HNNewShowAskJobs
Built with Tanstack Start
GPEmu: A GPU emulator for rapid, low-cost deep learning prototyping [pdf](vldb.org)
80 points by matt_d 6 days ago | 13 comments
  • mdaniel6 days ago

    Sadly, no licensing in the repo and I have no idea what the licensing weight the associated fields in setup.py carry <https://github.com/mengwanguc/gpemu/blob/27e9534ee0c3d594030...>

    • 0points5 days ago |parent

      MIT is a well known foss license.

      • devturn5 days ago |parent

        Nobody here is doubting that. Your parent comment said:

        > I have no idea what the licensing weight the associated fields in setup.py carry

        That's a valid concern. I had the same question myself.

        • immibis5 days ago |parent

          The relevant weight is: if the author of the copyrighted work sues you in a court of law, will the evidence convince the judge that the author gave you permission to do so?

          • IX-1035 days ago |parent

            That assumes you are willing to pay for lawyers. If not, the relevant weight is "will the author (or any subsequent copyright owners) sue you".

            • immibis4 days ago |parent

              If I sued you for wearing green pants, would you stop wearing green pants?

  • almostgotcaught6 days ago

    > To emulate DL workloads without actual GPUs, we replace GPU- related steps (steps #3–5, and Step 2 if GPU-based) with simple sleep(T) calls, where T represents the projected time for each step.

    This is a model (of GPU arch/system/runtime/etc) being used to feed downstream analysis. Pretty silly because if you're going to model these things (which are extremely difficult to model!) you should at least have real GPUs around to calibrate/recalibrate the model.

  • socalgal26 days ago

    What is the difference between a gpu emulator and maybe specifically GPEmu and say llvmpipe?

  • Voloskaya5 days ago

    I was thinking about building something like this because this would be *very useful* if it worked well, so got excited for a sec, but this does not seem to be an active project, last commit 10 months ago.

  • Retr0id5 days ago

    Does it not work out more expensive to emulate a GPU vs just renting time on a real one?

    • Voloskaya5 days ago |parent

      This isn't actually an emulator in the proper sense of the word. This does not give you correct outputs, but it will try to simulate the actual time it would take a real GPU to perform the series of operation you care about.

      This could be useful e.g. for performance profiling, optimization etc.

    • MangoToupe5 days ago |parent

      I imagine this is only true for high throughput loads. For development a full GPU is likely a waste.

  • Propheciple5 days ago

    [dead]