HNNewShowAskJobs
Built with Tanstack Start
The PowerPC Has Still Got It (Llama on G4 Laptop)(hackster.io)
53 points by stmw 2 days ago | 18 comments
  • buildbot2 days ago

    Oh someone else is as silly as I am? I hacked this together a few months ago as well! I guess I should have written it up.

    I’ve been getting llama.cpp going on various weird, old systems as I can and qwen3.c where llama.cpp has no hope. So far, I’ve tried various sparc generations (IIi, IIIi, Fujitsu M10, and an Oracle M7), a C8900 PA-RISC, some riscv boards, an Alpha 21264, POWER 9, and many X86 and ARM systems of course.

    • _rpf2 days ago |parent

      Im compelled to humblebrag my Sgi at this point … https://youtu.be/mzI8U7S0FDc?si=D70WbAak7_k7Ebrr

      • buildbota day ago |parent

        Amazing!!! It’s really fun to see an SGI system + a modern LLM.

    • actionfromafar2 days ago |parent

      Now, try https://www.winuae.net/

    • yjftsjthsd-h2 days ago |parent

      Yes, you should definitely write that up and post it:)

  • jchw2 days ago

    I am pretty sure Apple did not design or manufacture PowerPC chips at any point, so I'm not sure how that would be considered "custom" silicon.

    And anyway, the source article seems a bit more interesting.

    https://www.theresistornetwork.com/2025/03/thinking-differen...

    • stmwa day ago |parent

      Apple was the "A" in the AIM alliance that created PowerPC, together with IBM and Motorola. https://wiki.preterhuman.net/The_Somerset_Design_Center

      • jchwa day ago |parent

        That I am aware of, but unless I just missed something I've never heard that they ever were designing chips.

        • stmwa day ago |parent

          Yeah, don't think this was an equal 1/3 + 1/3 + 1/3 partnership, but in various written histories there is Apple engineering involvement:

          "So, with the goal of maintaining RS/6000 software compatibility, a team of architects from IBM, Apple, and Motorola set out to refine the architecture ... IBM and Motorola, with Apple engineering participation, have put into operation a new design center to develop future PowerPC microprocessors. The Somerset Design Center is a 37,000 square-foot facility located in Austin, Texas, staffed primarily by Motorola and IBM with approximately 300 engineering professionals. The design center is presently working concurrently on three separate PowerPC microprocessors." (https://www.thefreelibrary.com/History+of+the+PowerPC+archit...)

          The intro to PowerPC Architecture book includes the following:

          "We would like to acknowledge Keith Diefendorff, Ron Hochsprung, Rich Oehler, and John Sell for providing the technical leadership that made it possible for the group of architects, programmers, and designers from Apple, Motorola, and IBM to produce an architecture that met the goals established by the alliance these companies formed.

          Many people contributed to the definition of the architecture, and it is not practical to name each of them here. However, a core group worked long hours over an extended period contributing ideas, evaluating options, debating costs and benefits of each proposal, and working together toward the goal of establishing a competitive architecture for the member companies of the alliance. This group of dedicated professionals included Richard Arndt, Roger Bailey, Al Chang, Barry Dorfman, Greg Grohoski, Randy Groves, Bill Hay, Marty Hopkins, Jim Kahle, Chin- Cheng Kau, Cathy May, Chuck Moore, Bill Moyer, John Muhich, Brett Olsson, John O'Quin, Mark Rogers, Tom Sartorius, Mike Shebanow, Ed Silha, Rick Simpson, Hank Warren, Lynn West, Andy Wottreng, and Mike Yamamura."

    • buildbot2 days ago |parent

      Interesting, that article says llama.c not llama.cpp. I actually got llama.cpp going on a G4 awhile back, I guess I should write that up.

      Edit - I just can’t read, original article was llama.c

      Gotta push my powerpc llama.cpp fork now for sure!

    • DogRunner2 days ago |parent

      Apple didn't design the PowerPC or make custom variances. Motorola and IBM did it. Especially Altivec was added by Motorola, and IBM didn't like to add it to their PowerPC CPUs when Apple asked for help, when Motorola had the 500 MHz glitch bug back in the day.

      There is a nice coverage on this topic at https://www.youtube.com/watch?v=Tld91M_bcEI (Why the Original Apple Silicon Failed)

  • pizlonator2 days ago

    That's awesome!

    I think that's the 12" G4 - still my favorite laptop ever, in terms of looks and form factor.

    • forgotoldacc2 days ago |parent

      That generation of Apple laptops was my tech awakening. I always thought of computers as tools just for office work and nothing I'd ever want to use. But one day I sat down in front of a G4 iBook and was like, man, this thing is beautiful. And it's pretty fun to use. I got an iMac a couple weeks after that and it set me on my programming career.

      And just looking at that picture in the article, that keyboard is beautiful. Apple truly had some incredible design sense. It's very unfortunate how rough their design decisions have been the past few years.

      • stmwa day ago |parent

        Re: "very unfortunate how rough their design decisions have been the past few years" - one sometimes wonders if this is the inspiration:

        https://vinpaq.com/compaq-collection

  • markgall2 days ago

    I still have my old PowerBook G4 from 2005, with some not-that-old Debian currently installed. Every time my main laptop goes out commission, I get the G4 back out and use it for a few days. It's good enough for most of my work, though modern web-browsing is a challenge. (Maybe one that somebody has solved, I haven't dug at all.)

    • yjftsjthsd-h2 days ago |parent

      > though modern web-browsing is a challenge. (Maybe one that somebody has solved, I haven't dug at all.)

      The usual solution is to run the real browser somewhere else and remote into it, eg. https://github.com/tenox7/wrp or https://www.brow.sh/

  • anon2912 days ago

    There's nothing mysterious about AI. It's matrix and tensor ops which have been used for decades now. Hardware is pretty good at such things because memory accesses are nicely arranged.

    • nuc1e0n2 days ago |parent

      The memory accesses being nicely arranged is kinda why the focus has moved to AI in recent years. Moores law is that much easier to keep going if parallelization increases, such as with GPUs and SIMD on CPUs. That extra Silicon needs to be made productive somehow to be justified.