There is this part in there:
> Are our tools just worse now? Was early 2000s PHP actually good?
Not sure how rhetorical that was, but of course? PHP is a super efficient language that is tailor made to write dynamic web sites, unlike Go. The author mentions a couple of the features that made the original version easier to write and easier to maintain, they are made for the usecase, like $_GET.
And if something like a template engine is needed, like it will be if the project is a little bit bigger, then PHP supports that just fine.
> Max didn't need a request router, he just put his PHP file at the right place on the disk.
The tendency to abstract code away leads to complexity, while a real useful abstraction is about minimizing complexity. Here, the placement of PHP files makes stuff easier -> it's a good abstraction.
And that's why the original code is so much better.
> Max didn't need a request router, he just put his PHP file at the right place on the disk.
This also elides a bit of complexity; if I assume I already have the Nginx and gunicorn process then my Python web server isn’t much worse. (Back in the day, LAMP stack used Apache.)
I’ll for sure grant the templating and web serving language features though.
It still runs much of Facebook, I think
Only kind of, they have their own language (Hack) that descends from PHP. It’s JIT instead of interpreted, and breaks back-compat in a few ways.
php is also jited nowadays. currently I believe the main advantage is that hack is async, you can fire multiple SQL/http requests in parallel and cut some wall time.
This part really hits home. The first time I got to see a huge enterprise C project, I could not believe how simple the code was. Few to no tricks.
> To be perfectly honest, as a teenager I never thought Max was all that great at programming. I thought his style was overly-simplistic. I thought he just didn't know any better. But 15 years on, I now see that the simplicity that I dismissed as naive was actually what made his code great.
I saw the opposite in many Rust codebase. in some repoes I see some dev like to put every smartest trick in Rust book that makes me feel uncomfortable because it takes me a while to understand exactly what they are doing.
It’s a fun trip down memory lane, but the real story today, the sadder story, is that there is no longer any use for simple little programs like this that scratch an itch.
They’ve all been solved 100x over by founders who’ve been funded on this site. It used to make sense to have a directory or cgi-bin of helpful scripts. Now it only makes sense as a bit of nostalgia.
I miss the days when we had less, could get less done in a day… but felt more ownership over it. Those days are gone.
I would argue those days are coming back. Thanks to LLMs, I have probably 10x more "utility" scripts/programs than I had 2 years ago. Rather than bang my head against the wall for a couple hours to figure out how to (just barely) do something in Python to scratch an itch, I can get a nice, well documented, reusable and versatile tool in seconds. I'm less inclined than ever to go find some library or product that kinda does what I need it to do, and instead create a simple tool of my own that does exactly what I need it to.
Just please if you ever give that tool to someone else to use, understand, maintain, or fix, mention that it was created using an LLM. Maybe ask your LLM to mention itself in a comment near the top of the file.
The 'as is' nature of open source applies regardless of whether a human or LLM wrote the code.
Who said it was open source?
> They’ve all been solved 100x over by founders who’ve been funded on this site.
I’m kind of getting tired of software made by “founders,” who are just looking to monetize me and get their exit, as opposed to software written by normal users just wanting to be useful. I know I’m on the wrong website for this, but the world could use fewer “founders” trying to turn my eyeballs and attention into revenue.
There is still use for small niche programs. I host my own gif repository, a website for collecting vinyls and my own weather dashboard. I don’t expect anyone else to use these sites so they’re tailored to my user experience and it’s great.
I have many 1000s of small tools and sites. Some have a few other users, most do not. It makes me productive so he.
> It’s a fun trip down memory lane, but the real story today, the sadder story, is that there is no longer any use for simple little programs like this that scratch an itch.
> They’ve all been solved 100x over by founders who’ve been funded on this site. It used to make sense to have a directory or cgi-bin of helpful scripts. Now it only makes sense as a bit of nostalgia.
Why does it make more sense to learn the syntax for someone else's helper scripts than to roll my own, if the latter is as easy or easier, and afterwards I know how to solve the problem myself?
Because time is finite and you probably set out to achieve something else which is now on hold. Nothing wrong with distractions but let's not glorify them :).
> Because time is finite and you probably set out to achieve something else which is now on hold. Nothing wrong with distractions but let's not glorify them :).
That's true, but it was also true before. To the extent that solving a problem to learn the details of solving it was ever worthwhile, which I think is and was quite a lot, I'd say it's still true now, even though there are lots of almost-but-not-quite solutions out there. That doesn't mean that you should solve all problems on your own, but I think you also shouldn't always use someone else's solution.
It still makes sense to self-host, to have that ownership.
"Itch-scratching" programming is all I ever do now, as my career pivoted away from being a full time developer a long time ago.
But they're personal itches, not productizable itches. The joy is still there, though.
Reading this reminds me of the era which was envisioned will happen when I was in college (which was not long ago) - individuals and societies building their own independent custom stuff (both hardware and software) with the power of computers in everyone's hands. I am sure that is still happening in small pockets but most of the 'stuff' we use are built by large mindless corporates on which we have almost no control - and who prioritize profits over well-being of the employees and the community.
I don't know for sure what the problem was (I have my theories) and why could we not get there where most people build their own custom products.
This is something that's been on my mind a lot over the past few years. I think things were on that trajectory, but somewhere along the line it got out of wack.
User interfaces became more user-friendly [0], while developer experience - though simpler in many ways - also became more complex, to handle the complex demands of modern software while maintaining a smooth user experience. In isolation both of these things make sense. But taken together, it means that instead of developer and user experience converging into a middle place where tools are a bit easier to learn and interfaces a bit more involved, they've diverged further, to where all the cognitive load is placed on the development side and the user expects an entirely frictionless experience.
Specialization is at the core of our big interconnected society, so it's not a surprising outcome if you look at the past century or two of civilization. But at the same time I think there's something lost when roles become too segregated. In the same way homesteading has its own niche popularity, I believe there's a latent demand for digital homesteading too; we see its fringes in the slow rise of things like Neocities, the indie web, and open source software over the past few years.
Personally I think we just have yet to see the 'killer app' for digital homesteading, some sort of central pillar or set of principles to grow around. The (small) web is the closest we have at the moment, but it carries a lot of technical baggage with it, too much to be able to walk the fine line needed between approachability and flexibility.
Anyway, that's enough rambling for now. I'll save the rest for a blog post.
[0] user-friendly as in being able to use it without learning anything first; not that that's necessarily in the user's best interest
A bunch of useful insights in your reply. I really liked the insight of User Interfaces getting simpler while developer experience getting more complex. A counter argument that comes to mind is how violin has the most difficult UI - but a lot of people spend lot of time mastering it and enjoy creating music from it - often independently or in smaller bands. How can that happen with more people in development - maybe making developer experience more joyful is the way to go. I'm not against specialization - but specialization can be done at a small community level too.
Who said you can't do that now? This article is about apache and php and a single script. You can do that today.
Oh the point wasn't that we can't do it now. The point is that not enough people choose to make their own custom software and the systemic reason behind it.
Mostly unrelated, but I dislike how normalized AI art is.
I don't think it's unrelated at all. I saw the same picture and just closed the tab right away. Why should I read this article, the whole thing might be written by an LLM.
I think adding a AI image to filter out readers who think that way might have been intentional.
I certainly consider it a good idea, now that it has come to mind.
Could you also tag it “AI enhanced” or some such for us as well? Thanks.
nope. i don't use AI to write anything. I will just put an obviously AI image in the article to give those who make assumptions a reason to bail.
and it will work very well.
What’s wrong with using AI to write something?
The neo-Luddite filter
If anything, the ubiquity of style he used makes it into a deliberate meme. It's a little joke.
Your comment reminds me of people complaining about how using emoji in communications/text has become normalized. Generating images with AI is pretty fun and seems like an appropriate thing to do for a personal blog. As in, this is the exact sort of place where it's most appropriate.
It's not like this person was ever going to pay someone to make a cartoon drawing so nobody lost their livelihood over it. Seems like a harmless visual identifier (that helps you remember if you read the article if you stumble across it again later).
Is it really such a bad thing when people use generative AI for fun or for their hobbies? This isn't the New York Times.
same. Would have prefered a lo-fi stick figure drawn on a napkin. The cartoon Max detracts from the rest of article, which is a good read.
Dislike it or not, its normalised as is everything else that is useful.
It’s not really useful in most instances.
Useful? I concur with a sibling comment. I stopped reading and closed the article as soon as I saw it.
This happened to me too (almost subconsciously I might add). I'm actually not anti-AI at all, maybe a bit uninterested in AI-made art, since I don't fully see much use for it except for generating fun pictures of Golden Retriever dogs in silly situations, but this imitation-Ghibli art style is probably one of the least pleasing things to my eye that people love making. It's so round and without edge, it's colors are washed out in a very non-offensive way, and also it does not even look like the source material. I wouldn't be so aggrieved by it, I think, if there wasn't that wave where everyone and their dog was making pictures in that style. Sorry, just a small rant tangentially related to the article, which is fine. :)
But imgbin is a tiny project with exactly one moving part that only interacts with the file system and the user.
When the project becomes more complex, things change for the worse.
Also, you need to protect modules not only from errors, but from the other programmers in your team.
I appreciated this, just because my name is Max and any sort of validation from HN is hard to come by.
Also, the image kinda looks like me. It's not me though. I don't think.
To make it a fair comparison, you also need to consider all the old-school Apache and PHP config files required to get that beautiful little script working. :) I still have battle scars.
Ahh lamp stacks… I remember there was a distro that had everything preconfigured for /var/www/ and hardened but for the life of me I can’t remember its name.
A lot of distros did and still do that. Getting an Apache instance up and running with PHP running as a CGI process was just a matter of installing the right packages on RedHat-derived distros going back to the early 2000s, for example.
They weren’t hardened at all. Installing lamp is one thing, ensuring it’s secure is another. Even RedHat would send a SA to your place to do that for you.
Fair enough. I wasn't getting the emphasis on hardening in your comment since the parent was just talking about the "battle scars" of configuration.
Re: hardening - I guess I deployed a lot of "insecure" LAMP-style boxes. My experience, mainly w/ Fedora Core and then CentOS, was to turn off all unnecessary services, apply security updates, limit inbound and outbound connectivity to only the bare minimum necessary w/ iptables, make sure only public key auth was configured for SSH, and make sure no default passwords or accounts were enabled. Depending on the application grubbing thru SELinux logs and adjusting labels might be necessary. I don't recall what tweaks there were on the default Apache or PHP configs, but I'm sure there were some (not allowing overrides thru .htaccess files in user-writeable directories, making sure PHP error messages weren't returned to clients, not allowing directory listings in directories without a default document, etc).
Everything else was in the application and whatever stupidity it had (world-writeable directories in shitty PHP apps, etc). That was always case-by-case.
It didn't strike me as a horribly difficult thing to be better-than-average in security posture. I'm sure I was missing a lot of obvious stuff, in retrospect, but I think I had the basics covered.
My point was there was a distro circa 1997-2003 or so that had all of that pre-baked. No having to mess with SELinux (or disabling it!), iptables, php.ini, apache's httpd.conf, or any of that other than putting your project into /var/www/ and doing a chown -R www on it.
You actually don't need to. Just upload this little php script to a shared host for $1/mo and call it a day.
Early 2000s PHP was a DSL for very simple web apps. So it's no surprise it excels at that.
People soon found out that it was not very good at complex web apps, though.
These days, there's almost no demand for very simple web apps, partially because common use cases are covered by SaaS providers, and those with a need and the money for custom web apps have seen all the fancy stuff that's possible and want it.
So it's no surprise that today's languages and frameworks are more concerned with making complex web apps manageable, and don't optimize much (or at all) for the "very simple" case.
> These days, there's almost no demand for very simple web apps, partially because common use cases are covered by SaaS providers, and those with a need and the money for custom web apps have seen all the fancy stuff that's possible and want it.
I dunno about that.
In 2000, one needed a cluster of backends to handle, say, a webapp built for 5000 concurrent requests.
In 2025, a single monolith running on a single VM, using a single DB on another instance can vertically scale to handle 100k concurrent users. Put a load balancer in front of 10 instances of that monolith and use RO DB followers for RO queries, and you can easily handle 10x that load.
> So it's no surprise that today's languages and frameworks are more concerned with making complex web apps manageable, and don't optimize much (or at all) for the "very simple" case.
Maybe the goal is to make complex web apps manageable, but in practice what I see are even very simply webapps being mad with those frameworks.
I disagree. I would say most of the migration from PHP was due to the appeal of one language for frontend and backend, and fashion/hype. PHP is still very usable for server-side rendering and APIs. You say "very simple" as if you can't have complex systems with PHP.
I see the current state of web development as a spiral of complexity with a lot of performance pitfalls. Over-engineering seems to be the default.
> I would say most of the migration from PHP was due to the appeal of one language for frontend and backend
Definitely not. PHP lost far more market share to Java,C# and Ruby on Rails than to node.js
> PHP is still very usable for server-side rendering and APIs.
Not "is still", but "has become". It has changed a lot since the PHP 3 days.
> You say "very simple" as if you can't have complex systems with PHP.
With early 2000s PHP, you really couldn't, not without suffering constantly from the language's inadequacies.
> I see the current state of web development as a spiral of complexity with a lot of performance pitfalls. Over-engineering seems to be the default.
I don't disagree, but that seems to happen most of all in the frontend space.
> People soon found out that it was not very good at complex web apps, though.
They eventually made it fit for purpose with Laravel ;-)
Max wrote a simple php script. Mel wrote delay loops by accessing previous memory addresses on drums. They are not the same.
I didn't get the reference. For anyone else in my shoes: https://en.wikipedia.org/wiki/The_Story_of_Mel
I remembered this story the other day but couldn’t remember the name - and then coincidentally came across this link to it! https://users.cs.utah.edu/~elb/folklore/mel.html
I think that's the point. Max did things in the stupidest way that could possibly work, and it did work, and was simpler than the "smart" way, so was he less of a "real" programmer than Mel?
I think for a kid, Max's code was great but ultimately you do need to learn to think about things like error handling, especially if your code is intended to go into "production" (i.e., someone besides yourself will use/host it).
I'd argue that early computers are simple enough that one can actually put everything into one's head. Nowadays it is impossible.
No they are not, which makes the case for breaking up applications whenever possible. Some thinks that means micro services, but that's not my point.
The example with the image sharing is pretty good, because it only needs to share images. In, shall we say more commercial settings, it would grow to handle meta data, scaling, comments, video sharing, account management and everything in between. When that happens Max's approach breaks down.
If you keep your systems as "image sharing", "comments" and "blog" and just string them together via convention or simply hard coding links, you can keep the simple solutions. This is at the cost if integration, but for many use that's perfectly fine.
Edit: Oh, that Mel.
I think Max's brain was not polluted with terror and showed trust in his tools.
Today many devs (and not prograamers)
are always suspicious, and terrified on the potential of something going wrong because someone will point a finger
even if the error is harmless or improbable.
My experience is that many modern devs are incapable of assigning significance or probabilities, they are usually not creative, fearful of "not using best practices", and do not take into consideration the anthropic aspect of software.
My 2 cents
For years every external pentest of every perimeter of companies with old-school stuff like this has been finding these things and exploiting them and there are usually several webshells and weird stuff already on the server by the time they get to it. Very often the company forgot, or didn't know they had the thing.
The end state of running 15 year old unmaintained PHP is that you accumulate webshells on your server or it gets wiped. Or you just lose it or forget about it, or the server stops running because the same dev practices that got you the PHP means you probably don't bother with things like backups, config management, version control, IaC etc (I don't mean the author, who probably does care about those things, I just mean in general).
If these things are not a big deal (often it is not! and it's fun!) then absolutely go for it. In a non-work context I have no issues.
TBH I'm not 100% sure that either the PHP version _or_ the go versions of that code are free from RCE style problems. I think it depends on server config (modern php defaults are probs fine), binary versions (like an old exiftool would bone you), OS (windows path stuff can be surprising) and internal details about how the commands handle flags and paths. But as you point out, it probably doesn't matter.
Am I just doing the meme? :)
OTH, even a small program mistake can be exceptionally expensive. Remember crowdstrike , a NULL deference that cost hundreds of millions of dollars.
An unsafe string can be abused as attack vector to your system.
> For years Imagebin was wide open to the public and anybody could upload their own images to it. Almost nobody did.
There's your explanation why it could be so simple