It's an interesting take on an IR. It's goal is to support the current C/C++/JS backend but also to make generating native assembly easy to do as well.
It also doesn't rely on lexical scopes to do analysis for things like lifetimes, nil tracking, destructors, etc. Instead it uses the versioned variables AFICT to enable those features more directly. Should be much simpler for the compiler implementation for 99% of cases versus traditional SSA blocks.
Unfortunately I'm busy writing Nim code and not able to play with the new Nimony compiler framework. I'm excited about incremental compilation and borrow checking features though.
Nim seems to be almost a pet project of a single individual. Is that just my interpretation or is it an actual representation of reality?
If it is correct, and mostly created by one person - how? Are they a genius? Is creating your own programming language from scratch something anyone can accomplish if they just go for it?
Or is it just something that shouldn't be trusted/used for commercial purposes because it's not as "legit" as a newer language like rust for example?
It's just a weird vibe - it seems like it should be so much more popular than it is.
I am really only familiar with Python, in which I’m pretty sure that the .py becomes .pyc and then CPython translates .pyc into machine instructions.
How does this differ? Is an IR the same idea as Python’s .pyc?
> and then CPython translates .pyc into machine instructions.
What do you mean? CPython is a bytecode compiler and a virtual machine interpreting that bytecode. Or are you talking about the new experimental JIT?