Just did a bit of a deep dive into dithering myself, for my project of creating an epaper laptop. https://peterme.net/building-an-epaper-laptop-dithering.html it compares both error diffusion algorithms as well as Bayer, blue noise, and some more novel approaches. Just in case anyone wants to read a lot more about dithering!
Nice writeup. I've been looking at this for a print-on-demand project and found that physical ink bleed changes the constraints quite a bit compared to e-paper. In my experience error diffusion often gets muddy due to dot gain, whereas ordered dithering seems to handle the physical expansion of the ink better.
After implementing a number of dithering approaches, including blue noise and the three line approach used in modern games, I’ve found that quasi random sequences give the best results. Have you tried them out?
https://extremelearning.com.au/unreasonable-effectiveness-of...
Ooh, I haven't actually! I'll need to implement and test this for sure. Looking at the results though it does remind me of a dither (https://pippin.gimp.org/a_dither/), which I guess makes sense since they are created in a broadly similar way.
What is the advantage over blue noise? I've had very good results with a 64x64 blue noise texture and it's pretty fast on a modern GPU. Are quasirandom sequences faster or better quality?
(There's no TAA in my use case, so there's no advantage for interleaved gradient noise there.)
EDIT: Actually, I remember trying R2 sequences for dither. I didn't think it looked much better than interleaved gradient noise, but my bigger problem was figuring out how to add a temporal component. I tried generalizing it to 3 dimensions, but the result wasn't great. I also tried shifting it around, but I thought animated interleaved gradient noise still looked better. This was my shadertoy: https://www.shadertoy.com/view/33cXzM
I used ordered dithering in my ZX Spectrum raytracer (https://gabrielgambetta.com/zx-raytracer.html#fourth-iterati...). In this case it's applied to a color image, but since every 8x8-pixel block can only have one of two colors (one of these fun limitations of the Spectrum), it's effectively monochrome dithering.
Spectrum Basic was my first programming language, so that gives me all sorts of nostalgia feels. Your work is awesome.
Normally I am not a fan of gimmicky page formats but this series really hits it out of the park with well-considered presentation.
I can't wait until the next installment on error diffusion. I still think Atkinson dithering looks great, so much so that I made a web component to dither images.
I built a blue noise generator and dithering library in Rust and TypeScript. It generates blue noise textures and applies blue noise dithering to images. There’s a small web demo to try it out [1]. The code is open source [2] [3]
[1] https://blue-noise.blode.co [2] https://github.com/mblode/blue-noise-rust [3] https://github.com/mblode/blue-noise-typescript
There is something very satisfying in viewing media at 100% resolution of your screen. Every pixel is crisp and plays a role. Joy not available by watching videos or viewing scaled images.
This is really nice work, as are the other posts.
If the author stops by, I'd be interested to hear about the tech used.
Bookmarking this. Clear explanations of graphics algorithms are surprisingly rare.
Bayer dithering in particular is part of the signature look of Flipnote Studio animations, which you may recognize from animators like kekeflipnote (e.g. https://youtu.be/Ut-fJCc0zS4)
Bayer dithering was also employed heavily on the original PlayStation. The PS1's GPU was capable of Gouraud shading with 24-bit color precision, but the limited capacity (1 MB) and bandwidth of VRAM made it preferable to use 16-bit framebuffers and textures. In an attempt to make the resulting color bands less noticeable, Sony thus added the ability to dither pixels written to the framebuffer on-the-fly using a 4x4 Bayer matrix hardcoded in the GPU [1]. On a period-accurate CRT TV using a cheap composite video cable, the picture would get blurred enough to hide away the dithering artifacts; obviously an emulator or a modern LCD TV will quickly reveal them, resulting in a distinct grainy look that is often replicated in modern "PS1-style" indie games.
Interestingly enough, despite the GPU being completely incapable of "true" 24-bit rendering, Sony decided to ship the PS1 with a 24-bit video DAC and the ability to display 24-bit framebuffers regardless. This ended up being used mainly for title screens and video playback, as the PS1's hardware MJPEG decoder retained support for 24-bit output.
[1]: https://psx-spx.consoledev.net/graphicsprocessingunitgpu/#24...
Thank you
In chrome it says "Loading assets, please wait..." and hangs. but it works for me in firefox
I find these sites that try to feed you stuff at a bite-sized pace extremely disrespectful.
Half the posts here are people promoting their own projects without even mentioning the (really impressive) OP. Bit weird
When you look at something like Pietà by Michelangelo or Lolita by Vladimir Nabokov, you realise that some humans are given abilities that far exceed your own and that you will never reach their level.
When this happens, you need to stop and appreciate the sheer genius of the creator.
This is one of those posts.
I don’t know about all that, I’m just saying I thought people were being a bit rude
Is it self-promotion or just "hey cool I care enough about this I built something too"
It's ok for people to get excited about shared passions
first post was great, this should be interesting!