Quick link: Latest coding blogs
These animations are created by small programs I've written, called "shaders", that are running on your GPU - the graphics processor in your computer.
If you are using ancient hardware or a low-spec phone, unfortunately, the shaders may appear to run but will just display a black background. If you're using an ancient browser, on the other hand, you may just see a message that WebGL is not supported. In the former case, upgrade your hardware; in the latter, upgrade your browser!
Hover over one and you'll see a play/pause button, a fullscreen option and a rewind facility. In the bottom one (Yin-Yang), click in the right half for more symmetry, the left for less.
Instead of your CPU computing the colours of the all pixels to be displayed for every screen image (ideally about sixty times a second), it sends little programs called shaders to the GPU, which the GPU itself then runs in parallel thousands of times on very specialised processors , to generate every pixel value for the viewport.
By manipulating mathematical functions according to the position of the pixel on the screen and the time since the animation started, it's possible to generate all sorts of interesting effects.
I have only recently discovered this form of programming, and it gives me great joy to have found a new way of making art from mathematics.
But shaders are not just about art, or even just about rendering data to the screen. The ability of graphics processors to do thousands of calculations in parallel gives them applications in Bitcoin mining and artificial intelligence, for example. In fact, some of the newest GPUs have special instructions built into their hardware which facilitate the computations needed by neural networks, so training such networks on very large datasets is far more feasible.
The use of GPUs and massive parallelisation of computation for all sorts of other applications is a hot area of research currently, and we can expect interesting developments over the next few years. Maybe people will soon forget why they're called GPUs, as the original specialisation of the word disappears and they become really useful general-purpose workhorses utilised for a wide range of tasks in our digital lives.