Two new hacks this time, Marbling and Binary Horizon, plus a few minor updates.
Several of the old hacks have been re-enabled on Android, because it turns out that while they didn't work in the emulator, they do work on real Android hardware (of which I have none).
This release was built on macOS 11.6 instead of 10.14, so I think that means that it has native Apple M1 code in it -- though I doubt you'll notice any performance difference over the Rosetta2 emulation.
Also Apple completely changed how code signing works again, because hey, six more months have passed, it's clearly time for a redesign of the most incomprehensible part of their entire ecosystem, right? Is it better? No. No it is not, it's just differently awful. Again. Anyway, let me know if there are signing issues.
I also had to update Sparkle (the "Check for Updates" library), so hopefully auto-updates still work. Let me know.
About Marbling, the new one written by me. It started out fairly simple, but then it took the optimization train to crazytown. Here's the comment from the top of the source:
This generates a random field with Perlin Noise (Perlin's page, SIGGRAPH 2002 paper, Wikipedia entry), then permutes it with Fractal Brownian Motion (Wikipedia page, Book of Shaders, Shader Toy) to create images that somewhat resemble clouds, or the striations in marble, depending on the parameters selected and the colors chosen.
These algorithms lend themselves well to SIMD supercomputers, which is to say GPUs. Ideally, this program would be written in Shader Language, but XScreenSaver still targets OpenGL systems that don't support GLSL, so we are doing the crazy thing here of trying to run this highly parallelizable algorithm on the CPU instead of the GPU. This sort-of works out because modern CPUs have a fair amount of parallel-computation features on their side of the fence as well. (Generally speaking, your CPU is a Cray and your GPU is a Connection Machine, except that your phone does not typically require liquid nitrogen cooling and a dedicated power plant).
Update: Oh yeah, I forgot to mention. A while back someone requested an Apple TV version of XScreenSaver, so I took a crack at it. There's a tvOS target in the Xcode project, but when you launch it, it never instantiates SaverRunner. I imagine there's some xib or storyboard problem, but I couldn't figure it out so I gave up. If someone can get me past that, I'll take another look. BTW, it turns out that one of the hurdles of porting iOS to tvOS is that tvOS doesn't have newfangled UI elements like checkboxes and sliders.
Oooo, the start of your binary horizon video reminds me of my nightmare where I was trapped in abandoned ground spider tunnels. Nice!
Oh wow! "Marbling" may actually be an acceptable substitute for the old "Fluid" screensaver for OSX that never made the transition to Intel. Thanks!
It reminded me a bit of Acidwarp. Amazingly it has a homepage about 100 years after I last ran it: http://www.noah.org/acidwarp/
Mention of GLSL makes me wonder about pooling from ShaderToy for potential screensavers. Albeit, I mostly use ShaderToy because: https://shadertoy.com/view/llK3Dy more or less makes GPUs shake in their knees. Well, OK a GTX1080 with 8GB of RAM (and presumably anything newer) could run that at 4K at 120Hz, but most other things in laptops and such are commendable if they push 30FPS. Regardless, it's a decent benchmark, so few lines of code, for so much fan spin is impressive. Oh hey, look someone already made a ShaderToy screensaver plugin for Kodi (what XBMC is called these days if you aren't running the Plex fork that is): https://github.com/xbmc/screensaver.shadertoy
p.s. I don't know about the Connection Machine's cooling, but I am pretty sure that Crays were more or less using Fluorinert™ since around 1982 (citation here: https://www.3m.com/3M/en_US/data-center-us/applications/immersion-cooling/fluorinert-electronic-liquids/ though it was not without its drawbacks as mentioned here: https://en.wikipedia.org/wiki/Fluorinert#Toxicity). Liquid Nitrogen was pretty rare even for late 20th century computing, better alternatives existed (and no, they weren't water either typically, which is conductive, and more or less always a terrible idea for cooling despite what they market to gamers who "build rigs" and "overclock" consumer grade gear). Even Fluorinert™ these days I think is relatively uncommon, I think 3M et al came up with even newer alternatives (e.g. Novec™ Engineered Fluids). Perhaps, in the future, if GaN (Gallium Nitride, as contrasted with GaA [Gallium Arsenide], used by some late 1980s vintage Crays) gets more market foothold, we'll see more designs which can be passively cooled, or more correctly may not need cooling, because GaN supposedly can operate at hundreds of degrees Celsius. Maybe not so useful for thwarting global climate change, but it always seemed like bad engineering to waste more energy on cooling things rather than having hardware which didn't need cooling. I won't be holding my breath on such research personally, though the GaNext project which is looking to fab a RISC-V CPU in GaN among other goals, has been interesting from my vantage. Despite such caveats, there are certainly some "smart" phones (targeted towards gamers, e.g. some Asus ROG phone) on the market which have fan attachments and such, which also, just seems as if it is the opposite of what I would ever want to fit in my pocket along with its incumbent battery drain.
p.p.s most of NVidia's GPUs at least, are a bunch of 32bit ARM cores (their term "FP32" reflects this wherein the F is for FALCON [Fast Logic Controller] P I think is for processor, and 32 is for well, 32bit). Since around 2016 they've been relatively public about their next gen designs as being planned to be 64bit and RISC-V based, but YouTube talks from Joe Xie are probably better sources for such things than me regurgitating such lextures in typed text. Do they owe that lineage to the Connection Machine? Maybe! Certainly there are other GPU paradigms in existence which may draw closer provenance. I for one, mostly just miss when Symbolics had a GPU called a "Framethrower" and wonder if it is even possible to name such a product with a more apropos name.