Tuesday, July 31, 2018

Generating Colorful Random Backgrounds in the Browser

This is a short one because I was working on too many things at once and I wasn't able to get any of them done. So i figured I'd talk about a small thing I made a while back for some of my web pages. It's a random blurred color generator that creates random backgrounds every time it's rendered. I'll take you through my process of how I started with a prototype on the server side and slowly made my way to the final client side product.

Software is all about constant iteration to improve a system. Let's start with our Minimum Viable Product, a command that can be run server side to generate the image we want. This command requires ImageMagick. For all of our benchmarking, we'll be generating a 1080p image. Our first command is as follows:
convert -size 1920x1080  plasma:fractal -blur 0x100 jpg:-
This creates a random plasma image, blurs it, and dumps a JPEG to console, so you may want to capture it in a file. An example of its output is here:

It's pretty, right? The problem is that it takes on average 3.14 seconds to render. That's crazy long. It takes longer to render the image on my fast computer than it does to download the image on a browser. This isn't acceptable. Not only that, but the execution is much longer for larger images. We need to do better.

This is where the critical thinking comes in. The MVP works, but it isn't good enough. What can we do to make it better. Well, we don't have to blur the full resolution image. Since the whole point of blurring is to lose the detail which is basically the same as what resizing from small to large does, right? And then we don't have to blur so much because most of it will be done by the the resize algorithm. Basically, we'll only blur to get rid of the artifacting caused by the resize. We're left with this:
convert -size 1920x1080 plasma:fractal -filter Gaussian -resize 2.5% -define filter:sigma=2 -resize 4000% jpg:-

The same effect is achieved. On average we'll get sharper contrast between color transitions (it's best to think if this as a height-map as we'll see later). But it's looking good. We've also gotten the time down to an average of about 1.1 seconds which is a great improvement. But can we do better? I think we can do better.

We don't really need to start with a 1920x1080 canvas. We can start at 1/40th that size and scale it up instead of starting that large, shrinking it, and scaling it back up. This saves memory and time generating the plasma. We arrive at this, our third iteration:
convert -size 480x270 plasma:fractal -filter Gaussian -define filter:sigma=2 -resize 4000% jpg:-

This command executes in about 0.5 seconds. This is a vast improvement over the 3.14 seconds we started at. But what if we offloaded this task to the client side without hurting page performance too much?

That's exactly what I did. I looked at how ImageMagick generates its plasma images and came across the Diamond Square algorithm. It's used to generate height-maps randomly. This was perfect - I could basically use three instances of this algorithm to generate a heightmap for each color channel and then blend them together. Basically, I was writing the RGB data matrix straight to the image data array that I was instructing Chrome to draw on a canvas. Chrome will GPU accelerate anything it can, so I have that on my side.


This is the actual drawing on the canvas. It's close, but in order to make it look like usual, we'll need to stretch it out. Luckily, we allow Chrome's layout engine to apply the transformation for us. What we end up with is a crazy amount of optimization we get for free. From page load to first draw of color, it usually takes about 350ms on my computer. That's pretty great. Most of that time isn't even my script running.


The performance of this script is pretty great. I adapted a Diamond-Square algorithm implementation I found at https://github.com/cgiffard/DiamondSquare. You can try this code out here. It should work in everything except for some IE versions (no surprise there). Run the Chrome profile on it and see how it performs on your machine.

But this whole experience of iterations and vast improvements over MVP is an important one. I always try to crank out a working prototype and then optimize later. This tactic helps me avoid premature optimization and keeps me focused on the functionality first. Hopefully, this silly little blog post will be useful to someone in some capacity. I hope to get back to bigger things next month, but it's amazing how time slips away from you. Just yesterday it was the beginning of the month, and now it's the end.

No comments:

Post a Comment