I actually gasped at seeing the enhanced image – I had no idea the technology had advanced so far.
Over the years I’ve dabbled with applying colorization and other image processing effects to old pictures. Anyone interested in history or genealogy has likely done the same; probably all photo editing apps have at least a few enhancement tools built-in, and some of the more powerful online versions use AI to guess at the contents of the image for automatically adding color to black and white photos.
Results were rarely satisfying. Colorized images looked washed-out and the color choices could be laughably wrong. And because the underlying software was developed using modern photographs taken in color, processed black and white images might even lose quality – old portraits often have a shallow depth of field, for example, and apps may “fix” that by sharpening up the background.
Then on a whim, I recently took a photo from a 1920 Press Democrat and uploaded it to an AI website. I didn’t expect much improvement; my experience was that the software would probably despeckle the picture but not materially improve it. Still, I wanted the best possible image since the woman was key to the article I was researching.
To repeat myself: I actually gasped.
There was so much signal noise in the PD original it was difficult to read the woman’s expression; was she glaring angrily at the photographer? Did she look tired, or even sick? But with one click of the mouse, out of that hazy static emerged a clear and sharply-focused image of a woman’s face – with eyelashes, even! (This is the same woman shown in the graphic above the title.)
I took that portrait and ran it through a different AI-based program for colorization. The result was acceptable, but now she had the pale, sepia skin tone often seen in colorized photos.
Curious if colors in the photo could be improved further, I opened it in a graphics editor. Using just two menu options, I was able to give her a suitably realistic complexion. 1 Even though the portrait was still far from perfect, you could almost imagine it was a selfie made today – as long as you don’t squint too closely.
WHAT WAS USED FOR TESTING
To test colorization, my criteria was that the apps had to be available to everyone. This meant that they had to be on the internet, free to use, and work automatically without technical knowledge or experience.
Except as noted, all images in this article were generated using Hotpot to first sharpen and improve the black and white photo, followed by applying the “Fix Face” enhancer if the image was a portrait. Colorization was done using Palette which offers a variety of filters which can be applied to the image. Both Hotpot and Palette require payment for processing larger images.
The entire process took less than ten minutes, and most of that was spent experimenting with menu options in the editor. All of the tools were free. Anyone can do this.
Automatic colorization is really quite a breakthrough, particularly when you consider it combines two remarkable achievements that were both considered futuristic stuff just a few years ago.
The first step is object recognition – how the computer recognizes without any prompting from the user that the image is the face of a woman, as opposed to the picture of a truck, a cat, or something else. It wasn’t until 2001 before anyone demonstrated that could be done in real time.
Gentle Reader should recall from the previous article how AI chatbots were “trained” using text scraped from the internet. Image processing AI programs are also trained on datasets, such as Stanford’s ImageNet which now contains over fourteen million images organized by descriptions.2 That enabled a giant leap forward in being able to correctly identify various parts of an image as well as building a library of color information about the objects.
Although all colorizing AI apps start by tapping into the same basic training about what’s in the image, what they do with that data can produce widely different results, although all default to very conservative color choices. The differences lie in how much – and how well – the colors can be tweaked to produce a result pleasing to your eye.
(RIGHT: Same location today)
In the first colorized image below the Img2Go app recognized there was lots of foliage, which the AI decided is always the exact same shade of green, no matter what kind of tree. The street and sidewalk are likewise greenish. (Another app lightly tinted both the sky and street blue.) As a final gripe, Img2Go super-sharpened the entire image without asking.
Colors were far more realistic in the lower image, particularly the palm trees. The sidewalk and (unpaved) street look right and although the sky is light overcast, it’s not so dark as to make the shadows look out of place. But despite those improvements, it’s still easy to see this photo has been colorized.
The sad truth is that no matter what is done, these images will likely always look colorized. Future advances in AI will most likely make better color choices, but there are limits because of issues with the quality of the original photograph itself.
Most black and white photos taken before circa 1930 used film that pushed contrast higher than a human eye would see; it was also not very sensitive to red light and over sensitive to blue (this article explains more). As a result, what was blue in real life appears lighter in a vintage black and white photo while red areas look much darker, sometimes appearing almost black. So when looking at a picture taken during the “Golden Age of Photography,” keep in mind it’s actually quite distorted and information has been irretrievably lost. Ideally some of this info could be provided in object descriptors, such as “Photographed 1921 using Kodak Verichrome (orthochromatic).” Ha, ha, no.
Making matters worse, the current crop of AI colorization apps train using modern photos. Pictures are desaturated and then the image generator tries to predict the correct colors, testing accuracy by comparing them to what’s found in the original (here’s an article with examples). Needless to say, a modern photo with its colors removed still looks like a modern photo because it keeps all the same tones. As I understand it, this is a reason why colorized vintage images often look brownish and washed-out – the color model is simply wrong. It’s like trying to translate a novel written in Spanish by using an Italian dictionary; the surprise is the trick can work as well as it does.
Yet it seems to me the people building these apps don’t understand users want to colorize OLD photos, and not, you know, re-colorize a color picture they just took with their iPhone. All of the colorizing apps I’ve tested using their automatic option performed no better than fair on vintage images. All, that is, except for one: Palette.
Palette is far from perfect and like all the other apps it has bugs, particularly sometimes not coloring within the lines. It offers twenty preset filters which can be applied to an image; some are “meh” like its competitors and some may be flawed – but often there are two or three that are gasp-worthy.
Developer Emil Wallner seems to have an understanding of vintage film – both color and black and white – that others lack. The results often have warmer tones and bolder colors, both which can look more historically accurate. Palette also looks at small gradations in the black and white image to offer dramatic color choices.
Below is an auto decorated for the 1910 Rose Carnival (more details). Other colorizers painted the floral decoration completely green – as did the default Palette filter. But in the original image the flowers over the hood and fender are slightly lighter. Per the old film being more sensitive to blue, Palette painted them that color. This surely must be a bug, right? Roses ain’t blue! Yet it turns out Palette’s color choice isn’t bad at all, because those aren’t roses, or even actual flowers. They’re fakes made out of colored paper. We know that because of an item in the Press Democrat from two years earlier, where the Floats Committee declared parade entries could no longer combine artificial floribunda with the real thing. Since the headlights and other parts of the auto are clearly paper, the rest of the decorations must be as well.
The next image is from around 1915 and shows a woman cuddling her kitty on the front steps – a poignant snapshot that has long been a personal favorite. I’m guessing the year to be c. 1915 based on the size of the Yucca, which was probably planted after the house was built in 1910 (the home is still there).
This is a deceptively difficult image to get right. All of the AI colorizers (including Palette’s “base” version) barely apply any tint at all, rendering it in such pale browns I sometimes struggled to see any color. But again, Palette found small measurable differences in the gray levels that suggest the front of the house had five shades. As with the “blue roses,” we can’t know if Palette’s colors are what actually appeared, but they are historically appropriate. This image also demonstrates Palette’s bug of not always identifying object borders correctly – look at the color shift on the window frames.
Some images defy colorization, at least using any of the AI-based apps currently available. This rare 1890 photo of Santa Rosa House, the famous old hotel at the corner of First and Main, is only partially colorized, with many people left in black and white. Palette performed marginally better than others, but this is a common problem with all AI colorizers when there’s not enough data to recognize an object. To be useable this photograph would need to be rescanned at a much higher resolution and have additional preprocessing enhancements. The horses look good, though.
The final image is a 1885 portrait of Julio Carrillo. Palette did its usual fine job with color but like the portrait of the young woman discussed at the top, the real improvement is Hotpot’s “Fix Face” option, which adds dimension to his skin and beard.
But the garbled image of the woman was significantly improved while the overall quality of Julio’s portrait was not. In the original his eyes look more thoughtful; Hotpot very slightly lowered his eyelids, which is only apparent when you do a blink test.
So here’s the $64,000 Question: Should we be colorizing at all?
The “pro” side falls back to the same few arguments: People today expect color; photographers were hand-coloring back in the 19th century and no one objected; colorization can bring out hidden details. And let’s face it, being able to colorize a picture with the click of a mouse or tap on a screen is just pretty damn cool.
The “con” side begins by pointing out the apps are still doing a lot of guesswork. Yes, Palette has an remarkable ability to mimic old film and apply historically accurate colors, but it has no way of knowing what some of the original colors were – on the auto flowers, yellow would have been as good a choice as blue. Why is the woman’s hair a different color than her eyebrows? And although her forehead did not appear strikingly darker in the poor quality newspaper original, Hotpot turned it into a change of skin tone (or lighting?) which was further exaggerated by Palette. It feels like the same issues I raised with AI chatbots, which too often answer basic history questions partially or completely wrong. The technology may work satisfactorily 90% of the time, but when – and if – we will clear the hurdle of that last ten percent always seems a few years away.
And then there’s the briar patch of ethical issues, both because of the tech and those who use it. There’s been much discussion in articles and online forums about skin color automatically being lightened, and whether or not this is because the majority of faces in those mammoth image datasets are Caucasian. There are also examples of AI colorizers darkening the skin of Black people so much that facial features are nearly lost.
Colorization always has been somewhat controversial (here’s an article mostly about pre-AI objections) but there was an uproar in 2021 after an artist altered photos of Cambodian genocide victims, sometimes adding smiles to their mugshots as well as colorizing them. In the wake of that incident a “Colorizer’s Code of Conduct” was written, which I encourage everyone who’s planning to try AI colorization to read. It’s short and hits all the important points.
Even if you’re simply colorizing your grandparent’s wedding portrait to post on FaceBook, it’s still important to acknowledge the image has been significantly altered and an original black and white version exists – that thread of history must be unbroken. And who knows? Perhaps someday a member of your family will want to re-colorize it when we have AI apps that work more reliably.
1 The final image enhancement was done using GIMP, a free Photoshop-like app that will run on all desktop computers. Three options from the “Colors” menu were used:
Auto > Color Enhance (this maximizes saturation)
Hue-Saturation reduces max saturation (-40)
Brightness-Contrast, adjusts brightness (+75) and contrast (-10)
2 ImageNet initially received a huge boost in the number of images and descriptors in its dataset by incorporating data from Flickr, one of the earliest photo social media sites that allowed users to add tags describing the picture. Descriptions are also being crowdsourced, which has furthered the problems of troublemakers adding malicious tags, including racist slurs and disinformation to the descriptors.