Algorithm successfully simulates color perception for #theDress image

Which piece resembles your color perception for #theDress image?
Simulation of individual differences in the appearance of #theDress image, including two extremes: white/gold (top left: A-1) and blue/black (bottom right: G-7). Original corresponds to the center (D-4). Credit: Ichiro Kuriki

A novel algorithm to simulate the color appearance of objects under chromatic illuminants has been proposed by Ichiro Kuriki of Tohoku University. The figure shows the result of applying this algorithm to #theDress image.

"#theDress image" refers to a photo that went viral on the internet in February 2015, when viewers disagreed over the colors seen in the dress. The discussion revealed differences in human color perception and prompted studies in vision science.

How do we perceive colors? The details are not yet fully understood, even for colors that people easily experience. This remains one of the fundamental questions about vision. Objects reflect light from an illuminant on their surfaces. The light that falls on the retina is changed by an illuminant. However, people scarcely perceive object color shifts. Although slight color shifts remain, the human visual system is able to compensate for illuminant changes.

Several groups have proposed algorithms to simulate these shifts in color appearance, but problems remain, including the issue of achromatic points. Achromatic points are a series of rays that appear colorless (white to black through gray) under a given illuminant, and they work as the basis to evaluate hue and vividness. Therefore, an achromatic point is a keystone in color appearance simulations, but previous models by other groups proposed complicated formulas to simulate achromatic points.

Kuriki previously discovered a simple method to approximate these achromatic points under a chromatic illuminant. By combining this with a lightness adjustment, a simple algorithm was proposed to simulate the appearance of color under a colored illuminant. The algorithm was applied to #theDress image, widely acknowledged for its huge individual differences in color appearance.

Which piece resembles your color perception for #theDress image?
The matched color appearance of 15 observers distributed widely across this chart; not restricted to the diagonal line between A-1 and G-7 as it was assumed in previous studies on the appearance of #theDress image. Credit: Ichiro Kuriki

Such variability is known to originate from differences in the estimated color and intensity of the illuminant falling on the dress. For example, if a viewer assumed a bluish dim illuminant, they perceive the dress as white/gold. The color and intensity of illuminant was systematically varied and successfully simulated differences in color appearance of #theDress under various assumptions (Figure); one of these pieces may resemble what you perceive from #theDress image.

The method is also capable of preserving the wider color range of the image's darker parts, even when adjusted for the lightness to simulate a dimmer situation. This is advantageous for high dynamic range displays such as OLED screens.

Explore further: Brightness and darkness as perceptual dimensions

More information: Ichiro Kuriki, A Novel Method of Color Appearance Simulation Using Achromatic Point Locus With Lightness Dependence, i-Perception (2018). DOI: 10.1177/2041669518761731