Why do all the pictures you take underwater look blandly blue-green? The answer has to do with how light travels through water. Derya Akkaynak, an oceangoing engineer, has figured out a way to recover the colorful brilliance of the deep.
I really see this as the start of the Artificial Intelligence boom in marine science.
My name is Derya Akkaynak.
And I am an oceanographer and engineer
I specialize in problems of imaging and vision underwater.
I am now in the Lembeh Strait in Indonesia to test a new algorithm that I developed called Sea-thru, which takes an underwater image and removes the water from it, and it looks just like it would as if you took the photo
Underwater images typically have an overwhelming color cast, green or blue, depending on where you took them.
Objects in far distances are occluded by a layer of what we call backscatter, but think of it as a layer of haze.
So, the further you are from the objects in the scene the more haze you get in the scene.
Because light as it travels through the water, gets absorbed and scattered, colors fade away.
That's where underwater images look so dull and distorted all the time.
I'm diving with a regular consumer camera, and I carry a color chart with me.
Every time I see a reef with large 3D structure, I place my color chart at the base of the reef, and then I swim away about 15 meters.
Then I start swimming towards the reef, towards the color chart, and photograph it from slightly different angles until I get to the reef and then I swim over the reef, photograph the top of the reef, all sides.
Once I have the distance information all I do is I take the raw photos back to my computer, and I apply the method that uses a certain mathematical formula, goes through each pixel and calculates what the attenuation should be and removes it.
What the degradation should be and removes it.
TEXT: The technique could be of value to scientists trying to understand the impacts of climate change on coral and other marine systems.
Currently, in the field of underwater imaging and marine science, we are at a standstill.
So if you're a biologist, for example, studying corals on the seafloor, and you made a survey of a reef, and you'd like to see what species or what composition of animals are found on the seafloor, for the most part, you have to do that manually, because the water takes away so much, or it adds so many artifacts, that you can't see the true color of the species you're looking for.
This method is not photoshopping an image.
It's not enhancing or pumping up the colors in an image.
It's a physically accurate correction, rather than a visually pleasing modification.
I imagine in addition to scientists, recreational divers or underwater photographers would also be very interested in using this method, because finally from their images, they can remove all the degrading components and see the vivid colors of a scene just the way they would have as if that scene was on land.
Reporter: Erik Olsen
Additional Editing: Jeffery DelViscio
Animation: Jeffery DelViscio