Darmouth Faculty Reserach and Scholarship Today
  Darmouth College

 
Features Dartmouth Faculty Research and Scholarship Today
HOME FEATURES ESSAYS ABOUT ARCHIVE CONTACT SCHOLARSHIP NOW
 
 
Henry Farid Digital Forensics Written by Susan Knapp
Interactive

Digitzed images

 

Hany Farid is an assistant professor of computer science with joint appointments at Dartmouth's Institute for Security Technology Studies and at the Center for Cognitive Neuroscience.

Can You Name Five Black Women Business Leaders?

Space Weather Starts at the Sun

Health Care by the Numbers (cover feature)

Can a Robot Predict Space Weather?

Bringing Data Retrieval into the 21st Century

Captivated by Ecological Complexities

In the Eyes of the Beholder

The Foundation for a Livable Future

Once images have been digitized, it's easier to manipulate them. Looking into the underlying statistics can ferret out evidence of image tampering.

"Seeing is no longer believing. Actually, what you see is largely irrelevant," says Hany Farid, referring to a new kind of image tampering, now rampant in our digital age. While photos have been altered in the past, from airbrushing in fashion magazines and aliens in tabloid newspapers, to giant lizards in the movies, computers are making it much easier for more and more people to manipulate images.

Take for instance these digitized images. Marilyn Monroe died in 1962, before Farid was born, so he couldn't have spent time with her. But, the photo looks like the two are chums. We instinctively know that someone has manipulated these two images and put them together in a convincing portrait. But what if it was not so obvious? Suppose we had a photo of two competing CEOs talking over a document labeled "confidential - merger," or a photo of Saddam Hussein shaking hands with Osama bin Laden? How could we tell if they were real?

Farid explains that "regular" photos are hard to change without special expertise in altering negatives or darkroom privileges that would allow someone to influence the printing process. However, once images have been digitized, they are far easier to manipulate.

A digital image is a collection of pixels or dots, and each pixel contains numbers that correspond to a color or brightness value. When marrying two images to make one convincing composite, you have to alter pixels. They have to be stretched, shaded, twisted, and otherwise changed. The end result is, more often than not, a realistic, believable image.

"It's not easy to look at an image these days and decide if it's real or not," says Farid. "Being a computational person, I can look at the underlying code of the image for clues."

Farid has become a detective of sorts looking for the evidence inevitably left behind after image tinkering. He's found that statistical clues lurk in all digital images, and the ones that have been tampered with contain altered statistics.

"Natural digital photographs aren't random," he says. "In the same way that placing a monkey in front of a typewriter is unlikely to produce a play by Shakespeare, a random set of pixels thrown on a page is unlikely to yield a natural image. It means that there are underlying statistics and regularities in naturally occurring images. So we build models of natural images to capture those statistics. Various types of digital tampering disturb these statistics, and we can distinguish the real from the manipulated."

Farid and graduate students Siwei Lyu and Alin Popescu have built a statistical model that captures the mathematical regularities inherent in natural images. Because these statistics fundamentally change when images are altered, the model can be used to detect digital tampering. This approach employs a variety of techniques, from digital image compression (image decompositions such as wavelets) to machine learning (pattern recognition techniques).

One particularly sneaky form of digital tampering is using a computer to hide messages, or other images, within images. The art of concealing and sending messages, called steganography, has been around as long as people have had secret information to relay. It's come a long way, however, since the days of invisible inks and encrypted Morse code delivered over secret radio frequencies. Computers and the Internet provide a new twist.

This potential criminal activity intrigued Farid, so he set his mathematical mind to work in solving the puzzle of how to scour digital images for traces of hidden or embedded messages.

As with images that have been spliced together, Farid discovered that the mathematical statistics underlying the image are again fundamentally changed with steganography. When embedding secret messages, you manipulate pixels slightly, and then the recipient, who has the key to the manipulations, can decode the message. Even if your eye is not capable of seeing the embedded message, the mathematical models know that something is awry.

Farid and his graduate students have also used the model to detect other kinds of digital tampering, such as determining if digital audio has been altered, or distinguishing between natural photos and computer graphics.

Remember the photo of the competing CEOs? What if this image led to insider trading? If the case eventually reached a courtroom, the questions would remain: Was the photo real? How could you prove it?"

This technology to manipulate and change digital media is developing at an incredible rate," says Farid. "But our ability to contend with its ramifications is still in the Dark Ages. I'm always asked if this technology would stand up in a court of law." He explains that the simple answer is, "eventually." Farid predicts there will be skepticism and a great deal of scientific and legal debate. But eventually, he believes that some form of his technology or someone else's will be incorporated into our legal system.

Research into digital tampering dovetails with the focus of Dartmouth's Institute for Security Technology Studies. ISTS serves as a principal national center for counter-terrorism technology research, development, and assessment. Farid, whose work is funded in part by ISTS, is collaborating with experts there to ensure his research is understood by law enforcement officials and policy makers, from government representatives to legislators to corporate leaders.

Sidebar

Orozco is Digitized

"It's not easy to look at an image these days and decide if it's real or not," says Farid. "Being a computational person, I can look at the underlying code of the image for clues."

 
  Dartmouth Faculty Research and Scholarship Today    
 
Home | Features | Essays | About | Archive | Contact | Scholarship Now

Copyright © 2003 Trustees of Dartmouth College