Pages

Friday, March 1, 2013

M.I.T. Computer Program Reveals Invisible Motion in Video

Unable to see video? CLICK HERE


A 30-second video of a newborn baby shows the infant silently snoozing in its crib, his breathing barely perceptible. But when the video is run through an algorithm that can amplify both movement and color, the baby’s face blinks crimson with each tiny heartbeat.

The amplification process is called Eulerian Video Magnification, and is the brainchild of a team of scientists at the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory.

The team originally developed the program to monitor neonatal babies without making physical contact. But they quickly learned that the algorithm can be applied to other videos to reveal changes imperceptible to the naked eye. Prof. William T. Freeman, a leader on the team, imagines its use in search and rescue, so that rescuers could tell from a distance if someone trapped on a ledge, say, is still breathing.

“Once we amplify these small motions, there’s like a whole new world you can look at,” he said.

The system works by homing in on specific pixels in a video over the course of time. Frame-by-frame, the program identifies minute changes in color and then amplifies them up to 100 times, turning, say, a subtle shift toward pink to a bright crimson. The scientists who developed it believe it could also have applications in industries like manufacturing and oil exploration. For example, a factory technician could film a machine to check for small movements in bolts that might indicate an impending breakdown. In one video presented by the scientists, a stationary crane sits on a construction site, so still it could be a photograph. But once run through the program, the crane appears to sway precariously in the wind, perhaps tipping workers off to a potential hazard.

It is important to note that the crane does not actually move as much as the video seems to show. It is the process of motion amplification that gives the crane its movement.

The program originally gained attention last summer when the team presented it at the annual computer graphics conference known as Siggraph in Los Angeles.

Since then, the M.I.T. team has improved the algorithm to achieve better quality results, with significant improvements in clarity and accuracy.

Michael Rubinstein, a doctoral student and co-author on the project, said that after the presentation and subsequent media coverage, the team was inundated with e-mails inquiring about the availability of the program for uses ranging from health care to lie detection in law enforcement. Some people, says Mr. Rubinstein, inquired about how the program might be used in conjunction with Google’s glasses to see changes in a person’s face while gambling.

“People wanted to be able to analyze their opponent during a poker game or blackjack and be able to know whether they’re cheating or not, just by the variation in their heart rate,” he said.

The team posted the code online and made it available to anyone who wanted to download it and run the program. But to do so required some technical expertise because the interface was not simple to use. Last week, Quanta Research Cambridge, a Taiwan-based manufacturer of laptop computers that helped finance the project, provided a way for people to upload video clips to their Web site and to see a video that is run through the program.

The project is also financed by the National Science Foundation and Royal Dutch Shell, among others.

The team is also working toward making the program as an app for smartphones. “I want people to look around and see what’s out there in this world of tiny motions,” said Mr. Freeman.

source: http://bits.blogs.nytimes.com

Got an unusual story? Send it to zareh@inznews.com



No comments:

Post a Comment