Menu
Facial recognition: Facebook photo matching just the start

Facial recognition: Facebook photo matching just the start

As facial recognition tech moves onto the streets of your town, will your privacy be a casualty?

The Internet was in an uproar earlier this year following Facebook's launch of facial recognition software for its photo services, enabling users to identify their friends in photos automatically--and without their permission. Though critics described that move as creepy, the controversial technology may now be on the verge of widespread use.

For instance, this month a Massachusetts company called BI² Technologies will roll out a handheld facial recognition add-on for the iPhone to 40 law enforcement agencies. The device will allow police to conduct a quick check to see whether a suspect has a criminal record--either by scanning the suspect's iris or taking a photo of the individual's face.

Earlier this week, reports surfaced that the military and Georgia Tech Research Institute had started testing on autonomous aerial drones that could use facial recognition software to identify and attack human targets--in effect, the software performs the assessment that determines who gets killed.

And in yet another development, the Federal Trade Commission announced earlier this week that it will hold a free public workshop on December 8, 2011, to examine various issues related to personal privacy, consumer protection, and facial recognition technology.

[Read: "Facebook Photo Tagging: A Privacy Guide"]

Of course, the government and large private companies have had access to facial recognition software for years. The pressing question today is what happens to privacy when everyone has access to the technology? Already smaller businesses--and even private individuals--are developing sometimes amazing, sometimes very creepy uses for security-focused software.

In Las Vegas, advertisers have taken a page from Minority Report, the 2002 Tom Cruise movie. The Vegas advertisers use facial recognition to target ads to passers-by. For instance, if a woman in her mid-twenties walks past the advertising kiosk, its built-in software will identify her likely age and gender and then display ads for products deemed appealing to her specific demographic.

Meanwhile, in Chicago, a startup called SceneTap links facial recognition technology to cameras in bars and clubs so that users can figure out which bars have the most desirable (in their opinion) ratio of women to men--before they even arrive.

If you think the corporate implications are unsettling, wait until the general population gets deeply involved in using facial recognition technology. One recent instance: In the wake of the August London riots, a Google group of private citizens called London Riots Facial Recognition emerged with the aim of using publicly available records and facial recognition software to identify rioters for the police as a form of citizen activism (or vigilante justice, depending on how you feel about it). The group finally abandoned its efforts when its experimental facial recognition app yielded disappointing results.

Though the members of London Riots Facial Recognition undoubtedly believed that they were working for the greater good, what happens when people other than concerned citizens get their hands on the technology? It shouldn't take too long for us to find out.

Present-Day Reality Check

The use of facial recognition software by governments and online social networks continues to provide headline fodder. A Boston-area man had his driver's license revoked because when the U.S. Department of Homeland Security ran a facial recognition scan of a database containing the photos of Massachusetts drivers, it flagged the man's license as a possible phony. Afterward it emerged that the system had confused the man's face with someone else's.

In England, law enforcement officials ran photos of August riot suspects through Scotland Yard's newly updated face-matching program, which is under consideration for use during the 2012 Summer Olympics in the UK. In Canada, an insurance company invited Vancouver police to use its facial recognition software to help identify rioting fans after the Vancouver Canucks hockey team lost the seventh game of the NHL championship series.

And of course Facebook endured a hailstorm of criticism in June when it announced its plans be roll out a facial recognition feature for its members to provide semiautomatic tagging of photos uploaded to the social network.

[Read: "Facebook Facial Recognition: Its Quiet Rise and Dangerous Future"]

One Facebook critic was Eric Schmidt, executive chairman of Google, who said earlier this year that the "surprising accuracy" of existing facial recognition software was "very concerning" to his company and that Google was "unlikely" to build a facial-recognition search system in the future.

Indeed, Google seems to have been so concerned by the technology that Schmidt declined to implement it even though his company already had the know-how to make it. "We built that technology and withheld it," Schmidt said. "People could use it in a very bad way."

Next: Off-the-Shelf Efforts, Watch Out for Little Brother, and more

Off-the-Shelf Facial Recognition

You don't need the power of a government or an Internet behemoth to make facial recognition work for you. At this year's Black Hat security conference (held in Las Vegas in August), a team of researchers from Carnegie Mellon University demonstrated how much they could accomplish with existing off-the-shelf technology.

The team took photos of people's faces and pushed those images through an off-the-shelf facial recognition program called PittPatt (which Google recently acquired). In the demonstration, in less than 3 seconds, the program compared the CMU researchers' photos to images publicly available on Facebook and returned 10 possible matches, along with the names of the matches. The process proved to be accurate more than 30 percent of the time.

The team then used information gleaned from Facebook profiles to guess the birth dates or birthplaces of the people that the software had accurately identified. With that information, they predicted the first five digits of each person's Social Security number and were accurate about 27 percent of the time.

"The bigger picture here was to show that we're getting closer to a world where online and offline data blend seamlessly, where you can start with an anonymous face in the street and you can end up identifying something extremely sensitive about the person by combining these different technologies," says the leader of the team, Carnegie Mellon assistant professor Alessandro Acquisti.

It's Not Just Big Brother--Watch Out for Little Brother

While the demonstration by Acquisti's crew may make anyone who cares about privacy queasy, the concepts used in the demo aren't ready for "Little Brother" yet. "If you asked me if I could go out into the streets of New York and identify anyone and everyone, the answer is no," Acquisti says.

That's because the off-the-shelf system that the researchers used won't scale to a task of that magnitude. "If you wanted to identify anyone in the street of a large city, you'd need a database of hundreds of millions of people, and--given the computational power available now--it's still not possible to do these face match-ups in real time," Acquisti explains.

Still, because so much facial information is available online at places like Facebook and Flickr, preventing that information from being used to intrude on individual privacy is almost impossible, according to Harry Lewis, a computer science professor at Harvard University. Lewis told PCWorld: "A private individual in a public--but what was previously thought of as anonymous--place is no longer going to find themselves anonymous."

People are quick to express concern about technologies like facial recognition in the hands of Big Brother, Lewis acknowledges. "But let's not get so worried about Big Brother that we forget about the fact that Little Brother is going to able to do exactly the same thing," he says.

Lewis also points out that, in principle, Big Brother can be controlled through regulation and legislation, but "we can't regulate what Little Brother does about public information, unless we want to surrender our civil rights of freedom of speech."

Closed-Circuit Cameras: A Precedent

Some people argue, however, that anonymity began eroding long before facial recognition appeared on the scene. The proliferation of closed-circuit television cameras is an example of that trend. "The Big Brother thing is just technology catching up to what's always been there," says George Brostoff, founder and CEO of Sensible Vision in Covert, Michigan.

Sensible Vision makes facial recognition software designed for authenticating a person's identity. When users install Sensible Vision's software--Fast Access--on their computer and then sit in front of that PC, the software recognizes their face and logs them in automatically. If a user leaves the computer, the software detects his or her absence and prevents anyone else from using the unit. The company sells both personal and enterprise versions of the software.

In the long run, many problems involving potentially invasive technologies such as facial recognition simply work themselves out, according to Stewart Hefferman, CEO of OmniPerception, of Guilford in the UK, which makes object and facial recognition software.

"There are ways, through technology and legislation, of making sure that people's privacy is protected while deriving the benefits of a technology," Hefferman says.

Staff Editor David Daw of PCWorld contributed to this story.

Follow freelance technology writer John P. Mello Jr. and Today@PCWorld on Twitter.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags smartphonesGoogleFacebookiPhoneconsumer electronicsConsumer AdviceFederal Trade Commission

More about AlessandroBrother International (Aust)Carnegie Mellon University AustraliaCBS CorporationFacebookFederal Trade CommissionFTCGoogleHarvard UniversityMellonReality Check

Show Comments
[]