In and around Tokyo, face-recognition cameras have started to take photos of passersby at various locations. Supermarket chains, shopping malls and vending machines inside JR East stations all have been using face-recognition software to identify the sex and age of individuals who come within line of sight of their cameras.
That information is then used to determine what product to recommend and which particular ads draw people’s attention, or simply to acquire data on what kind of people stop by.
All of that might seem like one more little technological marvel in gadget-loving Japan. However, the use of that visual data must be considered and measures taken to ensure that the right to privacy is maintained.
Even though companies claim they do not store the data and that individuals remain anonymous, the technology for recognizing faces and linking them to commercial behavior and private data has the potential for misuse.
Face-recognition software now has the capacity to easily connect with online search engines and open information sources.
At Carnegie Mellon University in the United States, a researcher using off-the-shelf software and a cheap digital camera was able to put a name to the face of one-third of the students passing by on campus within three seconds. Police used such software to arrest 19 criminals at the 2011 Super Bowl.
While technology may help with security and police issues, its use by private as well as public entities needs careful regulation.
Face-recognition technology is improving and its use is spreading. Already, Facebook uses a function that automatically tags friends in pictures. Facebook is reported to have an estimated 90 billion photos, with an estimated 6 billion new photos uploaded each month.
Connecting sources of individual information with surreptitiously acquired photos has the potential to easily identify large numbers of people without their knowledge or consent.
Face data collection may soon be an important research and marketing technique, so pressure from companies to use it will continue to increase. Analyzing photos taken by small cameras provides a wealth of biometric data.
Even when the identity of the person in the photo is not determined, the age, gender, clothing, direction of eyes and other data can be captured, analyzed and stored. Making connections between the photos and other data has the potential for intrusion into individual privacy.
Marketing companies claim that the data is not stored, and that abstract data, such as age, gender, time and date, and where the eyes of the individual look, is simply analyzed, but not stored. Most of the small cameras in use are not easily noticeable, and in some displays in Tokyo, passersby were not informed that cameras were in operation or for what purpose. They should be informed clearly and directly.
Regulating how to inform people and how permission can be granted are important issues that government agencies should start working on immediately.
The guidelines under the Protection of Personal Information Law in Japan state that camera images are part of individuals’ personal information. However, no ministry or agency has yet to sufficiently establish a set of rules concerning privacy issues related to visual information.
Visual data connected to one’s identity deserve just as much protection as other personal information. Already, lawmakers and advocates in the U.S. are calling for regulation of this new technology. Japan should address the issue in line with its own laws and regulations.
Thus far, there is nothing yet to restrict corporations from profiting from anyone’s secretly obtained biometric information or from disseminating that information without the person’s consent. Even though companies may not be able to use the image of a person, they can use all the other data suggested within that image. Stronger regulations should be established for when, why and how that information may be, or, better yet, may not be, used.
Since the advent of the Internet, faces have become identifiable in a flash from around the world. The process of taking, storing and using data is now much easier than ever before.
Without sufficient safeguards, the data collected via face-recognition software could also be used for illicit, unpermitted or simply embarrassing purposes.
In Japan, where laws protecting privacy are generally respected, no one wants to have his or her own picture sent to an employer after protesting nuclear power, getting a tattoo or making an expensive purchase at a name-brand store. Privacy deserves to be legally protected and socially respected.
Facial features are one of the most important markers of identity. Every passport and ID card carries a face photo. The new face-recognition technology will only increase in quality and scope. As it does, the right to not have one’s image used without permission should be legally protected.
Laws need to keep pace with the technology, though that is not always so easy. In Japan, regulation lags while technology speeds ahead. The government needs to ensure that the restless progress of consumer marketing does not invade the privacy rights of individuals.
In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.