Two Harvard students recently revealed how combining Meta’s smart glasses with facial recognition software can expose a person’s identity and personal information in seconds, according to a report by Ashley Belanger for Ars Technica.

Per her report, AnhPhu Nguyen and Caine Ardayfio modified a pair of Ray-Ban Meta Smart Glasses to integrate them with PimEyes, a reverse image facial recognition engine, along with a large language model (LLM). This system can extract personal data such as names, phone numbers, and addresses instantly from the web. Nguyen explained how this technology, dubbed “I-XRAY,” raises alarming concerns about privacy, with the ability to identify strangers in public places simply by looking at them through the glasses.

Are we ready for a world where our data is exposed at a glance? @CaineArdayfio and I offer an answer to protect yourself here:https://t.co/LhxModhDpk pic.twitter.com/Oo35TxBNtD

— AnhPhu Nguyen (@AnhPhuNguyen1) September 30, 2024

The students conducted tests at a subway station, scanning the faces of unsuspecting commuters and accessing publicly available information through people-search databases. Some individuals were tricked into believing the students knew them based on the personalized details they retrieved in mere seconds. They described their project as a demonstration of how easily someone could use such technology for malicious purposes. “Some guy could just find a girl’s home address on the train and follow her home,” Nguyen warned.

I-XRAY combines recent advancements in LLMs and facial recognition, allowing for automated data extraction that would previously have required significant time and effort. Meta’s Ray-Ban glasses (with clear lenses) were chosen for this project because of their inconspicuous design, making them look like ordinary glasses. The students even disabled the glasses’ recording light to keep the scanning undetectable, further underscoring the risks.

Despite their breakthrough, Nguyen and Ardayfio stressed that they had no intention of releasing the code behind I-XRAY, explaining that their aim was to bring attention to the growing threats to privacy. They encouraged individuals to opt out of invasive search engines such as PimEyes to protect their data.

While privacy laws in the European Union require consent for the collection of facial recognition data, no such protections exist in the U.S., where bad actors could exploit this technology. However, the students emphasized that their project was not unique—similar technologies are being developed. For instance, Clearview AI, a company specializing in facial recognition for law enforcement, has reportedly explored using smart glasses for face scanning. This has raised significant concerns, given Clearview’s controversial practices and their goal to include nearly every human face in their database.

Nguyen and Ardayfio provided instructions on removing personal information from reverse face search engines, like PimEyes and Facecheck ID, and people-search databases like FastPeopleSearch, CheckThem, and Instant Checkmate. Yet, their tests showed that even opting out might not guarantee anonymity, as some subjects were still easily identified. Despite their warnings, the disturbing reality is that technologies like I-XRAY may soon be within reach for anyone with access to the right tools.

Featured Image via Pixabay