However, some experts caution that future versions of the technology are ripe for abuse. For example, it could enable stalkers or child abusers, says ethicist Jacob Metcalf of Data & Society, a nonprofit research center that focuses on the social implications of emerging technologies. A stalker could download images off of Instagram without the creators’ consent, and if those images contained shiny surfaces, they could deploy the algorithm to try to reconstruct their surroundings and infer private information about that person. “You better believe that there are a lot of people who will use a Python package to scrape photos off Instagram,” says Metcalf. “They could find a photo of a celebrity or of a kid that has a reflective surface and try to do something.”

Park points out that Instagram images don’t contain 3D depth information, which his algorithm needs in order to work. In addition, he says that his team considered potential misuse, particularly privacy violations such as surveillance, although they do not discuss these ethical considerations explicitly in the version of the paper currently available. Parks says that image and video platforms like YouTube could, in the future, automatically detect reflective surfaces in videos and then blur or process the image to keep the reconstruction algorithm from working. “Future research could enable privacy-preserving cameras or software that limits what can be inferred about the environment from reflections,” Park wrote in an email to WIRED. He also says that the algorithm is not currently accurate enough to pose a threat.

Metcalf thinks Park and his co-authors should state these ethical considerations directly in the paper. In fact, he thinks that the data science community as a whole needs to consistently include ethics sections in their publications. “I want to be clear; this isn’t a criticism of these researchers specifically, but of the norms of data science,” says Metcalf. “The norms of data science as an academic discipline have not yet grappled with the fact that papers like this have potentially enormous impact on people’s wellbeing.”

These ethical discussions can influence the direction of future research in the field, says Raji. “Some researchers will be like, ‘It doesn’t mean anything if I state what my intent is with the research; people are going to do what they’re going to do,'” she says. “But what they don’t realize is that the ethical statements often shape the development of the field itself.”

In an email response to WIRED, Park wrote that the team will include an ethics section in the official version of the paper released in association with the conference, which is scheduled to take place in June.

Park’s team isn’t the first to realize that snack packaging can be used as sensors. In 2014, Davis and his colleagues demonstrated that you could use a bag of chips as a microphone. They played a MIDI file of “Mary Had A Little Lamb” at the chip bag, and by processing a high-speed video of the bag’s vibrations, they could play the song back.

“There’s a surprising amount of information in images of everyday objects that are just kind of sitting there,” says Davis. With the right algorithms, it seems, any faint rustle or glint of light can now tell a tale.

More Great WIRED Stories