Recent findings from the MIT Computer Science and Artificial Intelligence Laboratory reveal that light sensors in smartphones, tablets, and notebooks can detect gestures and actions without the need for cameras, posing potential privacy risks.

Modern mobile devices incorporate various sensors — proximity, touch, microphones, accelerometers, and gyroscopes — to enhance performance and user experience. These technological advancements, however, compromise user privacy and data confidentiality by increasing the risk of external exposure.

The European Network and Information Security Agency (ENISA) recommends that smartphone app developers limit sensor data access, advising against the automatic collection of geolocation data, for instance.

A 2021 Spanish study highlighted on ScienceDirect, “SmartCAMPP – Smartphone-based continuous authentication leveraging motion sensors with privacy preservation,” illustrates how motion sensors used for continuous authentication may inadvertently expose users’ health status, underscoring privacy issues.

Image sensors in smartphones and tablets, used for various photographic activities, represent a significant security risk without stringent permission controls, potentially allowing unauthorized access to extensive private image collections.


The CSAIL’s research, using simulations and a bespoke algorithm, demonstrates ambient light sensors’ ability to detect hand gestures, like web page scrolling, on a mobile device.
The principal privacy threat stems from the potential for sensitive personal data, such as health and banking information, to be exposed and captured by hackers.
Future research aims to refine the algorithm to reconstruct video data from various sensor types across mobile devices, expanding the study to include smart TVs, thereby enhancing data protection efforts.

Mobile Devices and Privacy: Unveiling Ambient Light Sensors’ Role

In the realm of mobile devices and privacy, it’s crucial to recognize that most sensors in smartphones, tablets, and notebooks operate without needing user permissions or privileges. This is especially true for ambient light sensors, designed to adjust screen brightness automatically based on the surrounding light, thereby saving energy and enhancing user comfort. These sensors adapt brightness levels according to ambient light, maximizing or minimizing illumination as necessary.

“Ambient light sensors are traditionally seen as low-risk to user data security since they don’t appear to capture environmental images,” begins a challenge to this view by researchers at the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT.

Their study, “Imaging privacy threats from an ambient light sensor,” published on January 10, 2024, in Science Advances, refutes this assumption by showing light sensors can indeed compromise privacy.

Unauthorized Detection of User Gestures and Actions

The CSAIL team countered the common belief among app developers and device manufacturers that ambient light sensors pose minimal privacy risks, as they reportedly do not capture significant private data. Contrary to this, MIT’s simulations demonstrated that these sensors could passively capture images of user interactions with their devices, such as hand gestures and screen touches, without any camera aid. By employing a mathematical algorithm, the researchers were able to reconstruct these interactions from minimal visible data, revealing the potential for privacy breaches through simple hand and finger movements.

The study points out that apps with direct screen access could exploit these sensor data to monitor user actions like web page scrolling for marketing purposes, without user consent. More alarmingly, MIT’s research suggests the possibility of exposing sensitive personal data, such as health or financial information, through these seemingly innocuous sensors.

Conducted Tests

Through three laboratory demonstrations using an Android tablet, the researchers showcased the sensors’ ability to detect different types of hand touches.

In the initial experiment, a mannequin positioned before the tablet’s screen served to showcase the screen’s capability to recognize touches from various types of hands (those of the mannequin, a cardboard cut-out of a hand, and ultimately, a human hand), all under sensor illumination.

An experimental setup capturing the hand contact of a dummy with the display (A); a cardboard cutout shaped like an open hand touching the screen, followed by a human finger pointing at the monitor (B); pixelated fingerprints of the open hand and pointing hand, revealing physical interactions with the device's screen (C); final images retrieved and assembled by the algorithm developed by the team (D). (Source: "Imaging privacy threats from an ambient light sensor" - Computer Science and Artificial Intelligence Laboratory at MIT -https://www.science.org/doi/10.1126/sciadv.adj3608).
An experimental setup capturing the hand contact of a dummy with the display (A); a cardboard cutout shaped like an open hand touching the screen, followed by a human finger pointing at the monitor (B); pixelated fingerprints of the open hand and pointing hand, revealing physical interactions with the device’s screen (C); final images retrieved and assembled by the algorithm developed by the team (D). (Source: “Imaging privacy threats from an ambient light sensor” – Computer Science and Artificial Intelligence Laboratory at MIT -https://www.science.org/doi/10.1126/sciadv.adj3608).

The subsequent experiment, undertaken solely with human hands, revealed that the manner in which individuals interact with their smartphone or tablet screen—through touching, scrolling, and rotating—can be observed by unauthorized parties connecting to the device, “at a rate of one frame every 3.3 minutes,” as noted by the researchers. They further noted that “with an enhanced light sensor, malevolent entities could in real time intercept the user’s engagement with their device.”

In the concluding experiment, the team demonstrated an additional risk posed to users while engaging with their mobile device to view short videos or clips. They detailed the experiment’s result thus:

A human hand was positioned in front of the light sensor, against a backdrop of scenes from the cartoon Tom and Jerry and a whiteboard situated behind the participant, reflecting light onto the device’s screen. In this scenario, the light sensor detected minor variations in light intensity across each video frame, converting them into visual representations of the user’s hand movements while viewing the video.”

This revision maintains the original’s detailed explanation while ensuring clarity and formal tone.One test illustrated how even the subtlest gestures in front of the screen, like watching a video or scrolling, could be captured and reconstructed to reveal user interactions, highlighting the real-time threat posed by more powerful sensors.

Glimpses of futures

This breakthrough research opens up discussions on the broader implications of mobile device privacy, suggesting that what was once considered safe could indeed pose new threats to user data.

Future studies aim to refine the algorithm to extend its application to other sensor types, potentially in smart TVs, broadening the understanding of privacy risks in our increasingly connected world.

Let us now attempt to predict future scenarios by using the STEPS matrix to assess the impacts of the research team’s technological and methodological advancements from social, technological, economic, political, and sustainability viewpoints.

S – SOCIAL: often, through ignorance and carelessness, we do not apply the correct security measures to protect the personal data stored within our mobile devices. In the future, increasingly accurate simulations of the known and as yet unknown risks to our privacy would enable us to become more competent and aware users of security for smartphones, tablets, notebooks and smart TVs.

T – TECHNOLOGY: it could be the artificial neural networks that in the future will support the evolution of the mathematical algorithm developed to reconstruct the images detected on the screen of mobile devices by lighting sensors. More generally, the artificial intelligence techniques would contribute to more realistic and effective simulations of data privacy threats.

E – ECONOMY: when mobile devices are business tools and contain information related to professional activities, the economic impact following their breach is very high, with implications also on the organisation’s business. Hence, in the future, the practice of reconstructing the scene of the ‘hacker threat’ in order to analyse new risks of illicit data access and demonstrate the concrete possibilities, means preventing economic losses.

P – POLITICAL: to date, the Garante per la protezione dei dati personali has always been particularly attentive in making suggestions on how to ‘protect oneself’ from smartphone sensors such as microphones, “to avoid indiscreet listening“. In future, it will have to take into account the potential dangers posed by ‘all’ types of sensors on mobile devices by drafting new decalogues aimed at users.

S – SUSTAINABILITY: experiencing how the different sensors built into mobile devices made available to employees and company employees each have their own specific weight in defining the level of protection of such devices will, in the future, translate intomore sustainable practices from a security perspective, reflected in the guarantee of timely training of workers on the prevention of hacker threats, as well as on the correct behaviour to adopt.

Written by: