Researchers keep finding new ways that advertisers can track users across websites and apps by ‘fingerprinting’ the unique characteristics of their devices.
Some of these identifiers are well known, including phone and IMEI numbers, or the Wi-Fi and Bluetooth Mac addresses, which is why access to this data is controlled using permissions.
But iOS and Android devices have a lot of other hardware that could, in theory, be used to achieve the same end.
In the study SensorID: Sensor Calibration Fingerprinting for Smartphones, Cambridge University researchers give some insight into the latest candidate – sensor calibration fingerprinting.
If sensors don’t sound like a big deal, remember that today’s smartphones are stuffed with them in the form of accelerometers, magnetometers, gyroscopes, GPS, cameras, microphones, ambient light sensors, barometers, proximity sensors, and many others.
Researchers have been looking at whether these sensors could be used to identify devices for some time using machine-learning algorithms without much success, but the Cambridge researchers finally cracked the problem with a novel proof of concept for iOS devices using M-series motion co-processors.
And there’s a good reason why sensors represent an attractive target, say the researchers:
In other words, unlike traditional fingerprinting nobody is going to stop them, ask for permission to do what they’re doing, or even notice it’s happening at all, rendering the whole exercise invisible.
For advertisers, that’s the perfect form of device fingerprinting – one nobody notices.
It turns out that MEMS (Micro-Electro-Mechanical Systems) sensors are inaccurate in tiny ways that can be used to identify one from the other:
Natural variation during the manufacture of embedded sensors means that the output of each sensor is unique and therefore they may be exploited to create a device fingerprint.
For high-end devices (all Apple devices and Google’s Pixel 2 and 3 smartphones), manufacturers try to compensate for this using a calibration process applied to each.
This means that the identifying inaccuracy can be inferred by knowing the level of compensation applied during this process.
The good news is that when the researchers reported their findings to Apple last August it fixed it in an update identified as CVE-2019-8541 in iOS 21.2 in March 2019.
Apple adopted the researchers’ suggestion of adding random noise to the analogue-to-digital converter output and removing default access to motion sensors in Safari.
That’s just as well because, ironically, Apple iOS devices are far more susceptible to calibration fingerprinting than the bulk of Android devices where the complicated calibration stage is often missed out for cost reasons.
However, higher-end Android devices that use calibration could still be affected, which Google was told about in December 2018 but, unlike Apple, has yet to address with a fix.
Research like this serves to emphasise a familiar theme. If advertisers and websites want to track devices, they have plenty to aim at.
Whether any would jump through the complex hoops necessary to crunch sensor data seems highly unlikely when there are many simpler ways to achieve the same.
Culled from infosec News Ireland