Researchers at the University of Washington and Harvard Law School recently published a file Pioneering study Technical capabilities analysis of 16 Electronic Monitoring (EM) smartphone applications used as “alternatives” to criminal and civil detention. The study, billed as “the first systematic analysis of the ecosystem for electronic monitoring applications,” confirmed the concerns of many advocates that EM applications allow access to large swaths of information, often contain third-party trackers, and are often unreliable. The study also raises further questions about the lack of transparency involved in the ecosystem of EM applications, despite the increasing reliance of local, state, and federal government agencies on these applications.
As of 2020, more than 2.3 million people were imprisoned in the United States, and another 4.5 million were under some form of “community supervision,” including those on probation, parole, pretrial release, or in juvenile or immigrant detention systems . While electronic devices in the form of ankle monitors have long been used by agencies as an “alternative” to detention, local, state, and federal government agencies are increasingly turning to smartphone apps to fill this function. The way it works is simple: Instead of prison/detention or an ankle monitor, the person agrees to download an EM app to their own phone which allows the agency to track the person’s location and may require the person to submit to additional conditions such as access checks that include facial or voice recognition. The lower costs associated with requiring someone to use their own hardware for EM likely explain the explosion of EM applications in recent years. Although there is no accurate count of the total number of people using the EM app as an alternative to detention, in the context of immigration alone, today Nearly 100,000 people are on EM through BI Smartlink Applicationhigher than Just over 12,000 in 2018. This high usage calls for the need for the public to understand these applications and the information they collect, maintain, and share.
The study’s technical analysis, the first of its kind for these types of applications, identified several categories of problems in the 16 applications surveyed. This includes privacy issues with the permissions these apps request (and often require), concerns about the types of third-party libraries and trackers they use, who they send data to and how they do it, as well as some basic usability issues and app crashes.
When an app wants to collect data from your phone, for example by taking a picture with your camera or capturing your GPS location, it must first ask for your permission to interact with that part of your device. For this reason, knowing what permissions an app requests gives a good idea of what data it can collect. And while denying unnecessary requests for permission is a great way to protect your personal data, people under EM orders often don’t have that luxury, and some EM apps simply won’t work until all permissions have been granted.
Perhaps not surprisingly, nearly all of the apps in the study asked for permissions like GPS location, camera, and microphone access, which would likely be used for various check-ins with a person’s EM admin. But some apps ask for more exotic permissions. Two of the apps considered request access to the phone’s contact list, which the authors note can be combined with the Read Phone Status permission to monitor who someone is talking to and how often. And three other permissions ask for Activity Recognition, which determines whether the user is in a vehicle, on a bike, running, or standing.
Third Party Libraries and Trackers
Application developers almost never write every line of code that goes into their programs, instead relying on so-called “libraries” of software written by third-party developers. That the app contains these third-party libraries is hardly a red flag in and of itself. However, since some libraries are written to collect and upload user tracking data, it is possible to link their presence in an application with the intent to track and even monetize user data.
The study found that nearly every app uses a Google Analytics library of some kind. As EFF has previously arguedGoogle Analytics might not be particularly invasive if only used in one app, but when combined with its use almost everywhere across the web, it provides Google with a comprehensive view of the behavior of individuals online. Even worse, the app sprocket “It appears to contain the code needed for the Google AdMob and Facebook Ads SDK to serve ads.” If this is indeed the case, sprocketSoftware developers engage in a horrific practice of monetizing their captive audience.
The study aimed to learn about the types of network traffic that these applications send during normal operation, but was limited by the lack of active accounts for any of the applications (either because the researchers were unable to create their own or did not do so to avoid agreeing to the terms of service). Even so, by installing a program that allows them to spy on the app’s connections, they have been able to draw some troubling conclusions about some of the studied apps.
Almost half of the applications made requests to web domains that could be uniquely associated with the application. This is important because even though these web requests are encrypted, the domain they are directed to is not, which means that who controls the network the user is on (such as coffee shops, airports, schools, employers, Airbnb hosts, etc.) Theoretical if someone is under EM. One app we already mentioned, sprocketwas particularly terrible with the number of times it sent data: every five minutes, a call home was made to the Facebook ad network endpoint with many data points collected from phone sensors and other sensitive data.
It should be noted that due to study limitations, this is not a comprehensive picture of the behavior of each EM application. There are still a number of important open questions about what data they send and how they send it.
Application errors and technical issues
As with any software, EM applications are prone to errors. But unlike other applications, if someone under EM had problems with their application, they would be at risk of violating the terms of their court order, which could result in disciplinary action or even incarceration – problems that those exposed to ankle monitors similarly face.
To examine how bugs and other issues with EM apps affect people who are forced to use them, researchers conducted a qualitative analysis of Google Play Store reviews of the apps. These reviews were, by a large margin, overwhelmingly negative. Several users reported that they were unable to successfully check-in into the app, sometimes due to GPS/face recognition errors, and other times due to not receiving notifications for check-in. One user describes such an issue in his review: “I was having trouble checking in and not alerting my phone which prompted my probation officer to call and threaten to file an arrest warrant because I missed the check-in, which is very frustrating and annoying.”
The study also examined the legal context in which issues related to emergency science arise. Ultimately, the legal challenges to EM applications are likely to be difficult because, although the Fourth Amendment’s prohibition of unlawful search and seizure criterion is “reasonable,” courts have long held that freed and released have lowered expectations of privacy compared to The interests of the government in preventing a return to crime and reintegrating the released and released into society.
Furthermore, the government would likely be able to circumvent Fourth Amendment appeals by claiming that the person agreed to the EM application. But as we did Argue in other contexts, so-called “consent searches” are legal fiction. They often occur in highly coercive places, such as a traffic stop or a home inspection, and leave little room for the average person to feel comfortable saying no. Likewise, here, choosing to apply for an EM application is hardly an option at all, especially when faced with imprisonment as a potential alternative.
This study is the first comprehensive analysis of the ecosystem of EM applications, and lays a critical foundation for public understanding of these applications and their harms. It also raises additional questions that EM app developers and government agencies that contract with these apps must provide answers to, including:
- Why EM apps ask for dangerous permissions that seem unrelated to typical electronic monitoring needs, such as accessing phone contacts or accurate phone status information
- What developers of EM apps that lack privacy policies do with the data they collect
- What types of protection do people have under EM against the unlawful search for their personal data by law enforcement or by Advertising data brokers buy their data
- What additional information will be disclosed by being able to create an active account with these EM apps
- What information is actually provided about the technical capabilities of EM applications to both government agencies contracting to EM application vendors and people on EM applications
People who are forced to deal with EM applications deserve answers to these questions, and so does the general public with the increasing adoption of electronic surveillance in our criminal and civil systems.
#Study #smartphone #applications #electronic #surveillance #confirms #fears #privacy #advocates