U.S. Senator Edward J. Markey’s long-running probe into Amazon’s Ring surveillance doorbell system reached a new boiling point this week with fresh findings that lay bare deep structural gaps in the company’s privacy protections.
The latest disclosures, released this week by Markey, indicate that Ring’s new facial recognition feature, “Familiar Faces,” launched this week despite what Markey calls “reckless” failures to safeguard the biometric data of people who are unknowingly scanned.
The development comes at the height of the holiday delivery season. Markey warns that Amazon has effectively forced its vast network of delivery drivers to “surrender their biometric data” simply by doing their jobs.
In an October 31 letter, Markey urged Amazon to abandon its plans to add facial recognition technology to Ring doorbells, arguing that the feature would collect biometric information on anyone who appears in front of a camera, not just device owners.
Amazon’s response confirmed that Familiar Faces is designed to recognize and tag individuals who appear repeatedly in front of a user’s camera and then send personalized alerts such as “Emma at Front Door” instead of generic notifications.
The feature is optional and disabled by default, but once activated it scans all faces in view, whether those individuals have consented.
Markey’s office says Amazon’s answers to his questions reveal a stark divide between the protections available to Ring customers and the complete lack of rights for everyone else.
According to the Senator’s summary of the response, Ring’s privacy protections “only apply to device owners,” not visitors, neighbors, or passersby whose faces may be scanned and logged.
Device owners are told through in-app prompts that Familiar Faces uses biometric data and that they should comply with local consent laws, but Amazon itself does not provide any mechanism to secure that consent from non-users or even notify them that facial recognition is in use.
The company’s letter confirms that reference data for “familiar faces” is retained until the owner deletes it, while images of unnamed faces are removed after thirty days. Ring emphasizes that users “maintain control” over their tagged faces and can delete them at any time.
Non-owners who appear in someone else’s footage, however, must contact the individual device owner directly to seek deletion, because Ring maintains that owners are “best positioned to identify and manage specific videos in which individuals may appear.”
For delivery drivers who may encounter hundreds of Ring devices on a route, Markey argues, this structure effectively makes meaningful opt-out impossible.
“Despite my warnings,” he said, “Amazon unleashed a new privacy nightmare on the American people by releasing its Ring doorbell facial recognition feature without any meaningful privacy protections.”
He warned that the rollout is “a giant step toward a dystopian future where Americans cannot leave their homes without being tracked and surveilled,” and stressed that by releasing the feature during the holidays, Amazon is ensuring that delivery drivers, including its own, are repeatedly scanned with no realistic way to avoid the system.
The clash over Familiar Faces is the latest chapter in Markey’s six-year confrontation with Amazon over Ring. His 2019 investigation found that the company had partnered with over 400 police departments and maintained what he described as “egregiously lax” privacy and civil-rights protections.
Markey said that at the time, Ring had no evidentiary standards for law enforcement requests, no policies limiting how long police could retain shared footage, no clear restrictions on third-party sharing by departments, and no meaningful controls to keep users from capturing footage of children or activities beyond their property lines.
Amazon declined then to say whether it would integrate facial recognition into Ring products, an omission that Markey now points to as an early warning sign.
In 2022, Markey’s probe shifted to audio capture and law enforcement access through Ring’s Neighbors Public Safety Service (NPSS). The Senator disclosed that Ring had reported 2,161 law enforcement agencies on NPSS – more than a five-fold increase in partnerships since November 2019 – and confirmed that the company had provided user videos to police under an “emergency circumstance exception” at least eleven times that year, without obtaining the owner’s consent.
Markey also highlighted the company’s refusal to commit to disabling default audio recording, to make end-to-end encryption the default for users, or to rule out future integration of voice recognition technology.
Those earlier findings feed directly into the current fight over facial recognition. In his October letter, Markey warned that combining facial recognition technology with Ring’s existing surveillance infrastructure creates “a dramatic expansion of surveillance technology” that would collect biometric data on all individuals appearing in front of a Ring doorbell camera, including unwitting visitors and passersby.
He stressed that although Ring owners must opt in to activate Familiar Faces, that safeguard “does not extend to individuals who are unknowingly captured on video by a Ring doorbell camera” who never receive notice or a chance to opt out of having their face scanned and logged in a database.
Markey’s letter also situates Ring inside a broader ecosystem of government surveillance. Citing reports that Immigration and Customs Enforcement has rolled out facial recognition tools to officers’ phones, he noted it is “not hard to imagine immigration officials seeking access to Ring’s biometric data” for deportation targeting or other enforcement actions.
For Markey, this is precisely the kind of convergence between private-sector surveillance capabilities and powerful government agencies that opponents of facial recognition have been warning about for years.
Amazon’s response attempts to reassure critics on some fronts while leaving others unresolved. The company says Ring does not use customer biometric data to train machine learning models or improve algorithms, except in limited datasets where participants have expressly consented, such as beta trials.
Amazon said internal privacy and security teams conduct privacy impact assessments and that Ring carries out bias-mitigation testing across diverse demographic groups, including gender, ethnicity, age, and other variables.
But it does not publish accuracy or bias statistics, offering no public way to evaluate whether false matches disproportionately impact people of color, immigrants, or other vulnerable groups.
On data-sharing with police, Amazon directs Markey to its law enforcement guidelines and confirms that Ring does not share biometric data through NPSS, and that it has no mechanism for law enforcement to access live streams.
At the same time, Amazon conceded that NPSS has continued to grow. Ring now counts 2,723 police departments on the platform, alongside 626 fire departments, 43 animal services agencies, 27 agencies addressing homelessness, drug addiction or mental health, and 157 “other” entities such as local governments and community groups, an increase of nearly 600 police participants since Markey’s 2022 findings.
Ring’s own law enforcement guidelines confirm that the emergency disclosure pathway flagged in 2022 remains operational. The company “reserves the right to respond immediately to urgent law enforcement requests for information in cases involving imminent danger of death or serious physical injury,” and notes that such requests can be submitted through an emergency channel in Amazon’s law enforcement portal.
Markey argues that adding biometric identifiers to this ecosystem only heightens the stakes, particularly when non-users have no effective say over whether their data enters the system at all.
The Senator’s latest Ring findings also intersect with his legislative push.
Alongside Senator Jeff Merkley and Representatives Pramila Jayapal and Ayanna Pressley, Markey is sponsoring the Facial Recognition and Biometric Technology Moratorium Act. It would prohibit federal agencies from using biometric surveillance technology and tie certain federal grants to state and local moratoria on biometrics.
The bill’s moratorium would explicitly cover information derived from privately operated biometric systems, a direct response to the kind of public–private surveillance hybrids embodied by Ring’s integrations with law enforcement.
For its part, Amazon continues to describe Familiar Faces as “responsible innovation” meant to help customers reduce nuisance alerts and better recognize routine visitors.
But Markey insists the convenience benefit for doorbell owners is not worth the cost to everyone else. “Americans should not have to fear being tracked and recorded while visiting a friend’s home or walking past a neighbor’s house,” he said in October. “Amazon’s system forces non-consenting bystanders into a biometric database without their knowledge or consent. This is an unacceptable privacy violation.”
Related Posts
Article Topics
Amazon | biometrics | consumer electronics | facial recognition | law enforcement | Ring doorbells | video surveillance
Latest Biometrics News
Namirial, an Italian digital identity company managed by Bain Capital, PSG Equity and Ambienta SGR, has signed an agreement to…
A new report raises alarm over the UK’s mandatory digital immigration status system, claiming that migrants may have been used…
Veriff has reported explosive growth in authentication volumes as demand for secure digital identity surges across industries. The Estonia-founded firm…
Ethiopia’s National Identification Program (NIDP) has launched a call to tender for the delivery of 20 million secure pre-personalized resident…
