CBP’s body-worn camera rules collide with consumer AI glasses

When a Customs and Border Protection (CBP) officer was filmed wearing Ray-Ban Meta smart glasses during a Los Angeles immigration enforcement action in June, it raised the obvious question of whether consumer AI eyewear is creeping into federal policing, and into Department of Homeland Security (DHS) biometric ID pipelines.
The officer’s wearing of the glasses appears to have violated CBP regulations which disallow using both approved or personal recording devices for the “purpose of capturing individuals who are engaged in activity protected by the First Amendment” absent reasonable suspicion that the situation is likely to become an enforcement action. The person who videoed the CBP officer does not appear in the video to be the subject of an enforcement action.
Neither CBP nor DHS officials have responded to media requests for comments.
DHS sources told Biometric Update on condition of anonymity that there is no official policy specifically for the use of Ray-Ban Meta smart glasses, and that existing regulations do not allow for the personal use of a recording device when on duty.
“This is just another heartburn in a string of heartburns for them,” one of the sources said, adding, “I would suspect the guy got his ass reamed – not for what he did, but for getting caught. It’s not a good look.”
The incident falls within a broader pattern of increasing adoption of surveillance technologies by CBP under the Trump administration that includes facial recognition, access to biometric and other databases, AI-enabled detection systems, and even technologies like small drones or sensors capable of “seeing through walls.”
CBP’s operative policy in this matter is Directive 4320-030B, which was updated in May and has been in effect since August 2021. It defines “Incident-Driven Video Recording Systems” (IDVRS) as CBP-owned cameras that are vehicle-mounted, non-integrated vessel systems, and body-worn units.
The policy draws a very defined line around the use of personally owned devices. The policy explicitly states that “cellphones are not included in the definition of IDVRS and should not be used as a primary means to record enforcement actions.”
The general rules go even further, stating that on-duty personnel “will use only CBP-issued and approved IDVRS,” and “no personally owned devices may be used in lieu of IDVRS to record law enforcement encounters.”
Activation is also spelled out. Officers and agents “should record enforcement encounters at the start of the event or as soon as safely possible thereafter,” then “deactivate the IDVRS once their participation … has concluded.” Supervisors can require a statement if a camera wasn’t activated, and personnel are encouraged to advise those they encounter that they are being recorded when it won’t interfere with safety.
The prohibitions section is equally direct. CBP cameras are not to be used to film coworkers outside an enforcement encounter, to capture employee assessments outside training, to record privileged conversations, or in places with a reasonable expectation of privacy such as locker rooms or restrooms. And they are also not to be used for the “purpose of capturing individuals who are engaged in activity protected by the First Amendment.”
Retention and storage of data also is tightly controlled. Recorded data must be uploaded to a designated CBP-approved system and “shall not be downloaded or recorded for personal use or posted onto a personally owned device or website.”
Files that are tagged as non-evidentiary, evidentiary, or potentially evidentiary have different retention categories. Under the current published directive, non-evidentiary data is scheduled for retention of up to 90 days in alignment with the National Archives and Records Administration (NARA) guidance.
For potentially evidentiary material, CBP’s operational intent is a three-year retention, but this schedule remains under NARA review; until approval is granted, such material may be retained longer. Evidentiary data tied to a case file is preserved according to that case file’s records schedule, which can extend for decades.
A complementary Privacy Impact Assessment (PIA) describes the technical stack. Beginning in 2021, CBP connected thousands of Border Patrol body cameras to “a cloud-based digital evidence platform,” with role-based access controls and audit logs.
While internal program planning has referenced a shift from an earlier 180-day non-evidentiary target during the evaluation phase to shorter operational periods, the publicly posted directive reflects the current retention framework of up to 90 days for non-evidentiary data, a proposed three years for potentially evidentiary material still under NARA review, and evidentiary material retained in accordance with the applicable case file’s records schedule, which can extend for decades and, in some cases, up to 75 years.
Those constraints matter for consumer AI eyewear. Meta’s Ray-Ban glasses have a camera, mics, live-streaming, and “Meta AI with vision,” but Meta says the product does not ship with built-in facial recognition.
Multiple outlets have nevertheless shown it’s trivial to route the glasses’ video to a phone or laptop and run third-party face-matching pipelines in near-real time. This capability exists in the wild, but it is not an approved CBP function.
Could Meta glasses be integrated with DHS biometric systems in real time?
Practically, that would mean serving as a capture front-end to CBP’s Traveler Verification Service (TVS), the cloud facial-comparison service that powers Simplified Arrival at airports, seaports, and some land environments.
Simplified Arrival is an enhanced international arrival process that uses facial biometrics to automate the manual document checks that are already required for admission into the United States.
TVS takes images from approved fixed or partner cameras, matches them against pre-staged galleries built from government holdings (passports, visas, Automated Biometric Identification System), and purges the transient match images from the TVS cloud within defined windows.
None of CBP’s TVS documentation contemplates officer-worn consumer eyewear as a capture device, and the agency’s IDVRS rules would block using personal devices as evidence systems even if the optics and security hurdles were solved.
To deploy glasses as a TVS capture source, CBP would need to procure and issue compliant hardware, accredit the software, update PIAs/ System of Records Notices, and revise Directive 4320-030B and component Standard Operating Procedures. There is no indication that has happened.
The same governance posture appears in DHS’s department-wide body-camera policy, which directs components to operate within strict, agency-owned systems, to promulgate detailed component policies, and to align training, access controls, and retention with federal records law. That framework leans hard against ad-hoc recording on personal or consumer devices.
So, what explains the CBP officer wearing Ray-Ban Meta glasses in Los Angeles? The most benign possibility is that they were sunglasses worn as eyewear, with the camera inactive, which is still a bad look for an agency that must project respect for privacy at a time of unprecedented deployment into the nation’s interior.
A less benign scenario is off-policy recording. It’s plausible the wearer was using the glasses strictly as a Bluetooth headset for hands-free calls or prompts to a phone-based assistant.
Still, any of those uses though would collide with the spirit, and some with the letter, of the IDVRS rule that forbids “personally owned devices … in lieu of IDVRS to record law enforcement encounters” and the storage rules that bar posting footage to personal apps or sites.
If recording occurred, it would also create a retention, disclosure, and chain-of-custody mess because IDVRS footage must flow into CBP’s approved evidence platform with tagging, audit, and NARA-aligned deletion, not into a consumer cloud.
Could Meta’s glasses themselves perform biometric identification? Natively, Meta’s Ray‑Ban Meta smart glasses do not include native facial recognition, a point the company reiterates. What makes misuse possible, however, is the livestream capability and the phone link that enables third parties to tap video streams and process them with external AI tools.
This is not theoretical. In late 2024, Harvard students AnhPhu Nguyen and Caine Ardayfio publicly demonstrated how I‑XRAY was used to livestream from the glasses (via Instagram) to a computer program. The system ran facial recognition using PimEyes, then pulled personal information (names, addresses, phone numbers) from public sources and delivered that information back through a phone app in seconds.
Nguyen and Ardayfio said they built I‑XRAY not to deploy it, but to highlight how real-time doxxing is feasible using existing consumer tech.
Publicly documented demonstrations and technical discussions have shown that a real-time face-matching pipeline using Meta’s smart glasses can be built with off-the-shelf tools. The process begins with video capture through the glasses’ livestream feed, which is accessed on a connected phone or laptop.
From there, individual frames are extracted and run through face-detection software, such as RetinaFace or YOLO operating either on the device itself or a tethered computer.
Once a face is detected, frameworks like ArcFace or InsightFace generate an embedding, which can then be matched against a prebuilt gallery stored locally or sent to a cloud-based search service like PimEyes.
If a match is found, the identity can be enriched with open-source personal data that includes phone numbers, home addresses, and social media profiles before the results are sent back to the wearer.
That feedback may appear in an on-screen app or be spoken aloud through the glasses, often completing the entire process in as little as five seconds. The technology exists today; the glasses simply serve as an inconspicuous capture device.
Meta tries to mitigate misuse with a built-in LED recording indicator that can’t be disabled, but in bright light or crowded scenarios it’s easy to miss. Worse, “stealth” sticker accessories have appeared online attempting to mask the LED, though tests show these often fail or trigger disablement warnings.
For a federal law-enforcement component bound by a specific evidence workflow and a biometric program with its own privacy rules, that plug-and-play path is a legal and policy non-starter.
Presently, there is no evidence DHS or CBP has bought Meta’s smart glasses or otherwise integrated Meta platforms into any biometric program. DHS records do not show a smart-glasses budget line item. If such a pivot were underway, there should be procurement solicitations, an authority-to-operate trail, and privacy documentation.
What does exist is a mature CBP body-camera program that lives in a fenced evidence ecosystem, and a separate, large-scale facial-comparison service that is designed for fixed, controlled capture points at the border. This architecture isn’t easily bridged by a pair of sunglasses, especially not someone’s personal pair.
The policy language is unambiguous, the retention rules are prescriptive, and the biometric stack is purpose-built. Until CBP rewrites its own rules and runs the procurement and privacy gauntlet, Meta’s glasses sit firmly outside the fence.
Article Topics
biometric matching | biometrics | CBP | consumer electronics | DHS | facial recognition | law enforcement | Meta glasses | real-time biometrics | smart glasses | wearables






Comments