The university did not respond to repeated questions on whether CMU currently uses facial recognition technology. Photo via Adobe Stock. (Photo illustration by Natasha Vicens/PublicSource)

Update (07/15/22): Due to community feedback on the draft, Carnegie Mellon University has “decided not to move forward with further consideration of this policy document” on video surveillance, the university stated on its website. CMU has a “rigorous” policy development process, and revisions are common and expected as feedback is gathered, CMU stated.

The statement noted that the CMU police department “has never deployed facial recognition tools on our campus and has no plans to do so.”

Reported 7/6/22: Carnegie Mellon University has drafted a video surveillance policy that would allow its police to use facial recognition technology during criminal investigations.

The provision regarding facial recognition technology, only one sentence long, does not include information on the type of technology CMU would use or describe the specific ways the university would use facial recognition in a criminal investigation.

Peter Kerwin, director of media relations at CMU, said the policy has not been finalized and is still in its internal review stage. He said the university has begun a process to ensure that its community members can provide feedback on the policy before it’s finalized, which typically includes a 30-day comment period through the policy website.

The draft policy, obtained by PublicSource, is intended to guide CMU’s use of security cameras and video security systems as well as the ways the university retains or releases what they record. These tools can help the university detect, prevent and investigate crimes and threats to public safety, the policy states.

The university did not respond to repeated questions on whether CMU currently uses facial recognition technology, how the university intends to use the technology and what technology the university plans to use.

Researchers and civil liberties advocates have argued that facial recognition technology threatens privacy, normalizes surveillance and poses disproportionate risks to people of color and other marginalized identities. A university’s use of this technology, some told PublicSource, could stifle student expression and engagement on campus.

“It really is inconsistent with concepts of academic freedom and intellectual curiosity to use a technology that can track students everywhere they go,” said Chad Marlow, senior policy counsel at the ACLU, who has focused on issues related to privacy, technology and surveillance. “It’s very antithetical to a really robust educational experience.”

An excerpt of the draft policy. The provision regarding facial recognition technology is only one sentence long.

The draft states that CMU’s use of security cameras and video security systems must not “unduly constrain” the campus community’s civil liberties, academic freedom, personal privacy or freedom of expression and assembly. All students, faculty and staff need to know about the policy, the draft reads.

The university police department managed 615 cameras as of February 2022.

Campus surveillance

In Pittsburgh, the Department of Public Safety can’t use or acquire facial recognition technology without receiving approval from city council, with temporary exceptions granted in emergencies. But that legislation, passed in fall 2020, only applies to city entities and does not cover JNET, a system with facial recognition capabilities that all law enforcement agencies in Pennsylvania can use.

The number of colleges and universities that use facial recognition technology is unclear.

Fight for the Future, a national nonprofit that advocates for digital rights, says it asked prominent colleges and universities about their plans to use facial recognition technology. More than 70 — including CMU — are included on a list from the organization as having stated that they won’t use the technology, while more than 40 are listed as either using or possibly using it.

Shobita Parthasarathy, director of the Science, Technology and Public Policy program at the University of Michigan, co-authored a 2020 report that found facial recognition technology in K-12 schools is likely to disproportionately surveil marginalized students, collect student data without consent and more narrowly define acceptable student behavior, among other implications.

The authors recommended that the technology be banned in K-12 schools. Parthasarathy doesn’t believe universities should use the technology, either.

“What these universities promise to their students is a place where they can explore new horizons, test their independence, figure out who they are, exercise their rights and freedoms,” she said. “That could be at odds with the adoption of these kinds of surveillance technologies.”

The Purnell Center for the Arts on Carnegie Mellon University’s campus. (Photo by Clare Sheedy/PublicSource)

In an era in which educational settings can turn into targets, though, limited use of the technology has won favor among some academics.

Following the shooting at Robb Elementary School in Uvalde, Texas, CMU’s Biometrics Center held an online summit in which guest panelists discussed how technology could help improve school safety. Dr. Marios Savvides, the center’s director and founder, said at the panel’s conclusion that facial recognition technology could complement artificial intelligence-based weapons detection.

“Even facial recognition, if it’s allowed, can be useful. Somebody who’s been detected holding a gun can then be tracked through the corridors to give real-time alerts, where that person holding a gun is running towards,” Savvides said. 

“We can use AI in an ethical fashion, and use AI for good.”

The Pennsylvania Chiefs of Police Association has accredited the CMU police department since 2007. The association requires accredited agencies to have a written directive providing guidance to officers on whether, and how, they can use facial recognition technology. The association does not dictate the type of technology an agency uses or collect and maintain copies of the written directives.

A match through facial recognition technology gives police a starting point in an investigation, but it may not always be accurate, said James Adams, accreditation program coordinator for the association. Police have to be “very careful” if a match found through this technology is the only piece of evidence connecting a person to an investigation, he said.

“What that hit should mean is, you have a person that you should take a look at as a possible suspect and develop cooperating or independent evidence above and beyond,” Adams said.

CMU’s use of facial recognition technology, if implemented during its three-year accreditation cycle, would be noted in annual reports police agencies must provide the association, he said. He declined to provide any annual reports but said CMU “truly” embraces accreditation.

With its status as a research university, Marlow of the ACLU said CMU is well-positioned to understand that facial recognition technology poses risks toward people of color and presents privacy concerns for students and people who live nearby.

An earlier national analysis of facial recognition software found that algorithms more often misidentified Black women in a type of matching that can be used in law enforcement, a difference the researchers said could lead to false accusations. Even with perfect results, opponents believe law enforcement’s use of the technology could still perpetuate racist outcomes.

“Why would an enlightened university want to make life on or around their campus more dangerous for students of color?” Marlow said.

Clarity and trust

Facial recognition technology is evolving, and while there have been “horror stories,” there have also been successes, Adams said. He attributed most of the harmful use cases to police rushing to judgment based on a match and said he strongly believes the technology shouldn’t be tossed out or ignored on that account.

“I think, right now, it’s more important that there’s sound policy by an agency if they’re going to utilize facial recognition and give their officers guidance as far as how it’s used,” he said.

A university’s use of facial recognition technology raises questions about how it will manage and ensure the security of the collected student data as well as what procedures it will put in place for redress if there’s an incorrect match, Parthasarathy said.

The draft does not include information explicitly on how CMU would manage data collected through facial recognition technology but states that the university must:

  • Store video images in a secure location
  • Store video images for between 32 and 90 days, unless needed for legal reasons or other approved uses
  • Provide access to camera views and video images on the Pittsburgh campus only to CMU police and those granted access by the police chief
  • Prohibit personnel from accessing, using and sharing collected information except for university purposes, including law enforcement investigations
  • Prohibit the release of recorded images without the authorization of the university’s police chief and vice president for operations, in consultation with CMU’s general counsel.

The draft also states that CMU cannot capture audio through security cameras or video security systems or install them in places where there’s a reasonable expectation of privacy, such as bathrooms and dorm rooms. Security cameras will not be monitored constantly “under normal operating conditions,” and campus police will place signs in certain locations to inform viewers that the area may be under surveillance, the draft reads.

Those who violate the policy may face disciplinary action, according to the draft.

Carnegie Mellon University’s campus in Oakland. (Photo by Clare Sheedy/PublicSource)

If a university does adopt facial recognition technology, Parthasarathy said it must include community members in decisions about its procurement and implementation. There needs to be “a lot of clarity” about how the technology will be used and how the university will protect the rights of its community, she said.

“I still don’t think it’s necessary,” said Molly Kleinman, managing director of the University of Michigan’s Science, Technology and Public Policy program and co-author of the report on facial recognition in K-12 schools. “But if the university is going to use this tool, they should be responsible about it.”

According to the draft policy, CMU’s chief information officer would meet at least annually with an advisory group of “alumni, academic and administrative leadership representation” to provide guidance on issues relating to the policy’s implementation and enforcement. The group would review the policy at least once every three years.

The university police department would also conduct an annual audit summarizing the department’s compliance with the policy, the recorded video images that were accessed for university purposes and those that were released outside the university, according to the draft.

Marlow said CMU should prohibit the use of facial recognition technology entirely. What’s at stake, he said, is whether people feel comfortable and safe being on or around the university’s campus.

Students’ trust in their university is at stake, too, Kleinman said.

“It risks breaching the trust between students and the institution and alienating them from the institution in some ways that I think university leaders probably aren’t considering right now.”

This story was fact-checked by Charlie Wolfson.

Emma Folts covers higher education at PublicSource, in partnership with Open Campus. She can be reached at

Higher education reporter for PublicSource in partnership with Open Campus.