Researchers at the University of Iowa have invented a way to autonomously detect on-camera violence that could someday flag or even prevent the abuse of seniors and other vulnerable people.
In a patent application published April 8, the system is described as using motion-tracking and deep-learning methods to identify possible abuse on surveillance cameras. "The alert doesn't go directly to a police station," Karim Abdel-Malek, a professor of biomedical engineering at the University of Iowa and a lead inventor of the technology, told The Academic Times. "It goes to maybe a supervisor, and they actually get a picture of what's going on, so they can go review the video and see what happened."
For the past 15 years, Abdel-Malek and his colleagues have been working on virtual soldiers named Santos and Sophia — physics-based human simulators used by the U.S. military and private companies to study body movement. From that project, the team developed a camera capable of tracking human motion in 3D.
"If you've seen the movie 'Avatar' in 3D, they put markers on the body with reflectors and then shine infrared light to capture the motion," he explained. "Then they use that motion to replicate onto the character. What we did is [allow] a single camera — any webcam or security camera — to do the same thing. That was the first invention."
The technology spun off in several different directions, including a security company called Malum Terminus Technologies, Inc. that hosts a platform for autonomously monitoring live surveillance feeds for drawn weapons, slipping hazards, people who have fallen and other safety risks. "For example, the University of Iowa has 3,000 cameras all over [campus]," Abdel-Malek said. "What we do is tap into these cameras through the servers and run the program, which is able to say, 'Someone with a gun is approaching the school.'"
With the ability to measure human movement, including the speed of arm and leg motions, the team decided to use its invention to detect when someone is struck. To be flagged by the system, a person would have to raise their arm above a certain level and hit someone above specific velocity and acceleration thresholds.
Currently, the system requires one camera and a computer with enough processing power to run deep learning algorithms. "What we do is use just a regular camera and send the signal back to be analyzed on a regular computer that has a little more power," said Abdel-Malek.
"Cameras in general are just all over, they're just ubiquitous," he continued. "It's cheap and available: You don't have to change the camera, just run a program in the background. Cameras today all lead to a server, all internet-connected, so you can tap into a camera just about anywhere and run the software."
The technology is designed to work in real time, capturing physical abuse as it happens, and could be used in a variety of public spaces with security cameras, such as schools. But the researchers believe its primary purpose could be detecting and dissuading violence in nursing homes and senior care facilities that have security cameras in shared living spaces.
"Part of why that's such a unique application is that older folks often don't report being abused for fear of their care, their life," Abdel-Malek said. "Right now, cameras capture [the abuse] but it doesn't get noticed. Nobody goes back to review what happened if they don't know something has happened."
Abdel-Malek concedes that there's a potentially dystopian element to the technology from the standpoint of people concerned with widespread video surveillance, or more specifically, with being misidentified as an abuser by a machine-based system. "That concern is real, but technology has to advance and be used for good," he said. "It's like saying, 'Don't write the computer program because somebody will be able to hack it' — true, people have hacked banks and the Pentagon. You can always use technology in a bad way. Our job is to advance the technology and make it do good things for people."
In any case, the new violence-detection technology just passed the proof-of-concept stage and it could be up to a year before it's rolled out as a product, Abdel-Malek said. He hopes that it will eventually be used to protect elderly people before they're hurt: "We would rather that nobody does physical abuse and that the cameras would be there to say, 'Don't even think about it.'"
The application for the patent, "System and method for the autonomous identification of physical abuse," was filed Oct. 2, 2020 with the U.S. Patent and Trademark Office. It was published April 8, 2021, with the application number 17/061980. The inventors of the pending patent are Karim Abdel-Malek, Kimberly Farrell, Rajan M. Bhatt, Jasbir S. Arora, Kevin C. Kregel, Landon C. Evans, Robert B. Wallace and Daniel L. Clay. The assignee is the University of Iowa Research Foundation.
Parola Analytics provided technical research for this story.