Skip to content

‘A lot safer place:’ Calgary university uses AI to monitor threats

The technology alerts officials when anything out of the ordinary occurs
16841910_web1_190514-BPD-M-CGY500547610
Calgary’s Mount Royal University is turning to artificial intelligence technology to make the school safer and allow security officers to catch criminals in the act. The iCentana technology alerts officials on a bank of video screens, shown in a handout photo, when anything out of the ordinary occurs in the normal flow of university hallways and outside the facility. (THE CANADIAN PRESS/HO-Mount Royal University)

A post-secondary school in southern Alberta is turning to artificial intelligence to make its buildings safer and to allow security officers to catch criminals in the act.

The technology at Mount Royal University in Calgary alerts officials when anything out of the ordinary occurs in the normal flow of university hallways or on its grounds.

Developed in Australia, the iCetana system breaks everything down to pixels and “learns” the movement patterns of people, equipment and vehicles across the campus over a 14-day period. It comes to recognize shapes, sizes and movement — but not people or specific objects.

“What it will do if people normally walk in this hallway, and now they’re running, it’s going to … flash it on the screen,” explains Grant Sommerfeld, the university’s associate vice-president of facilities management.

“All the system does essentially is track the movement of pixels. It could be a gun. It could be a backpack, a selfie stick. It’s not going to automatically know that’s a gun. It doesn’t look for images.”

READ MORE: New elementary school means new transportation arrangements

Sommerfeld says security officials used to watch an outdated bank of video screens, which would cycle through cameras on campus.

“Security staff faced with a wall of monitors and a full shift could almost get hypnotized by it and essentially it just becomes like wallpaper.”

New, high-resolution, 360-degree cameras that have been installed across campus catch details the old ones would have missed. And the screens are black unless something out of place has been detected.

“When there’s a change in the movement of the pixels on these screens, it flashes up in security with 15 seconds of what was happening before the pattern changed. It fast-forwards … and presents a real-time image,” says Sommerfeld.

“The human sitting at the desk decides we’ve got to go and investigate or decides, no, that’s fine.”

Sommerfeld says the system also recognizes something as simple as a backpack left in an empty hallway.

“It may be somebody forgot a backpack or it could be more sinister than that. It just knows that something is not normal in the movement on that hallway.”

Sommerfeld says security staff can intervene right away if there is a problem, unlike previously when video was used as evidence after the fact.

“It allows us to become much more proactive in terms of intervening and stopping crimes or bad behaviour,” he says. “I think it makes the school a lot safer place.”

There are other artificial intelligence models being developed, especially in the aftermath of school shootings.

A computer software company out of Philadelphia has it’s own AI threat detection. It identifies weapons and sends images of the firearms and the shooter to law enforcement and school officials.

“We’re not in Canada right now,” says ZeroEyes spokesman Rob Huberty, who adds the technology is still part of pilot projects.

Huberty says the software focuses on active shooter situations and uses video systems already in place.

“We basically take all the weapons that have been used in school shootings as our models and we constantly add to our data base,” Huberty says.

“A lot of people in school shootings expose their weapons fairly early, a lot of times in the parking lot, so we can let everybody know before any shot is fired.

Bill Graveland, The Canadian Press

Like us on Facebook and follow us on Twitter