Ten Years After One of the Most Infamous Tragedies in US History, the Tech Sector Works to Prevent the Next One

December 15, 2022 - Baystreet.ca


The year was 2012, and on the day of December 14th, the USA suffered a horrific tragedy at Sandy Hook Elementary School in Connecticut. Now ten years later, the nation still mourns the unthinkable horrors of gun violence against young children.

In order to attempt to prevent any future atrocities such as what happened in 2012, the tech sector has been working diligently to properly bring our personal security into the 21st Century.

In fact, Leidos Holdings (NYSE:LDOS) is a technology leader that just announced a deal with a major airport, being selected by New-South Synergy to upgrade US Transportation Security Administration (TSA) checkpoints at Hartsfield-Jackson Atlanta International Airport (ATL).  

Soon to be gone are the questionably effective practices, such as the 100ml liquid rule at airports, while even the TSA is allowing customers the ability to enroll in a program called TSA PreCheck, that breaks down to only $17 per year.

Instead, the future of security looks to include more futuristic items that were predicted in years past, including many new forms of security robots.

In particular, security robots are autonomous robotic systems designed to perform their tasks without human intervention. They’re being employed in public spaces such as casinos, parking lots, airports, banks, hotels and public parks.

According to Markets and Markets, the Global Security Robots market is currently estimated to be US$31.7 billion, and is projected to reach US$71.8 billion by 2027, growing at an exceptional CAGR of 17.8% from 2022 to 2027.

And according to a new 2023 US Public Safety Trends Report released by Mark43, it found that members of public safety organizations believe leveraging technology and data analysis are critical tools in addressing crime and violence, while continuing to strengthen community engagement.

“We heard from first responders across the country and to best serve their communities, they said they need access to modernized systems to help increase efficiency and decrease the number of hours lost to their daily responsibility of handling reporting and administrative tasks,” said Matt Polega, Co-founder and Head of External Affairs, Mark43, which works with over 200 agencies across the U.S. and U.K., including Boston, DC, Seattle, San Antonio, Atlanta and Cumbria (U.K). “That time could be better spent on-site and in the community. The 2023 U.S. Public Safety Trends Report shows that technology plays a central role in everything a public safety agency does, and by using Mark43 technology, police departments can improve the safety and quality of life for all.”

On top of the use of robots, other public security needs are being enhanced with the use of AI, including artificial intelligence-based gun detection software in surveillance cameras, as part of an effort to address gun-related crimes committed on public transit property in Pennsylvania on SEPTA platforms.

“We’re dealing with, unfortunately, around the country, issues related to gun violence, and while those incidents are very rare on SEPTA, even one is too many,” Andrew Busch, director of media relations for SEPTA, told The Daily Pennsylvanian.

SEPTA approved a six-month pilot program of the technology and a $63,000 budget for its implementation at its November monthly meeting. Three hundred of the 30,000 live cameras currently installed in SEPTA stations.

In addition to the new AI technology, SEPTA is planning to place unarmed security guards in stations who can keep watch on everything that is going on and enforce rules, such as the ban on smoking and that people are paying the ride fare.

“We’re trying to help people who are in crisis situations and treat them with compassion and dignity and help them get the services that they may need. But also we, as part of that, need to make clear that the station or a train or a bus aren’t suitable to be shelters or to be places where drugs can be used out in the open,” added Busch.

Now governments are taking a serious look at the potential value of AI, and how to capture it. When developing and deploying AI use cases, it is critical that governments proactively consider and address the fast-changing universe of privacy and the security risks and ethical pitfalls that AI technologies can expose them to.

Citizens can be catalysts for change by actively demanding higher-quality, faster, and more personalized services from their respective public- and private-sector entities, while also being vocal about the privacy, security, and ethical risks associated with AI.

The biggest potential impact could come from private companies, which can adopt and even innovate AI use cases, build employee capabilities, double down on AI investments, and adopt a systematic approach to identifying and managing the risks related to the this brave new world of advancing technology.