Application - Solution Overview + Alignment

Solution Name

SIID™ Technologies

One-line solution summary:

A software that identifies instances of unfair and biased practices, triggers targeted forms of training, and evaluates effectiveness.

What specific problem are you solving?

Police brutality is a major issue in the U.S., with 1,021 fatal police shootings in 2020. Americans have watched shocking violence committed by police officers, resulting in injury and death which disproportionately impacts people of color like Michael Brown, Sandra Bland, Tamar Clark, Philando Castile, Breonna Taylor, Ahmaud Arbery, and George Floyd, whose killing in May of 2020 by Minneapolis police sparked global outrage and one of the largest settlements for police misconduct.

An analysis of 200 million traffic stops across the U.S. by Stanford researchers showed Black and Latino drivers were stopped and searched twice as much as White drivers who were more likely to have drugs, guns, or other contraband. The study showed a widespread disparity in policing practice across the country. Although many law enforcement agencies are already training on implicit bias, there is no scientific evaluation of implicit bias training on policing.

Whether implicit or explicit, biases are socialized in us all, and have a great impact on how we make-decisions — SIID™ acts as an early-intervention and risk mitigation software solution that can assist law enforcement by improving practices, preventing misconduct, evaluating, train, saving lives, and build trust with the communities.

Pitch your solution.

“Giving officers the tools to de-escalate a situation can often be the difference between life and death.”  Nevada Assembly Speaker, Jason Frierson. 

SIID™ is a software solution that uses AI to help heads of law enforcement debias and improve practices, policies, and evaluation decision-making. Our algorithm builds upon Stanford Professor and SPARQ Co-Director Jennifer Eberhardt’s research which found racial disparities in officer respect through language in body camera footage and Emily Owens' experimental evidence on supervision, training, and policing in the community.

Law enforcement agencies with police officers who wear body cameras and funds allocated to anti-bias training can use our solution to identify instances of bias in training, communications, and policies as well as to measure improvements and community impact over time. 

Explain why you selected this stage of development for your solution.

We are currently building a repository and the detection algorithm for the SIID™.  We’ve proven the concept through the development of a natural language processor (NLP) which functions like a “Grammarly” for bias that helps end-users become more aware of their unconscious biases and helps HR departments monitor workplace culture, mitigate the risk of biased decision-making, and communication and promotes mindfulness. 

The prototype will serve as the baseline algorithm for all future solutions and can be deployed into collaborative workspaces such as Slack, Teams, Discord, and across social media platforms. We plan to conduct closed beta testing from both workforce development and civic tech use case. We’ve already received interest from three small to medium-sized corporate teams and one police agency in piloting our solution. By Q3 2022, we hope to engage up to 600 law enforcement officers.

Our solution's stage of development:

Prototype: A venture or organization building and testing its product, service, or business model

Where are you based?

Minneapolis, MN, USA

Solution Team

  • Eri O'Diah Founder, SIID™ Technologies
 
    Back
to Top