Highlights from the article:
"The software clearly is not ready for use in a law enforcement capacity," Ting said. "These mistakes, we can kind of chuckle at it, but if you get arrested and it's on your record, it can be hard to get housing, get a job. It has real impacts."
"The body camera technology is just very far from being accurate," Friedman said. "Until the issues regarding accuracy and racial bias are resolved, we shouldn't be using it."
"Critics contend that the software is particularly problematic when it comes to identifying women, people of color and young people. Ting said those demographics were especially troubling to him, since communities of color have historically often been excessively targeted by police, and immigrant communities are feeling threatened by federal crackdowns on illegal immigration."
Basically, why I'm posting this, is that, Facial Recognition technology is used very often right now. Especially on things like Facebook. And here it is being used for very important purposes, and it is messing up left and right. Human faces aren't as complex as the entire human body. Don't worry, AI isn't going to steal your job in the next decade, or two maybe. Because even if its good, it won't be trusted for a long long time.