Home > National > Black Girl Banned From Michigan Skating Rink After Facial Recognition Software Misidentifies Her

Black Girl Banned From Michigan Skating Rink After Facial Recognition Software Misidentifies Her

By: Zach Liny, The Root

Another day, another incident that indicates facial recognition software is faulty and probably racist.

The Root has published numerous stories demonstrating how voice and facial recognition software often leads to racial bias. We’ve reported stories that show Black people are found to be more vulnerable to being misidentified by this technology than our white counterparts. Despite this, many companies and law enforcement agencies continue to use technology that allows machines to unilaterally identify human beings—because what can go wrong, amirite?

Recently, a Black teenager was removed and banned from a Livonia, Mich. skating rink. The teen and her parents say the business’ facial recognition software mistook her for another Black girl who had gotten into a brawl there prior to the incident. Not only was the teen not involved in the fight, according to the family, but she had never even been to that particular rink before.

“I was like, ‘That’s not me. Who is that?’” Lamya Robinson told Fox 2 Detroit of her experience at Riverside Arena skating rink last Saturday. “I was so confused because I’ve never been there.”

Lamya’s mother, Juliea Robinson, said she dropped her daughter off at the rink to hang out with friends, but she was barred from entering after the facial recognition software scanned her face and identified her as someone she reportedly was not. (So, am I the only one who thinks it’s weird that a skating rink is scanning the faces of its customers? Does everyone who enters get scanned or is it a random thing? Does this skating rink double as Skynet? What kind of Minority Report stuff is going on here?)

“To me, it’s basically racial profiling,” Robinson told Fox 2. “You’re just saying every young Black, brown girl with glasses fits the profile and that’s not right.”

Lamya’s father, Derek Robinson, also expressed contempt for the business and its technology which he said put his daughter at risk.

“You all put my daughter out of the establishment by herself, not knowing what could have happened,” he said. “It just happened to be a blessing that she was calling in frustration to talk to her cousin, but at the same time he pretty much said I’m not that far, let me go see what’s wrong with her.”

From Fox 2:

We have a statement from the skating rink which reads in part:

“One of our managers asked Ms. Robinson (Lamya’s mother) to call back sometime during the week. He explained to her, this our usual process, as sometimes the line is quite long and it’s a hard look into things when the system is running.

“The software had her daughter at a 97 percent match. This is what we looked at, not the thumbnail photos Ms. Robinson took a picture of, if there was a mistake, we apologize for that.”

Call me crazy (but not as crazy as this crazy facial recognition policy is), but maybe a better way of apologizing would be for the company to stop using the software. After all, this isn’t even the first or second story to come out of Michigan regarding a Black person being victimized by faulty software that appears to think we all look alike.

Last year in Detroit, 25-year-old Michael Oliver was falsely accused of a felony after facial recognition software mistook him for a teenage school student who had reached into a teacher’s car and snatched his phone after the teacher recorded his involvement in a fight.

Earlier this year, we reported the story of Robert Williams, 43, who filed a lawsuit against the Detroit Police Department after he was wrongfully arrested and identified as a shoplifting suspect by the department’s facial recognition software.

In fact, Fox 2 reports that Williams testified on Capitol Hill Tuesday, saying, “I just don’t think it’s right, that my picture was used in some type of lineup, and I never been in trouble.”

As for Lamya, it’s unclear if Riverside Arena determined that a mistake was made and permitted her to return to the rink if she chooses to visit in the future. Then again, would you even want to go back to a place if you know you might be misidentified by the machine that scans your face upon entry?

Follow by Email
Verified by MonsterInsights