Sidestep the law, take all your photos, and target our kids for surveillance — that’s how a U.S. facial recognition startup describing its services looked when the American Civil Liberties Union made a call to arms this week.

We’re not talking about a technology that can recognize, or even report, human faces (although that’s also a thing), but rather an automated software that is supposed to be able to recognize and group photographs into a convenient viewable “profile.”

This is an image that can be searched, separated, tagged and even saved into a Facebook profile, where a pedophile could discover a vulnerable young girl by piecing together her photo with those from other abusive predators, and then exploiting them.

ACLU [Read] “Big Brother Is Coming For You: School Children Isolated, Seized, and Persecuted As School Camera Surveillance Increases”

This screenshot from Susanne M. Schopflin’s documentary Free For All claims that in Chicago, schools are tracking students as they walk to class and have begun to take pictures of their faces as they head back.

ACLU [Read] “The Law Must Be Revamped to Protect Us From A Surveillance State”

When the American Civil Liberties Union, the Electronic Frontier Foundation, and a number of civil rights organizations put out a call to arms against Facebook-style facial recognition last week, the American Civil Liberties Union (ACLU) jumped on board the bandwagon, signaling that its concerns about information sharing and information profiling may be fairly shared by the rest of the tech community.

Alex Abdo, a staff attorney with the ACLU, wrote in a blog post that he's concerned about whether the companies providing facial recognition services are ready for public scrutiny. The following are a few examples of how the companies have dealt with some of their critics:

[W]e know that Facebook's face recognition capabilities are very limited. Rather than work with faces that already exist, Facebook's technology only recognizes new images from photos. It can't even recognize me. Facebook doesn't even wait for our "public" photos before sharing them with other Facebook apps like WhatsApp. We've known that Facebook doesn't comply with the law, at least not yet. Back in February of 2018, a judge ruled that Facebook violated wiretap laws when it used facial recognition to pair photos of my friends and me with a feature in WhatsApp that lets people forward photos to their friends. For the same reason, my Facebook app doesn't currently associate me with faces that are in public photos. Now that Facebook is offering public face recognition, the entire company must move swiftly to respond to these and other challenges. Facebook must ensure that any facial recognition software on its platform complies with human rights standards. Facebook must also make the technology completely transparent, so that users can determine for themselves if it is working properly. Facebook must ensure that it protects the data of its users and communities. This means that Facebook must ensure that its algorithms are transparent and thoroughly tested. It must put strict limitations on the use of facial recognition and be transparent about how it uses facial recognition. People should be able to control how their data is used, in addition to understanding the dangers of sharing their information.

Facebook said last week that it has agreed to stop offering facial recognition features in photos shared on WhatsApp and Instagram.

This year, Facebook said that it would be rolling out new privacy tools to protect its users. But critics argue that Facebook isn’t doing enough.

“A good first step is to clearly explain, as to how and why your face recognition will work and who will be able to use your data,” Abdo wrote. “But the next step is to make sure that your data doesn’t end up on an ad targeting system that can determine whether or not you’re a child.”