The Ontario City Council in October unanimously swept forward an expansion and upgrade to the city’s video camera system that includes facial recognition technology, gives more people access to video footage, and expands the current camera footprint to include three very public places: parks.

The price tag for Ontario citizens for the upgrade and expansion is $34,000; however, it is the long-term unforeseen cost of facial recognition technology with no rules in place that has me most concerned.

I want the city to ditch that technology altogether. If not, the city needs to put strict rules in place governing the cameras.

When I talked to an Oregon ACLU spokeswoman and lawyer Jack Orchard, of Portland, about the city’s camera upgrade, a lot of concerns were pointed out. They also had a lot of questions — and so do I.

Why is a small city moving toward an emerging technology that other major cities are rapidly moving away from?

New opponents to facial recognition include municipalities in major metropolitan cities across the nation, such as Boise, San Francisco and Oakland, all which recently enacted bans on such technology, citing infringements on people’s privacy and liberty as well as unreliability in the systems overall.

In addition, metro areas have seen racial disparity play out again and again with these cameras, as the technology does not work well in recognizing people of color. These false positives can exacerbate racial disparities in law enforcement, when a system creates false positives.

Even City Manager Adam Brown told me in an email that “we still have a long way to go,” when it comes to the reliability of the cameras. He told me that he saw this play out first-hand at an International City/County Managers Association conference in Nashville, when an AI device tasked with finding a chihuahua produced several different dogs as well as blueberry muffins.

Will the new camera system really deter crimes?

Brown told the City Council the cameras lacking AI technology have already deterred crime and that ultimately, that the goal with the upgrade is “merely to deter and to be able to solve crime and accidents quickly.”

If they are needed as a deterrent for crime, why the need to recognize who did not commit it?

It has been proven in large cities that the cameras are ineffective in helping solve crimes.

Chicago has 10,000 cameras, and in three years of arrests throughout the city, the cameras only assisted in less than 1%.

Why wasn’t the City Council given a presentation highlighting both pros and cons of such a system?

Brown didn’t mention a single negative aspect of facial recognition technology in his presentation to the City Council, making it seem like more of a sales pitch than a decision that should be thought long and hard about before jumping into.

In his near sales-pitch approach he said the upgrade would “bring us into the 21st century to do more with the resources we have by extending our eyes.”

He talked about the benefits, but not the drawbacks of multiple points, including moving to a cloud-based storage system. He touted the scalability of the system, rather than tell the council that giving access to other individuals, especially outside law enforcement agencies, should only be done with careful consideration.

He sold them on warranties.

He even told the council that after a webinar “the staff is very happy with it.”

After little discussion among the council, Brown got a unanimous pass on his one-sided proposal, and he didn’t even suggest any rules to go alongside it.

Wait a minute! Where are the rules?

“Every single individual camera can be given permission to as many or as few people as we like,” Brown told the council.

This statement brings up a host of other questions: Who is going to be the gatekeeper to decide who gets access to the cameras? Will a person get access just to check on the cameras? If so, why? Will the city allow people on City Council access, and if yes, will they still have access during election season when they might be able to see footage of other political candidates?

Will the city allow police to monitor the cameras in what is seen as unconstitutional and warrantless searches? Will law enforcement feel pressure to break state law and work with agencies, such as ICE, to identify illegal immigrants?

Because facial recognition is a system that is prone to misuse, there absolutely needs to be rules in place.

While the state has retention laws about video footage, the city has not set any parameters with which to govern itself. We’ve seen some near malfeasance recently play out with drone video footage handed over to the city. In regards to that footage, Ontario Police Chief Steven Romero, at about noon on Friday, told The Argus Observer that no criminal charges were identified or pursued as a result of the event. He said violations identified fell under city ordinance rules and that they would be up to Brown.

Brown replied to this by saying he didn’t plan on taking administrative action unless he heard otherwise from the council. Does this mean the City Council got to see that footage in order to reach such a decision? If so, why was it not presented to them in a public meeting?

And even later on Friday, Chief Romero said he hoped to find free time to continue going through the footage to “see if it has anything more.”

But the state sets laws governing how long he has to do that, and after repeated questioning, Romero and Brown have still not replied with an answer about how long they plan to keep that drone footage and whether the timeline or reasons are in accordance with state law.

This is exactly why the city needs rules over facial recognition. Having no guidelines in place about who can access a system, how they can use it, when they can use it or why, creates a recipe for abuse.

Cases of misuse across the nation have included people trying to see what there ex is up to, checking in on political opponents and putting on pressure to get unlawful access to video.

I want to believe that every person who will have access to the cameras — especially those in parks where government typically has very limited control — would only use them for the right reason. However, as Orchard reminds, “There are bad people out there who use access to info and technology in a way that provides benefit to them and no benefit to the public.”

As the camera installation has already begun, I urge the city to establish rules post haste to govern who can access the cameras, and when and why. Without these rules in place, due process is at risk.

Leslie Thompson is the editor at The Argus Observer. She can be reached at (541) 823-4818 or by emailing lesliet@argusobserver.com. To comment on this story, go to www.argusobserver.com.

Load comments