Tuesday, 23 Jul 2019
Technology

Facial-recognition startup Owal pushes ahead despite controversy surrounding the technology – Crain’s New York Business

The controversies have not suppressed the appetite for the technology. A year after its launch, Owal, based in Downtown Brooklyn in the Future Data Lab at NYU Tandon, is operating in more than 100 buildings, the majority of them residential properties in New York.

Fried has built privacy safeguards into his services, and he hopes to demonstrate a benign use of the technology in settings as sensitive as people’s homes. But he acknowledges he has work to do.

“We got into this space thinking we could solve a problem that was both socially and technically difficult,” Fried said. “We may not agree on all the answers, but at least we have the right ideas.”

Owal started with a behavioral- recognition service—essentially a closed-circuit system that sends alerts to management when it picks up unusual activity. In the past year it has flagged a domestic abuse incident in a hallway, a man selling a machete in a stairwell and someone wheeling a motorcycle into an elevator.

Owal then ventured into facial recognition, and it now deploys the service in about a quarter of its portfolio, including assisted-living facilities, which use it to prevent residents from wandering.

In New York rentals, the technology’s sole purpose is to keep a kind of score of tenant activity. With the help of residents’ head shots and electronic keys, the system can look for illegal Airbnb rentals or sublets and traffic that might indicate a brothel or a gambling parlor, for example.

Fried said it provides only leads. The system might notice that John Doe has not been in the building in three months and someone else is using his key. Management might then do a LexisNexis search to see if he has registered a car in another state, or it might check for tenant complaints.

As a business, Owal is targeting owners, who face hefty fines if the city finds illegal rentals on their property. But Fried said the service also benefits tenants, who likely don’t want a constant stream of strangers in their hallways.

“We wanted to build a system to help with quality of life but one that didn’t destroy privacy,” Fried said. “We’re concerned with knowing: Did this person reach a threshold where we don’t believe he’s been in town for the last six months? It’s like a FICA score.”

Owal allows the video and data to be handled exclusively by a division of building management, such as a fraud department, that operates off-site and has no relationship with tenants or even knows who they are. For smaller landlords, Owal serves as the fraud department; it keeps the encrypted video and data in-house and out of building owners’ hands.

But whether Owal is still infringing on privacy rights is far from a settled question. Some privacy advocates argue that tenants not only should be informed that they are being surveilled by smart systems but also be given a choice not to be.

“I think every tenant has the right to say no until they totally understand how their rights might be abridged in this scenario,” said Brad Hoylman, the Manhattan state senator who sponsored the bill to ban facial-recognition technology in residential settings.

Hoylman was also leery of Owal’s behavioral-recognition system.

“If I’m signing a lease, I expect my personal liberty to be protected, and that includes not having my behavior monitored or subject to scrutiny,” he said. “I think tenants are right to think these types of things could be used against them.”

Some privacy experts, however, argue that tenants need to be informed merely that a service is in place—albeit with stipulations over how their information will be used and guarantees that it will be protected.

“It’s important that when these technologies are pitched as beneficial to all parties that notice be given to all parties so folks can assess that for themselves,” said John Verdi, vice president of policy at the Future of Privacy Forum, a think tank in Washington, D.C. “These relationships need to be partnerships.”

Fried said he would welcome regulation of facial-recognition systems, including a requirement to notify tenants and others when the technology is in use.

In fact, his contract with building owners requires them to provide notification. Compliance is spotty, though. It has not been an issue with commercial and health care customers and with most of the landlords using the behavioral- analytics product. But Fried acknowledged that landlords using the facial-recognition service generally have not informed their residents, possibly out of fear of blowback.

He added that the program is new, they are still testing it, and they have not established a policy. In addition, the company has a consulting relationship with building owners and cannot force them to comply.