Last spring, local John Hobart traveled to Amman, Jordan with a small team of developers from Coria, a tech company he co-founded, with the lofty goal of utilizing data science to come up with a useful response to the refugee crisis.
The mission technically started three years ago when Hobart launched Coria right here in Portland, and since then it’s grown to employ over 40 people. Its main focus involves consulting with clients — typically big businesses — and contextualizing their data and web traffic to use for marketing purposes. But Coria’s also involved in several side projects that Hobart says are using emergent technology to benefit the public. One of these projects just started and, through a partnership with the Greater Portland Immigrant Welcome Center and the Southern Maine Community College, aims to create a job and training network for immigrants and refugees in Portland.
The second project, the one that took him to Jordan, Hobart developed his own facial recognition technology to reunite families separated in the ongoing refugee crisis.
Because of war, unrest, and natural disasters, the world today is facing the largest amount of displaced people since World War II — some 25 million people — many of whom settle in Jordan. According to the UN Refugee Agency, Jordan is in the top three countries containing refugee populations in the world. The country has a population of 9.5 million, 3 million of which are not native to the country.
Hobart, a 43-year old white man originally from Canada but now living in Acton, Maine, is aware of the countless other Westerners that have had dreams of making a positive impact in the Middle East with little to no avail. People with instant plans to “solve the refugee crisis” typically don’t have a deep understanding of the long and complex history of the region, or the tools necessary to navigate the cultural and religious differences. “Do-gooders,” he called them during our interview while letting me know he wasn’t one of them.
“We just wanted to find out if the stuff that we’re good at could be useful to others,” said Hobart, who felt mobilized by both the frequency of anti-immigrant rhetoric here in the U.S., and the severity of the refugee situation abroad.
John Hobart, the co-founder of Coria. Photo by Francis Flisiuk.
But after arriving and consulting with representatives from the United Nations, UNICEF, and the International Committee of the Red Cross, groups working on the front lines of this humanitarian crisis, Hobart quickly realized they didn’t need his help.
“They have their own awesome data nerds there, doing some amazing work,” he said.
It took deeper conversations with locals and closer inspections of some of Jordan’s refugee camps for Hobart’s team to finally settle on facial recognition technology as what they determined to be a meaningful way to leverage their highly specific skills.
“Immediately, one of the recurring themes we experienced was the need for families to be reunited,” said Hobart. “About 28,000 people are forced to flee their homes every day. Families are split up across camps and spontaneous settlements.”
Hobart told me of one Jordanian aid worker he met, whose job it was to visit families with missing loved ones and show them photos from the local morgue to see if there was a match. According to that worker, the process was slow and painful.
“We saw a big need for an on-the-ground identification service,” said Hobart. “Even for identifying dead people.”
Hobart’s team built just that, a software able to scan and identify faces in real time. To use it, one logs onto the program on a computer and loads photos of individuals they're looking for, then a wearable body camera scans the faces of people it encounters and sends an alert whenever a match is made.
John Hobart showing off his technology by capturing all the faces of audience members at the TEDxDirigo Rise event at the State Theatre last month. Photo by TEDxDirigo and Michael Eric Berube.
Hobart wanted to put this technology in the hands of aid workers. Then, in theory, those living and working with refugees in Jordan and elsewhere could wear these smart cameras and accurately identify anyone they're looking for.
The implications of scanning faces in the Middle East
While impressive, it’s easy to see where this technology could go wrong. Many refugees and asylum seekers prefer to stay anonymous and it’s no surprise that face-scanning software doesn’t bode well with many of them, especially considering it could fall into the hands of hackers, or government agencies looking to deport or imprison them.
"Humanitarian principles should include an emphasis on human dignity," says Christy Delafield.
Delafield is a member of Mercy Corps a U.S.-based organization that has worked to resettle millions of Syrian refugees in and around Jordan, she says. Many of them have been separated from their families.
“From a protection standpoint, there are lots of reasons people might choose to protect their privacy,” said Delafield. “They are coming from a conflict zone where they may have seen forced recruitment or detention by various armed actors. They may not only be scared for themselves, but also for other members of their family. There are a lot of open questions about how data like this would be secured, who would have access to it, and for what purposes. The important thing to remember is it's not up to us to make choices for others, especially when it comes to risk and safety.”
In fact, no relief organization that Coria has pitched this tech has agreed to adopt it, citing, according to Hobart, these exact concerns.
Hobart said the International Committee of the Red Cross declined his pitch because they need to maintain amicable relations with the Jordanian government. Meddling with facial recognition software in regions where thousands of immigrants both documented and undocumented travel through and live in could jeopardize those relationships, they told him. According to Hobart, he heard worries that the software could be “co-opted by the Israelis, the Russians, the Chinese, the Jordanian secret service, or the Americans” for purposes far different than Hobart intended.
“That’s the biggest problem we’re facing right now,” said Hobart. “All of the aid groups are highly sensitive to the security environment that they're in.”
Delafield said that conventional methods used to help reunite families includes outreach on television, radio, and social media. There’s also a mechanism in place for people to apply for professional help through local authorities and the U.N Refugee Agency. Her organization, Mercy Corps, also hosts a digital information hub specifically built for refugees and asylum seekers at Khabrona.Info.
Although Mercy Corps is always interested in finding technological solutions, Delafield was hesitant to endorse Coria's software because it’s unclear whether the individuals in front of the body camera could opt out of having their face scanned.
"While not all refugees are uncomfortable with the use of biometric data, many have concerns about safety issues that might result from improper use of that data," said Delafield. "It's essential that individuals have the opportunity to decide for themselves if they are comfortable before a scan takes place. If it's a body camera, I'd be concerned that there might not be an opportunity for truly informed consent."
Young boys in Syria. Photo by Sumaya Agha/Mercy Corps
Despite the ethical concerns, Hobart hasn’t given up on his software and is currently pitching use of it to Refugees International, an organization active in Syria and Somalia, as well as other advocacy groups he referred to as “smaller and more relaxed.”
Weighing the pros and cons of deploying Coria's software illustrates an important aspect of technology exploiting highly personal data: like any powerful tool it can be both helpful and harmful.
For the record, Hobart recognizes the ethical ambiguity of any technology utilizing intimate data. He holds out hope that as long as these emergent technologies aren’t controlled solely by the few, they can do more good than harm.
“Some ways are fantastic and some ways are really creepy,” said Hobart. “There are very little regulations on emergent technologies. We need to self-govern our use of emergent technology because our lawmakers can’t be trusted to do it for us. Emergent technologies have a tendency to be weaponized.”
Here are some more examples as to why he might say that.
From potential injustices...
Back in April, President Trump expanded use of facial recognition tech in airports, as a means to identify which visa holders actually leave the U.S. before their visa expires. As the Verge reported earlier this year, Customs and Border Control Agents are newly allowed to share their cache of high-quality photos with FBI agents, who could cross-reference them to find people with outstanding warrants or inadequate documentation.
Alvaro Bedoya, who studies facial recognition at Georgetown University Law’s Center on Privacy and Technology told the Verge that “suddenly you’re moving from this world in which you’re just verifying identity to another world where the act of flying is cause for a law enforcement search.”
The sketchiness doesn’t end there. According to a 2016 study from the Georgetown Law Center on Privacy and Technology, facial recognition technologies tend to be less accurate when scanning non-white faces. Couple that with the fact that visa holders tend to be non-white, and there's a possibility for bias that could lead to serious civil rights problems against ordinary travelers.
And then there’s the fact that biometrics are not provided the same protections as passwords or PIN codes under the Fifth Amendment because they’re considered “attributes of the body.” This means that a police officer could theoretically compel a suspect to unlock a device like Apple's new iPhone X just by looking at it, whereas they couldn’t legally be able to demand somebody's password.
...to annoying ads
Speaking of the iPhone X, which for the price of $999 offers consumers a full-screen interface that could be unlocked with a mere glance, its release last month reignited all these ongoing fears about the future of privacy.
Consumers online questioned how foolproof the protection actually is, despite Apple’s assurance that there’s a one in a million chance of a random face gaining access — so far the promise rings true because users tried to hack into the devices with photos and masks but were unsuccessful.
But facial recognition technology raises broad ethical concerns, the likes of which journalists, privacy experts, and consumers have been discussing for probably decades now. Each year we learn a little bit more about what big telecommunication companies plan to do with the immense amount of personal data they’ve compiled on their users. For example in October after a model in Cyprus fired off a viral tweet, we learned that Apple has been quietly categorizing women’s lingerie photos for some reason, storing them under the search terms of “bra” and “brassiere” while not organizing photos of men in the same way.
More recently, further dystopian anxieties were stoked when Reuters reported that Apple will make some facial recognition data available to third-party app developers. Imagine, for example, how valuable data on your facial expressions could be to advertisers looking to monitor how you respond (and when) to an app or product.
Apple is, of course, not the only tech company in the market of facial recognition, testing the limits of what consumers are comfortable with: Google can identify faces, as can Facebook, Samsung, and Snapchat, and unlike Apple, that data is stored “in the cloud” on private servers.
Considering all this, it’s no wonder that many are asking: should we be concerned about the sudden ubiquity of facial recognition technology?