If You Feel You’re Largely Overlooked … You Are So Wrong
Content Insider #667 – Just Checkin
By Andy Marken – (andy@markencom.com)
“Don’t forget, you’re behind enemy lines.” – Mark Snow, “Person of Interest,” Warner Bros., 2011-2016
The right to privacy is in a constant state of flux.
You know, yes … but.
The EU has its widely published and broadly enforced General Data Protection Regulation (GDPR) while California and other US states have their own versions of consumer privacy acts (CPAs).
GDPR The EU has its widely published and broadly enforced General Data Protection Regulation (GDPR) while California and other US states have their own versions of consumer privacy acts (CPAs).
In fact, every country has some type of personal data protection, including China with their Personal Information Security Specification.
At the same time, there are a ton of store/application tracking systems all designed to assist you and make things easier for you.
Google, Facebook, Alibaba, Baidu, and other social media keep track of your search and discussion activities to offer you ideas, information and products you’ll probably be interested in.
Amazon, Tencent, and your store loyalty cards follow your purchases, reviewing activities to improve their services and offer you discounts/recommendations on similar products/services.
Netflix and every VOD (ad, transaction and subscription) service follow your viewing likes/dislikes, how much you watch, what screen you watch it on and when you watch it to improve the content they develop for you.
All of that legislative oversight is strictly to protect you from … them.
Safety Costs – Don’t let them kid you, safety comes at a cost as facial recognition technology is used by every city and town, in one way or another, to ensure that bad folks get caught.
To provide personal protection and assistance, businesses and governments have added a new level of safety/security with camera systems and AI-enabled facial recognition solutions … everywhere.
But unlike DNA and fingerprints, this data can be acquired without a person’s knowledge or consent.
No matter where you live/work/relax, you’re on camera the minute you leave your apartment/home.
Don’t Look Up – The best advice is keep your hoodie up and head down if you don’t want to be seen doing the wrong thing … or, just don’t do it. Hiding your face can lead to legal actions in some countries.
Use the ATM, walk past a couple of stores, pop in and get a cup of coffee, take the train to work or drive through the tollbooth/park in the garage–your company ID logs you in automatically.
You may leave work at noon to grab some lunch; and on the way back to the office, use your smartphone to grab a photo of an outfit you want to check online.
You head home after a tough day of shuffling numbers. You walk up to your door, tell Ring to open the door, tell Alexa you’re home and ask to listen to some music. and then later, watch something on Netflix.
At 11:00 p.m., you tell Alexa to lock the door (and windows), lower the thermostat and tuck yourself into bed to prepare for yet another day of obscure freedom.
Throughout the day there are home, business, local, city and state camera systems that help law enforcement capture criminals/wrongdoers to keep you safe and secure.
No, you didn’t exactly ask for the protection or the ease of making decisions/selections, it just sorta’ arrived.
The growth of video surveillance systems and facial recognition can probably be traced to the loss of life on 9/11 and increased violence around the globe.
Facial recognition technology maps faces, compares them to lists of images of “interest” – suspects, missing persons, customers, persons of interest.
And they are increasingly being used everywhere – streets, shopping centers, offices, sports and entertainment venues.
The technology proponents say, “If you’ve got something to be worried about, you should probably be worried; otherwise, it’s probably one of the safest technologies today.”
Still, it poses a major threat to personal privacy and marginalizes groups.
And it’s still a horrible work in progress.
Google suspended its facial recognition research program so they could focus on making the software less racially biased. The company was trying to obtain more pictures of people with darker skin to counteract algorithmic bias in public datasets, which are predominantly white.
The goal, according to an article in the New York Times, was to ensure the company has a fair and secure feature that works across different skin tones and face shapes.
Such issues are of little concern to China, which has been very open regarding their use of cameras and AI-enabled facial recognition software. Anyone applying for mobile or internet service must have their face scanned to verify that they have a valid ID.
In Plain Sight – Protesters in Hong Kong continued making their statements but added face masks to ensure they couldn’t be identified.
During the Hong Kong riots, demonstrators were banned from wearing face masks so they can be more easily identified.
In a 60 Minutes segment early this year, a young lady in China was asked about her concerns regarding the rapid development and use of AI in emotional and facial technology.
Her response was, “I don’t think about it much.”
It’s probably a lot better that way!
But China is far from the only locale using the facial tracking information.
Brits Smile – London is the second largest major city in the world when it comes to the number of cameras located strategically around town to protect it’s population.
Britain uses 420,000 closed-circuit cameras in London, more than any city except Beijing.
The system has often flagged people who had already been dealt with by the legal system.
France is set to become the first European country to use facial recognition technology to give citizens a secure digital identity.
Dubbed Alicem, an acronym for “certified online authentification on mobile,” the application will enable “any individual to prove his/her identity on the internet in a secure manner,” says the interior ministry website.
The program reads the chip on an electronic passport and cross-references its biometric photo with the phone user via facial recognition to confirm the identity.
Thorn: Digital Defenders of Children, a nonprofit dedicated to stopping child sex trafficking and exploitation, has its products used in 23 countries around the world.
Thorn officials note that thousands of minors are coerced into prostitution every day; and over the past year, the organization identified/rescued over 6,000 children around the U.S. . Every day, 150,000 escort ads are posted in cities and towns around the country.
In Detroit, a surveillance program was begun in 2016 to track shooters and carjackers around the city.
The program has met with mixed results.
Researchers found darker-skinned women were identified as men 31 percent of the time. Other studies found African-Americans facial data produced false matches more often than whites.
Analysts concluded that a single algorithm cannot be used for both groups with equal accuracy.
Acknowledging many of the problems that needed to be solved, one pastor said that the tools – carefully used – can help bring answers to people who ask him – “what happened to my child, my loved one? Who did this?”
The city’s chief of police said that with the system and witness assistance, officers were able to apprehend a man now facing charges of three counts of first-degree murder and two counts of assault.
A new generation of cameras/software that enable real-time identity checks is causing concerns among government officials and citizens.
Maybe Not – With DMV databases of photos being the prime source of images for facial recognition applications and an increasing number of youth not acquiring a license, it may be one of the main reasons countries are issuing ID cards for citizens.
The major source for facial recognition inputs are state and national driver’s license and real ID databases.
At the same time, individuals continually add to the photo/video database with photos or where they are at, what they are doing, who they are doing it with to all of their social media “opportunities,” making online images even more valuable, more useful.
While this information is theoretically protected by national statutes in most countries, law enforcement agencies have historically been exempt from the law and their advanced data mining systems have been able to legally abuse and violate people’s privacy.
And yes, the pros and cons are getting a lot of attention from law enforcement, business folks and civil liberty groups around the globe.
Early this year, San Francisco became the first U.S. city to ban the technology and other U.S. cities have since followed.
Despite a large number of well-publicized examples of facial recognition misidentifying individuals and struggling to recognize certain types of faces 9including females and people with darker skin), most people feel it can be an effective tool in solving crimes and delivering customer convenience.
It’s already being used by airlines.
Welcome Aboard – JetBlue was the first airline to use facial recognition in place of passports or personal IDs for boarding. People are “happy” they’re in the system.
JetBlue no longer scans boarding passes; they scan your face.
There was no opt-in, Homeland Security provided their database of citizens’ faces to JetBlue.
Passengers have been delighted because they speed through the check-in line without having to constantly show their passport or ID.
Neighborhood Network – Working with police departments around the U.S., Amazon has expanded the sale of their home Ring visual security solution and enabled law enforcement to almost instantly tap into neighborhood cameras to capture images of potential criminals.
Amazon has established partnerships with more than 500 police departments which gives them direct access to Ring owners for access to their doorbell video to investigate crimes. The relationship includes free or discounted products – streamlining Amazon’s marketing efforts – and the company’s facial recognition technology.
Concerned groups say the relationships give communities a sense that crime is on the rise – it’s declining – and exacerbates racism and racial profile biases.
Still, there are no widely accepted legislative guidelines regarding the technology’s deployment or use.
The use of the technologies also raises questions:
- Do you have the right to your own face, individual privacy?
- Who is responsible for the protection of the information?
- Can I even remove my face from this database?
- How much of this technology is really necessary?
- What would sensible legislation look like?
- How do we ensure businesses, law enforcement doesn’t abuse it?
- Misidentification can, will happen, so how does someone get images and “danger score” removed from the system?
The downsides of the technology are on full display in China.
Leading Figures – China is at the forefront in capturing images of almost every citizen, so the AI-enabled facial recognition can quickly help officials to locate persons of interest.
More than 200m surveillance cameras around the country are used to track shoppers in stores, prevent violent crime, catch jaywalkers. Virtually every Chinese citizen is in the country’s massive facial database and individuals are constantly tracked.
Can citizens trust western countries to act differently?
Mass surveillance programs run by a wide number of national agencies seem to indicate that they could follow the same course.
As John Greer noted, “You’ve granted it the power to see everything, to index, order, and control the lives of ordinary people.”
It may already be time to put the genie back in the bottle.
Andy Marken – andy@markencom.com – is an author of more than 600 articles on management, marketing, communications, industry trends in media & entertainment, consumer electronics, software and applications. Internationally recognized marketing/communications consultant with a broad range of technical and industry expertise especially in storage, storage management and film/video production fields. Extended range of relationships with business, industry trade press, online media and industry analysts/consultants.