iPhones, Biometrics, FLoC, New Zealand, and Russia
Harpal Singh / Unsplash
#136: The FBI wanted to unlock the San Bernardino shooter’s iPhone. It turned to a little-known Australian firm.
In wake of a shooting in San Bernardino, California in December 2015, the FBI wanted to unlock the attacker’s iPhone 5c in hopes it would aid their investigation. There was just one problem: the phone was encrypted and a correctly entered PIN was the only thing designed to decrypt it.
They could try brute force the PIN by programmatically trying every possible combination of numbers, not a hard thing to do with today’s computing power. But Apple’s iOS was designed to erase the phone’s data after ten failed attempts. So the FBI obtained a court order asking Apple to remove that feature. Apple, in the name of privacy, refused.
As they were pursuing legal action against Apple, the FBI found help through an undisclosed third party and reported publicly that they successfully decrypted the phone. The third party who cracked the phone and the method in which they did it was unknown until now. A Washington Post investigation showed it was little known Australian security firm Azimuth Security.
The Azimuth engineers never actually broke the encryption. Rather, they simply found a vulnerability in an open source piece of code Apple was using, exploited it, then exploited yet another vulnerability that gave them access to the phone’s processor. This allowed them to brute force the PIN without triggering data erasure. They were reportedly paid $900,000 to do this.
The Post’s findings nicely give this chapter of the Crypto Wars closure while also highlighting how software bugs can easily make us all susceptible to surveillance at the hands of a determined party.
April 15, 2021 | TechCrunch by Natasha Lomas | ~741 words
40 Members of European Parlament, or MEPs, wrote an open letter to the European Commission asking them to ban “all untargeted and indiscriminate mass surveillance, no matter how many people are exposed to the system.”
The letter comes when the Commission is expected next week to propose legislation to regulate artificial intelligence in the EU. A copy of the draft legislation was leaked early last week. This group of MEPs fear it doesn’t go far enough in curbing abuse of biometric surveillance and, in some cases, could even overturn protections already in place.
April 16, 2021 | The Verge by Dieter Bohn | ~1603 words
As discussed in surveil-link #134, Google began to roll out a pilot test of it’s FLoC framework to select users of its Chrome browser. The framework, as explained in surveil-link #77, is meant to replace third party cookies as a means of tracking individual users across the internet and place users in “cohorts” of other users with similar browsing histories. The EFF promptly called the move “a terrible idea” and now other organizations are following suit.
The developers behind popular internet browsers Vivaldi and Brave, the privacy-centric search engine DuckDuckGo, as well as the folks at WordPress – which claims to serve 41% of websites on the web – have all stated that they will block the implementation of it in their products.
Vivaldi and Brave are both built on an open source project known as Chromium. Think of it as core engine that powers Chrome. They will simply a disable site’s ability to access their users’ FLoC data. DuckDuckGo and WordPress simply won’t allow any of their sites or pages to be used in building user cohorts, a move some news sites have made and one I plan to make on surveillance.today in the near future.
Other browsers like Apple’s Safari, Mozilla’s Firefox, Microsoft’s Edge, and Opera don’t utilize Chromium, so they would have to go out of their way to implement FLoC. The article contains comments from nearly all of them concerning what they plan to do going forward to balance advertising and privacy in their products.
April 16, 2021 | RNZ Morning Report | 5:04 (audio)
New Zealand Police recently upgraded their surveillance infrastructure and have called in two experts to help them decide what features they should and should not use. The analysis will last for six months and its findings will be published publicly. The audio contains an interview with the EFF’s Matthew Guariglia outlining the things they should watch out for.
While this may seem like a minor story, I include it to highlight how different governments around the world are going about surveillance regulation. Some cities like York, Pennsylvania are struggling with cameras but juxtapose that to the EU whose own elected officials are calling for the ban of facial recognition. These issues are affecting everyone all over the world and some legislators are handling it better than others.
April 17, 2021 | Washington Post by Robyn Dixon | ~1331 words
Speaking of some legislators handling surveillance regulation better than others: Russia is increasingly relying on facial recognition and other surveillance means to crack down on opposition. Moscow alone has over 200,000 cameras equipped with facial recognition that are used in 70% of investigations, according to the city’s mayor. Sarkis Darvinyan, a civil and digital rights attorney based in Moscow, says they’re being used to track and arrest protesters and that in January and February alone over a dozen were arrested in their homes or on the city metro.
One activist, Sergei Abanichev, was arrested in 2019 after facial recognition identified him throwing a paper cup towards police during a demonstration. Due to this incident, he didn’t participate in the recent protests, but that didn’t stop the system from flagging him as “high alert” on January 31, leading him to being detained for hours. Over 4,500 people were arrested that same day.
He expressed concern to the Post:
“If somebody knows that he or she can be tracked down, they may change their behavior. They may decide not to go to a rally, not to participate in a protest or maybe even not to go to the mosque for prayer, because they know that they can be tracked down.”
In addition to making all residents’ movements known to the police, anyone can also find it for sale on Russia’s digital black market due to some police looking to make a quick buck. One journalist claimed that for around $400, one could buy live access to the entire system. But the corruption doesn’t stop there. None of these cameras can apparently be found around the homes of government officials. Bellingcat, a collective of internet sleuths, recently released a phone call in which a Russian agent involved with the poisoning of opposition leader Alexei Navalny admitted to the cameras being turned off during the operation. Bellingcat also published a document citing a “malfunction” of the cameras around the time and place of the death of a journalist and activist outside a Moscow theater.
I recently was invited to be a regular contributor to San Jose Spotlight, a nonprofit newsroom based in San Jose, CA and my first official piece dropped last Wednesday. In it I address how San Jose is lagging behind its Bay Area neighbors when it comes to privacy legislation, the balance between privacy and security, and the possibility of vaccine passports coming to the Bay Area.
My column will be published the second Wednesday of every month.
And one more shameless plug. There are a lot of articles and pieces that I read having to do with surveillance and other tech issues. Sadly, I can’t include all of them in surveil-links. I try to include the best stories that directly have to do with surveillance and privacy. But if you want to see pieces that are tangentially related or privacy pieces that didn’t make the cut, follow me on Twitter! I’m @egd_io.
Enjoying Surveil-links? Subscribe today and never miss it!
|Get surveil-links directly to your inbox:||Listen wherever you get your podcasts:||Read here or subscribe to the RSS feed:|
Enjoy this piece? Consider letting me know by leaving a tip!Leave me a tip 🙏🏻