Digital Witness of the IT Privacy and Security Weekly update for the week ending October 25th, 2022


Daml’ers,


For this update, it’s all digits to hand. We go from Japan and Iran to Prime deliveries from a van.
Digital Witness

Allegations of TikTok’s Chinese monitor, e-mail proof that Parler shares monikers while France fines an unrepentant photograph chronicler.

We have programming language standings, new Australian fines landing and the malware that Qatar is demanding.

And if that bad rap came through clear, we promise it only gets better from here.


JP: Japan steps up push to get public buy-in to digital IDs

Japan has stepped up its push to catch up on digitization by telling a reluctant public they have to sign up for digital IDs or possibly lose access to their public health insurance.

As the naming implies, the initiative is about assigning numbers to people, similar to Social Security numbers in the U.S.

Many Japanese worry the information might be misused or that their personal information might be stolen.

Some view the My Number effort as a violation of their right to privacy.

So the system that kicked off in 2016 has never fully caught on.

Fax machines are still commonplace, and many Japanese conduct much of their business in person, with cash.

Some bureaucratic procedures can be done online, but many Japanese offices still require “inkan,” or seals for stamping, for identification, and insist on people bringing paper forms to offices.

Now the government is asking people to apply for plastic My Number cards equipped with microchips and photos, to be linked to driver’s licenses and public health insurance plans.

Health insurance cards now in use, which lack photos, will be discontinued in late 2024.

People will be required to use My Number cards instead.

So what’s the upshot for you? Force us? That approach has drawn a backlash, with an online petition demanding a continuation of the current health cards drawing more than 100,000 signatures in just a few days.


Global: Ring Cameras Are Being Used To Control and Surveil Overworked Delivery Workers

Networked doorbell surveillance cameras like Amazon’s Ring are everywhere and have changed the nature of delivery work by letting customers take on the role of bosses to monitor, control, and discipline workers, according to a recent report by the Data & Society tech research institute.

“The growing popularity of Ring and other networked doorbell cameras has normalized home and neighborhood surveillance in the name of safety and security,” Data & Society’s Labor Futures program director Aiha Nguyen and research analyst Eve Zelickson write.

“But for delivery drivers, this has meant their work is increasingly surveilled by the doorbell cameras and supervised by customers. The result is a collision between the American ideas of private property and the business imperatives of doing a job.”

Just as important as the adoption of Ring cameras, however, is the rise of delivery work and its transformation into gig labor.

As the report lays out, Ring cameras allow customers to surveil delivery workers and discipline their labor by, for example, sharing shaming footage online.

This dovetails with the “gig-ification” of Amazon’s delivery workers in two ways: labor dynamics and customer behavior.

“Gig workers, including Flex drivers, are sold on the promise of flexibility, independence and freedom. Amazon tells Flex drivers that they have complete control over their schedule, and can work on their terms and in their space,” Nguyen and Zelickson write.

“Through interviews with Flex drivers, it became apparent that these marketed perks have hidden costs: drivers often have to compete for shifts, spend hours trying to get reimbursed for lost wages, pay for wear and tear on their vehicle, and have no control over where they work.”

That competition between workers manifests in other ways too, namely acquiescing to and complying with customer demands when delivering purchases to their homes.

Even without cameras, customers have made onerous demands of Flex drivers even as the drivers are pressed to meet unrealistic and dangerous routes alongside unsafe and demanding productivity quotas.

The introduction of surveillance cameras at the delivery destination, however, adds another level of surveillance to the gig-ification.

The report’s conclusion is clear: Amazon has deputized its customers and made them partners in a scheme that encourages antagonistic social relations, undermines labor rights, and provides cover for a march towards increasingly ambitious monopolistic exploits.

So what’s the upshot for you? As Nguyen and Zelickson point out, it is ingenious how Amazon has “managed to transform what was once a labor cost (i.e., supervising work and asset protection) into a revenue stream through the sale of doorbell cameras and subscription services to residents who then perform the labor of securing their own doorstep.”


US: Forbes Alleges ByteDance Planned to Use TikTok to Monitor Locations of Specific American Citizens

A China-based team at TikTok’s parent company, ByteDance, planned to use the TikTok app to monitor the personal location of some specific American citizens, according to materials reviewed by Forbes.

The team behind the monitoring project — ByteDance’s Internal Audit and Risk Control department — is led by Beijing-based executive Song Ye, who reports to ByteDance cofounder and CEO Rubo Liang.

The team primarily conducts investigations into potential misconduct by current and former ByteDance employees.

But in at least two cases, the Internal Audit team also planned to collect TikTok data about the location of a U.S. citizen who had never had an employment relationship with the company, the materials show.

It is unclear from the materials whether data about these Americans was actually collected; however, the plan was for a Beijing-based ByteDance team to obtain location data from U.S. users’ devices.

Challenging the article, TikTok responded on Twitter that their service “does not collect precise GPS location information from U.S. users, meaning TikTok could not monitor U.S. users in the way the article suggested.”

But Forbes’ senior writer thinks that’s a misleading denial, writing on Twitter that "We never mentioned GPS in the story.

In fact, we quoted their spokesperson saying they collect approximate locations via IP address.

Not using GPS does not mean they could not use that approximate location to monitor certain individuals."

TikTok also acknowledged on Twitter that they do have a team that will “acquire the information they need to conduct internal investigations of violations of the company codes of conduct,” but says the team follows a specific set of policies and processes “as is standard in companies across our industry.”

In Forbes’ article, TikTok spokesperson Maureen Shanahan said that TikTok collects approximate location information (based on IP addresses) to “among other things, help show relevant content and ads to users, comply with applicable laws, and detect and prevent fraud and inauthentic behavior.”

But Forbes’ senior writer said in their article that “the material reviewed by Forbes indicates that ByteDance’s Internal Audit team was planning to use this location information to surveil individual American citizens, not to target ads or any of these other purposes.”

The Internal Audit and Risk Control team runs regular audits and investigations of TikTok and ByteDance employees, for infractions like conflicts of interest and misuse of company resources, and also for leaks of confidential information.

Internal materials reviewed by Forbes show that senior executives, including TikTok CEO Shou Zi Chew, have ordered the team to investigate individual employees, and that it has investigated employees even after they left the company.

"Neither TikTok nor ByteDance denied anything we reported, either in the pre-publication process, when we told them what we planned to report and asked for comment, or since then.

They have also not requested a story update."

So what’s the upshot for you? You could not make this up (and neither could we).


FR: France Fines Clearview AI Maximum Possible For GDPR Breaches

Clearview AI, the controversial facial recognition firm that scrapes selfies and other personal data off the Internet without consent to feed an AI-powered identity-matching service it sells to law enforcement and others, has been hit with another fine in Europe.

This one comes after it failed to respond to an order last year from the CNIL, France’s privacy watchdog, to stop its unlawful processing of French citizens’ information and delete their data.

Clearview responded to that order by, well, ghosting the regulator – thereby adding a third GDPR breach (non-cooperation with the regulator) to its earlier tally.

Here’s the CNIL’s summary of Clearview’s breaches:

  • Unlawful processing of personal data (breach of Article 6 of the GDPR)
  • Individuals’ rights not respected (Articles 12, 15, and 17 of the GDPR)
  • Lack of cooperation with the CNIL (Article 31 of the RGPD)

Clearview PR company’s statement? "There is no way to determine if a person has French citizenship, purely from a public photo from the internet, and therefore it is impossible to delete data from French residents.

Clearview AI only collects publicly available information from the internet, just like any other search engine and besides, we don’t do business in France."

So what’s the upshot for you? Clearview had two months to respond. It didn’t, so now it gets fined $19.57 million. If they have any money left over after this we think they should hire better advisers.


US: Parler Accidentally Doxed Elite Members When Announcing Kanye West Takeover

Parler was so excited to tell its users that the artist formerly known as Kanye West had decided to buy the social media network, it accidentally doxed all its members.

The platform has been embraced by conservatives who departed Twitter over allegations of political censorship, and West, a known lover of controversy, agreed to buy it earlier last week so those users could “freely express” themselves.

But in an email announcing the rapper’s involvement, the company publicly copied in 300-plus email addresses of its verified VIP members instead of blind copying, allowing their personal contact details to be visible to everyone else in the email chain.

So what’s the upshot for you? Some of the well-known names in the email chain include Sen. Ted Cruz, former President Donald Trump, and Rep. Matt Gaetz.


RU: Interested to see what AI can do for animated artwork?

Watch what Dmitrii Tochilkin does with a video called “A Year” using Stable Diffusion and their own custom algorithm.

You’ve read about the “sentient AI” that got the Google engineer fired; see what AI and an evolving algorithm do in the medium of video.

So what’s the upshot for you? OK it’s a little blurry, patchy, and maybe not the smoothest reel we’ve seen but we are going places we could never imagine in much less production time.


US: Conflict of interest?

About four years ago, former Google CEO Eric Schmidt was appointed to the National Security Commission on Artificial Intelligence by the chairman of the House Armed Services Committee.

It was a powerful perch. Congress tasked the new group with a broad mandate: to advise the U.S. government on how to advance the development of artificial intelligence, machine learning, and other technologies to enhance the national security of the United States.

The mandate was simple: Congress directed the new body to advise on how to enhance American competitiveness in AI against its adversaries, build the AI workforce of the future, and develop data and ethical procedures.

In short, the commission, which Schmidt soon took charge of as chairman, was tasked with coming up with recommendations for almost every aspect of a vital and emerging industry.

The panel did far more under his leadership.

It wrote proposed legislation that later became law and steered billions of dollars of taxpayer funds to the industry he helped build – and that he was actively investing in while running the group.

If you’re going to be leading a commission that is steering the direction of government AI and making recommendations for how we should promote this sector and scientific exploration in this area, you really shouldn’t also be dipping your hand in the pot and helping yourself to AI investments.

His credentials, however, were impeccable given his deep experience in Silicon Valley, his experience advising the Defense Department, and a vast personal fortune estimated at about $20 billion.

Five months after his appointment, Schmidt made a little-noticed private investment in an initial seed round of financing for a startup company called Beacon, which uses AI in the company’s supply chain products for shippers who manage freight logistics, according to CNBC’s review of investment information in database Crunchbase.

There is no indication that Schmidt broke any ethics rules or did anything unlawful while chairing the commission.

The commission was, by design, an outside advisory group of industry participants, and its other members included well-known tech executives including Oracle CEO Safra Catz, Amazon Web Services CEO Andy Jassy, and Microsoft Chief Scientific Officer Dr. Eric Horvitz, among others.

Schmidt’s investment was just the first of a handful of direct investments he would make in AI startup companies during his tenure as chairman of the AI commission.

“Venture capital firms financed, in part, by Schmidt and his private family foundation also made dozens of additional investments in AI companies during Schmidt’s tenure, giving Schmidt an economic stake in the industry even as he developed new regulations and encouraged taxpayer financing for it,” adds CNBC.

"Altogether, Schmidt and entities connected to him made more than 50 investments in AI companies while he was chairman of the federal commission on AI.

Information on his investments isn’t publicly available."

So what’s the upshot for you? “All that activity meant that at the same time Schmidt was wielding enormous influence over the future of federal AI policy, he was also potentially positioning himself to profit personally from the most promising young AI companies.”

Citing people close to Schmidt, the report says his investments were disclosed in a private filing to the U.S. government at the time and the public and news media had no access to that document.


US: FCC poised to ban all U.S. sales of new Huawei and ZTE equipment

The Federal Communications Commission (FCC) plans to ban all sales of new Huawei and ZTE telecommunications devices in the U.S. — as well as some sales of video surveillance equipment from three other Chinese firms — out of national security concerns.

Why it matters: The move, which marks the first time the FCC has banned electronic equipment on national security grounds, closes a vise on the two Chinese companies that began tightening during the Trump administration.

The ban marks the culmination of years of warnings from security researchers, analysts, and intelligence agencies that the Chinese government could use Chinese-made telecommunications equipment to spy on Americans.

The FCC order will also determine the scope of a ban on the sales of video surveillance equipment used for public safety. This would affect the Chinese companies Hytera Communications Corporation, Hikvision, and Dahua Technology Company.

So what’s the upshot for you? This will have a greater impact on the smaller US Telecoms companies who opt for lower price point equipment.


AU: Australia to toughen privacy laws with a huge hike in penalties for breaches

Australia has confirmed an incoming legislative change will significantly strengthen its online privacy laws following a spate of data breaches in recent weeks – such as the Optus telco breach last month.

"Unfortunately, significant privacy breaches in recent weeks have shown existing safeguards are inadequate.

It’s not enough for a penalty for a major data breach to be seen as the cost of doing business," said its attorney-general, Mark Dreyfus, in a statement at the weekend.

“We need better laws to regulate how companies manage the huge amount of data they collect, and bigger penalties to incentivize better behavior.”

The changes will be made via an amendment to the country’s privacy laws, following a long process of consultation on reforms.

Dreyfus said the Privacy Legislation Amendment (Enforcement and Other Measures) Bill 2022 will increase the maximum penalties that can be applied under the Privacy Act 1988 for serious or repeated privacy breaches from the current AUS $2.22 million (~$1.4M) penalty to whichever is the greater of:

  • AUS $50 million (~$32M);
  • 3x the value of any benefit obtained through the misuse of information; or
  • 30% of a company’s adjusted turnover in the relevant period.

So what’s the upshot for you? Companies sit in the middle with individuals exercising ever-evolving data breach tricks sitting on one side and regulators with their fines sitting on the other.

It does not excuse them from being stupid in their data handling, but the responsibility of keeping collected data safe has got to be growing more uncomfortable with each new piece of global legislation.


IR: Hacktivists say they stole 100,000 emails from Iran’s nuclear energy agency

Iran’s Atomic Energy Organization has laughed off claims that the email systems of a subsidiary were compromised, revealing important operational data about a nuclear power plant.

An activist group that calls itself Black Reward and claims to be from Iran took to Telegram last Friday with claims it had accessed an email server run by a company related to Iran’s Atomic Energy Organization and exfiltrated 324 inboxes comprising over 100,000 messages totaling over 50G of files.

Black Reward claimed the content of the haul includes construction plans for a nuclear power plant, personal information of Iranians who work for the Organization, and passport details of Russian engineers who assist Iran’s nuclear power efforts.

So what’s the upshot for you? Black Rewards has started posting the info to prove its bona fides.

It recommends the info be accessed in a virtual machine, as the Atomic Energy Organization’s emails are rife with viruses.


Global: PayPal is Getting More Secure Passkey Logins

PayPal announced yesterday that passkeys are being added as a new, password-less login method to secure PayPal accounts for iPhone, iPad, and Mac users on PayPal.com, with plans to expand passkeys to other platforms as they add support.

PayPal passkeys rolled out to US customers yesterday and will be available to “additional countries” in early 2023.

Passkeys are a new type of login credential that uses cryptographic key pairs to do away with passwords altogether.

That way, they’re resistant to phishing attempts and are designed to avoid sharing passkey data between platforms, addressing the weakness of current password-based authentication.

Apple, Google, and Microsoft have all pledged to bring the FIDO Alliance standard to their respective OSes.

The newly-released updates for Apple devices are ready to go, while earlier this month, Google started testing support in Android and Chrome so developers can see how it works, and Microsoft has said its accounts will work with passkey in the “near future.”

So what’s the upshot for you? Existing PayPal users will find an option to create a passkey after logging into their account via a desktop browser or mobile web on devices running iOS 16, iPadOS 16.1, or macOS Ventura.

Users will be prompted to authenticate the setup with Apple Face ID or Touch ID.

A passkey will then be automatically generated and synced across Apple devices with iCloud Keychain.


***CH: Nym’s Plan to Boost Internet Privacy Through ‘Mixnets’ ***

Harry Halpin from Nym’s headquarters in Neuchâtel, Switzerland: How do I communicate with you so that no one else knows I’m communicating with you, even if our messages are encrypted?

You can get a sense of what people are saying from the pattern of communication: Who are you talking with, when are your conversations, and how long do they last…?

There are two key elements: One is the “mixnet,” a technology invented by David Chaum in 1979 that my team has improved.

It relies on the premise that you can’t be anonymous by yourself; you can only be anonymous in a crowd.

You start with a message and break it into smaller units, communications packets, that you can think of as playing cards.

Next, you encrypt each card and randomly send it to a “mixnode” — a computer where it will be mixed with cards from other senders.

This happens three separate times and at three separate mixnodes. Then each card is delivered to the intended recipient, where all the cards from the original message are decrypted and put back into the proper order.

No person who oversees mixing at a single mixnode can know both the card’s origin and its destination. In other words, no one can know who you are talking to.

Q: That was the original mixnet, so what improvements have you made?

Halpin: For one thing, we make use of the notion of entropy, a measure of randomness that was invented for this application by Claudia Diaz, a computer privacy professor at KU Leuven and Nym’s chief scientist.

Each packet you receive on the Nym network has a probability attached to it that tells you, for instance, the odds that it came from any given individual…

Our system uses a statistical process that allows you both to measure entropy and maximize it — the greater the entropy, the greater the anonymity.

There are no other systems out there today that can let users know how private their communications are.

Q: What’s the second key element you referred to?

Halpin: Mixnets, as I said, have been around a long time. The reason they’ve never taken off has a lot to do with economics. Where do the people who are going to do the mixing come from, and how do you pay them?

We think we have an answer from a conversation I had in 2017 with Adam Back, a cryptographer who developed bitcoin’s central “proof of work” algorithm.

I asked him what he would do if he were to redesign bitcoin. He said it would be great if all the computer processing done to verify cryptocurrency transactions — by solving so-called Merkle puzzles that have no practical value outside of bitcoin — could instead be used to ensure privacy.

The computationally expensive part of privacy is the mixing, so it occurred to me that we could use a bitcoin-inspired system to incentivize people to do the mixing. We built our company around that idea…

A new paper that came out in June shows that this approach can lead to an economically sustainable mixnet…

We are not building a currency system or trying to replace the dollar. We just want to provide privacy to ordinary people.

So what’s the upshot for you? Ultimately, whether privacy is judged important enough will come down to the price of “incentivizing” the network that supports it. Getting the balance right and a substantial population could make this a cheap, viable solution to privacy.


Global: The RedMonk Programming Language Rankings: June 2022

To be included in this analysis, a language must be observable within both GitHub and Stack Overflow. If a given language is not present in this analysis, that’s why.

Here in order of popularity are the results to June 2022

  1. JavaScript
  2. Python
  3. Java
  4. PHP
  5. C#
  6. CSS
  7. C++
  8. TypeScript
  9. Ruby
  10. C
  11. Swift
  12. R
  13. Objective-C
  14. Shell
  15. Scala
  16. Go
  17. PowerShell
  18. Kotlin
  19. Rust
  20. Dart

So what’s the upshot for you? Daml is coming.


QA: Going to that "Hot world cup in Qatar? You need to install some malware on your phone first.

https://www.schneier.com/blog/archives/2022/10/qatar-spyware.html

Everyone traveling to Qatar during the football World Cup will be asked to download two apps called Ehteraz and Hayya.

Briefly, Ehteraz is a covid-19 tracking app, while Hayya is an official World Cup app used to keep track of match tickets and to access the free Metro in Qatar.

In particular, the covid-19 app Ehteraz asks for access to several rights on your mobile., like access to read, delete or change all content on the phone, as well as access to connect to WiFi and Bluetooth, override other apps and prevent the phone from switching off to sleep mode.

The Ehteraz app, which everyone over 18 coming to Qatar must download, also gets a number of other accesses such as an overview of your exact location, the ability to make direct calls via your phone, and the ability to disable your screen lock.

The Hayya app does not ask for as much but also has a number of critical aspects.

Among other things, the app asks for access to share your personal information with almost no restrictions.

In addition, the Hayya app provides access to determine the phone’s exact location, prevent the device from going into sleep mode, and view the phone’s network connections.

So what’s the upshot for you? Apparently, not putting the apps on your phone has not caused travelers to Qatar any issues up until very recently, but the games don’t start 'till the 20th. November.

May we suggest taking an old, 2nd hand or burner phone if you are making that trip?

Greenlight_Digital_Witness_Glowing


And our quote of the week: “I don’t know why people are so keen to put the details of their private life in public; they forget that invisibility is a superpower.” - Banksy


That’s it for this week. Stay safe, stay secure, witness the digits, and see you in se7en.



1 Like