Daml’ers,
This week we exercise all your senses: from sound and vision to …smell.
We have car stories worthy of serious rubbernecking with almost no brand of automobile left out.
There’s the NYC school district where teachers have fallen behind the kids in educating themselves.
We’re served an update on the quietest tech rollout the fast food industry has probably ever cooked up.
And then we get a load of privacy stories that may have you wondering where the technology and privacy divergence will leave humanity.
Finally, we end with a couple of products from last week’s Las Vegas Consumer Electronics Show that frankly, “stink”.
Yes, there may be a foulness in the air, but the adventure promises to be the best yet, so grab a clothespin, apply it to your nose, take a deep breath and “Let’s go!”
Global: Google Home speakers allowed hackers to snoop on conversations
A bug in the Google Home smart speaker allowed installing a backdoor account that could be used to control it remotely and to turn it into a snooping device by accessing the microphone feed.
Researcher Matt Kunze discovered the issue and received $107,500 for responsibly reporting it to Google last year. Earlier this week, the researcher published technical details about the finding and an attack scenario to show how the flaw could be leveraged.
The researcher discovered that adding a new user to the target device is a two-step process that requires the device name, certificate, and “cloud ID” from its local API. With this info, they could send a link request to the Google server.
To add a rogue user to a target Google Home device, the analyst implemented the link process in a Python script that automated the exfiltration of the local device data and reproduced the linking request.
The linking request that carries the device ID data
The linking request that carries the device ID data (downrightnifty.me)
The attack is summarized in the researcher’s blog as follows:
- The attacker wishes to spy on the victim within wireless proximity of the Google Home (but does NOT have the victim’s Wi-Fi password).
- The attacker discovers the victim’s Google Home by listening for MAC addresses with prefixes associated with Google Inc. (e.g. E4:F0:42).
- The attacker sends deauth packets to disconnect the device from its network and make it enter setup mode.
- The attacker connects to the device’s setup network and requests its device info (name, cert, cloud ID).
- The attacker connects to the internet and uses the obtained device info to link their account to the victim’s device.
- The attacker can now spy on the victim through their Google Home over the internet (no need to be close to the device anymore).
So what’s the upshot for you? Actually having Google “listen” to a command is hard enough. Do we think there is a threat from this? Certainly, so consideration should be given to the placement of these speakers.
Global: Now Amazon S3 Encrypts New Objects By Default
Amazon Simple Storage Service (Amazon S3) now encrypts all new objects by default. Now, S3 automatically applies server-side encryption (SSE-S3) for each new object, unless you specify a different encryption option.
SSE-S3 was first launched in 2011.
“Amazon S3 server-side encryption handles all encryption, decryption, and key management in a totally transparent fashion.
When you PUT an object, we generate a unique key, encrypt your data with the key, and then encrypt the key with a [root] key.”
So what’s the upshot for you? This change is effective now, in all AWS Regions, including on AWS GovCloud (US) and AWS China Regions.
There is no additional cost for default object-level encryption.
US: The BMW i Vision Dee is a concept car that literally changes color
This CES BMW i Vision Dee concept car is giving us mood-ring flashbacks: BMW showed off the color-changing concept vehicle that one day could match drivers’ emotions.
Even the outside, the front of the car, the area around the headlights and the “grille” – which, on this car, is really a display panel – can exhibit different shapes and hues, creating something like facial expressions. The car can show different moods or reactions, such as approval, happiness, or astonishment, according to BMW.
The type of content, from basic driving information to cartoon characters, that is shown in the windshield and window displays is controlled using a slider control on the dashboard that is, itself, merely a projection.
Rather than having a physical control or a permanent touchscreen, the “Mixed Reality Slider,” as BMW calls it, is projected onto the dashboard while sensors on the surface detect a finger sliding across the control.
With the control, a user can select from five different levels of digital content in the window displays.
The levels range from just the most basic driving information to augmented reality information relating to what’s outside – all the way to fully virtual worlds that obscure everything outside. (Presumably, the fully virtual experience is for use when the car is not being driven.)
So what’s the upshot for you? We’re imagining a sea of red at rush hour.
Global: Try taking the horse to work. It may be more secure.
Sam Curry and coHorts have published a huge list of vulnerabilities across the automotive industry.
Manufacturers affected include:
Acura
BMW
Ferrari
Ford
Genesis
Honda
Hyundai
Infiniti
Jaguar
Kia
Land Rover
Mercedes-Benz
Nissan
Porsche
Reviver
Rolls Royce
SiriusXM (entertainment)
Spireon (GPS Vehicle Tracking and Management)
Toyota
They were able to do things like remote unlock vehicles, precision-locate them, break into their internal infrastructure, do customer account takeovers, pull customer data, and much more.
So what’s the upshot for you? This presents a pretty interesting and somewhat terrifying read (depending on what brand of car you own).
US: San Jose Police Announce Three Stolen Vehicles Recovered Using Automatic License Plate Reader
Saturday night in the Silicon Valley city of San Jose, the assistant police chief tweeted out praise for their recently-upgraded Automatic License Plate Readers (ALPR).
Officers in Air3 [police helicopter], monitoring the ALPR system, got alerted to 3 stolen cars.
They directed ground units to the cars. “All 3 drivers in custody! No dangerous vehicle pursuits occurred, nor were they needed.”
Some context: The San Jose Spotlight (a nonprofit local news site) noted that prior to last year license plate readers had been mounted exclusively on police patrol cars (and in use since 2006).
But last year the San Jose Police Department launched a new “pilot program” with four cameras mounted at a busy intersection that “captured nearly 300,000 plate scans in just the last month, according to city data.”
By August this had led to plans for 150 more stationary ALPR cameras, a local TV station reported. “Just this week, police said they solved an armed robbery and arrested a suspected shooter thanks to the cameras.”
During a forum to update the community, San Jose police also mentioned success stories in other cities like Vallejo where they’ve reported a 100% increase in identifying stolen vehicles.
San Jose is now installing hundreds around the city and the first batch is coming in the next two to three months…
The biggest concern among those attending Wednesday’s meeting was privacy.
But the city made it clear the data is only shared with trained police officers and certain city staff… no out-of-state or federal agencies.
“Anytime that someone from the San Jose Police Department accesses the ALPR system, they have to input a reason, the specific plates they are looking for and all of that information is logged so that we can keep track of how many times it’s being used and what it’s being used for,” said Albert Gehami, Digital Privacy Officer for San Jose.
More privacy concerns were raised in September, reports the San Jose Spotlight:
The San Jose City Council unanimously approved a policy Tuesday that formally bans the police department from selling any license plate data, using that information for investigating a person’s immigration status, or for monitoring legally protected activities like protests or rallies.
So what’s the upshot for you? An EFF position paper argues that “ALPR data is gathered indiscriminately, collecting information on millions of ordinary people.”
By plotting vehicle times and locations and tracing past movements, police can use stored data to paint a very specific portrait of drivers’ lives, determining past patterns of behavior and possibly even predicting future ones — in spite of the fact that the vast majority of people whose license plate data is collected and stored have not even been accused of a crime… [ALPR technology] allows officers to track everyone…"
US: A 250 foot Drop and all four occupants survive. Volvo, Subaru, no… Tesla
The four-door, white Tesla was traveling south on Highway 1 just south of the Tom Lantos Tunnel between Pacifica and Montara on Monday morning when it veered off the road.
It plunged at least 250 feet below that road, and a dramatic rescue mission ensued.
“The damage to the vehicle would indicate that it hit, and then flipped several times.” The car came to rest on its wheels.
“There has been no determination as to what driving mode the Tesla was in; however, that does not appear to be a contributing factor in this incident,” the highway patrol said.
Based on the evidence collected, “investigators developed probable cause to believe this incident was an intentional act."
So what’s the upshot for you? Mom and the kids are in the hospital but dad? Dad’s going to jail.
US: Researchers Could Track the GPS Location of All of California’s New Digital License Plates
A team of security researchers managed to gain “super administrative access” to Reviver, the company behind California’s new digital license plates which launched last year.
That access allowed them to track the physical GPS location of all Reviver customers and change a section of text at the bottom of the license plate designed for personalized messages to whatever they wished, according to a blog post from the researchers.
“An actual attacker could remotely update, track, or delete anyone’s REVIVER plate,” Sam Curry, a bug bounty hunter, wrote in the blog post.
Curry wrote that he and a group of friends started finding vulnerabilities across the automotive industry. That included Reviver.
California launched the option to buy digital license plates in October.
Reviver is the sole provider of these plates, and says that the plates are legal to drive nationwide, and “legal to purchase in a growing number of states.” In the blog post, Curry writes the researchers were interested in Reviver because the license plate’s features meant it could be used to track vehicles.
After digging around the app and then a Reviver website, the researchers found Reviver assigned different roles to user accounts.
Those included “CONSUMER” and “CORPORATE.”
Eventually, the researchers identified a role called “REVIVER,” managed to change their account to it, which in turn granted them access to all sorts of data and capabilities, which included tracking the location of vehicles.
“We could take any of the normal API calls (viewing vehicle location, updating vehicle plates, adding new users to accounts) and perform the action using our super administrator account with full authorization,” Curry writes. “We could additionally access any dealer (e.g. Mercedes-Benz dealerships will often package REVIVER plates) and update the default image used by the dealer when the newly purchased vehicle still had DEALER tags.”
Reviver told Motherboard in a statement that it patched the issues identified by the researchers. "We are proud of our team’s quick response, which patched our application in under 24 hours and took further measures to prevent this from occurring in the future.
Our investigation confirmed that this potential vulnerability has not been misused.
Customer information has not been affected, and there is no evidence of ongoing risk related to this report.
As part of our commitment to data security and privacy, we also used this opportunity to identify and implement additional safeguards to supplement our existing, significant protections," the statement read.
So what’s the upshot for you? And you want a set of these digital plates why?
US: McDonald’s unveils first automated location, social media worried it will cut ‘millions’ of jobs
McDonald’s quietly opened its first automated restaurant at the end of last year, with machines handling everything from taking orders to delivering the food – and dividing opinions everywhere.
“When you step inside the test restaurant concept, you’ll notice it’s considerably smaller than a traditional McDonald’s restaurant in the U.S.,” McDonald’s said in a statement.
“Why? The features—inside and outside—are geared toward customers who are planning to dine at home or on the go.”
Customers can fully avoid interacting with any humans during their order and pickup.
The Fort Worth, Texas, location uses technology to minimize human interaction when ordering and picking up food.
The restaurant features an “Order Ahead Lane” where customers can receive orders by conveyor belt, Newsweek reported.
The initiative is part of McDonald’s “Accelerating the Arches” plan, which works to grow and innovate the customer experience, but not everyone is pleased with the direction the restaurant chain has chosen.
So what’s the upshot for you? McDonald’s has kept this opening ultra low profile with no announcements and no press releases.
Why? They expect a hugely negative reaction to the loss of these jobs to people. And we notice the even since we wrote up these notes, McDonald’s has taken down the one web page detailing the automated branch.
US: Spoiler alert: New York City schools ban access to ChatGPT over fears of cheating and misinformation
The New York City Department of Education has blocked access to ChatGPT on its networks and devices over fears the AI tool will harm students’ education.
A spokesperson for the department, Jenna Lyle, told Chalkbeat New York – the education-focused news site that first reported the story — that the ban was due to potential “negative impacts on student learning, and concerns regarding the safety and accuracy of the content.”
“While the tool may be able to provide quick and easy answers to questions, it does not build critical thinking and problem-solving skills, which are essential for academic and lifelong success,” said Lyle.
So what’s the upshot for you? We bet that there aren’t too many under 17s that have not already tested this out on their homework and have already judged it “not quite good enough.”
FR: France Fines Apple for Illegally Harvesting iPhone Owners’ Data for Ads
“France’s data protection authority, CNIL, fined Apple €8 million (about $8.5 million) Wednesday for illegally harvesting iPhone owners’ data for targeted ads without proper consent.”
It’s an unusual sanction for the iPhone maker, which has faced fewer legal penalties over privacy than its Big Tech competitors.
Apple makes privacy a selling point for its devices, plastering “Privacy. That’s iPhone.” across 40-foot billboards across the world… Apple failed to “obtain the consent of French iPhone users (iOS 14.6 version) before depositing and / or writing identifiers used for advertising purposes on their terminals,” the CNIL said in a statement.
The CNIL’s fine calls out the search ads in Apple’s App Store, specifically. A French court fined the company over $1 million in December over its commercial practices related to the App Store…
With iPhones running iOS 14.6 and below, Apple’s Personalized Advertising privacy setting was turned on by default, leaving users to seek out the control on their own if they wanted to protect their information.
That violates EU privacy law, according to the CNIL… The newer versions of the iPhone operating system corrected the problem, presenting users with a prompt before the advertising data was collected.
An Apple spokesperson responds: "We are disappointed with this decision given the CNIL has previously recognized that how we serve search ads in the App Store prioritizes user privacy, and we will appeal.
Apple Search Ads goes further than any other digital advertising platform we are aware of by providing users with a clear choice as to whether or not they would like personalized ads.
So what’s the upshot for you? France’s fine could be “a signal that Apple may face a less friendly regulatory future in Europe.”
US: Please Don’t Film Us in 2023
The Verge is decrying “a genre of video that derives its entertainment value from unwitting passersby” — like filming pedestrians in a neighborhood in New York City.
Many viewers on TikTok ate it up, but others pushed back on the idea that there’s humor in filming and posting an unsuspecting neighbor for content.
This year, I saw more and more resistance to the practice that’s become normal or even expected… [P]eople who have been featured in videos unbeknownst to them have pointed out that even if there’s no ill will, it’s just unnerving and weird to be filmed by others as if you’re bit characters in the story of their life.
One TikTok user, @hilmaafklint, landed in a stranger’s vlog when they filmed her to show her outfit.
She didn’t realize it had happened until another stranger recognized her and tagged her in the video.
“It’s weird at best, and creepy and a safety hazard at worst,” she says in a video…
Even before TikTok, public space had become an arena for constant content creation; if you step outside, there’s a chance you’ll end up in someone’s video.
It could be minimally invasive, sure, but it could also shine an unwanted spotlight on the banal moments that just happen to get caught on film.
This makeshift, individualized surveillance apparatus exists beyond the state-sponsored systems — the ones where tech companies will hand over electronic doorbell footage without a warrant or where elected officials allow police to watch surveillance footage in real-time.
We’re watched enough as it is.
So what’s the upshot for you? So if you’re someone who makes content for the internet, consider this heartfelt advice and a heads-up. If you’re filming someone for a video, please ask for their consent.
And if we catch you recording us for content, we might just smack your phone away.
US: Meet the Spy Tech Companies Helping Landlords Evict People
Some renters may savor the convenience of “smart-home” technologies like keyless entry and internet-connected doorbell cameras.
But tech companies are increasingly selling these solutions to landlords for a more nefarious purpose: spying on tenants in order to evict them or raise their rent.
Erin McElroy, a professor of American Studies at the University of Texas at Austin who tracks eviction trends, also says that digital surveillance of residential buildings is increasing, particularly in New York City, which she calls the “landlord tech epicenter.”
Any camera system can document possibly eviction-worthy behavior, but McElroy identified two companies, Teman and Reliant Safety, that use the biometrics of tenants with the explicit goal of facilitating evictions.
These companies are part of an expanding industry known as “proptech,” encompassing all the technology used for acquiring and managing real estate.
A report by Future Market Insights predicts that proptech will quadruple its current value, becoming a $86.5 billion industry by 2023. It is also sprouting start-ups to ease all aspects of the business – including the unsavory ones. […]
Reliant Safety, which claims to watch over 20,000 apartment units nationwide, has a less colorful corporate pedigree.
It is owned by the Omni Organization, a private developer founded in 2004 that “acquires, rehabilitates, builds and manages quality affordable housing throughout the United States,” according to its website.
The company claims it has acquired and managed more than 17,000 affordable housing units. Many of the properties it lists are in New York City.
Omni’s website features spotless apartment complexes under blue skies and boasts about sponsorship of after-school programs, food giveaways, and homeless transition programs.
Reliant’s website features videos that depict various violations detected by its surveillance cameras.
The website has a page of “Lease Violations” it says its system has detected, which include things such as “pet urination in hallway,” “hallway fistfight,” “improper mattress disposal,” “tenant slips in hallway,” as well as several alleged assaults, videos of fistfights in hallways, drug sales at doorways and break-ins through smashed windows.
Almost all of them show Black or brown people and almost all are labeled as being from The Bronx – where, in 2016, Omni opened a 140-unit affordable housing building at 655 Morris Avenue that boasted about “state-of-the-art facial recognition building access” running on ubiquitous cameras in common areas.
Reliant presents these as “case studies” and lists outcomes that include arrest and eviction.
Part of its package of services is “illegal sublet detection” using biometrics submitted by tenants to suss out anyone not authorized to be there.
So what’s the upshot for you? Reliant claims its products are rooting out illegal and dangerous activity, but the use of surveillance and biometrics to further extend policing into minority communities could become a major cause for concern to privacy advocates.
Global: A Roomba recorded a woman on the toilet. How did the screenshots end up on Facebook?
https://www.technologyreview.com/2023/01/10/1066500/roomba-irobot-robot-vacuum-be
An investigation recently revealed how images of a minor and a tester on the toilet ended up on social media.
iRobot said it had consent to collect this kind of data from inside homes – but participants say otherwise.
When Greg unboxed a new Roomba robot vacuum cleaner in December 2019, he thought he knew what he was getting into.
He would allow the preproduction test version of iRobot’s Roomba J series device to roam around his house, let it collect all sorts of data to help improve its artificial intelligence, and provide feedback to iRobot about his user experience.
He had done this all before. Outside of his day job as an engineer at a software company, Greg had been beta-testing products for the past decade.
He estimates that he’s tested over 50 products in that time – everything from sneakers to smart home cameras.
But what Greg didn’t know – and does not believe he consented to – was that iRobot would share test users’ data in a sprawling, global data supply chain, where everything (and every person) captured by the device’s front-facing cameras could be seen, and perhaps annotated, by low-paid contractors outside the United States who could screenshot and share images at their will.
Greg, who asked that we identify him only by his first name because he signed a nondisclosure agreement with iRobot, is not the only test user who feels dismayed and betrayed.
Nearly a dozen people who participated in iRobot’s data collection efforts between 2019 and 2022 have come forward in the weeks since MIT Technology Review published an investigation into how the company uses images captured from inside real homes to train its artificial intelligence.
The participants have shared similar concerns about how iRobot handled their data – and whether those practices conform with the company’s own data protection promises.
So what’s the upshot for you? Thankfully this has only happened in a very limited number of test cases.
Still, having something map your home and identify pertinent elements within it when you only wanted it to Hoover your floor could end up being detrimental to you over time.
US: CES’s ‘Worst in Show’ Criticized Over Privacy, Security, and Environmental Threats
https://www.theregister.com/2023/01/06/ces_w
CEO of the how-to repair site iFixit, Kyle Wiens, “We are seeing, across the gamut, products that impact our privacy, products that create cybersecurity risks, that have overarchingly long-term environmental impacts, disposable products, and flat-out just things that maybe should not exist.”
And this year, as in many past years at CES, it’s almost impossible to tell from the products and the advertising copy around them! They’re just not telling you what their actual business model is, and because of that — you don’t know what’s going on with your privacy."
After warning about the specific privacy implications of an add-on to make a smart toilet, they noted there was a close runner-up for the worst privacy: the increasing number of scam products that “are basically based on the digital version of phrenology, like trying to predict your emotions based upon reading your face or other things like that. There’s a whole other category of things that claim to do things that they cannot remotely do.”
To judge the worst in the show by environmental impact, Consumer Reports sent the Associate Director for their Product Sustainability, Research, and Testing team, who chose the 55-inch portable “Displace TV” for being powered only by four lithium-ion batteries (rather than, say, a traditional power cord).
And the “worst in show” award for repairability went to the Ember Mug 2+ — a $200 travel mug “with electronics and a battery inside…designed to keep your coffee hot.” Kyle Wiens, iFixit’s CEO, first noted it was a product that “does not need to exist” in a world that already has equally effective double-insulated, vacuum-insulated mugs and Thermoses.
But even worse: it’s battery-powered, and that battery can’t be easily removed! (If you email the company asking for support on replacing the battery, Wiens claims that "they will give you a coupon for a new, disposable coffee mug. So this is the kind of product that should not exist, doesn’t need to exist, and is doing active harm to the world.
“The interesting thing is people care so much about their $200 coffee mug, the new feature has ‘Find My iPhone’ support. So not only is it harming the environment, but it’s also spying on where you’re located!”
The founder of SecuRepairs.org first warned about “the vast ecosystem of smart, connected products that are running really low-quality, vulnerable software that make our persons and our homes and businesses easy targets for hackers.”
But for the worst in the show for cybersecurity award, they then chose Roku’s new Smart TV, partly because smart TVs in general “are a problematic category when it comes to cybersecurity, because they’re basically surveillance devices, and they’re not created with security in mind.”
And partly because to this day it’s hard to tell if Roku has fixed or even acknowledged its past vulnerabilities — and hasn’t implemented a prominent bug bounty program.
“They’re not alone in this. This is a problem that affects electronics makers of all different shapes and sizes at CES, and it’s something that as a society, we just need to start paying a lot more attention to.”
And US Pirg’s “Right to Repair” campaign director gave the “Who Asked For This” award to Neutrogena’s “SkinStacks” 3D printer for edible skin-nutrient gummies — which are personalized after phone-based face scans. (“Why just sell vitamins when you could also add in proprietary refills and biometric data harvesting.”)
So what’s the upshot for you? Our favs for the worst of the show have to be the
- U-Scan health monitor from Withings. You put it in your toilet and pee on it and you get a urinalysis after the data is analyzed in China. Great … and when it runs out of power you simply take it out and charge it with a USB type C adapter. Think about that.
But beating that out just ever so slightly is
- The “Aroma Shooter” is a Wearable “necklace” of sorts, that transmits smells relevant to what’s being shown on your computer screen. It uses “solid-state” cartridges that come in a variety of “flavors”. Along with the Aroma Player app, it’s a platform that “integrates smell into your favorite games and movies” and allows you to create and distribute your own digital smellscapes.
Our Quote of the week: “If it is communicating with your phone, it’s generally communicating to the cloud too. But more importantly, if a product is gathering data about you and communicating with the cloud, you have to ask yourself: is this company selling something to me, or are they selling me to other people?” - Kyle Wiens
That’s it for this week. Stay safe, stay secure, don’t forget the clothespin (on your nose), and see you in se7en.