This should be a wakeup call for politically-engaged funders and anyone who cares about civil society. It’s not that we need to have less conservative algorithms; it’s that whoever controls the algorithms has a disproportionate say over the electorate’s view of the world.
A private intelligence company run by a young founder is now taking that hacked data from what it says are more than 50 million computers, and reselling it for profit to a wide range of different industries, including debt collectors; couples in divorce proceedings; and even companies looking to poach their rivals’ customers.
“The golden rule of privacy… is that information collected for one purpose can’t be repurposed without permission. We don’t have that rule.” Instead, Nojeim explained, the US has a “sectoral approach” which is weakest in areas related to law enforcement access. “We’re at a point in the march forward of technology,” he said, where ”more human thought than ever in the history of mankind is becoming available to the government without the need for a warrant or court permission. Where that goes depends entirely on the goodwill of professionals who enforce vague laws that have been designed intentionally to give them flexibility in emergencies. Unfortunately, those laws are being exploited by the Administration, which is making false claims about emergency [and] national security risk.”
When the government can track where you go, whom you associate with, and what you spend your money on, it violates the Fourth Amendment. It also chills First Amendment freedom of expression, undermines your freedom to travel, and destroys what Justice Louis Brandeis famously called “the right to be left alone” — the fundamental privacy right that underlies American liberty.
I think in a lot of cases when people claim they have nothing to hide, they often jump to thinking about illegal or malicious things. When in fact, privacy, for me, isn’t about “hiding” things at all. You should be able to have the space—both in the physical and digital world—to not be surveilled or have your actions tracked. People should be able to act without intrusion from others—that doesn’t mean you’re hiding anything, but you just don’t want to share everything you do with everyone (or anyone). And really that’s why privacy is considered a fundamental human right.
The refusal of these kinds of AI to admit ignorance or incapacity and their obstinate preference for generating incorrect but plausible-looking answers instead are one of their most dangerous characteristics. It is extremely easy for a user to pose a question to an LLM, get what looks like a valid answer, and then trust to it, without doing the careful inspection necessary to check that it is actually right.
This is an area where human experts and LLMs diverge. Both are capable of making mistakes, but humans who have expertise in a certain area are generally more aware of the limits and gaps in their knowledge.
If you've used 23andMe, you should follow these steps to delete your data and (if applicable) destroy the original DNA sample before the company is sold. Otherwise, your genetic information will be subject to the whims of whatever company acquires the assets of 23andMe.
Amazon is a paperclip-maximizing artificial intelligence, and the paperclip it wants to maximize is profits, and the path to maximum profits is to charge infinity dollars for things that cost you zero dollars. Infinite prices and nonexistent wages are Amazon's twin pole-stars. Amazon warehouse workers don't have to be injured at three times the industry average, but maiming workers is cheaper than keeping them in good health. Once Amazon vanquished its competitors and captured its the majority of US consumers, it raised prices, and used its market dominance to force everyone else to raise their prices, too. Call it "bezosflation"
And therein lies a fundamental problem with the location data broker industry: Those of us who use cellphone apps cannot and do not know which of the hundreds of data brokers is buying and trading in our personal information, and we have very little choice in the matter.
It’s become common knowledge that our personal data is collected, sold, and bought as we use the internet. We often chalk this up to advertising. Ads may feel creepy at times, but isn't that an acceptable tradeoff to access apps and content for free?
Maybe, if serving ads was the only way our data was used. But Lena Cohen at the Electronic Frontier Foundation explains how the "Real Time Bidding" (RTB) process behind targeted ads is exploited to collect personal information for a broader range of purposes:
The moment you visit a website or app with ad space, it asks a company that runs ad auctions to determine which ads it will display for you. This involves sending information about you and the content you’re viewing to the ad auction company.
The ad auction company packages all the information they can gather about you into a “bid request” and broadcasts it to thousands of potential advertisers.
The bid request may contain personal information like your unique advertising ID, location, IP address, device details, interests, and demographic information. The information in bid requests is called “bidstream data” and can easily be linked to real people.
Advertisers use the personal information in each bid request, along with data profiles they’ve built about you over time, to decide whether to bid on ad space.
Advertisers, and their ad buying platforms, can store the personal data in the bid request regardless of whether or not they bid on ad space.
Anyone who participates in the auction, even if they don't bid or even plan to place an advertisement, receives all of this personal information. This leads to companies that participate in as many bidding networks as possible to cross-reference and combine data and build out extensive profiles of users:
Mobilewalla collected data on over a billion people, with an estimated 60% sourced directly from RTB auctions. The company then sold this data for a range of invasive purposes, including tracking union organizers, tracking people at Black Lives Matter protests, and compiling home addresses of healthcare employees for recruitment by competing employers. It also categorized people into custom groups for advertisers, such as “pregnant women,” “Hispanic churchgoers,” and “members of the LGBTQ+ community.”
Any data broker that participates in this system can store, consolidate, and sell this information for any purpose outside of advertising:
Some sell their tools to governments for mass surveillance. In the U.S., this is used as a workaround for 4th amendment protections. Agencies claim users have inherently consented to their data being sold to law enforcement by nature of using the internet. They use this to access troves of information that would otherwise require a warrant.
There are some steps you can take to better protect yourself. But the most thorough and permanent solution is to ban behavioral advertising all together.