And therein lies a fundamental problem with the location data broker industry: Those of us who use cellphone apps cannot and do not know which of the hundreds of data brokers is buying and trading in our personal information, and we have very little choice in the matter.
It’s become common knowledge that our personal data is collected, sold, and bought as we use the internet. We often chalk this up to advertising. Ads may feel creepy at times, but isn't that an acceptable tradeoff to access apps and content for free?
Maybe, if serving ads was the only way our data was used. But Lena Cohen at the Electronic Frontier Foundation explains how the "Real Time Bidding" (RTB) process behind targeted ads is exploited to collect personal information for a broader range of purposes:
The moment you visit a website or app with ad space, it asks a company that runs ad auctions to determine which ads it will display for you. This involves sending information about you and the content you’re viewing to the ad auction company.
The ad auction company packages all the information they can gather about you into a “bid request” and broadcasts it to thousands of potential advertisers.
The bid request may contain personal information like your unique advertising ID, location, IP address, device details, interests, and demographic information. The information in bid requests is called “bidstream data” and can easily be linked to real people.
Advertisers use the personal information in each bid request, along with data profiles they’ve built about you over time, to decide whether to bid on ad space.
Advertisers, and their ad buying platforms, can store the personal data in the bid request regardless of whether or not they bid on ad space.
Anyone who participates in the auction, even if they don't bid or even plan to place an advertisement, receives all of this personal information. This leads to companies that participate in as many bidding networks as possible to cross-reference and combine data and build out extensive profiles of users:
Mobilewalla collected data on over a billion people, with an estimated 60% sourced directly from RTB auctions. The company then sold this data for a range of invasive purposes, including tracking union organizers, tracking people at Black Lives Matter protests, and compiling home addresses of healthcare employees for recruitment by competing employers. It also categorized people into custom groups for advertisers, such as “pregnant women,” “Hispanic churchgoers,” and “members of the LGBTQ+ community.”
Any data broker that participates in this system can store, consolidate, and sell this information for any purpose outside of advertising:
Some sell their tools to governments for mass surveillance. In the U.S., this is used as a workaround for 4th amendment protections. Agencies claim users have inherently consented to their data being sold to law enforcement by nature of using the internet. They use this to access troves of information that would otherwise require a warrant.
There are some steps you can take to better protect yourself. But the most thorough and permanent solution is to ban behavioral advertising all together.
Cultural technologies are more than just inventions. They “fundamentally alter how we think, create, and make meaning.”
The railroad, the telegraph, and the mechanical clock each began as tools, but their intersection created something far more profound. The need to coordinate train schedules across distances led to standardized time zones, fundamentally transforming how humans conceptualize and experience time itself. This wasn’t just about making trains run on time – it reshaped human consciousness, creating new concepts of punctuality, new forms of social coordination, and new ways of thinking about time as something that could be wasted or saved.
We set arbitrary limits on what we believe we're "allowed" to do. This list is helpful to revisit from time to time. Each time I read it, I rethink the limits I've placed on myself.
Sometimes the stated values or purpose of a person or organization are useful in terms of judging what they have intended to do, but intent never matters nearly as much as impact, so we have to treat the actual outputs of that person or system as the ultimate truth when we assess them.
When we spend our energy pinpointing how systems are supposed to work, we lose sight of their actual impact.
by Emily M. Bender, Alex Hanna, and Adrienne Williams
New(ish) technology, same people and problems. This episode of Mystery AI Hype Theater 3000 explores the ways billionaires with little-to-no experience in education continue pushing solutions based on... what they guess teaching entails?
Adrienne Williams (former educator and current researcher/organizer) reflects:
Most of what I did as a teacher wasn't actually the learning. The kids pick up that learning very quickly if they're happy and they're comfortable and they've eaten food and they aren't being bullied.
...
When they have ten thousand other things going on in their head, the last thing that's going to help is some vacant AI bot just saying whatever, hallucinating whenever it wants to.
The whole episode is worth a listen, but if you're in a crunch you can read the transcript. Other highlights include:
Bill Gates has a history of unilaterally pushing education "reforms" without consulting actual educators, only realizing and acknowledging they don't work after permanently altering the American education system
Silicon Valley "disrupters" exclusively experiment on students in the poorest, least-funded districts because top-rated schools (where they likely send their own children) don't consider the products to be as revolutionary as they claim
AI platforms tend to provide "blueprints" that are only effective when teachers put in time, energy, and expertise to fill in the gaps
This design reiterates a limitation of LLMs: they lack the understanding and context to effectively follow granular rules about responses to avoid. The only way to completely avoid unwanted responses is blunt, unnuanced filters.
When you’re optimizing for efficiency, you’re getting rid of redundancies. But when patients’ lives are at stake, you actually want redundancy. You want extra slack in the system. You want multiple sets of eyes on a patient in a hospital.