Skip to main content
×
Blacklisted Listed News Logo
Menu - Navigation
Menu - Navigation

Cited Sources

2nd Smartest Guy in the World
2nd Amendment Shirts
10th Amendment Center
Aaron Mate
Activist Post
AIER
Aletho News
Ammo.com
AmmoLand
Alliance for Natural Health, The
Alt-Market
American Free Press
Antiwar
Armstrong Economics
Art of Liberty
AUTOMATIC EARTH, The
Ben Bartee
Benny Wills
Big League Politics
Black Vault, The
BOMBTHROWER
Brandon Turbeville
Breaking Defense
Breitbart
Brownstone Institute
Burning Platform, The
Business Insider
Business Week
Caitlin Johnstone
Campus Reform
CAPITALIST EXPLOITS
Charles Hugh Smith
Children's Health Defense
CHRISTOPHE BARRAUD
Chris Wick
CIAgate
Citizen Free Press
Citizens for Legit Gov.
CNN Money
Collective Evolution
Common Dreams
Conscious Resistance Network
Corbett Report
Counter Signal, The
Cryptogon
Cryptome
Daily Bell, The
Daily Reckoning, The
Daily Veracity
DANERIC'S ELLIOTT WAVES
Dark Journalist
David Haggith
Defense Industry Daily
Defense Link
Defense One
Dennis Broe
DOLLAR COLLAPSE
DR. HOUSING BUBBLE
Dr. Robert Malone
Drs. Wolfson
Drudge Report
Economic Collapse, The
ECONOMIC POPULIST, The
Electronic Frontier Foundation
Ellen Brown
Emerald Robinson
Expose, The
F. William Engdahl
FAIR
Farm Wars
Faux Capitalist
FINANCIAL REVOLUTIONIST
Forbes
Foreign Policy Journal
FOREXLIVE
Foundation For Economic Freedom
Free Thought Project, The
From Behind Enemy Lines
From The Trenches
FUNDIST
Future of Freedom Foundation
Futurism
GAINS PAINS & CAPITAL
GEFIRA
Geopolitical Monitor
Glenn Greenwald
Global Research
Global Security
GM RESEARCH
GOLD CORE
Grayzone, The
Great Game India
Guadalajara Geopolitics
Helen Caldicott
Homeland Sec. Newswire
Human Events
I bank Coin
IEEE
IMPLODE-EXPLODE
Information Clearing House
Information Liberation
Infowars
Insider Paper
Intel News
Intercept, The
Jane's
Jay's Analysis
Jeff Rense
John Adams
John Pilger
John W. Whitehead
Jonathan Cook
Jon Rappoport
Jordan Schachtel
Just The News
Kevin Barret
Kitco
Last American Vagabond, The
Lew Rockwell
Le·gal In·sur·rec·tion
Libertarian Institute, The
Libertas Bella
LIBERTY BLITZKRIEG
LIBERTY Forcast
Liberty Unyielding
Market Oracle
Market Watch
Maryanne Demasi
Matt Taibbi
Medical Express
Media Monarchy
Mercola
Michael Snyder
Michael Tracey
Middle East Monitor
Mike "Mish" Shedlock
Military Info Tech
Mind Unleashed, The
Mint Press
MISES INSTITUTE
Mises Wire
MISH TALK
Money News
Moon of Alabama
Motherboard
My Budget 360
Naked Capitalism
Natural News
New American, The
New Eastern Outlook
News Deck
New World Next Week
Nicholas Creed
OF TWO MINDS
Off-Guardian
Oil Price
OPEN THE BOOKS
Organic Prepper, The
PANDEMIC: WAR ROOM
PETER SCHIFF
Phantom Report
Pierre Kory
Political Vigilante
Public Intelligence
Rair
Reclaim The Net
Revolver
Richard Dolan
Right Turn News
Rokfin
RTT News
Rutherford Institute
SAFEHAVEN
SAKER, The
Shadow Stats
SGT Report
Shadowproof
Slay News
Slog, The
SLOPE OF HOPE
Solari
South Front
Sovereign Man
Spacewar
spiked
SPOTGAMMA
Steve Kirsch
Steve Quayle
Strange Sounds
Strike The Root
Summit News
Survival Podcast, The
Tech Dirt
Technocracy News
Techno Fog
Terry Wahls, M.D.
TF METALS REPORT
THEMIS TRADING
Tom Renz
True Activist
unlimited hangout
UNREDACTED
Unreported Truths
Unz Review, The
VALUE WALK
Vigilant Citizen
Voltaire
Waking Times
Wall Street Journal
Wallstreet on Parade
Wayne Madsen
What Really Happened
Whitney Webb
winter oak
Wolf Street
Zero Hedge

How to poison the data that Big Tech uses to surveil you

Published: March 8, 2021 | Print Friendly and PDF
  Gab
Share

Algorithms are meaningless without good data. The public can exploit that to demand change.

Every day, your life leaves a trail of digital breadcrumbs that tech giants use to track you. You send an email, order some food, stream a show. They get back valuable packets of data to build up their understanding of your preferences. That data is fed into machine-learning algorithms to target you with ads and recommendations. Google cashes your data in for over $120 billion a year of ad revenue.

Increasingly, we can no longer opt out of this arrangement. In 2019 Kashmir Hill, then a reporter for Gizmodo, famously tried to cut five major tech giants out of her life. She spent six weeks being miserable, struggling to perform basic digital functions. The tech giants, meanwhile, didn’t even feel an itch.

Now researchers at Northwestern University are suggesting new ways to redress this power imbalance by treating our collective data as a bargaining chip. Tech giants may have fancy algorithms at their disposal, but they are meaningless without enough of the right data to train on.

In a new paper being presented at the Association for Computing Machinery’s Fairness, Accountability, and Transparency conference next week, researchers including PhD students Nicholas Vincent and Hanlin Li propose three ways the public can exploit this to their advantage:

  • Data strikes, inspired by the idea of labor strikes, which involve withholding or deleting your data so a tech firm cannot use it—leaving a platform or installing privacy tools, for instance.
  • Data poisoning, which involves contributing meaningless or harmful data. AdNauseam, for example, is a browser extension that clicks on every single ad served to you, thus confusing Google’s ad-targeting algorithms.
  • Conscious data contribution, which involves giving meaningful data to the competitor of a platform you want to protest, such as by uploading your Facebook photos to Tumblr instead.

People already use many of these tactics to protect their own privacy. If you’ve ever used an ad blocker or another browser extension that modifies your search results to exclude certain websites, you’ve engaged in data striking and reclaimed some agency over the use of your data. But as Hill found, sporadic individual actions like these don’t do much to get tech giants to change their behaviors.

What if millions of people were to coordinate to poison a tech giant’s data well, though? That might just give them some leverage to assert their demands.

There may have already been a few examples of this. In January, millions of users deleted their WhatsApp accounts and moved to competitors like Signal and Telegram after Facebook announced that it would begin sharing WhatsApp data with the rest of the company. The exodus caused Facebook to delay its policy changes.

Just this week, Google also announced that it would stop tracking individuals across the web and targeting ads at them. While it’s unclear whether this is a real change or just a rebranding, says Vincent, it’s possible that the increased use of tools like AdNauseam contributed to that decision by degrading the effectiveness of the company’s algorithms. (Of course, it’s ultimately hard to tell. “The only person who really knows how effectively a data leverage movement impacted a system is the tech company,” he says.)

Vincent and Li think these campaigns can complement strategies such as policy advocacy and worker organizing in the movement to resist Big Tech.

“It’s exciting to see this kind of work,” says Ali Alkhatib, a research fellow at the University of San Francisco’s Center for Applied Data Ethics, who was not involved in the research. “It was really interesting to see them thinking about the collective or holistic view: we can mess with the well and make demands with that threat, because it is our data and it all goes into this well together.”

There is still work to be done to make these campaigns more widespread. Computer scientists could play an important role in making more tools like AdNauseam, for example, which would help lower the barrier to participating in such tactics. Policymakers could help too. Data strikes are most effective when bolstered by strong data privacy laws, such as the European Union’s General Data Protection Regulation (GDPR), which gives consumers the right to request the deletion of their data. Without such regulation, it’s harder to guarantee that a tech company will give you the option to scrub your digital records, even if you remove your account.

And some questions remain to be answered. How many people does a data strike need to damage a company’s algorithm? And what kind of data would be most effective in poisoning a particular system? In a simulation involving a movie recommendation algorithm, for example, the researchers found that if 30% of users went on strike, it could cut the system’s accuracy by 50%. But every machine-learning system is different, and companies constantly update them. The researchers hope that more people in the machine-learning community can run similar simulations of different companies’ systems and identify their vulnerabilities.

Alkhatib suggests that scholars should do more research on how to inspire collective data action as well. “Collective action is really hard,” he says. “Getting people to follow through on ongoing action is one challenge. And then there’s the challenge of how do you keep a group of people who are very transient—in this case it might be people who are using a search engine for five seconds—to see themselves as part of a community that actually has longevity?”

These tactics might also have downstream consequences that need careful examination, he adds. Could data poisoning end up just adding more work for content moderators and other people tasked with cleaning and labeling the companies’ training data?

But overall, Vincent, Li, and Alkhatib are optimistic that data leverage could turn into a persuasive tool to shape how tech giants treat our data and our privacy. “AI systems are dependent on data. It’s just a fact about how they work,” Vincent says. “Ultimately, that is a way the public can gain power.”

TOP TRENDING ARTICLES


PLEASE DISABLE AD BLOCKER TO VIEW DISQUS COMMENTS

Ad Blocking software disables some of the functionality of our website, including our comments section for some browsers.


Trending Now



BlackListed News 2006-2023
Privacy Policy
Terms of Service