Skip to main content
×
Blacklisted Listed News Logo
Menu - Navigation
Menu - Navigation

Cited Sources

2nd Smartest Guy in the World
2nd Amendment Shirts
10th Amendment Center
Aaron Mate
Activist Post
AIER
Aletho News
Ammo.com
AmmoLand
Alliance for Natural Health, The
Alt-Market
American Free Press
Antiwar
Armstrong Economics
Art of Liberty
AUTOMATIC EARTH, The
Ben Bartee
Benny Wills
Big League Politics
Black Vault, The
BOMBTHROWER
Brandon Turbeville
Breaking Defense
Breitbart
Brownstone Institute
Burning Platform, The
Business Insider
Business Week
Caitlin Johnstone
Campus Reform
CAPITALIST EXPLOITS
Charles Hugh Smith
Children's Health Defense
CHRISTOPHE BARRAUD
Chris Wick
CIAgate
Citizen Free Press
Citizens for Legit Gov.
CNN Money
Collective Evolution
Common Dreams
Conscious Resistance Network
Corbett Report
Counter Signal, The
Cryptogon
Cryptome
Daily Bell, The
Daily Reckoning, The
Daily Veracity
DANERIC'S ELLIOTT WAVES
Dark Journalist
David Haggith
Defense Industry Daily
Defense Link
Defense One
Dennis Broe
DOLLAR COLLAPSE
DR. HOUSING BUBBLE
Dr. Robert Malone
Drs. Wolfson
Drudge Report
Economic Collapse, The
ECONOMIC POPULIST, The
Electronic Frontier Foundation
Ellen Brown
Emerald Robinson
Expose, The
F. William Engdahl
FAIR
Farm Wars
Faux Capitalist
FINANCIAL REVOLUTIONIST
Forbes
Foreign Policy Journal
FOREXLIVE
Foundation For Economic Freedom
Free Thought Project, The
From Behind Enemy Lines
From The Trenches
FUNDIST
Future of Freedom Foundation
Futurism
GAINS PAINS & CAPITAL
GEFIRA
Geopolitical Monitor
Glenn Greenwald
Global Research
Global Security
GM RESEARCH
GOLD CORE
Grayzone, The
Great Game India
Guadalajara Geopolitics
Helen Caldicott
Homeland Sec. Newswire
Human Events
I bank Coin
IEEE
IMPLODE-EXPLODE
Information Clearing House
Information Liberation
Infowars
Insider Paper
Intel News
Intercept, The
Jane's
Jay's Analysis
Jeff Rense
John Adams
John Pilger
John W. Whitehead
Jonathan Cook
Jon Rappoport
Jordan Schachtel
Just The News
Kevin Barret
Kitco
Last American Vagabond, The
Lew Rockwell
Le·gal In·sur·rec·tion
Libertarian Institute, The
Libertas Bella
LIBERTY BLITZKRIEG
LIBERTY Forcast
Liberty Unyielding
Market Oracle
Market Watch
Maryanne Demasi
Matt Taibbi
Medical Express
Media Monarchy
Mercola
Michael Snyder
Michael Tracey
Middle East Monitor
Mike "Mish" Shedlock
Military Info Tech
Mind Unleashed, The
Mint Press
MISES INSTITUTE
Mises Wire
MISH TALK
Money News
Moon of Alabama
Motherboard
My Budget 360
Naked Capitalism
Natural News
New American, The
New Eastern Outlook
News Deck
New World Next Week
Nicholas Creed
OF TWO MINDS
Off-Guardian
Oil Price
OPEN THE BOOKS
Organic Prepper, The
PANDEMIC: WAR ROOM
PETER SCHIFF
Phantom Report
Pierre Kory
Political Vigilante
Public Intelligence
Rair
Reclaim The Net
Revolver
Richard Dolan
Right Turn News
Rokfin
RTT News
Rutherford Institute
SAFEHAVEN
SAKER, The
Shadow Stats
SGT Report
Shadowproof
Slay News
Slog, The
SLOPE OF HOPE
Solari
South Front
Sovereign Man
Spacewar
spiked
SPOTGAMMA
Steve Kirsch
Steve Quayle
Strange Sounds
Strike The Root
Summit News
Survival Podcast, The
Tech Dirt
Technocracy News
Techno Fog
Terry Wahls, M.D.
TF METALS REPORT
THEMIS TRADING
Tom Renz
True Activist
unlimited hangout
UNREDACTED
Unreported Truths
Unz Review, The
VALUE WALK
Vigilant Citizen
Voltaire
Waking Times
Wall Street Journal
Wallstreet on Parade
Wayne Madsen
What Really Happened
Whitney Webb
winter oak
Wolf Street
Zero Hedge

You’ve Been Flagged as a Threat: Predictive AI Technology Puts a Target on Your Back

Published: May 11, 2022 | Print Friendly and PDF
  Gab
Share

 
“The government solution to a problem is usually as bad as the problem and very often makes the problem worse.”—Milton Friedman

You’ve been flagged as a threat.

Before long, every household in America will be similarly flagged and assigned a threat score.

Without having ever knowingly committed a crime or been convicted of one, you and your fellow citizens have likely been assessed for behaviors the government might consider devious, dangerous or concerning; assigned a threat score based on your associations, activities and viewpoints; and catalogued in a government database according to how you should be approached by police and other government agencies based on your particular threat level.

If you’re not unnerved over the ramifications of how such a program could be used and abused, keep reading.

It’s just a matter of time before you find yourself wrongly accused, investigated and confronted by police based on a data-driven algorithm or risk assessment culled together by a computer program run by artificial intelligence.

Consider the case of Michael Williams, who spent almost a year in jail for a crime he didn’t commit. Williams was behind the wheel when a passing car fired at his vehicle, killing his 25-year-old passenger Safarian Herring, who had hitched a ride.

Despite the fact that Williams had no motive, there were no eyewitnesses to the shooting, no gun was found in the car, and Williams himself drove Herring to the hospital, police charged the 65-year-old man with first-degree murder based on ShotSpotter, a gunshot detection program that had picked up a loud bang on its network of surveillance microphones and triangulated the noise to correspond with a noiseless security video showing Williams’ car driving through an intersection. The case was eventually dismissed for lack of evidence.

Although gunshot detection program like ShotSpotter are gaining popularity with law enforcement agencies, prosecutors and courts alike, they are riddled with flaws, mistaking “dumpsters, trucks, motorcycles, helicopters, fireworks, construction, trash pickup and church bells…for gunshots.”

As an Associated Press investigation found, “the system can miss live gunfire right under its microphones, or misclassify the sounds of fireworks or cars backfiring as gunshots.”

In one community, ShotSpotter worked less than 50% of the time.

Then there’s the human element of corruption which invariably gets added to the mix. In some cases, “employees have changed sounds detected by the system to say that they are gunshots.” Forensic reports prepared by ShotSpotter’s employees have also “been used in court to improperly claim that a defendant shot at police, or provide questionable counts of the number of shots allegedly fired by defendants.”

The same company that owns ShotSpotter also owns a predictive policing program that aims to use gunshot detection data to “predict” crime before it happens. Both Presidents Biden and Trump have pushed for greater use of these predictive programs to combat gun violence in communities, despite the fact that found they have not been found to reduce gun violence or increase community safety.

The rationale behind this fusion of widespread surveillance, behavior prediction technologies, data mining, precognitive technology, and neighborhood and family snitch programs is purportedly to enable the government takes preemptive steps to combat crime (or whatever the government has chosen to outlaw at any given time).

This is precrime, straight out of the realm of dystopian science fiction movies such as Minority Report, which aims to prevent crimes before they happen, but in fact, it’s just another means of getting the citizenry in the government’s crosshairs in order to lock down the nation.

Even Social Services is getting in on the action, with computer algorithms attempting to predict which households might be guilty of child abuse and neglect.

All it takes is an AI bot flagging a household for potential neglect for a family to be investigated, found guilty and the children placed in foster care.

Mind you, potential neglect can include everything from inadequate housing to poor hygiene, but is different from physical or sexual abuse.

According to an investigative report by the Associated Press, once incidents of potential neglect are reported to a child protection hotline, the reports are run through a screening process that pulls together “personal data collected from birth, Medicaid, substance abuse, mental health, jail and probation records, among other government data sets.” The algorithm then calculates the child’s potential risk and assigns a score of 1 to 20 to predict the risk that a child will be placed in foster care in the two years after they are investigated. “The higher the number, the greater the risk. Social workers then use their discretion to decide whether to investigate.”

Other predictive models being used across the country strive to “assess a child’s risk for death and severe injury, whether children should be placed in foster care and if so, where.”

Incredibly, there’s no way for a family to know if AI predictive technology was responsible for their being targeted, investigated and separated from their children. As the AP notes, “Families and their attorneys can never be sure of the algorithm’s role in their lives either because they aren’t allowed to know the scores.”

One thing we do know, however, is that the system disproportionately targets poor, black families for intervention, disruption and possibly displacement, because much of the data being used is gleaned from lower income and minority communities.

The technology is also far from infallible. In one county alone, a technical glitch presented social workers with the wrong scores, either underestimating or overestimating a child’s risk.

Yet fallible or not, AI predictive screening program is being used widely across the country by government agencies to surveil and target families for investigation. The fallout of this over surveillance, according to Aysha Schomburg, the associate commissioner of the U.S. Children’s Bureau, is “mass family separation.”

The impact of these kinds of AI predictive tools is being felt in almost every area of life.

Under the pretext of helping overwhelmed government agencies work more efficiently, AI predictive and surveillance technologies are being used to classify, segregate and flag the populace with little concern for privacy rights or due process.

All of this sorting, sifting and calculating is being done swiftly, secretly and incessantly with the help of AI technology and a surveillance state that monitors your every move.

Where this becomes particularly dangerous is when the government takes preemptive steps to combat crime or abuse, or whatever the government has chosen to outlaw at any given time.

In this way, government agents—with the help of automated eyes and ears, a growing arsenal of high-tech software, hardware and techniques, government propaganda urging Americans to turn into spies and snitches, as well as social media and behavior sensing software—are spinning a sticky spider-web of threat assessments, behavioral sensing warnings, flagged “words,” and “suspicious” activity reports aimed at snaring potential enemies of the state.

Are you a military veteran suffering from post-traumatic stress disorder? Have you expressed controversial, despondent or angry views on social media? Do you associate with people who have criminal records or subscribe to conspiracy theories? Were you seen looking angry at the grocery store? Is your appearance unkempt in public? Has your driving been erratic? Did the previous occupants of your home have any run-ins with police?

All of these details and more are being used by AI technology to create a profile of you that will impact your dealings with government.

It’s the American police state rolled up into one oppressive pre-crime and pre-thought crime package, and the end result is the death of due process.

In a nutshell, due process was intended as a bulwark against government abuses. Due process prohibits the government of depriving anyone of “Life, Liberty, and Property” without first ensuring that an individual’s rights have been recognized and respected and that they have been given the opportunity to know the charges against them and defend against those charges.

With the advent of government-funded AI predictive policing programs that surveil and flag someone as a potential threat to be investigated and treated as dangerous, there can be no assurance of due process: you have already been turned into a suspect.

To disentangle yourself from the fallout of such a threat assessment, the burden of proof rests on you to prove your innocence.

You see the problem?

It used to be that every person had the right to be assumed innocent until proven guilty, and the burden of proof rested with one’s accusers. That assumption of innocence has since been turned on its head by a surveillance state that renders us all suspects and overcriminalization which renders us all potentially guilty of some wrongdoing or other.

Combine predictive AI technology with surveillance and overcriminalization, then add militarized police crashing through doors in the middle of the night to serve a routine warrant, and you’ll be lucky to escape with your life.

Yet be warned: once you get snagged by a surveillance camera, flagged by an AI predictive screening program, and placed on a government watch list—whether it’s a watch list for child neglect, a mental health watch list, a dissident watch list, a terrorist watch list, or a red flag gun watch list—there’s no clear-cut way to get off, whether or not you should actually be on there.

You will be tracked wherever you go, flagged as a potential threat and dealt with accordingly.

If you’re not scared yet, you should be.

We’ve made it too easy for the government to identify, label, target, defuse and detain anyone it views as a potential threat for a variety of reasons that run the gamut from mental illness to having a military background to challenging its authority to just being on the government’s list of persona non grata.

As I make clear in my book Battlefield America: The War on the American People and in its fictional counterpart The Erik Blair Diaries, you don’t even have to be a dissident to get flagged by the government for surveillance, censorship and detention.

All you really need to be is a citizen of the American police state.

WC: 1677

ABOUT JOHN W. WHITEHEAD

Constitutional attorney and author John W. Whitehead is founder and president The Rutherford Institute. His books Battlefield America: The War on the American People and A Government of Wolves: The Emerging American Police State are available at www.amazon.com. He can be contacted at johnw@rutherford.org. Nisha Whitehead is the Executive Director of The Rutherford Institute. Information about The Rutherford Institute is available at www.rutherford.org.

TOP TRENDING ARTICLES


PLEASE DISABLE AD BLOCKER TO VIEW DISQUS COMMENTS

Ad Blocking software disables some of the functionality of our website, including our comments section for some browsers.


Trending Now



BlackListed News 2006-2023
Privacy Policy
Terms of Service