Skip to main content
×
Blacklisted Listed News Logo
Menu - Navigation
Menu - Navigation

Cited Sources

2nd Smartest Guy in the World
2nd Amendment Shirts
10th Amendment Center
Aaron Mate
Activist Post
AIER
Aletho News
Ammo.com
AmmoLand
Alliance for Natural Health, The
Alt-Market
American Free Press
Antiwar
Armstrong Economics
Art of Liberty
AUTOMATIC EARTH, The
Ben Bartee
Benny Wills
Big League Politics
Black Vault, The
BOMBTHROWER
Brandon Turbeville
Breaking Defense
Breitbart
Brownstone Institute
Burning Platform, The
Business Insider
Business Week
Caitlin Johnstone
Campus Reform
CAPITALIST EXPLOITS
Charles Hugh Smith
Children's Health Defense
CHRISTOPHE BARRAUD
Chris Wick
CIAgate
Citizen Free Press
Citizens for Legit Gov.
CNN Money
Collective Evolution
Common Dreams
Conscious Resistance Network
Corbett Report
Counter Signal, The
Cryptogon
Cryptome
Daily Bell, The
Daily Reckoning, The
Daily Veracity
DANERIC'S ELLIOTT WAVES
Dark Journalist
David Haggith
Defense Industry Daily
Defense Link
Defense One
Dennis Broe
DOLLAR COLLAPSE
DR. HOUSING BUBBLE
Dr. Robert Malone
Drs. Wolfson
Drudge Report
Economic Collapse, The
ECONOMIC POPULIST, The
Electronic Frontier Foundation
Ellen Brown
Emerald Robinson
Expose, The
F. William Engdahl
FAIR
Farm Wars
Faux Capitalist
FINANCIAL REVOLUTIONIST
Forbes
Foreign Policy Journal
FOREXLIVE
Foundation For Economic Freedom
Free Thought Project, The
From Behind Enemy Lines
From The Trenches
FUNDIST
Future of Freedom Foundation
Futurism
GAINS PAINS & CAPITAL
GEFIRA
Geopolitical Monitor
Glenn Greenwald
Global Research
Global Security
GM RESEARCH
GOLD CORE
Grayzone, The
Great Game India
Guadalajara Geopolitics
Helen Caldicott
Homeland Sec. Newswire
Human Events
I bank Coin
IEEE
IMPLODE-EXPLODE
Information Clearing House
Information Liberation
Infowars
Insider Paper
Intel News
Intercept, The
Jane's
Jay's Analysis
Jeff Rense
John Adams
John Pilger
John W. Whitehead
Jonathan Cook
Jon Rappoport
Jordan Schachtel
Just The News
Kevin Barret
Kitco
Last American Vagabond, The
Lew Rockwell
Le·gal In·sur·rec·tion
Libertarian Institute, The
Libertas Bella
LIBERTY BLITZKRIEG
LIBERTY Forcast
Liberty Unyielding
Market Oracle
Market Watch
Maryanne Demasi
Matt Taibbi
Medical Express
Media Monarchy
Mercola
Michael Snyder
Michael Tracey
Middle East Monitor
Mike "Mish" Shedlock
Military Info Tech
Mind Unleashed, The
Mint Press
MISES INSTITUTE
Mises Wire
MISH TALK
Money News
Moon of Alabama
Motherboard
My Budget 360
Naked Capitalism
Natural News
New American, The
New Eastern Outlook
News Deck
New World Next Week
Nicholas Creed
OF TWO MINDS
Off-Guardian
Oil Price
OPEN THE BOOKS
Organic Prepper, The
PANDEMIC: WAR ROOM
PETER SCHIFF
Phantom Report
Pierre Kory
Political Vigilante
Public Intelligence
Rair
Reclaim The Net
Revolver
Richard Dolan
Right Turn News
Rokfin
RTT News
Rutherford Institute
SAFEHAVEN
SAKER, The
Shadow Stats
SGT Report
Shadowproof
Slay News
Slog, The
SLOPE OF HOPE
Solari
South Front
Sovereign Man
Spacewar
spiked
SPOTGAMMA
Steve Kirsch
Steve Quayle
Strange Sounds
Strike The Root
Summit News
Survival Podcast, The
Tech Dirt
Technocracy News
Techno Fog
Terry Wahls, M.D.
TF METALS REPORT
THEMIS TRADING
Tom Renz
True Activist
unlimited hangout
UNREDACTED
Unreported Truths
Unz Review, The
VALUE WALK
Vigilant Citizen
Voltaire
Waking Times
Wall Street Journal
Wallstreet on Parade
Wayne Madsen
What Really Happened
Whitney Webb
winter oak
Wolf Street
Zero Hedge

Why Are Cops Around the World Using This Outlandish Mind-Reading Tool?

Published: December 7, 2019 | Print Friendly and PDF
  Gab
Share

Source: ProPublica

This story was originally published by ProPublica.

This article was produced in partnership with the South Bend Tribune, a member of ProPublica’s Local Reporting Network in 2018.

ProPublica is a nonprofit newsroom that investigates abuses of power. Sign up to receive our biggest stories as soon as they’re published.

The police gave Ricky Joyner a pen and a nine-page questionnaire.

Write what you did, beginning to end, on the day Sandra Hernandez disappeared, one question asked.

“Went ot work …,” Joyner wrote, transposing the letters in “to.” “Went home toke shower got dress pick Sandra up … went out to eat … went the movies … toke Sandra home … stop at [bar] for little while, then spent the night with a grilfriend.”

“Did you cause Sandra to become missing?” another question asked.

“No,” Joyner wrote.

“How do you feel now that you have completed this form?”

“Yes,” Joyner wrote, that one word the entirety of his answer.

When Hernandez went missing in Elkhart, Indiana, in March of 1992, the police suspected Joyner might be responsible. But Joyner, who worked with Hernandez at a door-manufacturing company, denied having anything to do with her disappearance.

To assess Joyner’s credibility, Elkhart police turned to a tool — well known to many police departments, little known to the public — called Scientific Content Analysis, or SCAN for short.

A detective, trained in SCAN, reviewed Joyner’s written answers. He also examined the answers of a second suspect who filled out the same questionnaire. After conducting his analysis, the detective typed up a two-page report. The second suspect’s responses were “truthful,” the detective concluded. Joyner’s, he determined, were “deceptive.”

He noted that while summarizing the day Hernandez disappeared, Joyner had not used the word “I,” writing, for example, “went home,” not, “I went home.” “That in itself is a signal of deception,” the detective wrote. Instead of writing “my girlfriend,” Joyner had written “a girlfriend.” What’s more, the detective wrote, Joyner’s handwriting was larger and more spread out in the answer’s last two lines than in the previous seven.

When asked why the police should believe his answers, Joyner had written, “I have nothing to hide.”

“This is not the same as stating I did not lie,” the detective wrote.

When Hernandez was later found dead, Joyner was charged with, and convicted of, murder.

In July, ProPublica and the South Bend Tribune wrote about the questionable evidence used against Joyner at trial. But in Joyner’s case, as in many others, the police, while setting the investigation’s course early on, used an investigative tool that exists out of public view. Such tools rarely, if ever, make it into the courtroom because they’re too unreliable to clear even the low threshold for evidence allowed at trial.

SCAN, a product sold by a company called the Laboratory for Scientific Interrogation (LSI), has, in the words of four scholars in a 2016 study, “no empirical support” — meaning, there’s no dependable research showing that it works.

Scientific Content Analysis is akin to other investigative tools scrutinized by ProPublica, including bloodstain-pattern analysis and photo analysis. These analytical techniques promise a degree of certainty — about how blood came to spray across a wall, or whether a particular plaid shirt was worn by a robber — that can guide an investigator or shore up a case. The trial evidence presented against Joyner included yet another example: a prosecution expert testified that two plastic garbage bags — one found in Joyner’s apartment, the other around Hernandez’s head — had “definitely” once been connected. (A statistician said in an interview that this testimony was laced with “a lot of unproven assertions.”) Law enforcement officials hold these tools out as science, even though they have little or no scientific backing.

SCAN’s creator has written, “I am pleased to say SCAN has helped solve thousands of cases over the years.”

While police in Elkhart and elsewhere have used the tool to make critical decisions that can establish an investigation’s direction, SCAN has escaped the scrutiny that comes with being offered in court as proof. Appellate opinions often refer to key pieces of evidence used at trial, but a search of legal databases with opinions from around the country turns up precious few mentions of SCAN.

The detective who used SCAN in the Joyner case was Steve Rezutko. He resigned from the Elkhart police in 2001 after an internal investigation found he had engaged in sexual misconduct with an informant. He died, in an apparent suicide, this year.

In 1994, two years after Hernandez’s death, Rezutko was asked in a deposition to describe his training in SCAN.

“Not great,” Rezutko said. “Been to two schools. At the time, I hadn’t done an awful lot, maybe 40 or 50 interpretations, but I had been to a weeklong school in Indianapolis under the guy who … developed the procedure.”

Joyner’s lawyer asked whether a person’s ability to read and comprehend the English language could affect the results of the questionnaire.

“Well ... you struggle with the same questions I struggled with when I went through the school, went through the sessions,” Rezutko said. “I guess it’s kind of like two and two is four. Why is it four? It’s two and two is four all over the world. Why it is I have no idea.”

Rezutko, like officers across the country, took it on faith that SCAN works, without really understanding how or why.

Local, state and federal agencies from the Louisville Metro Police Department to the Michigan State Police to the U.S. State Department have paid for SCAN training. The LSI website lists 417 agencies nationwide, from small-town police departments to the military, that have been trained in SCAN — and that list isn’t comprehensive, because additional ones show up in procurement databases and in public records obtained by ProPublica. Other training recipients include law enforcement agencies in Australia, Belgium, Canada, Israel, Mexico, the Netherlands, Singapore, South Africa and the United Kingdom, among others.

The tool’s lack of scientific grounding aside, criminal investigators have been quick to seize upon sales pitches for training, exemplified by a company commander with the famed Texas Rangers, who, in an email to his fellow majors, wrote that SCAN’s creator is “a true master at detecting deception.”


For Avinoam Sapir, the creator of SCAN, sifting truth from deception is as simple as one, two, three.

1. Give the subject a pen and paper.

2. Ask the subject to write down his/her version of what happened.

3. Analyze the statement and solve the case.

Those steps appear on the website for Sapir’s company, based in Phoenix. “SCAN Unlocks the Mystery!” the homepage says, alongside a logo of a question mark stamped on someone’s brain. The site includes dozens of testimonials with no names attached. “Since January when I first attended your course, everybody I meet just walks up to me and confesses!” one says. Acronyms abound (VIEW: Verbal Inquiry - the Effective Witness; REASON: REport Automated SOlution Notes), as do products for sale. “Coming Soon! SCAN Analysis of the Mueller Report,” the website teased this year. LSI offers guidebooks, software, kits, discount packages, cassette tapes of seminars and, for computer wallpaper, a picture of a KGB interrogation room.

SCAN saves time, the site says. It saves money. Police can fax a questionnaire to a hundred people at once, the site says. Those hundred people can fax it back “and then, in less than an hour, the investigator will be able to review the questionnaires and solve the case.” “Past students … have reported a dramatic increase in the amount of information obtained from people,” the site says. “Thus, costly and time-consuming outside investigation was reduced to a minimum.”

SCAN works, the site says. “Analysis of statements has been found to be highly accurate and supported by a validation survey conducted in a U.S. governmental agency. In that survey, when SCAN was compared to other methods, the validity of SCAN reached above 95%,” the site says, without identifying the agency or citing or linking to any survey.

Sapir has outlined his background on LinkedIn and in books he’s written, including one in which he uses SCAN to analyze the biblical book of Genesis. He was born in 1949 in Israel. He got a bachelor’s degree in psychology and criminology at Bar-Ilan University and a master’s in criminology at Tel Aviv University. His master’s thesis was on “Interrogation in Jewish Law.” He served in Israeli military intelligence Unit 8200 (a high-tech spy agency akin to America’s NSA). He became a polygraph examiner with the Israel police. In the mid-1980s, he moved to the United States, where he began teaching SCAN to investigators “on six continents.”

Sapir declined to be interviewed for this story. An email response from his company said, “We are proud that over the past 30+ years, LSI and SCAN have promoted justice in society, both for victims of crime and for innocent suspects.”

SCAN’s purpose, the email said, “is not to accuse but to clear the innocent. ... We have had tens of thousands of past students, who have used SCAN for solving hundreds of thousands of cases; and in the end, the solution of each case was based on physical evidence (which SCAN helped to locate) and/or the subject’s freely given confession. SCAN is being tested every day by finding information from within the text, to be confirmed immediately by independent outside investigation. These confirmations are the rock upon which SCAN is based. After all, reality is the ultimate test in science.”

Sapir has described the principles of SCAN on the LSI website and in products that he sells, including two books, sample analyses, a DVD of a television appearance and a bound anthology of newsletters he has written with dozens of case studies.

 

Screenshot of SCAN’s website. The Laboratory for Scientific Interrogation lists hundreds of state and local law enforcement agencies that have received training in SCAN. They come from 49 states, plus the District of Columbia.

With SCAN, Sapir encourages the asking of a simple, open question: What happened? After the person writes a statement, the SCAN investigator looks for signs of deception, analyzing, among other things, pronouns used, changes in vocabulary, what’s left out and how much of a statement is devoted to what happened before, during and after an event. Indications of truthfulness include use of the past tense, first-person singular (“I went to the store”); pronouns, such as “my,” which signal commitment; and direct denials, the best being: “I did not do it.” Signs of deception include lack of memory, spontaneous corrections and swapping one word in for another — for example, writing “kids” in one place and “children” in another.

The SCAN analyst need not know anything about the person or the case. In fact, that’s preferable, Sapir writes. Outside knowledge might contaminate the analysis, and all that matters is the written statement. Sapir likens SCAN to Sudoku, only with words, not numbers, sentences, not squares: “Everything must fit — left to right, and top to bottom.”

The SCAN course teaches students to diagram, circling pronouns and coloring in a statement with blue, green, purple, yellow, orange and pink. Yellow = “‘Unimportant’ information. For example: ‘Brushed my teeth.’” Pink = “Missing time or missing information. … For example: ‘Later on.’ ‘I don’t remember.’”

To show how SCAN works, Sapir, over the years, has used his invention to analyze statements from public officials and people in the news, including two former FBI directors, Robert Mueller and James Comey.

This year, ProPublica purchased Sapir’s sample analysis of the Mueller report ($2.99, on Kindle).

The report on its first page says the FBI opened an investigation “into whether individuals associated with the Trump Campaign were coordinating with the Russian government.”

“Please note,” Sapir writes: “The report says, ‘whether…’ and not ‘whether or not.’”

“By the omission of ‘or not’ it seems that the FBI was already concentrating on only one option,” Sapir writes.

To grammarians, the use of “or not” in that sentence would be redundant and therefore poor writing. To Sapir, the words’ absence reveals intent.

ProPublica also purchased Sapir’s analysis of Comey’s 2018 memoir, “A Higher Loyalty” ($5, as part of a package deal). A hard copy arrived in the mail with an introductory note from Sapir saying: “Realizing that the analysis of this book has political impact, I decided not to put this analysis on the internet.”

The analysis says: “There are several signals that Comey might be a victim of sexual abuse in childhood.” In the 290-page memoir, Sapir notes 14 instances in which Comey describes the opening or closing of a door, be it to a garage, an office or a minivan. “This activity when it enters an ‘open statement’ is correlated very strongly to child abuse in the speaker’s past,” Sapir writes. “This is due to the fact that child abuse starts when the door opens and it ends when the door is closed.”

Comey sometimes refers to former congressman Anthony Weiner’s computer as a “laptop.” Other times he calls the computer a “computer.” “This is ‘unjustified change of language’ indicating that deception might be present,” Sapir writes.

Twenty times, Sapir writes, Comey used the verb “left.” One time he used the verb “departed.” The ratio — 20 “left” to 1 “departed” — makes the scene with the latter word “quite likely to be deceptive,” Sapir writes. He footnotes this sentence and writes: “[‘Average’ vs. ‘deviation’ = 3 vs. 1 = likely deception; 4 vs. 1 = definite deception].” For these numbers, the footnote provides no source.

A ProPublica reporter emailed Comey, asking to interview him about SCAN and the above analysis of his book.

Comey emailed back: “No comment. Never heard of the alleged tool. (And by using the word ‘never’ in conjunction with the word ‘heard,’ I mean only ‘never heard’ and not to suggest childhood trauma. Yikes.) I’m sorry about your five bucks.”

 

The color coding key for Scientific Content Analysis, or SCAN. It teaches students to diagram, circling every pronoun and coloring in a statement with blue, green, purple, yellow, orange and pink. (Courtesy of Washington State Department of Fish and Wildlife, in response to a public records request)

Although Sapir has conducted SCAN training for more than 30 years, publicly available photos or video of him can be hard to find. But ProPublica purchased a DVD from LSI (cost, $10) showing Sapir on a local-access television station in Sterling Heights, Michigan, in the early 1990s.

On the show he is interviewed by Pat Lehman, then the city’s community relations director.

Lehman asked Sapir about Anita Hill, who testified in 1991 at the confirmation hearings for then-Supreme Court nominee Clarence Thomas. Hill told a Senate committee that when she worked for Thomas at the U.S. Department of Education, “I had a normal social life with other men outside of the office.”

“Let’s take this sentence,” Sapir told Lehman.

“There is only a certain group in society that can label themselves as normal,” Sapir said. “Only the people who were labeled abnormal before.”

“Oh, oh my goodness,” Lehman said.

Sapir went on.

“I don’t have a certificate that I am normal. I mean, I would imagine you don’t have one,” he told Lehman, who smiled and chuckled, a look of fascination on her face. “Who has a certificate that is normal? Only someone who was abnormal before he was labeled normal.”

“Hmm,” Lehman said.

“Think of it,” Sapir said.

In her Senate testimony, Hill once referred to herself as an “individual.” Another time she referred to herself as a “person.”

“Anita Hill never called herself a woman,” Sapir said on the television program. “Never did, even once.” Sapir wondered if Hill “didn’t have some, I would say, problem with sexual identity.”

Sapir also offered his take on Magic Johnson, who, in 1991, announced he was HIV positive. When interviewed by Connie Chung, Johnson said he contracted the virus while having sex with a woman.

But Sapir expressed doubts — to a newspaper in San Antonio and in his interview with Lehman.

Chung had asked Johnson about rumors that he might be gay or bisexual. Johnson had responded, “But I’m not gay.”

Sapir analyzed this exchange for Lehman. “He said, ‘I’m not gay.’ So let’s go on from there. So, you know, I have a calculator. See I have a calculator? I take two. We punch two. Why do we punch two? Because she said you might be either gay or bisexual. That’s two. And he denied one. Yeah? We deduct one, what is the total?”

“One,” Lehman said.

“The other one,” Sapir said.

 

“We Deduct One, What Is the Total?”

On a local-access television station, Avinoam Sapir, the creator of Scientific Content Analysis, attached a lot of meaning to a few words from Magic Johnson.


In 2009, in his first days in office, President Barack Obama signed Executive Order 13491, barring federal agents from using waterboarding and similar torture while gathering intelligence. The order also did something else: It created a special interagency task force to study the effectiveness of various approaches to interrogation.

In short, the government wanted to know: Which techniques work? Which ones don’t? In 2010, the research began. A task force, given the unwieldy name of the High-Value Detainee Interrogation Group (acronym, HIG), contracted with “world-renowned, Ph.D.-level scientists” who specialized in interrogation.

Three agencies make up the HIG: the FBI, CIA and the U.S. Department of Defense.

The HIG’s research program conducted tests, canvassed the scholarship on interrogations and produced scores of peer-reviewed articles. In September 2016, the HIG produced a 93-page review of its findings.

The review devoted just one paragraph to SCAN. Its synopsis was short but withering. SCAN “is widely employed in spite of a lack of supporting research,” the review said. Studies commonly cited in support of SCAN were scientifically flawed, the review said. “When all 12 SCAN criteria were used in a laboratory study, SCAN did not distinguish truth-tellers from liars above the level of chance,” the review said. The synopsis also specifically challenged two of those 12 criteria, noting: “Both gaps in memory and spontaneous corrections have been shown to be indicators of truth, contrary to what is claimed by SCAN.”

In a footnote, the review identified three specific agencies that use SCAN: the FBI, CIA and U.S. Army military intelligence, which falls under the Department of Defense.

Those were the very agencies responsible for this report, concluding there’s no reliable science behind SCAN.

GovSpend.com, which aggregates purchase orders from local, state and federal agencies, turns up four contracts between the Department of Defense and LSI. In 2015, the year before the HIG report, the Defense Department executed two contracts for a combined $97,000 on training from LSI. In 2014, the department spent at least $41,000 on SCAN training, and in 2012, $16,320. There was no competitive bidding, because the training had only one source, according to GovSpend’s data.

A spokesperson at the Pentagon said these contracts were awarded for the Defense Department’s Counter Narcoterrorism Technology Program Office and Counternarcotics and Global Threats Division.

GovSpend’s database is not comprehensive. But it does turn up one contract between the FBI and Sapir’s company. In 2018 — two years after the HIG report, citing the lack of scientific support for SCAN — the FBI spent at least $1,800 for training from LSI. (A public records request for that contract is pending.)

In line with the HIG report, the LSI website says the CIA has received training in SCAN. But when ProPublica submitted a public-records request for those records, the CIA responded that it “can neither confirm nor deny the existence or nonexistence of records responsive to your request. The fact of the existence or nonexistence of such records is itself currently and properly classified.”

The CIA declined to be interviewed about SCAN. “The CIA doesn’t discuss its sources and methods,” spokesperson Chelsea Robinson said. So did the FBI, which issued this statement: “The FBI uses a variety of tools and techniques in the course of our investigations. We consider each case individually and use the appropriate lawful methods. We decline to discuss particular contracts.”

In 2016, the same year the federal task force released its review of interrogation techniques, four scholars published a study on SCAN in the journal Frontiers in Psychology. The authors — three from the Netherlands, one from England — noted that there had been only four prior studies in peer-reviewed journals on SCAN’s effectiveness. Each of those studies (in 1996, 2012, 2014 and 2015) concluded that SCAN failed to help discriminate between truthful and fabricated statements. The 2016 study found the same. Raters trained in SCAN evaluated 234 statements — 117 true, 117 false. Their results in trying to separate fact from fiction were about the same as chance.

“Scientific Content Analysis has no empirical support to date,” the authors wrote in their conclusion. “As a result, we discourage the application of SCAN in its current form.”

One of the four authors was Aldert Vrij, a psychology professor at England’s University of Portsmouth who has published hundreds of articles or book chapters on verbal and nonverbal cues to deception. In 2008, he produced a textbook, “Detecting Lies and Deceit,” in which he devoted a chapter to SCAN. He hadn’t initially intended to, because research on SCAN was scarce and no researcher had strongly recommended its use, Vrij wrote in the book. But what changed his mind was a seminar he gave in the summer of 2006, attended by about 100 criminal investigators from countries that included the U.S., the United Kingdom, Canada and the Netherlands.

He asked the investigators which lie-detection tool they used the most. To Vrij’s surprise, the most frequent answer, “from investigators on both sides of the Atlantic,” was SCAN.

Vrij began putting SCAN to the test. In 2011, he co-authored an article in Law and Human Behavior on an experiment he helped conduct. The participants included 61 students at Israel’s Bar-Ilan University, Sapir’s alma mater, split into three groups. One group’s members committed a mock theft of a statistics exam from a departmental mailbox, then provided written statements, lying about everything they did that day. The members of a second group likewise stole the exam, but lied only about their thievery, while telling the truth about everything else. The students in the third group were innocent. They committed no theft and, in written statements, told no lies.

Coders analyzed the statements using SCAN criteria. Their results failed to discriminate between the three groups. “In sum, no support for the use of SCAN was found in the experiment,” the authors concluded.

Vrij has faulted SCAN for lack of standardization: Different evaluators seize upon different criteria. When asked about this by a ProPublica reporter, Vrij responded by email: “Yes, this is very dangerous because the outcome does not depend so much on the tool but on the person who is using it. Different users therefore can come to different conclusions when assessing the same case.”

 

One of Vrij’s co-authors on the Frontiers in Psychology article was Glynis Bogaard, now a psychology professor at Maastricht University in the Netherlands. Bogaard, who researched SCAN for her Ph.D. project, attended a training workshop led by Sapir in Ghent, Belgium, in 2010.

During the training, Bogaard said, Sapir never offered any scientific support for SCAN. When she asked for data, Sapir claimed he had it but wasn’t going to publish it. Bogaard recalled one of Sapir’s claims: “He said when people talk about closing and opening doors — this can be your home or car door or whatever — this means you’ve been sexually abused when you were younger. I couldn’t believe my ears.”

The workshop’s other attendees were Dutch and Belgian police officers, who, to Bogaard’s dismay, were much less skeptical. During lunch, she talked to them in Dutch: “Sapir doesn’t speak Dutch, so he couldn’t follow.”

She asked them: “Do you believe all this, do you want to use this?”

“And most of them actually believed what he said. Most of them said, ‘Well, this looks very promising.’”

Bogaard said police who believe in SCAN don’t put much stock in the research challenging its effectiveness, because those studies are performed in a controlled environment; asking someone to make up a story in a lab is one thing, they say, but taking a statement from a suspect in a real crime is another.

Steven Drizin, a Northwestern University law professor who specializes in wrongful convictions, said SCAN and assorted other lie-detection tools suffer from “over-claim syndrome” — big claims made without scientific grounding. Asked why police would trust such tools, Drizin said: “A lot has to do with hubris — a belief on the part of police officers that they can tell when someone is lying to them with a high degree of accuracy. These tools play in to that belief and confirm that belief.”

In 1997, a trial in North Carolina offered a rare example of SCAN making its way into court. A social worker trained in SCAN analyzed a questionnaire filled out by the defendant, a foster mother charged with child abuse. The defendant’s reference to trivial things — for example, “I brushed my teeth” — signaled deception, the social worker testified, according to a story in the Raleigh News & Observer. And the defendant’s reference to “the baby” — instead of “my” baby or using the toddler’s name — indicated child abuse, the social worker testified.

“At times, [the social worker’s] testimony prompted courtroom spectators to roll their eyes,” the news story said.

The foster mother was acquitted.


On its website, LSI lists law enforcement agencies from 49 states, plus the District of Columbia, that have received training in SCAN.

Despite the LSI website’s outdated design and dusty references (cassette tapes? fax machines?), SCAN’s appeal isn’t relegated to some bygone era.

Through public-records requests, ProPublica obtained documents from 40 state and local agencies that have purchased SCAN training, most within the last 10 years. In 2014, the Borough of Madison, New Jersey, spent $2,500 on training for two police lieutenants, two detectives and a sergeant. In 2017, the Franklin County Sheriff’s Office in Ohio paid $999.99 to get training for four detectives. That same year, Louisville police paid $5,000 to train 12 officers, including sergeants and detectives in sex crimes and homicides. In 2018, more than two dozen members of the Michigan State Police attended LSI’s basic or advanced workshop.

Over the years, the agencies that have spent public money to get SCAN training include the Maryland State Police; the Washington State Department of Fish and Wildlife; the Pennsylvania Office of the Attorney General; the prosecutor’s offices in New Jersey’s Middlesex, Morris and Union counties; and local police departments large (Los Angeles) and small (Apple Valley, Minnesota), according to records obtained by ProPublica.

Those 40 departments are just a sampling. ProPublica also submitted records requests to more than three dozen other federal, state and local agencies. Some requests are still pending — for example, the documents pertaining to a $132,500 purchase for SCAN training by the U.S. State Department in 2014. Some were denied because the reporter didn’t live where the records were requested (Tennessee, Alabama). And in one instance, ProPublica refused to pay when the Virginia State Police estimated that its charge for the records would be $35,007.09, which was $34,907.09 more than any other agency wanted.

On occasion, agencies that did provide records redacted attendees’ names. “It is better for the public good not to release the names of the particular people with this specialized skill,” wrote the Middlesex County Prosecutor’s Office in New Jersey. The Pennsylvania Attorney General’s Office also withheld names, saying it can’t identify individuals “performing an undercover or covert law enforcement activity.”

Records that were disclosed included sales brochures from LSI, featuring liberal use of italics, bold letters, multiple fonts and type sizes, all caps and exclamation points. One flyer is emblazoned at the top:

“ATTN: TRAINING OFFICERS AND ALL INVESTIGATORS

Turn every investigator into a ‘walking polygraph’!”

Those who completed the training received a gold-seal certificate, in letters black and blue, in English and Hebrew, with a quote from Deuteronomy: “Then shalt thou inquire, and make search, and ask diligently.”

ProPublica and the Tribune reached out to more than 20 agencies that have received SCAN training. Most declined interview requests or didn’t respond. One exception was Sgt. Mark Miller, who investigates homicides for the Maryland State Police. He received SCAN training last year from Sapir. “He’s a phenomenal teacher,” Miller said. The sergeant called SCAN a “pretty good tool,” which he now uses on occasion. “It’s like cooking,” Miller said. “You might use salt in one dish, in the next you might use salsa.”

 

Certificate of completion. Those who complete SCAN training receive a gold-seal certificate with a quote from Deuteronomy: “Then shalt thou inquire, and make search, and ask diligently.” (Courtesy of Washington State Department of Fish and Wildlife, in response to a public records request)

ProPublica obtained emails and other written communications of law enforcement administrators touting SCAN internally or to colleagues in other departments. An officer with the Franklin County Sheriff’s Office in Ohio wrote: “Most instructors in this field learned from Mr. Sapir and are 4th or 5th generation students. Mr. Sapir is considered the expert in this field.” A 2017 announcement from the Florham Park Police Detective Bureau in New Jersey said: “We have a couple Officers who took this course years ago and still use the teachings regularly. … This will help you know immediately when someone is being deceptive.”

But when it comes to enthusiasm for SCAN, it would be hard to top the Texas Rangers, one of the country’s most recognizable law enforcement agencies (big hat; western boots; badge above left shirt pocket; fanny packs prohibited). In 2017, Brian Burzynski, a company commander, endorsed SCAN in an email to his fellow majors. He had received the training in 1997 and 1998, attending the advanced course in Avinoam Sapir’s home in Phoenix. The training now being offered was “really, really cheap, especially since Avinoam is teaching the course,” Burzynski wrote, adding: “He is a true master at detecting deception, and his technique is way better than a polygraph.”

“I’ve never attended any course on interview and interrogation that was more useful, accurate, or productive than the LSI SCAN course,” Burzynski wrote.

In December 2017, the Rangers dispatched 26 people to get the training. The registration cost $4,999. Travel expenses added another $10,265.46.

This year, the Rangers sent two more waves. In July, seven Rangers received SCAN training in Austin. One week later, an additional 10 Rangers attended the course in Kingsville. The total cost, including registration and travel, came to about $10,000.

In September of this year, Burzynski was promoted to assistant chief, the Rangers’ No. 2 position.

The Rangers declined to be interviewed but issued a statement about their use of SCAN, saying in part: “The Rangers consider each investigative interviewing technique as a tool in their tool belt, and we believe that our investigations benefit from our Rangers having as many tools as possible at their disposal. Regardless of the techniques employed, the Texas Rangers must independently corroborate the veracity of statements and confessions.”

In Elkhart, Indiana, where Ricky Joyner was convicted of the 1992 murder of Sandra Hernandez, the last officer trained in SCAN retired a couple of years ago, Lt. Travis Snider said. Whether the department would use SCAN again, Snider said, “That’s hard to tell.”

 

Just south of Elkhart is Kosciusko County; there, Capt. Travis Marsh commands the sheriff’s department’s investigative and patrol divisions.

Marsh told a Tribune reporter that he has used SCAN for almost 20 years.

He received training in 2000 in Indianapolis, along with officers from the Greenwood and Syracuse police departments, the sheriff’s departments for Elkhart, Hamilton and Marion counties, and the Indiana State Police. Also getting the training was a man who worked for National City Bank.

Marsh said he has since performed SCAN analyses in dozens of cases, including robberies, arsons and sexual assaults. More often than not, he said, SCAN pointed him in the right direction.

Once, Marsh said, a sergeant tested him, giving him statements in a burglary case where a victim, suspect and witness were all women. Marsh said he concluded, correctly, that the suspect was a “disgruntled former lover” of the victim. “I had no idea what orientation they were,” Marsh said of the women. “The sergeant was quite surprised.”

Marsh has even applied SCAN at home. Years ago his wife left a note saying she and the kids were off doing one thing, whereas Marsh, analyzing her writing, could tell they had actually gone shopping. His wife has not left him another note in at least 15 years, Marsh said.

Marsh said he understands why some people might be skeptical of SCAN. But he believes in it, just as Steve Rezutko did in the early 1990s.

“You ask me how does SCAN work, I can’t tell you that,” Marsh said. “It really is, for lack of a better term, a faith-based system because you can’t see behind the curtain.”

 

Christian Sheckler covers criminal justice for the South Bend Tribune. Email him at csheckler@sbtinfo.com and follow him on Twitter at @jcsheckler.

Katie Zavadski, Alex Mierjeski and Doris Burke contributed to this report.

TOP TRENDING ARTICLES


PLEASE DISABLE AD BLOCKER TO VIEW DISQUS COMMENTS

Ad Blocking software disables some of the functionality of our website, including our comments section for some browsers.


Trending Now



BlackListed News 2006-2023
Privacy Policy
Terms of Service