|Studies in Time|
Notes on "Framing"
Shaping opinion and response through language.
With research on the "backfire effect" and the persistence of misconceptions.
26 April 2011. Modified: 27 January 2013. First published at Intraspec.ca; moved here 16 June 2013.
What is a "frame"?Source: Textual silence and the discourse of homelessness
Thomas Huckin, Discourse & Society 2002;13(3):347-372. p.354.
A frame is a socially based, abstract, high-level knowledge structure that organizes certain information about the world into a coherent whole; it is "a general, standardized, predefined structure (in the sense that it already belongs to the receiver's knowledge of the world) which allows re-cognition and guides perception' (Donati, 1992:141). Writers and speakers commonly frame public issues by mentioning certain relevant topics and subtopics while ignoring others. In so doing, they are in effect setting the context so as to invoke a certain context model, i.e. give the text representation a certain 'slant'. Donati, P. (1992) 'Political Discourse Analysis', in M. Diani and R. Eyerman (eds.),
What is a "context model"?Source: Cognitive Context Models and Discourse
Teun A. van Dijk, 'Cognitive Context Models and Discourse', in M. Stamenow (ed.), Language Structure, Discourse and the Access to Consciousness. Amsterdam: Benjamins. (1997:189-226) pp.192-194, passim.
During a conversation, a lecture, doctor-patient interaction, reading the newspaper or watching TV, participants of course also need to mentally monitor such encounters themselves, e.g., by planning, executing, controlling or indeed understanding them. It is here proposed that such ongoing, continuously updated episodic representations should be conceptualized as a special type of models, viz., context models. [...] [C]ontexts typically consist of at least the following major categories, possibly each with their own internal schematic structure, as if they were sub-models :
Our context models make us more or less susceptible to the framed message.
How does framing work?
The facts never speak for themselves, which is why scientists
The Ethics of Framing
"Cough or sneeze in your sleeve" is an excellent example of a complex theme framed in simple terms that communicate information of real benefit to others. The origin of "cough in your sleeve" is difficult to pin down, but some remember it from childhood: "Rather than coughing into your hands and spreading germs by touching everything you come into contact with, you should cough into your sleeve instead. This method will keep your saliva to yourself a trick most of us learned in kindergarten or from our multi-tasking mothers who didn't have time for consistent hand washing."
We launched an aggressive communication strategy to get the word out to the American people primarily about vaccination, since this is the single safest and most effective way to protect public health but also about "what you can do to keep flu from spreading: cough in your sleeve; keep surfaces clean; stay home when you're sick."
"Cough in your sleeve" proved an extremely successful framing of the facts of contagion. It is a call for specific action in the interest of self-protection and shared responsibility in a social matrix.
But framing can also be used in a manner most of us would regard as unethical, particularly when objective reportage is compromised by personal bias which leads to the omission or distortion of "facts" in the frame. A good case in point is presented by editor of The Daily Caller Tucker Carlson, in his recent exposés of email archives from Journolist, a now-defunct listserv comprised of several hundred liberal journalists, like-minded professors, and activists. These archives suggest concerted effort on the part of certain journalists to frame information in pursuit of their own political biases, rather than convey the facts in an objective manner. ⇒ And of course, Carlson himself exhibits such traits see, e.g., Erik Wemple's Washington Post piece, Dutch historian [Rutger Bregman] exposes Tucker Carlson's fraud (20.02.2019).
For Journolist founder Ezra Klein's take on it, see On Journolist, and Dave Weigel (25.06.10), in which he writes that, "insofar as the current version of Journolist has seen its archives become a weapon, and insofar as people's careers are now at stake, it has to die". Klein is a "26-year-old Washington Post blogger [...] who makes trenchant observations about health care and other complicated policy issues" and "could be seen as relatively inexperienced [...]", writes columnist Kathleen Parker. But while his postscript is an interesting exploration of personal motive and an attempt to place events in meaningful context, Klein seems to miss the key point. The fact is, we want to hold "reporters" to certain standards of conduct as professional framers of information in the public interest, and those standards preclude prejudicial or pejorative distortions or elisions of the facts with intent to manipulate public opinion. There is a difference, one hopes, between a reporter and an activist.
The Pew Research Center for the People & the Press released a survey report (12.09.09) entitled Press Accuracy Rating Hits Two Decade Low: Public Evaluations of the News Media: 1985-2009. Among the findings:
Just 29% of Americans say that news organizations generally get the facts straight, while 63% say that news stories are often inaccurate. In the initial survey in this series about the news media's performance in 1985, 55% said news stories were accurate while 34% said they were inaccurate. That percentage had fallen sharply by the late 1990s and has remained low over the last decade. Similarly, only about a quarter (26%) now say that news organizations are careful that their reporting is not politically biased, compared with 60% who say news organizations are politically biased. And the percentages saying that news organizations are independent of powerful people and organizations (20%) or are willing to admit their mistakes (21%) now also match all-time lows.
The Journolist controversy may be overblown in some respects there is no foul in the personal exchange of opinions, for example but in this case, more than the matter of media bias and mistaken facts, at issue is the intent to dissemble, disparage, and manipulate. In Getting the message on Journolist's controversial postings, Howard Kurtz (Washington Post, 23.07.10) writes that
Ezra Klein, who recently abolished the group, says members were "loose with their language" because they were having what amounted to an off-the-record bull session. "The Daily Caller has been rankly dishonest. . . . It's an attempt to rip quotes out of context and make it look like a conspiracy." Klein says there is no evidence that members collectively carried out the strategies being debated: "What would be disturbing is if people came to a conclusion together, and you looked the next day and it appeared in everyone's blog or everyone's column." None of this quite adds up to a Vast Left-Wing Conspiracy, and there is no reason to believe that some conservative commentators don't have similar discussions. But there is no escaping the fact that some of the list's liberal literati come off sounding like cagey political operatives.
What it illustrates, on all sides, is the influence of personal perspective on the identification and assessment of 'fact' and significance.
In Politics, Sometimes the Facts Don't Matter
Source: In Politics, Sometimes The Facts Don't Matter
New research suggests that misinformed people rarely change their minds when presented with the facts and often become even more attached to their beliefs. The finding raises questions about a key principle of a strong democracy: that a well-informed electorate is best.
Talk of the Nation, NPR (00:30:17)
How facts backfire:
[...] In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger. This bodes ill for a democracy, because most voters the people making decisions about how the country runs aren't blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper. "The general idea is that it's absolutely threatening to admit you're wrong," says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon known as "backfire" is "a natural defense mechanism to avoid that cognitive dissonance." [...]
Source: When Corrections Fail:
PSYOP : Psychological Operations Framing as a weapon...
The term "Psychological Operations" (PSYOP, PSY-OP) was superseded by
Source: Psychological Operations Tactics, Techniques, and Procedures
PSYOP are planned operations that convey selected information and indicators to foreign target audiences (TAs) to influence their emotions, motives, objective reasoning, and ultimately, the behavior of foreign governments, organizations, groups, and individuals. The purpose of all PSYOP is to create in neutral, friendly, or hostile foreign groups the emotions, attitudes, or desired behavior that support the achievement of U.S. national objectives and the military mission. In doing so, PSYOP influence not only policy and decisions, but also the ability to govern, the ability to command, the will to fight, the will to obey, and the will to support. The combination of PSYOP products and actions create in the selected TAs a behavior that supports U.S. national policy objectives and the theater commander's intentions at the strategic, operational, and tactical levels.
PSYWAR : History of American propaganda for political, corporate control of Americans
Source: Psychological Operations Field Manual
Knowledge of propaganda techniques is necessary to improve one's own propaganda and to uncover enemy PSYOP stratagems. Techniques, however, are not substitutes for the procedures in PSYOP planning, development, or dissemination. Techniques may be categorized as: Characteristics of the content self-evident.
The shift from a "needs" to a "desire" culture :
Framing and repetition...
corporate tax cut debate
Framed in Canada (27.01.11)
[...] Given recent polling that indicates the majority of Canadians don't like the idea of corporate tax cuts, it could be a risky wedge issue on which to stake an election. But conservatives in Canada are careful students of framing. They understand what neuroscience is teaching us: repetition changes minds.
There are several fascinating experiments that prove this phenomenon.For instance, psychologist Ian Skurnik asked senior citizens to sit through a computer presentation of a series of health warnings that were randomly identified as either true or false: Aspirin destroys tooth enamel (true); Corn chips contain twice as much fat as potato chips (false). Quizzed a few days later, the seniors remembered the false statements as true repetition had rewired their brain to believe falsehoods. Kimberlee Weaver of Virginia Tech did a study that showed if one person tells you that something is true, and tells you that over and over again, you are likely to conclude that the opinion is widely held. Norbert Schwarz from the University of Michigan helped show that even when the task is to educate the public with a myth-busting fact sheet, people walk way remembering the repeated myths as truth. Astute politicians use repetition to build support for their side. Harper's Conservatives are nothing if not disciplined message bearers. Their carefully scripted frame, which key Ministers are diligently repeating, is simple: Corporations = job creators; corporate tax cuts = job creation. Finance Minister Jim Flaherty has used it: "If we want more jobs, higher wages, an improved standard of living for all of us, Canada needs to be an attractive place for job-creators to do business and invest." Government House Leader John Baird has used it: "We are reducing taxes for businesses because it creates jobs and it creates economic growth," said Baird. "Our tax rates for job creators is one of the measures sustaining our fragile economic recovery." And you can expect to hear a lot more of the Conservative frame. [...]
A key finding that has emerged in communications research over the years is that when propaganda fails, it's because audiences are active. They ask questions.
Communications professor Aaron Delwiche, Trinity University, San Antonio TX; propagandacritic.com.
In Military experts say psy-ops isn't brainwashing: Psychological operations just convinces the enemy to change behavior.
All things considered...
We sometimes hold to biased cognitions, inaccurate beliefs and interpretations, even when presented with evidence that those cognitions are falsely predicated, because we want to maintain our point of view (motivated reasoning).H We may defend against cognitive dissonance (an inner sense of disquiet or anxiety resulting from simultaneously held contradictory ideas) by ignoring or rationalizing the evidence in terms of our desired conclusion.A,F, I, K The interview, article, and research presented at left explore this phenomenon, as well as the "backfire effect" by which such cognitive bias is more strongly reinforced. Notes go to references appended here. The opposite side of the coin is the manner in which the factual evidence is framed, and by whom. When we are overloaded with information, dealing with contradictory frames presented by supposedly "expert" or "authoritative" sources, each competing for our attention, filters become all the more important in making sense of our world. As David Shenk writes (2003),
"[t]he psychological reaction to ... an overabundance of information and competing expert opinions is to simply avoid coming to conclusions. 'You can't choose any one study, any one voice, any one spokesperson or a point of view', explains psychologist Robert Cialdini. 'So what do you do? It turns out that the answer is, you don't do anything. You reserve judgment. You wait and see what the predominance of opinion evolves to be.'" But waiting for the emergence of predominant opinion may not be practicable; the psychoemotional need for an operative understanding may be immediate. Cognitive processes (strategies for evaluating, constructing, and evaluating belief),H engage the experiential information at hand, in the interests of intrapsychic coherence and stability, driven by unconscious variables.F
In objective terms the resulting solution may be more expedient than factual, but it nevertheless serves as an effective filter, a context model, in terms of which we can respond. Ideally, self-perceptionJ and self-affirmationE, F make one more open to evidence that runs contrary to a motivated stance but, given the postmodern fluidity of "fact" and the decenteredness of meaning in an age of spin and sound bytes, is it any wonder that we formulate and defend self-serving belief, political or otherwise?
"The Scream", Edvard Munch
dated 1893? 1910?)
Image Credit: AFP/Getty; Adapted.
Click to enlarge, read article.