fbpx

A Little Knowledge Is a Dangerous Thing

Many years ago, a microeconomics instructor of mine made what was at that time the unexceptional observation that more information is always better than less information, that “more information weakly dominates less information.” The “weakly” bit meant that, if the information was relevant to a choice, knowing it would improve one’s position. And if the information were not relevant to a choice, it could simply be ignored, with the choice remaining what it would have been without the added bit of information. Thus, more information never makes outcomes worse, and sometimes improves them. So more information is always to be preferred to less information.

Yet the claim’s generality did not seem quite right. There was, first, the issue of whether the value of the learned information was greater than the cost of acquiring the information in the first place. The possibility of “rational ignorance” is well known and is not problematic. Our everyday lives provide any number of examples in which we choose to remain ignorant of something because the cost of acquiring information is greater than the expected benefit of getting the information. Yet even if information were free, it didn’t seem as though more information would always weakly dominate less information. The best I could do on the spur of the moment was this counterexample: That a six-year old learning the information, “Santa Claus does not exist,” does not leave the six-year old weakly better off than he or she was before learning the information. The professor smirked, and continued on to another topic.

The 2019 film The Farewell presents a more substantive, if contestable, counterexample. A Chinese family, with siblings and grandchildren scattered throughout the United States and Japan, learns that the family matriarch in China has lung cancer and has only a few months to live. The family chooses not to tell the grandmother of her diagnosis, choosing instead, as one of her son’s puts it, to “bear her burden for her.” The Americanized granddaughter resists the sentiment, arguing that it is not right to keep her grandmother in the dark about something so important, although the granddaughter grumpily complies with the family’s corporate decision. The film presents several scenes in which family members discuss their contrasting cultural attitudes toward informational disclosure as a means of promoting a person’s agency or autonomy versus corporate and paternalistic attitudes toward information revelation.

Cass Sunstein presents a surprisingly skeptical view of “more-information-is-always-better” claims in his new book, Too Much Information. Pertinent to the two cases above, for example, Sunstein, a law professor and head of the White House Office of Information and Regulatory Affairs for several years during the Obama administration, argues that the affective or emotional impact of information should be included as a fundamental consideration in crafting government disclosure-of-information policies. He writes that one of the “primary goals” of his analysis in the book “is to offer a plea for focusing intensely on the emotional effects of receiving information” as a consideration whether disclosure in fact leaves people “better or worse” off.

More broadly, however, against the view that more information always weakly dominates less information, Sunstein discusses the many different ways in which disclosing more information does not necessarily leave people better off.

Inefficient Disclosure Policies

Sunstein targets government disclosure policies more than academic analysis. (Indeed, few economists today would make the unqualified statement of my micro professor.) In particular, Sunstein takes aim at basing government disclosure policies in a non-utilitarian “right to know.” He argues the right-to-know rubric leads policymakers to adopt disclosure policies when disclosure is at best useless and at worse downright counterproductive.

He writes, “The primary question in this book is simple: When should government require companies, employers, hospitals, and others to disclose information?” His answer, he writes, is simple, although perhaps deceptively simple. Government should require disclosure “When information would significantly improve people’s lives.” The surprise is that the book focuses mainly on the argument that making judgment of when disclosure “improves people’s lives” can be so complicated that government policymakers often should not attempt it except under carefully identified conditions.

To be sure, Sunstein nods at many cases in which disclosure policies provide information that improve people’s choices. Yet the surprising focus of his book is on the many cases in which more information simply wastes government and private time and resources or actually worsens people’s lives.

Information can . . . improve people’s lives if it makes them happier. Unfortunately, some information does not improve people’s lives in any way. It does not improve their decisions, and it does not make them happier. Sometimes it is useless. Sometimes it makes them miserable. Sometimes it makes their decisions worse.

Examples abound in the book of how more information can sometimes worsen people’s lives. On the lighter side, Sunstein discusses a response to a policy he implemented when working for the White House that, among other things, would require nutritional posting for popcorn in movie theatres. He was taken aback when a friend sent him an email with the subject line, “Sunstein Ruins Popcorn.” More seriously, Sunstein discusses false inferences that consumers sometimes draw from disclosures, resulting in worse decisions. For example, consumers infer from disclosure that food is a genetically-modified organism (GMO) that it is less safe than non-modified food. Yet the disclosure is intended for informational purposes only, and is not intended to communicate danger. Consumers may avoid purchasing a product they would otherwise prefer—perhaps purchasing a more expensive alternative—because they misperceive the purpose of the disclosure. The welfare losses from misperceiving the purpose of disclosure requirements can be huge in the aggregate. So, too, the costs imposed on people in the process of the government acquiring and then disclosing information sometimes far exceed the minimal benefits derived from disclosure.

At the level of policy-making, Sunstein discusses many disclosures that work. For example, warning labels on cigarettes. But he dwells on understanding the reasons in cases in which disclosure either didn’t seem to work or produced unintended consequences, as those discussed above.

Sunstein’s Skepticism toward Disclosure Policies

Early in the book Sunstein writes that the volume’s purpose is to provide a framework “to clarify not only when mandatory disclosure is a good idea, but the form that mandatory disclosure should take.” Yet it turns out that he actually provides little leverage on those questions, aside from the broad point that more information does not always weakly dominate less information.

For example, after criticizing “willingness-to-pay” analysis, Sunstein punts any policy framework for disclosures, arguing instead that government agencies need to do more research before imposing disclosure requirements:

In the future, it would be far better for agencies to make progress in answering difficult questions about the actual effects of information on people’s lives. Those effects might be strongly positive or strongly negative. The next generation of work on disclosure requirements—and regulatory benefits in general—should make it a priority to produce those answers.

I don’t disagree with Sunstein’s conclusion—and Sunstein should be commended for his principled ambivalence. Yet the conclusion that disclosure requirements might produce “strongly positive or strongly negative” effects on people’s lives does not provide the promised analytical leverage that would “clarify . . . when mandatory disclosure is a good idea.”

The book reads almost as though Sunstein started the book with one hypothesis in mind—that he would develop a framework that would help with developing sensible government disclosure policies going forward—but he instead became increasingly skeptical of his initial project as he worked through the research.

Similarly, at the end of the very next chapter, Sunstein also concludes his discussion with a call for additional study:

Further research is needed to gain a better understanding of when, why, and how disclosure requirements have intended or unintended consequences, as well as how policies can be improved. But one thing is clear: psychology changes everything.

Here, again, Sunstein’s conclusions counsel skepticism regarding disclosure requirements, except when drafted under very specific circumstances. No framework for policy here, either.

Sunstein further develops his skeptical theme in a chapter devoted to the costs of what he calls “sludge,” which is the administrative burden imposed by government requirements that people provide it with information.” Sludge imposes “serious costs in terms of time, frustration, money, humiliation, and sometimes even health.” Sometimes the information is useful, but Sunstein is skeptical. He argues the government often imposes informational requirements without considering whether the benefit of acquiring the information is worth the cost. He recommends that “in the future, it should be a high priority for deregulation and deregulators” to acquire information on whether the cost of “sludge” is worth the benefits. He again concludes skeptically, opining that “in many cases . . . acquisition of the relevant information will demonstrate that sludge is not worth the candle.”

Sunstein’s broad doubts and questions regarding mandatory disclosure policies are important in themselves. They are doubly notable, however, given his stature among the Democratic intellectual and governing elite. The book reads almost as though Sunstein started the book with one hypothesis in mind—that he would develop a framework that would help with developing sensible government disclosure policies going forward—but he instead became increasingly skeptical of his initial project as he worked through the research.

It merits stress that he does not provide a broadside against any and all disclosure requirements. Yet he pointedly opposes broadly aimed disclosure requirements. Instead, he would have them drafted on carefully identified case-by-cases bases.

Curious Textual Choices

Beyond his substantive points, there are several curious oversights of what would have been helpful technical verbiage for Sunstein’s analysis.

First, unless I missed a passing use of it, the phrase “rational ignorance” never appears in a book in which the concept of rational ignorance plays a central role. It’s an odd omission, not least because the term is well known today. More importantly, however, the label is both so understandable and so self-explanatory to even non-academics that its use would have provided a helpful means to organize and frame one of the fundamental concepts Sunstein deploys in the book.

Similarly, there are well-known alternative specifications of people’s preferences that would seemingly have helped explain some of the behavior that puzzles Sunstein. For example, Sunstein devotes an entire chapter to this puzzle regarding people’s behavior toward Facebook: people report being happier when they quit Facebook, yet they continue on Facebook. So, too, people who quit Facebook, and report being happier as a result, nonetheless also report that they desire to continue to use Facebook.

The puzzle disappears, however, if we hypothesize that people were engaging in mini-maxing behavior. That is, people make decisions based on criteria with simple informational requirements, such as minimizing the maximum losses they face. Most people in cases like this are not attentive to finely changing probabilities or less-than-maximum losses. For Facebook, people are concerned that they will miss out on something big if they quit Facebook. Their mini-maxing fear induces them to continue on Facebook despite the non-maximum losses they incur by remaining on the social medium. The oddity of this omission is that at numerous other points of the book, Sunstein expressly invokes people’s use of “shortcuts” or “heuristics”—approaches that include mini-maxing—as explanations for otherwise puzzling behavior.

Finally, Sunstein could have usefully resurrected the old distinction between “risk” and “uncertainty.” “Risk” applying outcomes occurring with a generally known and well-behaved probability distribution (like a roulette wheel, or the weather) while “uncertainty” applies to outcomes in which the underlying probability distribution is not known or well behaved (the proverbial “black swan” event). Again, Sunstein invokes the underlying concept, but could have usefully drawn on the language to help organize parts of his argument.

These terminological quibbles aside, Sunstein’s book provides a host of reasons to approach disclosure requirements skeptically. This does not mean opposing the policies across-the-board. Nonetheless, Sunstein seemingly would limit the crafting of disclosure policies to case-by-case, empirically rich cases. This would seem to be an abundantly sensible approach to government disclosure policies.

Reader Discussion

Law & Liberty welcomes civil and lively discussion of its articles. Abusive comments will not be tolerated. We reserve the right to delete comments - or ban users - without notification or explanation.

on January 26, 2021 at 07:09:04 am

Cass worked for the least open administration in US history, so his position isn't surprising. Why put warning labels on unhealthy food products, or tell Americans you lied about FISA warrants? No need to disclose correct?

read full comment
Image of l
l
on January 26, 2021 at 09:51:42 am

Excellent post. Here we have, in policy terms, novel or under -theorized doctrine indeed. One must agree with it.

Endless and complicated parameters can be put or implied here. Just worth to note:

It seems that one major concept has been missed here (I guess). And it is rather the strategic implications or consequences of such doctrine, on the public mood or public trust:

For the public considers truth and transparency as merits per se. Hiding information, and as a matter of public policy, deliberate one, may erode public trust in government, notwithstanding the discretion exercised case by case. Just the vague understanding that such policy exists indeed, can cause it in fact.

Thanks

read full comment
Image of El roam
El roam
on January 26, 2021 at 11:43:11 am

Apparently, Governor Gavin Newsom read Sunstein's book. California's justification for NOT revealing actual ChiComm flu statistics is that the "people are unable to understand them."
In other words, the people are too stupid.

read full comment
Image of gabe
gabe
on January 26, 2021 at 11:47:00 am

It would seem empirically-obvious, indeed, "a truth universally acknowledged" in a nation steadily dumbing itself down while drowning itself in more and more and ever more information that (as Rogers suggests and Cass' book would seem to imply) ''it ain't necessarily so" that more information is better. To the contrary, it seems apparent that more information is often worse because it produces information overload, indifference to quality, a numbing of the capacity of discernment, and a dulling of the ability to distinguish good from bad information. This is true not just of mandatory labelling and disclosures and the like; it is true of all information that may be publicly distributed, including distribution of "news" through journalism and the distribution of information through education. The disclosure of (or the search for) information should be directed toward improving competence by fostering debate, which, in turn, promotes the search for more and better information so as to elevate competence, which in turn generates further debate.

BTW: this web blog site does little to serve the competence-raising/debate-fostering purpose of information disclosure because it devotes so much of its written effort to modest articles, like the one above, which are of no significance to competence-raising and debate-fostering, and it devotes so little of its effort (so few of its articles) to discussing matters of competence-raising/debate-fostering importance, for example, the raging political fires now engulfing our nation.

"I'm just sayin'"

read full comment
Image of Paladin
Paladin
on January 26, 2021 at 16:55:11 pm

“A Little Knowledge Is a Dangerous Thing”

Perhaps, but then again, often times it is necessary to explore what differentiates places, persons, or things from one another that can lead to that little bit of knowledge that is key to getting to the heart of the matter. For example, what differentiates SARS-Cov-1 and SARS -Cov-2, seems to be that SARS-Cov-2 “has an affinity to human receptors that is multiple times higher than that of SARS -Cov -1”, due to an addition of a furin cleavage site in the Spike protein.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7111780/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7111780/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7457603/

read full comment
Image of N.D.
N.D.
on January 26, 2021 at 20:55:09 pm

Paladin, please do keep on "just sayin". Candor should always be welcome, even when "disagreeable" to the reader/ listener.

In fact, Professor Rogers strikes me as writing more clearly and cleanly than many of the other essayists/ reviewers here at L&L. He almost seems to recognize that about himself, when he writes above: "Our everyday lives provide any number of examples in which we choose to remain ignorant of something because the cost of acquiring information is greater than the expected benefit of getting the information." That is the task we face every time we approach an L&L essay, review, or podcast: is it going to be worth my time to engage with this information entity? Today I was beginning to regret starting my reading about half the way through the essay, but he managed to keep me engaged to the end. He even got me over and past the "mini-maxing behavior" in a way that made sense. And to supply evidence that even an Obama supporter like Cass Sunstein might have a modicum of conservatively oriented sense. As Mr. Rogers suggests, Sunstein should have had sufficient insight and experience to push forward to a more detailed discussion of the merits and limits of specific kinds of information disclosure, both of the government to citizen variety, and the expensive and often questionable "sludge" extraction from citizens/ business owners to the government. Sunstein has probably never heard of Warren Meyer or his small businessman blog at www.coyoteblog.com. But it is to his credit he can call it "sludge", so he recognizes how little it can add to governmental knowledge compared to cost of extracting, preparing, and delivering it.

But in regard to a 6 year old learning that Santa does not exist: that may disturb his equanimity at age 6 but will toughen him up to understand at age 8, 9, 10, 11, 12, etc., that there are many myths and allegories presented as factual that do indeed need to be treated less literally [macroeconomics, systemic racism, "liberty and justice for all", "a nation of laws, not of men", "innocent until proven guilty", etc.] But then as a young parent, a grandparent, a contributing member of society he learns again that Santa does exist, if perhaps not in the guise originally presented.

A little knowledge is a dangerous thing, but no knowledge and wrong "knowledge" are still more dangerous. One half of the country believes the other half is dangerous because they posses said wrong knowledge.

read full comment
Image of R2L
R2L
on January 27, 2021 at 14:49:11 pm

Thoughtful comment, properly appreciative of Rogers' considerable ability and of Sunstein's predictable limitations, as to which his self-demeaning service in the Obama Cause ( I think of Tuesday, November 4, 2008, as Bastille Day in America, the official birthday of the Revolutionary Democrat Party) is res ipsa loquitur.

read full comment
Image of Paladin
Paladin
on January 27, 2021 at 12:34:14 pm

Little knowledge is dangerous, that's why no one should attend public school system. It's just training for government tax slaves. If you can't afford private school or home school, you can't afford children.

read full comment
Image of samuel
samuel
on January 29, 2021 at 13:46:30 pm

**At the level of policy-making, Sunstein discusses many disclosures that work. For example, warning labels on cigarettes.**
If that's "labels that work," I'd hate to see examples of failure. Warning labels were introduced in 1966. Half a century on, 15% of those over age 15 smoke cigarettes. Social stigma, smoking prohibitions, and knowledge of health effects appear far more relevant to reductions in use than do warning labels.

read full comment
Image of Forbes
Forbes

Law & Liberty welcomes civil and lively discussion of its articles. Abusive comments will not be tolerated. We reserve the right to delete comments - or ban users - without notification or explanation.