Is there a tension between creativity and accuracy?

On Twitter, I’ve been chatting with my friend Julia Galef about tensions between thinking creatively and thinking in a way that reduces error.

Of course, all other things being equal, I’m in favour of reducing error in our thinking!

However, all other things are not always equal.

In particular, I believe “there’s a tension, too, between behaviours which maximize accuracy & which maximize creativity… A lot of important truths come from v. irrational ppl.”

Julia has summarized some of her thinking in a blog post, where she disagrees, writing: “I totally agree that we need more experimentation with “crazy ideas”! I’m just skeptical that rationality is, on the margin, in tension with that goal.”

Before getting to Julia’s arguments, I want to flesh out the idea of a tension between maximizing creativity and maximizing accuracy.

Consider the following statement of Feynman’s, on the need to fool himself into believing that he had a creative edge in his work. He’s talking about his early ideas on how to develop a theory of electrons and light (which became, after many years, quantum electrodynamics). The statement is a little jarring to modern sensibilities, but please look past that to the idea he’s trying to convey:

I told myself [of his competitors]: “They’re on the wrong track: I’ve got the track!” Now, in the end, I had to give up those ideas and go over to their ideas of retarded action and so on – my original idea of electrons not acting on themselves disappeared, but because I had been working so hard I found something. So, as long as I can drive myself one way or the other, it’s okay. Even if it’s an illusion, it still makes me go, and this is the kind of thing that keeps me going through the depths.

It’s like the African savages who are going into battle – first they have to gather around and beat drums and jump up and down to build up their energy to fight. I feel the same way, building up my energy by talking to myself and telling myself, “They are trying to do it this way, I’m going to do it that way” and then I get excited and I can go back to work again.

Many of the most creative scientists I know are extremely determined people, willing to explore unusual positions for years. Sometimes, those positions are well grounded. And sometimes, even well after the fact, it’s obvious they were fooling themselves, but somehow their early errors helped them find their way to the truth. They were, to use the mathematician Goro Shimura’s phrase “gifted with the special capability of making many mistakes, mostly in the right direction”.

An extreme example is the physicist Joseph Weber, who pioneered gravitational wave astronomy. The verdict of both his contemporaries and of history is that he was fooling himself: his systems simply didn’t work the way he thought. On the other hand, even though he fooled himself for decades, the principals on the (successful!) LIGO project have repeatedly acknowledged that his work was a major stimulus for them to work on finding gravitational waves. In retrospect, it’s difficult to be anything other than glad that Weber clung so tenaciously to his erroneous beliefs.

For me, what matters here is that: (a) much of Weber’s work was based on an unreasonable belief; and (b) on net, it helped speed up important discoveries.

Weber demonstrates my point in an extreme form. He was outright wrong, and remained so, and yet his erroneous example still served a useful purpose, helping inspire others to pursue ideas that eventually worked. In some sense, this is a collective (rather than individual) version of my point. More common is the case – like Feynman – of a person who may cling to mistaken beliefs for a long period, but ultimately uses that as a bridge to new discovery.

Turning to Julia’s post, she responds to my argument with: “In general, I think overconfidence stifles experimentation”, and argues that the great majority of people in society reject “crazy” ideas – say, seasteading – because they’re overconfident in conventional wisdom.

I agree that people often mistakenly reject unusual ideas because they’re overconfident in the conventional wisdom.

However, I don’t think it’s relevant to my argument. Being overconfident in beliefs that most people hold is not at all the same as being overconfident in beliefs that few people hold.

You may wonder if the underlying cognitive mechanisms are the same, and perhaps there’s some kind of broad disposition to overconfidence?

But if that was the case then you’d expect that someone overconfident in their own unusual ideas would, in other areas, also be overconfident in the conventional wisdom.

However, my anecdotal experience is that a colleague willing to pursue unusual ideas of their own is often particularly sympathetic to unusual ideas from other people in other areas. This suggests that being overconfident in your own crazy ideas isn’t likely to stifle other experimentation.

Julia also suggests several variants on the “strategy of temporarily suspending your disbelief and throwing yourself headlong into something for a while, allowing your emotional state to be as
if
you were 100% confident.”

In a sense, Feynman and Weber were practicing an extreme version of this strategy. I don’t know Weber’s work well, but it’s notable that in the details of Feynman’s work he was good at ferreting out error, and not fooling himself. He wasn’t always rigorous – mathematicians have, for instance, spent decades trying to make the path integral rigorous – but there was usually a strong core argument. Indeed, Feynman delivered a very stimulating speech on the value of careful thought in scientific work.

How can this careful approach to the details of argument be reconciled with his remarks about the need to fool yourself in creative work?

I never met Feynman, and can’t say how he reconciled the two points of view. But my own approach in creative work, and I believe many others also take this approach, is to carve out a sort of creative cocoon around nascent ideas.

Consider Apple designer Jony Ive’s remarks at a memorial after Steve Jobs’ death:

Steve used to say to me — and he used to say this a lot — “Hey Jony, here’s a dopey idea.”

And sometimes they were. Really dopey. Sometimes they were truly dreadful. But sometimes they took the air from the room and they left us both completely silent. Bold, crazy, magnificent ideas. Or quiet simple ones, which in their subtlety, their detail, they were utterly profound. And just as Steve loved ideas, and loved making stuff, he treated the process of creativity with a rare and a wonderful reverence. You see, I think he better than anyone understood that while ideas ultimately can be so powerful, they begin as fragile, barely formed thoughts, so easily missed, so easily compromised, so easily just squished.

To be creative, you need to recognize those barely formed thoughts, thoughts which are usually wrong and poorly formed in many ways, but which have some kernel of originality and importance and
truth. And if they seem important enough to be worth pursuing, you construct a creative cocoon around them, a set of stories you tell yourself to protect the idea not just from others, but from your own self doubts. The purpose of those stories isn’t to be an air tight defence. It’s to give you the confidence to nurture the idea, possibly for years, to find out if there’s something really there.

And so, even someone who has extremely high standards for the final details of their work, may have an important component to their thinking which relies on rather woolly arguments. And they may well need to cling to that cocoon. Perhaps other approaches are possible. But my own experience is that this is often the case.

Postscript

Julia finishes her post with:

One last point: Even if it turned out to be true that irrationality is necessary for innovators, that’s only a weak defense of your original claim, which was that I’m significantly overrating the value of rationality in general. Remember, “coming up with brilliant new ideas” is just one domain in which we could evaluate the potential value-add of increased rationality. There are lots of other domains to consider, such as designing policy, allocating philanthropic funds, military strategy, etc. We could certainly talk about those separately; for now, I’m just noting that you made this original claim about the dubious value of rationality in general, but then your argument focused on this one particular domain, innovation.

To clarify, I didn’t intend my claim to be in general: the tension I see is between creativity and accuracy.

That said, this tension does leak into other areas.

If you’re a funder, say, trying to determine what to fund in AI research, you go and talk to AI experts. And many of those people are likely to have cultivated their own creative cocoons, which will inform their remarks. How a funder should deal with that is a separate essay. My point here is simply that this process of creative cocooning isn’t easily untangled from things like evaluation of work.

10 comments

  1. I think that this is why we’re often more creative in collaboration than when we work on our own. When we collaborate, we can role-play so that one person is doing the “creative daydreaming” and the other is doing the rigorous cross-checks. When we work on our own, we have to do two opposing things at the same time.

  2. ^ absolutely agreed with Izabella. Also, when we’re working on our own, we operate in a familiar territory of our own assumptions and perspectives. Sometimes just explaining something to somebody else, or watching somebody else do something we were already thinking, etc – triggers all sorts of new thoughts and conceptual collisions.

  3. I think the argument got caught up in a slightly deceptive frame early on. Rational vs irrational makes it seem like the conflict is over how epistemically rigorous to be overall whereas I think the conflict is properly framed over strictness of individual steps. You allude to this but I think worth outlining explicitly:

    Consider traversal of a knowledge graph with ideas at vertices and inferences as edges. You can imagine people following different rules for exploring the graph and experiencing conflict when they encounter someone who claims their traversal strategy is closer to a global optimum. In this frame some people are more concerned that if the rigor of an individual edge is too low that you could wind up somewhere crazy. Other people think its fine to wind up in weird places because you can always try to confirm that there is a valid path to that location later if it is worthwhile to do so. These two groups have likely had very different experiences that told them how dangerous/valuable these sorts of actions are (along with innate differences in risk preference).

  4. Creativity is a near related to some forms of schizophrenia, a conspiracy theory of ideas or concepts. A conspiracy theory usually collapses under rigorous tests. It can then be defended by detailing or given up.

    We live in guilded age, every thought is imagined to be allready thought, every idea already invented. Suggesting otherwise is viewed as either beeing a dreamer, fool or delusional.

    Not beeing able to argue instantly with refined logic for every point is then concluded to be a victory over a new idea. Add to that, that creative people usually are not the convincing sales-man type of person- and you have the base for the scientific slowdown we are currently in.

  5. “Modern sensibilities” is a rather strange way to put it; such sensibilities are “modern” only in the West. (And: modern, but still not internalized, apparently–as opposed to similar sensibilities about, say, slurs against Jewish people)

  6. Lateral thinking and thus creative thinking correlates with loose associations and thus inaccuracy. Creativity is somehow a result of errors; just like metaphors. You won’t understand a metaphor by looking at the words dictionary entries. Metaphors are contextually counter determined. Ever thought about a deaf mouth? No? Because there is nothing deaf in mouth’s (common) scope of meaning. But now you try to correlate them. If you succeed it’s like mental short-circuiting. You can’t come up with good metaphors by logically deducing them. They just happen. Creativity just happens the same way. You can’t force creativity but you can maximize lateral thinking and thus lower your idea’s resistance to short with totally unrelated ideas.

    But how are you supposed to logically reason if your mind is down the rabbit hole? The way our brain works you can’t have both: creativity is irrational.

  7. I like to phrase it a little more provocatively: it’s about smelling the fragrance in one’s own BS.

    Humans have probably evolved to not waste cognitive resources. The simplest way to maximize accuracy (and not “waste” resources) is to never try anything risky. Just about every radically new idea (almost tautologically) has gone through a phase where it was criticized by conventional wisdom, and needed ardent defenders willing to stick it out while there were many hanging questions. In the face of the fact that most new ideas fail, it’s difficult to invest the effort to pursue something original unless one is “irrationally” invested in it.

    As a corollary, I conjecture that the need for irrational faith will be higher when trying to be original in more resource-intensive endeavors. As a flipside, the tension between creativity and correctness should dissolve when it’s cheaper to check whether something works. This must be part of why programming (especially interactively) helps unleash creativity — it is very easy to unload the emotional burden of checking for correctness! In the same spirit, I think Izabella makes a great point regarding collaboration helping creativity; that comment has certainly helped me better understand my inclinations.

Comments are closed.