UCLA faculty voice: Low interest rates are bad for your brain
Dr. Peter Whybrow is director of the Semel Institute for Neuroscience and Human Behavior at UCLA and the author of “The Well-Tuned Brain: Neuroscience and the Life Well Lived.” This column appeared Sept. 18 on Zócalo Public Square.
In today’s interdependent turbocharged world, we all feel the downstream shocks from China’s wobbling experiment in casino capitalism — and they are painfully familiar. A euphoric bull market, driving stock growth of 150 percent in one year, suddenly crashes with shares losing one third of their value. Despite government intervention, the slide continues, and begins to spread. Panic ensues and markets tumble worldwide in a frantic sell-off.
Millions of novice investors in China, mesmerized by the promise of astonishing profits, had been lured to buy grossly overvalued stocks. Lack of investing experience, easy finance, hyped promotion, the mushrooming of day traders, “margin accounts” fostering stock purchase on borrowed money — it’s all chillingly reminiscent of America’s dot.com bubble of the late 1990s, or the sub-prime mortgage madness that preceded the worldwide financial crash of 2008.
Market booms and busts are not new, as history tells us. Holland’s tulip mania in the 1630s; the South Sea Bubble a century later; the financial turmoil that gripped the nascent United States in the 1790s (ensuring Alexander Hamilton’s place on the $10 bill); plus the speculative frenzies of the late 1800s and the Gilded Age, are topped in their drama only by the freewheeling flapper years of the 1920s that ended in the disastrous crash of ’29 and the Great Depression. Bubbles, it seems, are a feature of financial markets and increasingly so in contemporary times, despite today’s technical wizardry. As Mark Twain observed, while history may not repeat itself, it surely does rhyme.
So why do we find learning from history so difficult?
As a neuroscientist interested in the behavior of capital markets, I believe insights into the persistent cycles of boom and bust are to be found in how the brain balances risk against reward and how our current financial preoccupation with short-term profit has “retuned” that process, distorting the choices we make.
In the brain, as in the marketplace, the evaluation of risk is the foundation of prudent choice. Evolved over millions of years, the human brain is a hybrid. There is an ancient, pre-conscious core—developed in the interest of survival when life was nasty, brutish, and short — that makes us instinctively selfish creatures, emotionally focused on near-term reward, and driven by habit. The more recently developed frontal cortex, which sits above the eye sockets and is sometimes called the “executive” brain, blends this emotional striving with kaleidoscopic streams of stimuli that monitor immediate experience. In complement, a conscious, deliberative “thinking” cycle, drawing heavily on memory and predominantly concerned with behavioral control and avoidance of harm, runs through the lateral, outer region of the frontal cortex. It is in the crosstalk of these parallel cycles of perception and action—seeking balance between the fear of pain or loss and the expectation of pleasure or profit—that choices are made.
Over the last three decades the enticements of affluence, hand-in-glove with reduced financial regulation, have distorted this crosstalk and lured the ancient emotional self into the bullish pursuit of profit, overriding objective control and disrupting the brain’s “internal market.”
Risk assessment in the brain is a process not dissimilar to the barter that sets prices in real-world markets, balancing future reward against potential hazard. However, in the consumer society, which now represents some 70 percent of America’s economic activity and which in the interest of continued growth promotes the joys and benefits of easy credit, the temptations of immediate gratification can rapidly overshadow the brain’s reasoned assessment of downside risk.
Credit makes it possible to have in hand today what otherwise we would have to postpone. Of course, the debt that is incurred mortgages future earnings, but the ancient emotional brain has a hard time evaluating long-term costs, especially when easily available financing offers a false sense of security. In neurobehavioral terms, easy credit builds dangerous intuitive habits that can hijack the brain’s perception-action cycle: The more we enjoy a painless, immediate reward, the more we want to repeat it. In consequence, reason — to paraphrase David Hume — becomes enslaved to passion. We learn to be thoughtless.
The run-up to America’s 2008 financial crisis offers illustration. Faced with a global savings glut and low interest rates in the closing decade of the 20th century, pension funds and other endowments were demanding safe, higher-yield investments. Similarly the big international banks were exploring innovative ways of increasing profit and managing risk by bundling loans into credit-financed portfolios, thus reducing the cash reserves required to cover potential defaults. By the late 1990s, as the excitement of the dot.com boom declined and with the ongoing liberalization of banking laws, the concept of creating such “credit derivatives” was accepted by federal regulators.
At first, reason prevailed among the bankers and the portfolios were constructed largely from the assets of companies with proven track records. As the opportunities for short-term profit were recognized, however, greed took hold. Soon home mortgages of dubious asset quality were being added to the mix. The securities so derived were complex and difficult to understand, with risk virtually impossible to accurately assess even by those financiers promoting them. Because the income of the hustlers selling the mortgages was tied to the units sold, however, this was a detail largely ignored. In consequence, the home mortgage bubble rapidly inflated, creating unknown trillions of imaginary wealth. Once again, in a frenzied search for rapid return, the passions of the ancient brain had outstripped executive reasoning. Then, inevitably, as the bubble collapsed, true risk stared us in the face. Panic ensued as the markets froze, precipitating the Great Recession.
In rational moments, most of us agree that mortgaging the future to excessive debt is not prudent behavior. In the run up to the 2008 fiscal crisis, however, homeowners, investors, the banks, and the government were essentially complicit in doing just that. Aided and abetted by the enticements of short-term profit, we had shifted in our thinking from credit as the healthy driver of economic growth to the steady accumulation of debt as basic to the economics of everyday life. Collectively, as individuals and as a nation, from the neuroscience perspective we had been hard at work retuning the brain’s internal market. Economists give such retuning the quaint name of “moral hazard.”
That, in this electronic age, money as a tangible asset is fast becoming invisible hasn’t helped our reasoned decision-making. Friedrich Hayek, in The Fatal Conceit, warned against such abstraction: “The moment that barter is replaced by indirect exchange mediated by money, ready intelligibility ceases.” Reduced now to a string of numbers that provide a record of the pay we receive, the money we borrow and the stocks we buy, tangible assets have become divorced from economic reality. And yet the abstract concept of “money” is ever more central to everyday life. While it is possible to have too much to eat or too much to drink, in the abstract one can never have enough money.
In this world of short-term self-interest where risk is a secondary concern, it is perhaps not surprising that China allowed its debt to quadruple prior to the recent meltdown. But for all the wagging of tongues about the Chinese debacle, and our wringing of hands about the U.S. stock market’s gyrations in its wake, the evidence is that we have become addicted to debt as a way of life. A 2015 McKinsey report estimates that global debt has grown by $57 trillion since 2008. That’s a 17 percent rise and a yearly increase of 5.3 percent, not far from the annual growth rate of 7.3 percent that marked the “boom” years before the Great Recession.
For many, this ongoing saga calls into question the integrity of our financial and government institutions and those who lead them. And indeed they do bear responsibility, but it is a fool’s chase to expect governments to set boundaries that we increasingly avoid setting for ourselves. To learn from our mistakes we must accept ourselves for who we are. In truth, the ancient brain that serves us each day — evolved in scarcity, focused on the short term and habit driven — is poorly matched to the frenzied affluence of contemporary material culture.
If we are to change the future of our cultural institutions, we must first each take the responsibility for changing ourselves. We must become mindful of the longer-term consequences of our actions, for the sake of personal health, the stability of our economy, and the quality of the environment that sustains us. Particularly we must ask: Is the debt- driven short-term market model, with its power to foster greed and erode the social contract, an adaptive strategy for the 21st century?