Lessons in leverage: How monetary policy could cause financial instability
Why raising interest rates might make banks riskier, not safer

The chief tool in the central bank arsenal is interest rates. Traditionally, monetary policy uses interest rates to control inflation, as raising rates helps curb inflation spikes. However, the global financial crisis taught us that controlling inflation alone is not enough to prevent a financial crash.
As a result, since 2008 there have been growing calls to use interest rates in a way that supports the stability of the financial system, in addition to controlling inflation. However, my research questions the effectiveness of this approach.
Understanding leverage
Bank leverage is the ratio of a bank’s assets to its equity, or how much it borrows relative to how much it has. After the 2008 crisis, central banks across the world broadly agreed that, given its role in financial crises, bank leverage should be a major focus of financial stability.
To understand leverage, think of buying a house. If you buy a house with a one per cent deposit, you will be more leveraged than if you bought the same house with a 10 per cent deposit – you’re borrowing more relative to the equity that you’re putting in. Intuitively, we understand that this is a more financially risky position to be in.
Now, imagine a bank operating under similar conditions: the more it borrows compared to its equity, the riskier it becomes. Therefore, bringing bank leverage down should reduce their riskiness and, in turn, reduce the chances of collapses that trigger a financial crash.
Here’s where interest rates come into play. According to theory, unexpected rate increases can reduce leverage by making borrowing more expensive. Just as higher rates might discourage you from taking out a big loan, banks might borrow less, reducing their leverage and making them less risky.
On the flip side, lower interest rates encourage borrowing and increase leverage, potentially raising risks. This is why theoretical models claim that central banks could make the banking system safer by raising rates.
The evidence
While it may seem intuitive that monetary policy can support financial stability through reducing leverage, the theory has not been empirically tested. My research fills this gap, and reveals a surprising result: raising interest rates unexpectedly does not only fail to reduce leverage, it actually has the opposite effect of increasing bank leverage, thereby increasing risk.
Why does this happen? In a nutshell, banks are not households. When interest rates go up, banks, like individuals, borrow less. But, unlike individuals, banks also lend money out to others in the form of loans, credit cards and mortgages.
When interest rates rise unexpectedly many borrowers—homeowners, businesses, and individuals—struggle with higher loan payments, leading to defaults. This shrinks banks’ assets and equity at the same time. Instead of making banks safer, this chain reaction raises their leverage. Think of it like taking out a smaller mortgage while your home’s value plummets. Even though you’re borrowing less, your financial position worsens. I refer to this as the "loan-loss mechanism", and I show that this mechanism captures the overall response of banks to interest rate changes from as early as 1984.
For their leverage to decrease, banks would have to see their assets fall and/or their equity rise. But my results show that what actually happens when interest rates rise is that both come down. Conventional theory does not account for this loan-loss mechanism.
And in fact, the latter effect – the fall in equity due to loan defaults – dominates, meaning that losses are so large the net effect is an increase in bank leverage. Going back to the house purchase example, this would be like borrowing less on your mortgage but simultaneously seeing the value of your house drop. Your loan-to-value ratio would worsen even though you are borrowing less.
The role of variable-rate loans
To explain why loan losses occur, it’s important to consider the role of variable-rate loans. When interest rates rise, banks face higher deposit costs because they must pay more interest to depositors. To offset this interest rate risk, banks issue variable-rate loans that adjust with rising rates. This allows them to earn more from borrowers when rates go up, hedging against interest rate risk.
While this strategy works in theory, in reality it shifts the risk onto borrowers. Higher rates make loans more expensive, increasing the likelihood of defaults and posing a credit risk for the bank. In essence, banks are engaging in risk transformation: they trade one type of risk (interest rate risk) for another (credit risk). My research shows that it is indeed banks with the highest proportion of variable-rate loans that experience the most borrower defaults during rate hikes.
Implications for policymakers
For central banks and regulators, these findings are significant. They suggest that trying to achieve both low inflation and financial stability using interest rate hikes alone might backfire. Unexpected rate increases can inadvertently raise bank leverage, making the system more fragile instead of safer.
The lesson here is clear: monetary policy must account for unintended consequences, such as the loan-loss mechanism, that can undermine financial stability. Central banks may need to rely on other tools, such as macroprudential regulation, to directly reduce risks in the banking sector.