The Risk Paradox: Why We Fear Sharks But Text While Driving

Episode Title: The Risk Paradox: Why We Fear Sharks But Text While Driving

Introduction
Welcome back to PsyberSpace! I’m your host, Leslie Poston, and today we’re taking a look at why our brains struggle to assess risk. Risk assessment is one of the most critical cognitive skills we rely on daily. From deciding whether to change jobs to determining if it's safe to cross the street, we're constantly making calculations about danger and opportunity. But here's the catch—humans are terrible at it. We worry about things that are statistically unlikely, ignore dangers that are creeping up on us, and allow emotion, misinformation, and stress to completely distort our perception of what's risky and what's not.
We’re breaking down the science of why we make bad risk decisions—in love, money, society, and even global crises. We'll talk about the psychology behind our risk biases, how social media and politics warp our judgment, and why even our own brains may be sabotaging us after something like Covid. And don't worry—we won't just leave you with the problems. By the end of this episode, we'll talk about how you can get better at assessing risk and making smarter choices in your daily life.
The truth is, understanding risk isn't just about avoiding danger—it's about knowing which opportunities are worth taking. Whether you're wondering about a career change, a new relationship, or how worried you should be about the latest headline, this episode will help you navigate uncertainty with more clarity and confidence.
Section 1: Risk in Personal Relationships – Love, Dating, and Trust
Let's start with something deeply personal: relationship risk. Whether it's dating, friendships, or work relationships, people often make decisions that seem—at least in hindsight—completely irrational. Why do we stay in toxic relationships, take forever to leave bad jobs, or ghost potential partners instead of just saying, "Hey, I'm not interested"?
A lot of this comes down to attachment styles and how we perceive risk in relationships. If you're anxiously attached, you may fear rejection so much that you overcommit to a failing relationship. On the other hand, avoidantly attached individuals tend to view commitment itself as a risk and may bail too early. Research from adult attachment theory shows that these patterns form in childhood but continue to shape our risk calculations in adult relationships. Your brain is essentially running calculations based on your earliest experiences—deciding whether closeness represents safety or danger.
Then there's the sunk cost fallacy. We invest time, energy, and emotions into people, and even when it's clear a relationship isn't working, we convince ourselves to stay because we don't want to "lose" what we've already put in. This is why you hear people say things like "But we've been together for five years!" as if time invested is a reason to invest more, rather than cutting losses. Similarly, in dating, people fall into the gambler's fallacy—thinking that because they've had a streak of bad dates, the next one is due to be amazing. In reality, past experiences don't dictate future odds unless we change something about our behavior.
And what about work? Interestingly, we tend to tolerate toxic workplaces longer than toxic friendships. Why? Partly because we assess workplace risk differently—quitting a job comes with financial uncertainty (as well as healthcare uncertainty in the United States), while cutting off a bad friend doesn't. There's also the psychological pressure of workplace culture, where leaving is framed as failure rather than a healthy choice. Studies show that the average person stays in a job they hate for 18 months before quitting, while they might drop a toxic friend within weeks of recognizing the problem.
People often make opposite risk mistakes in different domains. The same person might stay in a terrible relationship for years but quit a good job over a minor frustration. Or they might ghost a promising dating prospect because of tiny "red flags" while putting up with genuinely abusive behavior from family members or bosses. These contradictions show that risk assessment isn't rational—it's deeply emotional and contextual.
Section 2: Risk in Personal & Corporate Finance – Why We Suck at Money Decisions
Money might seem like a rational space where people carefully weigh pros and cons. But nope—we're just as bad at financial risk assessment as we are in relationships.
One of the biggest psychological factors at play is loss aversion. Research shows that people feel the pain of losing $10 far more intensely than the joy of gaining $10. This leads to overly cautious investment decisions, staying in bad financial situations, or refusing to take small, calculated risks that could pay off in the long run. Behavioral economists have found that the emotional impact of a loss is about twice as powerful as the positive impact of an equivalent gain—which helps explain why people hold onto losing stocks too long (hoping to avoid that painful loss) while selling winners too early (to lock in that smaller pleasure of a gain).
Then there's present bias, where people prefer immediate rewards over long-term gains. This is why credit card debt is so common—spending money today feels great, and the future consequences feel far away. Our brains are wired to prioritize immediate rewards over distant ones, a trait that served us well when survival meant focusing on immediate threats rather than long-term planning. The same happens in corporate decision-making, where CEOs sometimes take short-term risks for immediate stock gains, even if those decisions harm the company long-term. One study found that CEOs make significantly riskier decisions in the final year of their contract—essentially sacrificing the company's future stability for immediate performance metrics, a behavior that we can see replicated across the entire C-suite.
And let's talk overconfidence bias—the belief that you're the exception to the rules. This is what drives people to gamble on risky stocks or crypto, thinking they can "beat the market," when in reality, even most professionals struggle to outperform index funds. This overconfidence means people often take big swings in areas they don't understand while being irrationally cautious in areas where modest, consistent investment would yield better results. Look at how many people pour money into speculative investments while failing to contribute to their retirement accounts—the exact opposite of smart risk management.
Risk perception in business often rewards boldness while punishing measured caution. Consider how start-up culture glorifies the "move fast and break things" mentality, even though research shows that more sustainable businesses typically take a balanced approach to risk. This creates particularly challenging dynamics for employees, who often bear the consequences of leadership's risk-taking without sharing in the rewards. It's no coincidence that companies with the highest CEO-to-worker pay ratios also tend to have the most volatile business strategies.
Section 3: The Psychology of Risk Perception
One of the strangest things about human psychology is how bad we are not just at perceiving risk, but also at prioritizing risks. Why do we fear shark attacks more than heart disease? Why does terrorism feel scarier than climate change?
The availability heuristic explains a lot of this. Our brains assess risk based on what comes to mind most easily, which means we overestimate the danger of highly publicized but rare events (like plane crashes) while underestimating mundane but deadly risks (like high blood pressure). After a plane crash makes headlines, ticket sales drop—even though driving to the airport is statistically far more dangerous than flying. Our brains aren't wired to think in probabilities; they're wired to think in stories. And vivid, dramatic stories stick with us, distorting our perception of what's likely to happen.
We're also biased toward optimism and normalcy—believing bad things won't happen to us. It's why people put off buying insurance or assume they won't get scammed. Research shows that even when given exact probabilities of negative outcomes, people consistently believe their personal risk is lower than the average person's. This optimism bias can be healthy in moderation—it gives us the courage to take necessary risks—but it becomes dangerous when it leads to ignoring genuine threats. Meanwhile, we overestimate risks we can't control, like nuclear accidents, while downplaying dangers in our control, like texting while driving.
The illusion of control plays a significant role here too. People tend to feel safer when they're in charge, even when the statistics don't back this up. It's why people fear flying (where someone else is in control) more than driving (where they're at the wheel), despite driving being far deadlier. This illusion extends beyond physical risk to financial and social risks as well. Investors often believe they can time the market better than professionals. Gamblers believe their personal rituals influence random outcomes. And social media users believe they can control their digital footprint, despite overwhelming evidence to the contrary.
What makes risk assessment even more complicated is how our perception changes based on context. The same person who refuses to eat GMO foods because of perceived health risks might drive without a seatbelt or vape regularly. This compartmentalization shows that risk assessment isn't just about calculating probability—it's about how risks make us feel, who we trust to give us information, and what behaviors are normalized in our social groups.
Section 4: The News Media, Social Media, and Risk Distortion
Media plays a huge role in warping our risk perception. The news thrives on fear, highlighting rare but shocking events—like mass shootings or AI job losses—while underreporting slower, more systemic issues like healthcare failures or climate change.
The fundamental problem is that media outlets, whether traditional or social, have incentives that don't align with accurate risk assessment. Sensation sells. A single shark attack will dominate headlines for weeks, while the thousands of deaths from diabetes that occurred during the same period go unmentioned. This creates a deeply distorted view of what's actually dangerous. Research has found that people who consume more legacy news media tend to have less accurate perceptions of risk, overestimating the likelihood of murder, terrorism, and natural disasters while underestimating health risks like heart disease.
Then there's social media. Fear spreads fast, and platforms reward content that makes you anxious or angry. This leads to doomscrolling—a psychological trap where people keep consuming negative news, reinforcing anxiety and desensitization. Studies of social media engagement show that fear-inducing content spreads six times faster than positive news. This creates a media environment where even legitimate risks get blown out of proportion, making it harder to distinguish between minor concerns and genuine threats.
The microtargeting capabilities of today's media platforms make this even worse. Algorithms detect what makes you scared or angry and feed you more of it, creating personalized echo chambers of fear. Someone worried about climate change gets fed increasingly apocalyptic climate content. Someone anxious about crime gets bombarded with stories of violence, even if crime rates in their area are low. These personalized fear silos make collective, rational risk assessment almost impossible.
What's particularly troubling is how this distortion leads to risk fatigue. When everything is framed as a crisis, people eventually tune out or become numb to warnings. We've seen this play out with pandemic alerts, climate change warnings, and economic forecasts. By crying wolf too often (or making every wolf seem equally dangerous), media makes it harder for people to respond appropriately when genuine threats emerge.
The result? We panic over the wrong risks while ignoring slow-moving but deeply consequential dangers. We fear dramatic, acute risks while missing chronic, escalating problems until they reach crisis levels. And increasingly, we distrust the very sources that could help us make better risk assessments, creating a vacuum where conspiracy theories and unfounded fears flourish.
Section 5: Covid and the Erosion of Risk Literacy
We’re all a little tired of having to examine our reactions to Covid, but let’s take two or three minutes to talk about the risk assessment aspect of it. Covid broke people's ability to assess risk, and not just socially—it literally changed our brains in two ways: stress and infection. Research shows that Covid infections can cause gray matter loss in areas responsible for decision-making and empathy. Long Covid sufferers report increased impulsivity, brain fog, and poor judgment.
The neurological impact is just beginning to be understood. Studies using MRI scans have found changes in the prefrontal cortex and the amygdala—brain regions critical for risk assessment and emotional regulation—in people who had Covid, even mild cases. This isn't just about subjective symptoms; it's measurable brain changes that affect how people process threat and safety information. One particularly concerning study found that these changes persisted for months after infection, potentially explaining why risk perception seems to have shifted permanently for many people.
Beyond the neurological changes, there was also the stress of the unknown and the psychological whiplash of mixed messaging. People went from overreacting (sanitizing groceries, hoarding supplies) to underreacting (ignoring airborne transmission, taking off masks too early, and dismissing long Covid risks). This inconsistency created a perfect storm for eroding trust in public health guidance and in scientific expertise more broadly.
The chronic stress of the pandemic itself rewired our collective risk assessment. Prolonged uncertainty and threat lead to what psychologists call "allostatic load"—essentially, your brain's threat detection system gets worn out from constant activation. The result? Some people become hypervigilant, seeing danger everywhere, while others swing to the opposite extreme, becoming numb to genuine risks. Neither response leads to accurate risk assessment.
What's interesting is how the pandemic highlighted existing disparities in risk perception and management. Communities with high trust in institutions generally followed risk-reduction measures more consistently. Those with historic reasons to distrust medical or government authorities often rejected guidance—sometimes to their detriment, but not without understandable historical context. These patterns reveal that risk assessment isn't just individual psychology; it's shaped by collective experience and social trust.
The pandemic also accelerated a troubling pattern we've seen with other public health measures: the rejection of expert guidance followed by gradual acceptance, followed by forgetting we ever resisted. Just as people initially rejected seatbelts, bicycle helmets, and smoking bans—only to later wonder why anyone would oppose such obvious safety measures—we may eventually look back at Covid precautions with similar retrospective clarity. The question is what damage will be done in the meantime, and what the next rejected risk guidance will be.
Section 6: Trump, Musk, and the Psychology of Political & Technological Chaos
Speaking of cultural forces, let's talk about disruption as a brand and its impact on risk assessment. For that we’ll turn briefly to politics in the United States. Both Trump and Musk have built their images on chaotic decision-making, normalizing risk-taking behaviors that are obviously reckless in other contexts.
For Musk, this means gambling on tech and finance experiments with other people’s money—crypto, AI, Twitter—while for Trump, it's about making political instability seem normal. The constant churn of scandal and controversy has made people numb to real threats, like the erosion of democracy or the spread of misinformation. Psychologists call this "crisis fatigue"—when the alarm bells ring constantly, people stop hearing them altogether.
What's particularly significant about these two figures is how they've utilized legacy media’s tendency to amplify anger and anxiety as well as tactics normally seen in religion or cults to gamify risk-taking. At this point, their supporters don't just tolerate chaos; they have begun to celebrate it as evidence of genius or authenticity. Every erratic tweet, every norm-breaking statement becomes proof of their willingness to "disrupt" the status quo. This reframing transforms what would rightly be seen as dangerous impulsivity into a supposed strategic advantage.
The psychological impact of this risk normalization extends far beyond politics or tech. Research shows that people's risk tolerance in one domain often bleeds into others. When political and business leaders model reckless behavior without immediate consequences, it shifts everyone's baseline for what seems reasonable. Studies have found correlations between periods of political volatility and increased risk-taking in personal finance and health behaviors.
We're also seeing this manifest in what some researchers are calling "tech bro risk culture"—a approach to risk that glorifies bold, reckless moves while dismissing careful planning as timid or boring. This mindset is particularly evident in financial behavior among younger people. The GameStop short squeeze, meme stock investing, and crypto gambling all reflect a growing comfort with extreme financial risk, often justified as rebellion against an unfair system.
What makes this concerning is the asymmetry of consequences. When Musk tweets something that tanks Tesla stock, or when Trump makes a statement that destabilizes an alliance, the most severe consequences rarely fall on them personally. Instead, regular people—investors, employees, citizens—bear the brunt of their risk-taking. This disconnect between who takes risks and who suffers consequences creates deeply distorted risk incentives throughout society.
Section 7: Can We Get Better at Assessing Risk?
So, how do we fix this? First, we need to train ourselves to think in probabilities, not emotions. Understanding base rates, using scenario planning, and asking "who benefits from my fear?" can make us better decision-makers.
Probabilistic thinking doesn't come naturally to most people, but it can be learned. Start by looking for the base rate—that’s the background probability of something happening regardless of your specific situation. If you're worried about a rare disease, check how common it actually is before getting caught in a spiral of health anxiety. If you're concerned about a new technology, look at the track record of similar innovations rather than fixating on worst-case scenarios. This simple habit can dramatically improve risk assessment.
Scenario planning is another powerful tool. Instead of thinking in binary terms (will this happen or not?), consider multiple possible outcomes with different probabilities. Professional risk managers use this approach to prepare for uncertainty without becoming paralyzed by it. You can apply the same technique to personal decisions: When considering a job change, for instance, map out several possible outcomes—from best case to worst case—and create plans for each. This converts vague anxiety into concrete contingencies.
Also, we need to clean up our information diet. Cutting out low-quality, fear-based media and learning to pause before reacting emotionally can rewire how we assess danger. This doesn't mean ignoring genuine risks, but rather being selective about sources and aware of how information is framed to manipulate emotions. Pay attention to whether a news source provides context and probabilities or just isolated incidents designed to provoke fear.
Collective risk literacy matters too. Research shows that people make better risk decisions when they discuss options with others who bring different perspectives. Create habits of checking your risk perceptions with trusted friends who think differently than you do. Are you overlooking something? Overreacting? The social aspect of risk assessment can provide valuable course corrections.
Finally, cultivate genuine expertise in areas where you regularly need to assess risk. Whether it's health, finance, relationships, or technology, deepen your knowledge beyond headlines and clickbait. The more familiar you are with a domain, the less susceptible you'll be to manipulation and fear-mongering. True experts aren't those who claim certainty about everything, but those who understand the probabilities and complex factors that influence outcomes.
Conclusion
Risk assessment is a skill—one we can improve. The more we understand why we misjudge risk, the better we can navigate love, money, society, and global crises.
What's powerful about better risk assessment is how it transforms decision-making across all domains of life. From healthier relationships to smarter financial choices, from more informed voting to more productive worry (focusing on risks you can actually mitigate), improving your risk literacy pays dividends everywhere.
The good news is that while our brains come with built-in biases that distort risk perception, we also have the capacity to recognize and compensate for these biases. Every time you catch yourself overreacting to a sensationalized news story, or notice you're ignoring a significant but unsexy risk like retirement planning, you're building this mental muscle.
In a world of increasing uncertainty, the ability to accurately assess risk isn't just a nice-to-have skill—it's becoming essential for wellbeing. Whether it's navigating a pandemic, making sense of economic shifts, or simply deciding which relationships are worth investing in, risk literacy enables better choices in a complex world.
So what's the biggest risk you ever misjudged? What might have happened if you'd had better tools for assessing that situation? Share your stories on our social media or the Patreon—I'd love to hear how this resonates with your experiences.
Thanks for tuning in to another episode of PsyberSpace! I’m your host Leslie Poston, signing off. Until next week, take smart risks—and stay curious.

The Risk Paradox: Why We Fear Sharks But Text While Driving
Broadcast by