In August, Twitter CEO Jack Dorsey was interviewed on the New York Times podcast The Daily, where he was asked explicitly what his company will do if President Donald Trump uses Twitter to declare himself the winner of the 2020 election before the results have been decided. Dorsey paused, then provided a vague answer about learning lessons from the confusion that occurred in 2000 with the Florida recount and working with “peers and civil society to really understand what’s going on.” It was 88 days before the election, and my heart sank.

For those of us who study misinformation and investigate online efforts to interfere with democratic processes around the world, this election feels like our Olympics. It can be hard to remember just how different attitudes around the threat of false and misleading information were back in November 2016, when, two days after the presidential election, Mark Zuckerberg famously claimed that it was “crazy” to suggest that fake news had affected the outcome. Now a misinformation field has emerged, with new journals inspiring cross-disciplinary research, millions of dollars in funding spent on nonprofits and start-ups, and new forms of regulation from the European Union Code of Practice of Disinformation to U.S. legislation prohibiting so-called deepfakes.

Planning for the impact of misinformation on the 2020 election has taken the form of a dizzying number of conferences, research projects and initiatives over the past four years that warned us about the effects of rumors, conspiracies and falsehoods on democracies. Recent months were supposed to be the home stretch. So when Dorsey failed to give a concrete answer to a question about a highly likely scenario, it felt like watching a teammate fall on their face when they should have been nailing the dismount.

Every platform, newsroom, election authority and civil society group could have a detailed response plan for a number of anticipated scenarios—because we have seen them play out before. The most common form of disinformation is that which sows doubt about the election process itself: flyers promoting the wrong election date, videos of ballot boxes that look like they have been tampered with, false claims about being able to vote online circulating on social media and in closed groups on WhatsApp. The low cost of creating and disseminating disinformation allows bad actors to test thousands of different ideas and concepts—they are just looking for one that could do real damage.

We have not grappled with the severity of the situation. Social media platforms seem to have only recently recognized that this election might not end neatly on November 3. Nonprofits whose employees are exhausted after months of COVID-related misinformation work are still scrambling for resources. The public has not been adequately trained to manage the onslaught of misinformation polluting their feeds. Most newsrooms have not run through scenarios to practice how they will cover, say, bombshell leaks in the run-up to Election Day or after the election, when the outcome might be disputed. In the spring of 2017 France saw #macronleaks, the release of 20,000 e-mails connected to Emmanuel Macron’s campaign and financial history two days before the election. Because of a French law that prohibits media mentions of elections in the final 48 hours of the campaign, the impact was limited. The U.S. does not have such protections.

The panic is palpable now. My e-mail inbox is full of requests from platforms to join belatedly assembled task forces and from start-ups wondering whether some technology could be quickly built to “move the needle” on election integrity. There are near-daily updates to platform policies, but these amendments are not comprehensive, lack transparency and have not been independently assessed.

Ultimately the rise of misinformation, polarization and emotion-filled content is our new reality, and the biggest threat we face in this moment is voter suppression. So rather than “muting” friends and family members when they post conspiracy theories on Facebook, start a conversation about the serious damage that rumors and falsehoods are doing to our lives, our health, our relationships and our communities. Do not focus on the veracity of what is being posted; use empathetic and inclusive language to ask how people are voting. No one should be shamed for sharing misinformation because we are all susceptible to it—especially now, when our worlds have been turned upside down and many of us are operating in fight-or-flight mode. To avoid losing ourselves in the noise, we have to help one another adapt.