It was in May 2022 that I first deactivated my Twitter account. Not usually one to follow my gut when it comes to big decisions, this time I couldn’t help noticing how drained I felt after engaging on that platform. I was also getting an increasingly bad feeling about how the platform could, and likely would, change at the whim of its presumptive new owner.
At the time, it did feel like a big decision. I’d built up a following that wasn’t big by most people’s standards. But what it lacked in quantity it made up with quality, rich with connections to other academics whose work I admired.
It also went against the grain for someone whose work focuses on teaching with technology and the impacts of technology on how we think, remember, and learn. A good chunk of my recent work has also focused on tech-related mythbusting, debunking all kinds of exaggerated claims about the ways in which technology supposedly erodes our mental capabilities. Whether it’s getting people to tone down rhetoric on taking notes with laptops, questioning the depth of generational divides around technology, or trying to put the brakes on full-fledged moral panic, I’m usually there to say that it’s not all that bad.
Social media, though, is a different kettle of fish. It’s always been a bit of an oddball in the landscape of online socializing and relationship-building, providing a stage for negative personality traits to play out and contributing to anxiety, depression, and sleep deprivation among certain demographics (most notably teenagers). And within the landscape of social media platforms, Twitter is odder still.
People use Twitter for different reasons than they use other platforms, and may even have different sorts of personalities, tilting more towards seekers and those in search of specific interests than the other ones. More research these days tells us that we shouldn’t lump all social media together when weighing its good and bad points, especially now that newer forms (TikTok, BeReal) are giving the established giants a run for their money.
So it makes sense to apply a new critical lens every time we start making decisions about our own personal social media, asking ourselves what the particular features and affordances are, and how even subtle differences among those can lead to vastly different experiences.
It’s even more important when we’re sending other people into digital environments, such as our students. I’ve been intrigued by ideas for using Twitter for class activities, and impressed when instructors manage to make those go well. But I’ve also cautioned faculty about it being a frequently hostile environment to people of color and women, enough so that I have never advocated for making Twitter a required part of the course. Yes, there are various safeguards and settings that could help buffer the risks. But those are far from perfect, and I know I could never forgive myself if a student’s ill-considered, awkwardly worded or overly controversial tweet got them cancelled.
This is not just a concern borne of the last few years of online mayhem. Consider the case from years back of a young woman targeted for massive online scorn and harassment after a Twitter joke landed badly. I’m not going to spell out her name here, but if you want to see how the whole business unfolded, Jon Ronson’s book So You’ve Been Publicly Shamed offers a devastating account of it. She really did lose everything in this wrenching (and arguably the first) case of Twitter cancellation, and Ronson’s conclusion is that her gender had a lot to do with it.
That risk has been there all along, that for some reason, a student’s class related tweets would end up backfiring on them, on you, or both. But this risk is likely only going to become more of an issue as – as most experts are predicting – Twitter becomes even more chaotic and even less reflective in the months to come.
That sense of impending chaos was what finally pushed me over the edge last May. It wasn’t that I was harassed or feeling like I was about to be. It was the feeling that any comment I made, any post I tweeted, immediately became a chore, just another thing I had to check and babysit and manage in the midst of the daily barrage of tasks.
I was seeing a few other academic users cast the stay-or-go question as one of persistence, of staying in the fight for … truth, the right to use Twitter as we pleased, informed viewpoints about our shared profession? I was never sure what exactly we were supposed to stay and advocate for. But in that moment of exhaustion back in May, I became acutely aware of something I think we all know about social media, but that’s easy to forget: Used properly, it is a product that serves your needs as a consumer. When it doesn’t, you walk away. It’s not a cause that needs you to pour energy into it, it’s a commercial entity that needs to earn your business or die trying.
Courtney Heard (@[email protected]) put it best, posting: “I think the fundamental idea Elon is missing…is that we were all on Twitter because we got something of value out of it. When you strip that value & replace it with incel edge lords with the vocabulary of a Teletubby, there is no longer a reason for us to be there. It’s not that we’re not “tough” enough to endure, it’s that it no longer provides us with value. Logging in was always a consumer choice. You have to make us want to.”
To be honest, Twitter had also become a bad fit for my own vulnerabilities and not-great habit patterns. Within the platform, it is difficult-to-impossible to avoid breaking news, leading me into unplanned scrolling binges during work time. Interspersed with all this doomscrolling was the jolt of anxiety whenever beloved figures were trending for one random reason or another (fortunately, as far as I’m aware, Dolly Parton is currently okay).
And as fun (bordering on addictive) as it was to collect praise about new articles, books, and blog posts, tweeting them out came with the risk of getting roasted instead. I realized that this issue was generating a constant, low-level anticipatory anxiety that was enough to contribute to writing block. Fortunately, I never ended up at a total standstill, but we all know that writing is hard enough as it is. Why set myself up for feeling even more anxious about what the crowd would say next? Yes, anything I put out into the world is still fair game to be dissected online somewhere, but I won’t be there for it – which feels like a big relief.
What’s next after bailing on Twitter? To sustain that need to stay in touch with others in my field, I’m falling back on an old alternative, LinkedIn. It’s all business, buttoned-up and boring as can be – which suits me just fine. It doesn’t hold out the promise of amassing thousands of reactions and comments, but it does get content out in front of other professionals in my field, people who will see it as relevant rather than as an opportunity to take pot shots at the ivory tower.
Soon I’ll also be tweaking the way I get blog posts and research out in front of my network, most likely starting with a free Substack newsletter. I’m also checking out the new platforms explicitly billing themselves as Twitter replacements, such as Mastodon and Post. Reddit is still chugging along after all these years, and while it’s not without problems, perhaps we will see a revival of it as a way to connect with larger and more fluid audiences. I think there’s also going to be a future in bespoke, niche online spaces, such as Gather communities.
If any of those catch on, they might also fit the bill for the other thing I really like Twitter for, engaging at conferences. I’ve found that this back channel enhances the conference experience in a way that nothing else ever has, offering opportunities to share in-the-moment reactions with others who are right there in the room, and to praise speakers publicly when a talk stands out. In theory, there are ways to achieve the same outcomes via the chat and social room functions that come standard with most conference applications I’ve seen. So far though, these never seem to take off with the same momentum that’s sparked by a Twitter hashtag.
As for using Twitter in classes, I think there’s no ready substitute here either. The reasons come down to the unique features Twitter brought to the table. There was the extreme concision, which pushes writers to condense and edit thoughts. Compare this to the typical post-once-reply-twice discussion board assignment, where minimum word counts create perverse incentives to write more and say less. Twitter is public enough to encourage students to think through what they say, and enough to ensure some kind of response in a short period of time – unlike the typical academic essay assignment, written for an audience of one who usually needs a week or more to answer.
Is it conceivable that some new platform or tool could provide similar features and affordances, while protecting students from the worse of Twitter’s dangers? I don’t know, but I think that now is an excellent time for educational technology developers to start work on alternatives.
Rest assured that for now, if you’ve still got your Twitter handle, I won’t hold it against you. And I sincerely hope that your experiences continue to be good ones. But unless something radically changes, I’m out. Out, but excited to explore what comes next.