On March 5, 2020, as the COVID-19 pandemic was dominating news headlines, South Carolina Senator Lindsey Graham introduced the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act—known as the EARN IT Act—to Congress. The EARN IT Act intends to radically expand theOn March 5, 2020, as the COVID-19 pandemic was dominating news headlines, South Carolina Senator Lindsey Graham introduced the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act—known as the EARN IT Act—to Congress. The EARN IT Act intends to radically expand the surveillance of sexual speech online, calling for the formation of a nineteen-person commission to develop vaguely defined “best practices” that ISPs and content platforms like social media sites will be strongly incentivized to institute—facing hefty fines and potential criminal charges if they refuse.1 As the ACLU noted in their opposition letter to the Senate, the proposed commission developing these best practices will be constituted solely of Department of Justice officials, elected officials, and industry representatives, with no representation of LGBTQIA+ communities, sex workers, or other marginalized communities that will be impacted by the bill.2 As the ACLU noted,
After SESTA/FOSTA, platforms censored a great deal of legal sex-related speech, disproportionately harming the LGBTQ community, and the speech of sex workers, generally, harming their ability to organize and engage online. The EARN IT Act will incentivize similar censorship efforts by platforms. Platforms will again ban and censor sex-related speech, especially if it relates to youth. These sex-related speech censorship regimes are particularly harmful to LGBTQ communities and to sex worker communities because their advocacy often discusses or relates to matters involving sex and sex education. Furthermore, censoring the online speech of the LGBTQ community also harms LGBTQ youth, who often first explore their identities by seeking information and building community online, before engaging with their identities offline, especially if their friends or family may not accept who they are.3
Beyond the further expansion of censorship of LGBTQIA+ communities, the EARN IT Act also threatens to enact backdoors through encryption protocols in digital communications technologies. Riana Pfefferkorn, a research scholar at the Stanford Internet Observatory, described EARN IT as a “bait and switch” that attempts to mobilize people’s anger at “Big Tech” toward a long-standing governmental desire to ban strong encryption.4 The ACLU has identified strong encryption as essential not only to the political protests demanding racial justice in the United States but also to less visible efforts to organize the LGBTQIA+ community, institute HIV prevention, deliver public health resources to marginalized communities, and safeguard domestic violence victims.5 These impacts will reverberate internationally. As has been seen time and again, social media companies and ISPs that are either (1) headquartered in the United States or (2) are dependent on doing business in the United States tend to implement rather uniform content moderation procedures across their entire platforms. The standards set here will impact people across the world. And further, as Ruane notes, offering a backdoor to encryption for the US government will make it difficult for these companies to resist similar requests from foreign governments, including those that actively criminalize or persecute the LGBTQIA+ community.6
Myriad institutions, including the ACLU, the Electronic Freedom Foundation, Human Rights Watch, Wikimedia Foundation, and Freedom Works, have submitted open letters to Congress outlining the dangers of the EARN IT Act, similarly as was done in the case of FOSTA-SESTA in 2018. Sex workers have also been organizing against the act creatively because of social distancing measures. For instance, Veil Machine, a sex worker–led art collective, put on a twelve-hour virtual variety show/peepshow called “E-Viction” that was meant to highlight the ways in which sex workers and marginalized people are being evicted from digital spaces.7 These actions are eerily similar to those that unfolded prior to FOSTA—briefs and letters filed from free speech–oriented nonprofits and tech platforms and small-scale activism from feminist, LGBTQIA+ communities, and sex workers. While these efforts are truly admirable, and I want to find hope in them, the signs that I have been piecing together in my research for this book point toward a trend in the opposite direction. The EARN IT Act is just another bulwark to further solidify and entrench the social conservative position. If it passes, it will be wielded with lethal force against marginalized communities. If it doesn’t, another will soon take its place, leveraging the rhetorical force of child sex trafficking to distract from its more malignant intentions.
The theorist Michel Foucault once argued that, contrary to popular opinion that the Victorian Era was a sexually repressive era, beneath the surface, people could not stop talking about sex. Today, we are experiencing the opposite. In an era popularly conceived to be sexually liberated, heterogeneous, and with all forms and representations of sex readily available at the stroke of a key, beneath the surface, sex is being ignored by most and targeted for repression by a small but influential subset of the population. As we’ve seen throughout this book, there is a growing sentiment in online discourse that sexual expression needs to be combated. While this opinion may be held by a minority of internet users, it is given an atypical amount of power in shaping our online discourse and thus the future of the internet. As I’ve shown, this is largely due to two primary factors. First, the privileges this heteronormative, white, bourgeois minority enjoys—ranging from technical literacy to strong organizational structures, strategies, and tactics to the media coverage their taboo transgressions generate—allows them to exert a disproportionate amount of power on the internet. Second, they have formed strong alliances and become the unlikely bedfellows of evangelical Christians and anti-porn feminists, allowing them to form a multifaceted discourse that shifts emphases from scientific rationality to feminist critique to Christian conservative family values to violent misogyny based on what is convenient given the context in and audience for which their message is disseminated.
The influence of this growing sentiment against sexual expression can be seen everywhere, from the coders developing digital tools and technologies to the underlying code for major internet platforms to the “human algorithms” that oversee content moderation online to the way the US government understands, legislates for, and regulates the internet. As we’ve seen, many coders hold misogynistic and anti-LGBTQIA+ sentiments, a problem exacerbated by the lack of diversity in the tech sector. Whether intentional or not, these biases get embedded into the structure of the algorithms they produce in the form of biased data inputs or biased parameters for machine learning. As they say, garbage in, garbage out. The result is algorithms that reinforce cultural biases and prejudices in a way that is largely opaque to the public and at a worrisome new scale. Once these systems are trained and embedded into our digital infrastructures, they are very costly to change. In line with the hacker ethic of continually patching bugs in a product like bailing water from a sinking ship, the most frequent solution is to suggest ad hoc readjustments and the addition of human review to edge cases. However, this will always lead to two problems. First, the human review process is produced by the same companies that built the biased system in the first place, and their normative viewpoints tend to inflect their protocols for human review just as much as their code for algorithmic review. Second, because of capitalist incentives to maximize profits—and, in fact, the legal obligation of publicly traded companies to do so—this content moderation labor will always be farmed out to contract laborers; in the case of sexual expression, it will be outsourced to undertrained and overworked Indian and Filipino laborers. Reviewers will have mere seconds to make determinations about the content being reviewed and rather than reflecting the contextual and localized community standards in which the content was produced and circulated, these judgments will be made according to the most conservative global standards to protect platform brand integrity and advertiser revenues. This isn’t likely to change unless the public relations expenditures internet platforms incur when they have to apologize for overblocking LGBTQIA+ content become more expensive than it would be for them to reconstruct their algorithms, retrain their moderators, and hire more moderators who are better trained and given more time to review sexually expressive content.
Further, it is difficult to trust that these companies could achieve such a change even if they were well-intentioned. For example, while tech companies are at least paying lip service to feminism and LGBTQIA+ civil rights and in some instances installing people dedicated to progress when it comes to these issues in positions within middle management, too frequently, these measures are rendered moot by the coders who work in isolation from them and the top executives who flout them in an attempt to buy their high school fantasies of unlimited heterosexual and misogynistic access to female bodies. The limitations of these progressive midlevel employees were demonstrated all too clearly in Google’s firing of Timnit Gebru in December 2020. Even if there were well-intentioned tech executives, they would still be subject to US law and regulations, which, as we’ve seen particularly in the case of FOSTA, are increasingly oriented toward combating sexual expression on the internet, none more so than LGBTQIA+ and feminist sexual expression.
The impact that this curtailing of sexual expression has is always disproportionately borne by those already structurally positioned for disempowerment and marginalization, most notably women and LGBTQIA+ communities, but also communities marginalized by race, nationality, and ability. This is most noticeable when we examine who bears the burden of “overblocking,” the phenomena in which unintended pieces of content are blocked because content filters are designed to be overbroad—any “catch-all” filter will also catch a lot of nonpornographic content in the process. In the instance of art, we can clearly see that while canonical, Western (re: white, male, Eurocentric, colonial, and the like) art can trigger content filters, it is considered embarrassing when it does so. This art is indexed to prevent this from happening, and there are specific “carve-outs” in the content moderation review procedures meant to protect it from being censored. This is not the case for other forms of art, whether it is lesser indexed art from decolonial communities or the amateur art produced by online communities—a particularly salient practice in online LGBTQIA+ communities.
For additional evidence of the undue burden borne by these communities one only needs to look at which sex educational and nonpornographic sexually expressive content gets censored on the internet. As I’ve shown, it is inordinately LGBTQIA+ community resources, activist groups, and sex educators that are getting censored by the overbroad content moderation algorithms and human reviewers. This occurs for a number of reasons, including (1) their identities are at least partially tethered to sexual expression, and thus LGBTQIA+ discourse requires freedom of sexual expression to exist; (2) they don’t have the institutional support or financial resources to seek redress from internet platforms and ISPs when their content gets blocked as if it were pornography; and (3) the cultural pornographication of LGBTQIA+ identity is exacerbated by the frequency of LGBTQIA+ terms being used in descriptions of mainstream heteroporn—e.g., “bisexual girl in MFF threesome,” “lesbian dominatrix uses strap-on”—which floods algorithms with signals that words like “bisexual” or “lesbian” are dirty words. This was never about simply blocking hard-core pornography but about the pornographication of a large group of people’s everyday lives, identities, and forms of self-expression.
These effects are felt particularly acutely by LGBTQIA+ children. By positioning children as naturally “pure” with no inner sexual drives, children find themselves increasingly dependent on adult “protection” and evacuated of all agency and autonomy.8 As Henry Giroux argues,
Unable to understand childhood as a historical, social, and political construction enmeshed in relations of power, many adults shroud children in an aura of innocence and protectedness that erases any viable notion of adult responsibility even as it evokes it. In fact, the ascription of innocence largely permits adults to not assume responsibility for their role in setting children up for failure, for abandoning them to the dictates of marketplace mentalities that remove the supportive and nurturing networks that provide young people with adequate healthcare, food, housing, and educational opportunities.9
For our purposes, we can expressly view these educational opportunities through the lens of sex education. By asserting that children and adolescents are pure, sexless beings, parents and other authority figures at the same time deny their responsibility for educating children about sex and sexuality. Instead, children are left to learn about sex and sexuality from the pornographic marketplace, which is misrepresentative enough of sex and sexuality for heterosexual children and wildly nonrepresentative for LGBTQIA+ children and adolescents looking to learn about and explore their sexuality.
What makes all of this even harder to swallow is that this entire system of porn censorship is not really slowing down the production, distribution, and consumption of pornography. Mainstream heteroporn proliferates as do the structurally produced ills of sex work within this largely heteropatriarchal mode of production. Sure, content moderation does a rather good job of keeping nudity off Facebook, YouTube, and Google Images, but technology companies celebrate themselves for putting pornography out of sight and thus out of mind. What these systems actually do is make it more difficult to accidentally stumble upon porn. However, they don’t make it much more difficult at all to find porn if you are looking for it, even if you are not supposed to be able to find it (as in the instance of many adolescents). This focus on preventing exposure to porn at some times while facilitating access to it at other times has had very problematic effects on the range of sexual expression that can be readily found in pornography. By setting themselves up as gatekeepers and trying to determine the exact instances when a person may want to view pornography, they play into the hands of the mainstream heteroporn industry. The mainstream heteroporn industry alone is capitalized, horizontally integrated, and vertically integrated enough to force its product through this gauntlet of censorship. By a combination of SEO, sophisticated hub-and-spokes affiliate networks, and legal representation, mainstream heteroporn producers make sure their content is always available and nearly exclusively so. Barred from revenue by undue censorship, deprioritization in search, shadow bans, and content demonetizations that can’t be adjudicated because of their small size and lack of capital, niche producers of feminist and LGBTQIA+ pornography at best become largely invisible and at worst cease to exist.
In essence, we increasingly find ourselves in a digital world where sexual expression is considered to be a private matter, not meant to take place on social networks but only to be consumed or enacted in private. I imagine here an archetypical person who holes up in a room with the door closed, wakes up their screen, and signals that now is the time they’d like to engage with mainstream heteroporn. If this engagement with heteronormative porn is confined to the privacy of the bedroom, then the engagement with queer porn can be understood as once again confined to the silence and invisibility of the closet. This increasing tendency to bracket pornography to a digital bedroom, safely distant from the social media we increasingly understand as our digital public sphere, privileges heteronormativity. As we’ve seen, LGBTQIA+ discourse requires some level of tolerance for sexual speech in the public sphere—as does much of feminist discourse (e.g., marital rape requires that the private become public for just and democratic solutions to be found). As if it weren’t enough that the bulk of sexual expression is filtered out of public discourse, nonpornographic LGBTQIA+ discourse is overblocked, and LGBTQIA+ pornography is rendered invisible or nonexistent.
The result of all of this is what I have called “the digital closet.” The digital personae of LGBTQIA+ people are forcibly stripped of all sexual expressivity after having been pornographied, and they are forced to digitally segregate that aspect of themselves from their everyday online existence. To not have your account banned, to not have your content censored, to not find yourself demonetized, or, in short, to participate in this new internet-mediated world of ours, you must relegate a certain part of your identity to a digital closet—usually one with a gym bag containing the few odd bits of pornography that push the boundaries of the “abnormal” sexual desires that you’ve been able to scrape out from the homogenous glut of mainstream heteroporn (with little help from tube sites or Google Search). As Michele Barrett and Mary McIntosh note, this can lead to “a prison whose walls and bars are constructed of the ideas of domestic privacy and autonomy.”10
Taken individually, each instance of heteronormative bias I’ve examined throughout the book and recapitulated above is rather easily dismissed by technology companies’ public relations departments as simply being a mistake made by enormously complex systems operating at web scale on billions of pieces of content or the rogue misogyny and homophobia of a few bad actors. Perhaps even more unfortunately, these arguments are convincing to a sizable portion of their users. Lisa Nakamura has found similar explanations for racism online, which is often positioned as “a ‘glitch’ or malfunction of a network designed to broadcast a signal, a signal that is hijacked or polluted by the pirate racist.”11 Following Nakamura, I hope to have shown that heteronormativity is not a glitch online but a feature of the internet writ large. By connecting a broad overview of misogynist and heteronormative discourse online (chapter 1) to the coding practices and content moderation policies at technology companies (chapter 2) and demonstrating their broad, enduring, and consistent negative impact on LGBTQIA+ communities over time (chapters 3 and 4), my hope is that in aggregate these many cases and examples might serve as a convincing gestalt from which we can begin to see the growing heteronormativity of the internet.
For some, this will likely still be an ersatz argument, lacking the smoking gun of direct admission of guilt or the empirical evidence of a heteronormative module embedded in every algorithm on the internet, as some people are still wont to give the benefit of the doubt to technology companies. Unfortunately, with blackboxed proprietary algorithms dominating the internet and a scattered and ephemeral archive of overzealous censorship, it will be difficult to ever convince these people. Further, the fact that cultural, political, and economic victories are never securely won but must continually be refought can inspire cynicism, apathy, and, in the worst cases, nihilism. However, this is the harsh reality that we must face. Just as the door to the closet seemed to have been pried open with the blood, sweat, and tears of millions of people, its logic is being rearticulated in our digital world and embedded in the infrastructure of the internet. This battlefront has been reopened, and like a hydra, heteronormativity has reared another head.
In light of this, critique is not enough. In my opinion, ending the book here would be dodging the key question implicit in any such critique; namely, what can be done? For those willing to see the whole that emerges from these many parts, there are some steps we might take, ranging from revisionist actions that might make the argument more convincing and ameliorate some of the worst heteronormative abuses of power to the revolutionary that might reshape the internet and society for the next generation. While I will outline the beginnings of some potential strategies and tactics that might be useful in the battle at hand, I would like to offer some caveats. My ideas here will be partial, perspectival, and quite possibly wrong. There can be no singular answer to this most difficult of questions, and someone who enjoys my privileges is perhaps least qualified to respond. As such, I’d invite you to correct my response, to critique my critique, even if it means tearing down everything I’ve pieced together here to start anew or exposing the normativity in my own analysis. It is my hope that smarter and more qualified people than me will be determining the course of action necessitated in response to the digital closet.
The revisionist response to the digital closet includes collective actions that we might all engage in to strengthen our case against the increasing heteronormativity of the internet and to ameliorate some of the harms that it inflicts, unduly borne by the most marginalized in our communities. The revisionist response is meant to provide some framework for what can be done immediately or in the short term while more expansive responses are formulated and implemented. This is a culture war of many fronts and will take a steadfast, diverse, and distributed set of actors committed to many different strategies and tactics over different time frames to make significant progress. Toward that end, here are a few of the action items that I think are readily achievable and can be articulated within the preexisting framework and discourse on the internet, free speech, and civil rights that are prominent in Silicon Valley.
Vigilance and Accountability through Data Collection
We—and by this I mean the alliance of people willing to work toward queering our internet architecture—need more, better, and longer duration data on internet censorship. While we could demand this from companies themselves—or we could demand that our governments demand it on our behalf—it is unlikely that they will provide it. The possibility of spammers reverse engineering their filtration systems from this data will endanger their ad revenue too greatly for them to provide this information willingly. If it cannot be obtained by demand, it ought to be collected independently by research centers, universities, and community members. Some initial efforts have been made in this direction, but they are not well funded or robust enough. Ideally, everyone on social media would know where to go and how to submit a report of the overzealous censorship of sexual speech. With a large enough dataset, we can make much more convincing arguments; we can demonstrate that heteronormativity is not a glitch but a feature of the internet.
Initiate a Public Discourse on Sexual Speech
We need to be having a much more robust conversation about what constitutes pornography, in which contexts, when it is actually in the best interests of children and adolescents to censor it, and how best to do so. This conversation needs to better reflect LGBTQIA+, sex-positive, and sex-critical voices. We need to figure out what values we actually share and examine how they intersect with civic justice. We need to consider the evidence we have about sexual speech and pornography in particular in doing so.
More and Better Evidence on the Impact of Sexual Speech
Throughout my research for this book, it was a struggle to connect the incredibly heterogeneous and siloed empirical evidence that came to bear on sexual speech online. This is no wonder, as disciplinary boundaries often prevent the very confluences of ideas necessary to address a problem like this. This is only exacerbated by the difficulty of getting funding for and internal review board (IRB) approval for studies on the impact of sexual speech, especially when they examine people under eighteen years old. It would be helpful if we advocated for more, better, and reproducible studies of the impact that sexual speech has on people that are then confirmed through multiple repeat trials. This same energy ought to be applied as well to researching the material impacts of online sex work so that we can better understand the needs of digital sex workers. The social sciences are particularly well equipped to do this if we make it a priority.
Anti-Censorship Commitment
In 2007, Google shareholders voted down a sweeping anti-censorship initiative.12 Similar initiatives have been introduced at or suggested to other internet platforms to no avail. We ought to press these companies to reconsider anti-censorship commitments and press our governments to put similar commitments into legislation and bureaucratic regulations as well. While anathema to shareholders, these commitments easily fit within the techno-libertarian, free speech–oriented ethos of the technology sector and can be argued for on grounds that are thus familiar to tech executives. Extracting a specific commitment to protecting LGBTQIA+ discourse online would be particularly beneficial, as they can be brought to bear as pressure on companies to redress grievances more quickly and thoroughly.
Better Adjudication Mechanisms
One of the more opaque aspects of content moderation online is the adjudication mechanisms available to people who believe their content was blocked unjustly or in error. The accounts that I came across repeatedly showed tech companies sending out mixed messages, repeatedly sending vague form letters in response to each complaint, or ignoring requests for adjudication altogether. We ought to advocate for more carve-outs for LGBTQIA+ discourse and sexual speech and specific channels of adjudication for content that may have been blocked due to heteronormativity and/or homophobia. This is a rather low-cost solution and fits within the content moderation workflow that already exists at most tech companies—it is a simple matter of prioritizing and escalating LGBTQIA+ content to the more senior and better trained moderators and/or instating targeted carve-outs to preserve LGBTQIA+ discourse. These costs, it could be argued, would easily be offset by the benefits of avoiding the embarrassing public relations nightmares of censoring clearly nonpornographic LGBTQIA+ content.
Demand AI Explicability
Big data and AI ethics are rapidly growing discourses that increasingly stress the need for neural network explicability and interpretability. Some computer scientists argue that this will unnecessarily handcuff the development of AI systems.13 However, it is the only means for having a public discourse on such systems. Recent trends in neural network research have begun to demonstrate methods for feature visualization and attribution in neural network applications.14 We ought to demand that companies applying machine learning and neural networks to content moderation institute more robust feature visualization and attribution and make these outputs publicly available so that we might better understand how their algorithms are working and offer constructive criticism for improving them.
Demand “Human Algorithm” Explicability
In the wake of the content moderation scandals that surrounded the 2016 US presidential election, Facebook introduced transparency measures to its content moderation policy making. This first step is applaudable and ought to be replicated industry-wide. It needs to be taken further though, and further transparency ought to be granted to the public or nonprofit industry watchdogs who can keep track of who is making content moderation policies, who is influencing these policy makers, and who is enacting these policies and making decisions about individual pieces of content. Moderation of sexual content ought to be further prioritized, with more care and consideration given to policy making and more training being given to content moderation laborers. Ideally, this would also include location or cultural context being factored into decision-making. Further use also ought to be made of the click-to-reveal dynamics implemented at companies like Facebook for potentially gory photos, allowing borderline sexual content to persist on the site behind a click-through barrier or even behind age verification, though this latter is rife with its own problems.
Reinstate the Off Button
Google SafeSearch and other companies that host but mask pornography on their platforms need to reinstate a full opt-out option. All content on these platforms should be indexed and searchable with the same ease, and the decision of when to show or not show “pornographic” results ought to be left to users rather than keyword and behavior-based predictive analytics. Gating pornography behind a select few keywords puts mainstream heteroporn producers at an undue advantage, as they can leverage their technological prowess, access to corporate lawyers, and advertising capital to make sure their content is “optimized” to show up first in any content search. This seems like a relatively simple to implement and cost-effective solution and thus is a demand worth making. Similar demands ought to be made if other platforms can be convinced to host sexual speech behind click-through or age verification barriers, though, as of now, this demand pertains mostly to Google.
The revolutionary response to the digital closet encompasses those strategies that aim for changes that are much more difficult to achieve or need to occur over a longer time frame. The revolutionary response needs to remain flexible and responsive to social contexts and the needs of the marginalized. It is particularly difficult to imagine because we are all fed a narrative of the inevitability of our current technologies and the impossibility of thinking outside the frameworks of the nation-state and capitalism. That said, it is worth staking out some initial thoughts on what such a response might look like, even though it will inevitably fall short. Toward that end, here are some action items that might help us orient a revolutionary response to the digital closet, each of which requires a more or less radical break from our current ideology and the current state of affairs.
Defund the Police
Following the calls of the Black Lives Matter movement and other social justice organizers, we ought to make defunding the police a core strategy. The particular focus ought to be on defunding vice squads that enforce prostitution laws and criminalize sex work, as well as the branches of the Justice Department now focused on the overbroad enforcement of FOSTA. Where police departments continue to exist, we might also follow New Zealand’s model and train police officers to be more accountable and available to LGBTQIA+ and sex worker communities so that they can have equal access to protection under the law. Extending that concept, police also ought to be better equipped to handle the types of online harassment that digital sex workers might face, including things like trolls and stalkers.
Legalize Sex Work Online and Offline
A tightly coupled second aim ought to be to legalize sex work, both online and offline, in recognition that the criminal justice system is not the appropriate apparatus to address the material ills of sex work. This has the added benefit of creating a loophole in FOSTA, which notably does not apply in Nevada because of state legislation on prostitution there. While a more revolutionary approach would be to demand this at the federal level, it also works as a revisionist approach, as the same idea can be applied at local and state levels perhaps more immediately.
Make Sex a Concern for the Welfare State
Again, following the trends in the current progressive movement of demanding an expansion to the welfare state—including universal health care, sweeping environmental regulations, unemployment insurance, and so on—we might add to that list that sex be treated as a public health concern and an important prong of the welfare state. I mean this both in a rehabilitative sense—offering social services like housing, health care, food, job training, education, and so on, to sex workers (regardless of whether they agree or intend to exit sex work)—and in a more proactive and positive sense. By the latter, we might begin to think about fulfilling, enjoyable, diverse sex as part of what it means to live a healthy and happy life. We might radically expand and diversify sex education, not only in public schools but also in public discourse through public service announcements and other informational campaigns. We might aim to become the society we already imagine ourselves to be that can openly talk about sex and sexuality in a productive, informative, transformative sense. Needless to say, I imagine this in an anti-heteronormative and feminist sense that would highlight increasingly things like consent and mutual pleasure.
Direct Action through Community Organizing
While this is less difficult to imagine as many LGBTQIA+ and sex worker communities are already engaging in the practice, we might imagine radically expanding our communal capacities to address our concerns directly without the need to appeal to state or corporate powers. I am thinking here of the sex worker activist groups AIDS Myanmar Association, Durbar Mahila Samanwaya Committee, Veshya Anyay Mukti Parishad, and the Thai group Empower or trans and queer community groups looking to address violence without recourse to police like Safe Outside the System Collective of the Audre Lorde Project in New York City; For Crying Out Loud!, Communities Against Rape and Abuse, and the Northwest Network of Bisexual, Trans, Lesbian and Gay Survivors of Abuse in Seattle; Creative Interventions and Generative Somatics in Oakland; Community United Against Violence in San Francisco; and Philly Stands Up!15 In particular, we can look to The Revolution Starts at Home and the Creative Interventions Tool Kit as inspiration for how community problems can be solved by committed community members engaging in direct action.16 I think this is a model we might look to expand on and develop.
Make Communications Infrastructures and/or Social Media Platforms into Public Utilities
It has always struck me as odd that among the demands made by progressive organizers that turning phone and ISPs and now social media platforms into public utilities was not a more prominent demand. It is nearly impossible to access state services or maintain gainful employment without maintaining perpetual internet and mobile phone connectedness, and it has become increasingly difficult to navigate higher education and the workplace without using social media. Internet and telephone services definitely present themselves as public utilities, as increasingly do social media platforms and other technologies, such as Google Search. We might take that model to rethink technology’s place in our society and either demand public ownership or an extremely restrictive private licensing agreement where companies are allowed to provide the service for limited profit but under tight constraints aimed at the public good. This goal would subsume similar but smaller-scale goals like reinstituting net neutrality or extending net neutrality to mobile communications. The result may be universal and free access to phone and internet communications and tighter regulations on content moderation policies—making them responsive to our needs rather than advertisers’ brand images.
Fully Automated Luxury Gay Space Communism
In the end, many of the strategies here are interconnected with and dependent on a much larger revolutionary movement toward the overthrow of global capitalism and its attendant imperialist nation-states. With power concentrated in the hands of either, we’re left to fend for ourselves and take what moderate revisions we can get. While it may be a yet to be imagined -ism that gives shape to an allied intersectional revolutionary movement like this, to me, it looks like for now the closest concept we have to imagine a society that can meet these demands of radical democracy, robust social welfare, and freedom of self-expression is communism—particularly of the variety often memed about in earnest on the internet, Fully Automated Luxury Gay Space Communism. Let’s all blast off together.