Sex workers are not the only ones switching apps to stay safe – queer populations across MENA use the same techniques. “The queer community, they use Grindr to meet up,” one lawyer told us in an interview. “If there are security alerts, then they would use more private applications.”
If Grindr is the only app to survive Apple’s policy changes, that option of switching between apps disappears. It is unclear whether any apps have been removed since the policy change was announced, and because the App Store checks apps for compliance when they are updated, we may not know for a while what enforcement might look like. But we don’t need to predict the future to see what would happen when smaller apps run afoul of Apple’s rules – we can look at what’s already happened. Sanctions on Iran have already removed access to certain essential technologies
“These bans and limits […] have decreased my access to communication and networking tools,” one Iranian queer person told ARTICLE 19. Another says the apps remain viral, despite the risks some can pose. “I knew some of the accounts on these apps were fake, however, when I had access to them, I had this hope that there’s someone like me, someone living nearby that I could reach out to,” they say. “[The] decision to ban Iranian users has killed that hope.”
Ro Isfahani, a journalist with a focus on Iran, also believes that “queer people in high-risk environments [will continue to bear] the brunt of these policies”. He was clear about the potential consequences of these seemingly technical changes, saying “this new policy is certain to further curb queer folks access to safe spaces that they have nurtured despite risks to their lives”.
Apple and Google are not the only companies afraid of sex. For decades, advocates who equate all sexually explicit materials with exploitation or trafficking, such as the National Center on Sexual Exploitation, formerly Morality in Media, have suggested that the best thing to do is ban sex on the internet entirely. Recently, anti-sex campaigners have been remarkably successful, driving payment card processers such as Mastercard and Visa to drop PornHub or Instagram to equate sexually explicit content with violence as part of their sensitive content restrictions
Some of that has been as a result of (actual or manufactured) litigation fears stemming from the United States’s passage of the Fight Online Sex Trafficking Act, or FOSTA, in 2018. The law, which claimed to be about preventing sex trafficking, made changes to Section 230, a bedrock law that limited liability for online platforms. Although these changes were minor, they caused significant upheaval in terms of companies’ willingness to host sexual content. Other events, such as the closure and prosecution of Backpage, created limited options for consenting adults who wanted to engage with sexual material online. But Apple’s policy changes, in particular, sweep far broader than what FOSTA covers. (Google’s hew closer to the law.)
Google and Apple’s intentions may be good – people deserve to be able to curate what they see on their phones and should only see sexually explicit content that they consent to. And advocates have been working to make apps safer for high risk users. But kicking services out of the App Store because of the presence of NSFW materials forces apps to choose between policing people and not being able to reach them. When they do that, LGBTQ people, across countries and contexts, lose.
Afsaneh Rigot is a senior researcher at ARTICLE 19 and a Technology and Public Purpose fellow at Harvard Kennedy School’s Belfer Center. Kendra Albert is a clinical instructor at the Harvard Law School’s Cyberlaw Clinic.
More great stories from WIRED