Lips Founder Annie Brown Wants To Make Social Media Less Sexist And More Inclusive

“Rather than serve us, these platforms censor us.”

Annie Brown of Lips

Presented by Ascend Agency

On December 20th, 2020, Instagram updated its terms of service – presented by the tech giant as merely a reorganization and clarification of existing guidelines. However, the claim was challenged by users who noticed that the new terms placed stricter limitations on sexual content (including art, emojis, language, etc.)

Add to this the reality that algorithms governing moderation, Instagram and other social platforms have historically been very bad at differentiating between sexual expression and sexual exploitation.

Annie Brown and the Lips team are out to change this. Launched in January 2021, the Lips app is an Instagram alternative built for historically marginalized creators who need a place on the internet to express themselves, free from biased censorship and harassment.

Lips is at the forefront of a new wave of feminist internet technologies designed to unlock opportunities for previously underserved and intersectionally marginalized communities.

Lips founder Brown explains, “Rather than serve us, these platforms censor us; stunt our businesses; allow us to be bullied, trolled, harassed; and ultimately cause us harm that perpetuates injustice and inequality. But now we’ve built something better.”

Mainstream social media platforms like Instagram have been on the defensive as of late, attempting to prove that algorithmic suppression of women, BIPOC, and other marginalized groups is nothing more than an illusion.

This one tweet from Instagram’s PR team had over 1K comments, mostly from sex-positive creators and sex workers who have experienced a targeted de-platforming effort from Instagram despite adhering to ever-changing, increasingly strict guidelines.

Lips

Despite IG’s claims, it is well-established that the social media apps we use every day have elected a blanket-censoring policy towards sexual content, which disproportionately impacts women and the LGBTQ community. Reports have shown that bias encoded into the algorithms that moderate the content are more likely to flag content shared by women, queer, and trans folks.

In fact, the Huffington Post published an article citing a common practice on Instagram that helps boost exposure and engagement—females changing their gender to male.

Brown elaborates on this point: “Artists, sex educators, and women’s health companies are among the vast array of groups whose posts are often hidden by Instagram, or ‘shadow-banned,’ resulting in limited exposure to the people who benefit from consuming their content or utilizing their services—women and LGBTQ+ persons.”

Why is this happening? Well, to Brown, there are several conflating factors. Legal changes, namely FOSTA/SESTA have pushed big tech companies to be more cautious, however, nothing in these laws explicitly calls for the erasure of all sexually-related content. 

https://www.instagram.com/p/CQ6prozJRUc

Another factor says Brown, “is the fact that the algorithms governing what content can be seen on these platforms are written primarily by white, cis, heterosexual men – causing datasets to be biased.”

Brown describes Lips’ social media platform as a jumping-off point for feminist technology company to begin solving the Internet’s sexism problem. Here are two of the impressive technologies Lips has in the works:

  • [Patented] Blockchain privacy features that can protect survivors of domestic or sexual harassment are surely equipped to better protect everyone
  • [Patented] Contextual, more nuanced, and co-moderated algorithms to correct for the 73% of LGBTQIA+ content wrongly flagged as “inappropriate”

Brown states, “An algorithm that can accurately distinguish sexual expression from sexual exploitation will help larger internet companies finally put an end to child trafficking on their sites.”

These technologies are also important from a mental health perspective, says Brown, “unlocking people’s confidence in their bodies and sexualities empowers them in every other aspect of their lives – health to finances.”

It’s important to note that Brown is not alone in her effort to make the Internet a more inclusive place. In our interview, she made mention of several women who are fighting alongside Lips to raise awareness about social media discrimination.

In the summer of 2019, Unbound and Dame Products – two women’s wellness companies – organized a protest outside of Facebook’s New York headquarters to bring awareness to the selective enforcement of advertising guidelines that promote men’s sexual health products, but ban brands targeting women.

As a part of a report by Salty titled, “An Investigation into Algorithmic Bias in Content Policing on Instagram,” Polly Rodriguez, CEO of Unbound, stated that her company “is consistently banned from advertising on Facebook and Instagram and it’s debilitating to our business. 

It’s also infuriating, because we see endless ads on the same platforms for erectile dysfunction medication, penis pumps, and ‘manscapping’ razors. Why are penises normal but the female and non-binary body considered a threat?”

The Global Women’s Health Technology market will exceed $50B by 2025, and yet keywords in ads & articles currently flagged on social media platforms include: “vaginal health,” “lesbian,” “bisexual” and “gender” – making it very difficult to promote this growing and important industry.

“Right now,” says Brown, “there is so much potential revenue opportunities lost because platforms are not adequately serving women, non-binary, BIPOC, & LGBTQIA+ people in digital space.”

Like Brown, women’s health investor and advocate, Maria Velissaris, Founding Partner at SteelSky Ventures, is passionate about seeing companies develop solutions that improve access, care and outcomes for women to thrive. “Censorship of descriptive language impedes the ability of these companies to scale,” says Velissaris.

Femtech companies like Joylux, a therapeutic device developed to help women with incontinence, vaginal dryness and other conditions related to aging and giving birth are often victims of gender-biased algorithms. Although it’s a clinically backed, FDA approved medical device, social media algorithms identify and categorize it as an adult sex toy.

As Velissaris explains, “These companies are at an extreme disadvantage when marketing on social platforms because language around female sexual health and wellness are banned.”

She continues, “There is an obvious double standard as language around men’s reproductive health, such as ‘sperm and erectile dysfunction,’ are not banned.”

Bans on advertising and marketing perpetuate gender inequities in healthcare and drive inequitable funding for women’s health companies.

“If companies can’t advertise, they can’t sell products. If they can’t sell products, they can’t secure investment. If they can’t secure investment, the company folds,” says Velissaris. “If the company folds, we don’t have products/services to improve women’s healthcare, which is a very underserved market and in need of significant innovation.”

While biased data is one part of the issue, another major problem is the algorithmic model itself. During our interview, Brown mentioned the good work the Center for Humane Technology (CHT) is doing to bring light to this issue.

The mission of CHT, a non-profit, “is to drive a comprehensive shift toward humane technology that supports our well-being, democracy, and shared information environment.”

Cailleach De Weingart-Ryan, the former Chief of Staff at CHT, explains that because algorithms are optimized around engagement, it makes sense that these models would replicate societal biases by deprioritizing content (such as women’s health, or LGBTQ content) that goes against norms.

“The underlying business models of FB, Instagram and other social media platforms require engagement and ‘eye-balls’ to generate advertising dollars,” says Weingart-Ryan.

“What is most engaging to the human brain includes radicalized versions of the truth and things that confirm our pre-existing beliefs. What is less engaging – the boring and/or complicated truth and/or things that challenge our existing beliefs.”

In other words, minority groups are less familiar and present belief systems and perspectives that might challenge what individuals already consider to be true and thus may not get the same “reach” and “virality” boost that a white male spouting radical conspiracy theories that align with existing beliefs and biases gets.

Weingart-Ryan reminds us that there is no person or person behind the scenes at Facebook actively silencing minority groups: “There are humans, however, who built an algorithm that promotes and expands the reach of content that is likely to increase engagement. If the voices of minorities are not somehow driving engagement (and thus advertising dollars), those voices are not going to garner the same ‘reach’ as others.”

Co-designing with and for communities is at the heart of Lips’ approach to building inclusive technologies. “When built right (aka ‘with’ not ‘for’ communities), these technologies ultimately better serve everyone,” says Brown.

Brown believes that “more ethical, more secure, more inclusive technologies built by women, BIPOC, and LGBTQIA+ communities are about to become highly sought after and valuable,” which is one reason why Lips recently launched an equity crowdfunding campaign to allow users and advocates to become investors and profit as Lips grows.

“Marginalized communities have always pioneered internet technologies, but unfortunately we have not always had the opportunity to own those technologies and therefore profit off their expansion,” says Brown.

Lips is already deeply invested in their community’s well-being – from co-design sessions to community guidelines to an active user feedback group on Signal – and is committed to involving the community in decision-making going forward. To learn more about Lips and their equity campaign, visit wefunder.com/lips

Tags: