so long —
The app will need to find new Web hosting by Sunday or go offline.
Amazon Web Services is suspending Parler’s access to its hosting services at the end of the weekend, potentially driving the service offline unless it can find a new provider.
“Because Parler cannot comply with our terms of service and poses a very real risk to public safety, we plan to suspend Parler’s account effective Sunday, January 10th, at 11: 59PM PST,” Amazon wrote to Parler in an email obtained and first reported by BuzzFeed.
The email from AWS to Parler cites several examples of violent and threatening posts made in recent days, including threats to “systematically assassinate liberal leaders, liberal activists, BLM leaders and supporters,” and others. “Given the unfortunate events that transpired this past week in Washington, D.C., there is serious risk that this type of content will further incite violence,” the message adds.
Parler launched in 2018 as a “free speech” alternative to Twitter and Facebook. Through 2019 and 2020, it drew a number of conservative, right-wing, and far-right fringe users. Usage has dramatically increased in the past few days in the wake of Wednesday’s events at the US Capitol and President Donald Trump’s subsequent total ban from Twitter and other platforms.
That increased traffic has also brought increased threats of violence to the platform, which technology companies across the board seem to be taking more seriously after this week—and no wonder, as the insurrectionists who attacked the Capitol made widespread use of social media to plan, carry out, and brag about their activity.
Parler, however, has not articulated a clear plan for dealing with violent threats on its platform. As Amazon wrote:
It’s clear that Parler does not have an effective process to comply with the AWS terms of service. It also seems that Parler is still trying to determine its position on content moderation. You remove some violent content when contacted by us or others, but not always with urgency. Your CEO recently stated publicly that he doesn’t “feel responsible for any of this, and neither should the platform.” This morning, you shared that you have a plan to more proactively moderate violent content, but plan to do so manually with volunteers. It’s our view that this nascent plan to use volunteers to promptly identify and remove dangerous content will not work in light of the rapidly growing number of violent posts.
Apple also removed Parler from its iOS App Store earlier today, citing similar concerns.
“Parler has not upheld its commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, and is not in compliance with the App Store Review Guidelines,” Apple wrote. “Yo