Twitter apologized this Monday, November 6th, 2017, after thousands of twitter users complained about certain search terms being blocked, but the ‘correction’ may not have been exactly what the twitterverse was looking for.
From Tech Crunch
Earlier this week, Twitter users complained after discovering that photo and news search results were blocked for terms like “bisexual.” Twitter apologized for what it said was an error and now it’s released a more detailed update.
“We apologize for anyone negatively impacted by this bug. It is not consistent with our values as a company,” Twitter said in a thread on its support account.
Twitter explained that the company uses terms that often appear with adult content to identify sensitive media, but since many of those terms are not problematic by themselves, that means Twitter’s system needs to weigh them in context with other signals. But the list of terms it used “was out of date, had not been maintained and incorrectly included terms that are primarily used in non-sensitive contexts.”
As a result, some tweets were incorrectly identified as containing sensitive content. Twitter says it has updated its list, removing some terms, and will implement changes over the next 24 hours.
We’ve identified an error with search results for certain terms. We apologize for this. We’re working quickly to resolve & will update soon.
1 / Late last week, we discovered a technical issue that affected search results: searches for certain words related to sexuality did not populate complete results. We apologize for anyone negatively impacted by this bug. It is not consistent with our values as a company.
2 / As outlined in our media policy, media that may be considered sensitive is collapsed in places such as search results, meaning that images and videos would be presented as a link, not automatically populated. https://support.twitter.com/articles/20169199 …
3 / One of the signals we use to identify sensitive media is a list of terms that frequently appear alongside adult content. Many of these words on the list are not inherently explicit, which is why they must be used alongside other signals to determine if content is sensitive.
4 / Our implementation of this list in search allowed Tweets to be categorized based solely on text, w/out taking other signals into account. Also, the list was out of date, had not been maintained and incorrectly included terms that are primarily used in non-sensitive contexts.
5 / When all Tweets containing certain terms were incorrectly collapsed on the photos, video and news search tabs, the search results in those tabs returned an error message indicating that no content was available.
6 / We have audited the list and removed terms that should not have been included. We are making changes during the next 24 hours to correct this mistake. Once we are confident it is completely resolved, we’ll share an update here.
Despite Twitter’s multiple apologies, many users were upset that even though terms related to human sexuality were blocked in results, terms like “Hitler” and “Nazi” were still searchable despite the barrage of criticism Twitter has received for not disabling accounts that contain hate tweets or incitements to violence. Twitter’s seeming enablement of racists and its haphazard approach to enforcing its own policies were highlighted last month when it temporarily suspended actress Rose McGowan after she accused producer Harvey Weinstein of sexually assaulting her (Twitter has not disclosed how or why McGowan’s account was suspended).