Predictive search results are often hit and miss, but Facebook’s search bar plays seriously.
Typing the phrase “video from” into Facebook’s search bar tricks some users into seeing sexually explicit and violent autocomplete suggestions.
How to turn off Facebook’s new facial recognition features
Clicking on these results does not display the exact content of what is searched, but nonetheless, it is a strange event that raises questions about how Facebook’s search bar autocomplete works.
Facebook has said in the past that its autocomplete mixes up predictions that link to both profiles and pages, as well as content like posts.
In comparison, Google said its autocomplete algorithm is based on words users type, terms that have already been searched, and what other people are looking for.
Of course, despicable results still show up in Google’s autocomplete every now and then, like in 2016, when the company edited anti-Semitic results. Google allows users to report offensive predictive results and has an autocomplete policy that prohibits the posting of sexually explicit, violent, hateful, or harmful suggestions.
Update 3/16, 8:18 a.m. ET: In a statement provided to Mashable, Facebook said, “We are so sorry that this has happened. As soon as we became aware of these offensive predictions, we deleted them. The Facebook search predictions are representative of what people are doing. can search Facebook and are not. necessarily reflecting actual content on the platform. We do not allow sexually explicit images, and we are committed to keeping such content off our site. We are investigating why these predictions of research have emerged, and in the future, we are working to improve the quality of research predictions. “
Related video: Here are 5 tips for spring cleaning your digital footprint