The ai system was tested to detect harmful misinformation about the covid-19 vaccination. It was also used to identify content intended to incite violence or simply portray extreme incidents. Facebook used the following example of harmful content that stops short of inciting violence: “does this guy need all his teeth?” The announcement claims that the new ai system has already helped reduce the amount of hate speech posted on facebook. Facebook shared a chart showing how the amount of hate speech on facebook decreased as each new technology was implemented. Entailment few-shot learning facebook calls its new technology entailment few-shot learning. He has a remarkable ability to correctly flag written text that exhibits hate speech.
The related research paper entailment as few-shot learner pdf states that it outperforms other few-shot learning techniques by up to 55% and achieves an average improvement of 12%. Facebook’s article about the research used this example we can rephrase an Iran Phone Number List obvious pair of input and sentiment classification label I love your national team. You should all be six meters under the earth” y : positive] as the following sample of textual implication I love your national team. You should all be six meters underground. This is hate speech implication. Facebook is working on the development of artificial intelligence.

Which approaches human intelligence the announcement of this new technology made it clear that the goal is a human-like “flexibility and learning efficiency” that will allow it to evolve with trends and enforce new facebook content policies on short notice, just like a human. The technology is in its infancy, and over time, facebook envisions it becoming more sophisticated and widespread. “a teachable ai system like the few-shot learner can greatly improve the flexibility of our ability to detect and adapt to emerging situations. By detecting evolving and harmful content much faster and more accurately, fsl promises to be a critical piece of technology that will help us continue to evolve and address harmful content on our platforms.