(Reuters) – Facebook Inc (NASDAQ:) is introducing a new detection technology to stop the spread of intimate photos posted on Facebook or Instagram without people’s permission, sometimes called “revenge porn,” the company said on Friday.
“By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram,” the social networking giant said in a blog post https://newsroom.fb.com/news/2019/03/detecting-non-consensual-intimate-images.
Facebook will also launch a support hub called “Not Without My Consent” on its safety centre page.
Fusion Media or anyone involved with Fusion Media will not accept any liability for loss or damage as a result of reliance on the information including data, quotes, charts and buy/sell signals contained within this website. Please be fully informed regarding the risks and costs associated with trading the financial markets, it is one of the riskiest investment forms possible.