© Reuters. FILE PHOTO: Silhouettes of laptop users are seen next to a screen projection of Facebook logo in this picture illustration

(Reuters) – Facebook Inc (NASDAQ:) is introducing a new detection technology to stop the spread of intimate photos posted on Facebook or Instagram without people’s permission, sometimes called “revenge porn,” the company said on Friday.

“By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram,” the social networking giant said in a blog post https://newsroom.fb.com/news/2019/03/detecting-non-consensual-intimate-images.

Facebook will also launch a support hub called “Not Without My Consent” on its safety centre page.

Disclaimer: Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. All CFDs (stocks, indexes, futures) and Forex prices are not provided by exchanges but rather by market makers, and so prices may not be accurate and may differ from the actual market price, meaning prices are indicative and not appropriate for trading purposes. Therefore Fusion Media doesn`t bear any responsibility for any trading losses you might incur as a result of using this data.

Fusion Media or anyone involved with Fusion Media will not accept any liability for loss or damage as a result of reliance on the information including data, quotes, charts and buy/sell signals contained within this website. Please be fully informed regarding the risks and costs associated with trading the financial markets, it is one of the riskiest investment forms possible.

READ  Why I’d sell this 5%-yielding dividend stock to buy this 3%+ yielder





READ SOURCE

WHAT YOUR THOUGHTS

Please enter your comment!
Please enter your name here