Artificial Intelligence and Extended Reality May Pose Security Risks, Expert Warns – FedTech Magazine

Protect AI Systems from Manipulation

Payton predicted that “AI poisoning” will be something to be concerned about in 2021. As Towards Data Science notes, a “poisoning attack happens when the adversary is able to inject bad data into your model’s training pool, and hence get it to learn something it shouldn’t.”

In solidly built AI models, Payton noted, “your [AI] coach should be self-learning and contextually aware and almost become a black box to the engineer” once it gets up and running.

“My prediction is that, as we’re implementing more AI, hackers will hack in and change that algorithm undetected, so that the AI will do things not initially in the design,” she said. “AI is going to be cybercriminals’ weapon of choice, to help them crack into more accounts, networks and data stores.”

Organizations should not abandon AI and need to leverage AI tools in many cases to combat cyberattacks, Payton said. AI is also needed, she said, for “resiliency and reliability in your operations and to be able to scale your operations.”

Payton recommended that organizations make sure that all of their AI componentry “has a champion challenger test where you can actually run sample decisions outside of AI, compare it to the decision that the AI came up with and have it reviewed to make sure you don’t have issues going on inside the black box.”

MORE FROM FEDTECH: Find out how to best update your agency’s incident response plan.

Watch Out for Hacks on Extended Reality Platforms

When it comes to extended reality, Payton said, she believes “it’s going to pick up in adoption in 2021, meaning it’s going to have its first public hack in 2022.”

Read More   Google Releases May 2020 Android Security Patch; Fixes Bug That Allowed Remote Code Execution - Mashable India

Payton said she is bullish on extended reality (XR) technology. It is “going to give your organization the power to really revolutionize how your operations work in any pandemic, or a natural disaster or a man-made disaster.”

However, because XR platforms thrive on collecting users’ emotional reactions, they are potentially valuable data troves for malicious actors.

“When you interact with the technology, what makes you breathe in? What makes you hold your breath? What makes you happy? What makes you sad?” she said. “Because that’s all going to be correlated with a reaction to you to personalize experiences for you, it’s going to be a treasure trove to be hacked.”

Payton recommended agencies have a playbook to protect such data and prevent such attacks.


Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.