Imagine a chess match in which the rules are constantly changing as new pieces with unexpected capabilities are developed. That is the world of cybersecurity experts, battling resourceful cybercriminals and foreign agents along the frontiers of technology.
Recently, Infosec Island took a look at three emerging technologies: quantum computing, artificial intelligence and autonomous vehicles. All have the potential to transform our lives — but in the wrong hands (and there are plenty of bad actors out there), each can also pose major new cybersecurity threats, posing new vulnerabilities.
After decades on the technology horizon, quantum computing is becoming a reality, harnessing some of the most bizarre properties of subnuclear physics to transform how computing is done. Computational tasks that were formerly impossible — or so time-consuming to solve, even on a supercomputer, as to be effectively impossible — will become easy.
That is the good news. The bad news, per Infosec Island, is that quantum computing can make cracking encryption systems radically easier. Vast troves of confidential data — currently protected by encryption — will be left exposed and vulnerable. This is one of the greatest looming security threats: Bad actors may already be stealing encrypted data, anticipating that they will be able to read it once they get their hands on quantum technology.
Fortunately, there is more good news. According to ZDnet, a mathematical tool called lattice cryptography can provide encryption so powerful and robust against attack that even quantum computers will not be able to crack it. But security professionals will have a huge task in deploying lattice cryptography, along with other network security technologies, to protect existing data stores from evolving threats.
Artificial intelligence, like the digital computer itself, is a multi-use technology with a limitless range of potential applications. The latest generation of AI technology exploits so-called neural nets that replicate some of the structures of the human brain. A neural net system is able to learn from experience in ways that its designers could not fully predict in advance.
But this same capability, warns Infosec Island, can be applied to develop malware that is also capable of learning from experience — for example, learning how to detect network security technologies, then trick them into allowing the malware through the protective barriers.
Neural nets and other AI technologies pose another challenge for the good guys, reported Technology Review. These tools can also be enlisted as protectors — but with some real risks. One is that we may simply put too much confidence in them. A more subtle risk is that we may not fully understand what our own cybersecurity AI is doing, or why. Things get complicated if your sentries can’t tell you exactly what they are watching out for.
Connected and Autonomous Vehicles
Also high on the list of emerging cybersecurity threats is the threat of hacking into connected and autonomous vehicles, notes Infosec Island. Here the threat is not so much the specific malware technologies involved — which may be new or old — as the inherent danger posed by hacked vehicles or other devices. A hacked database can expose confidential information, but a hacked car might hurt people.
And, as another article in Technology Review points out, the challenge of understanding neural nets applies not only to the threat of deliberate hacking, but also to uncertainties about what AIs are learning when we train them. For this reason, the article noted, the Pentagon is now putting special emphasis on understanding the actions of autonomous military devices.
On these three technology fronts, and others, the ever-changing technology chess game between cybersecurity defenders and bad actors is pushing deep into the frontiers of emerging technology. Even the (not always) “smart home” can be a target for sophisticated new forms of cyberattack.
Are you interested in joining a team that is fighting back against cyberthreats worldwide? Check out careers here.