This is an article from the fall 2017 issue of LSA Magazine. Read more stories from the magazine.
Siri is always listening, at the ready in case we call her name. Wireless-connected medical devices allow doctors to reprogram equipment like pacemakers without surgery. Computer code controls electric wheelchairs, autonomous cars, door locks, garage door openers, refrigerators, light bulbs, thermostats, DVD players, and more unexpected products every minute.
This is the world of networked devices and appliances: the Internet of Things (IoT).
Hacking IoT devices is easy. Their internet connection allows infiltration by remote access. They’re also designed to link up with one another seamlessly, which allows a breach at one weak point to affect an entire network.
And those weak points aren’t hard to find. Companies that create smart devices often have little incentive to design strong security into their products. For example, many devices leave the factory with the exact same default password. Few people know or bother to change it after they leave the store and install the device in their home.
A clever hack can exploit the literally billions of IoT devices that connect urban communities across the United States, spreading an attack across devices in no time flat. It’s like a virus infecting a community of organisms. That’s why Stephanie Forrest (M.S. 1982, Ph.D. ’85) says, “I argue to my computer science friends that biology is the true science of security.”
Forrest, now a computer science professor at Arizona State University and an external professor at the Santa Fe Institute, earned her Ph.D. with John Holland (M.A. 1954, Ph.D. ’59). Holland himself received the first Ph.D. in computer science granted by U-M and became one of the first professors in LSA’s Department of Computer and Communication Science. He also taught in LSA’s Psychology Department and Center for the Study of Complex Systems before passing away in 2015.
Early on, Holland realized he could apply Darwin’s ideas about natural selection to computers by imagining the computing language of binary code — strings of ones and zeros — as genes, which could then evolve into better versions of themselves. He called these genetic algorithms.
Today, Forrest takes a similar approach. She applies genetic algorithms to software, so the software itself can evolve into more efficient or less hackable versions. She thinks we can design better computer security systems if our understanding of biological defense inspires our solutions.
Both cybersecurity and biology involve attackers and defenders, predators and prey. And similar observations apply in both cases. For example, diverse ecosystems tend to bounce back to health after a disturbance, while biologically homogeneous environments are not as resilient. By the same logic, if everyone uses a different computer or internet browser, the community isn’t as vulnerable to virus attacks.
But Forrest admits that staying secure is a challenge, because attackers continually innovate and devise new attacks. Defending against these attacks requires continual adaptation and repair of software bugs. The work involved can be intense and expensive, and sometimes vulnerabilities in the code remain undiscovered or unrepaired. Companies like Google have set up “bug bounty” programs, which offer rewards to conscientious coders who report major bugs before hackers can exploit them.
Forrest and her colleagues have pioneered another approach based on Holland’s genetic algorithm, which automates the software repair process.
It works like this: Different computer program versions can mate. Mating merges the code of the different program versions, producing new combinations and functions that sometimes work better than the originals. Low-quality versions die. The quality of each version depends on how well it performs the function it was programmed to do. The process boosts promising variants and kills off the code that doesn’t work as well. Mutations also come into play, an element of randomness that allows for unexpected innovation. The genetic algorithms that fuel this mating process — basically selective breeding and artificial adaptation, with computers — are superfast and require little human intervention.
“It doesn’t always fix bugs the way you might,” says Forrest, “But it often fixes them.”
Because it’s automated, Forrest’s program, or something like it, would be cheaper and more efficient than waiting for humans to manually repair software bugs.
Keeping ourselves and our digital devices safe as the IoT grows and evolves requires vigilance, strong computer code, and an investment in safeguarding the devices that go to market. The vulnerability of our devices reflects our own vulnerability.
In this digital form of survival of the fittest, we are only as strong as the weakest programmer.