Experts in anthropology and cybersecurity at Kansas State University are examining the unspoken knowledge shared by cybersecurity analysts as a way to develop new automated tools that help analysts strengthen their cyberdefenses.
Xinming “Simon” Ou, associate professor of computing and information sciences, and Mike Wesch, associate professor of anthropology, recently received nearly $700,000 from the National Science Foundation to fund a three-year project that takes an anthropological approach to cybersecurity. Data will be used to develop algorithms for improved cybersecurity.
Ou and Wesch, along with Sathya Chandran Sundaramurthy, India, and Yuping Li, China — both doctoral students in computing and information sciences — are working alongside analysts in the university’s office of information security and compliance. The researchers are using anthropological techniques to understand how analysts perform their job duties. These techniques help them learn tacit knowledge rather than traditional formal knowledge about the job duties and manpower requirements for security operations centers.
“Tacit knowledge is the knowledge that we have about something that we can’t verbalize,” Wesch said. “You cannot walk into a New Guinea village and just ask people what their culture is. You have to live it and experience it to understand it.”
Researchers will translate this tacit knowledge into algorithms that will speed up various tasks and job duties performed by the analysts. For example, it takes a professional analyst between five and six minutes to find the Internet Protocol address and physical location of a computer that has been compromised by viruses and malware. An algorithm could complete the process in five to six seconds.
“We’d like to automate the boring, repetitive part of the tasks that aren’t heavily reliant on human intelligence but are more about humans doing them because they do not have better tool support,” Ou said. “That would free analysts to concentrate on the more complex tasks, such as investigating more large-scale, sophisticated attacks and plugging potential security holes in a network.”
The lack of understanding of the tacit knowledge in cybersecurity may be why so few commercial and open-source support tools are available to help cybersecurity analysts understand an attack in detail, Ou said. Often the tool developers do not understand the job and time requirement of security analysis, which limits the ability for them to design useful algorithms for these tools. As a result, finding information such as how the attacker got into the system and what data was compromised and damaged is a very labor-intensive process.
“A network is bombarded with attacks all of the time, and many of those attacks themselves are automated,” Wesch said. “We’re trying to automate parts of the defense.”
In addition to streamlining the repetitive tasks, researchers said their findings about what is needed for comprehensive cybersecurity analysis in this unique collaboration will lead to better training and education for the field.
“We’re ultimately building something like a conceptual model of how cybersecurity actually works, not just how it should work from a researcher’s perspective,” Wesch said.