While being “classic” and “timeless” might work in other industries, information security (Infosec) must constantly guard against resting on laurels when it comes to strategies and solutions. So, while the People, Process, Technology (PPT) framework popularized by Bruce Schneier in the early 2000s has served us well, it has increasingly been questioned recently. Some claim that the concept of putting people at the top no longer works, while others say it needs to be reworked entirely with new elements. The question is whether the concept is still relevant, or if a new one is needed that better matches today’s technological capabilities.
The idea behind the PPT framework is that a proper balance of people, along with the processes and technology they utilize, will efficiently drive action and improvement. While it has frequently been stated that a four-element diamond model would serve modern organizations better, the Golden Triangle continues to survive and thrive.
There is a good reason for this, as the model is flexible and about balance. The framework is most commonly visualized as a triangle, with people at the top. This made sense in the early days, but as technology changes, so should the way we think about this concept, especially in highly technical fields such as cyber protection.
While security pros in some industries might still be able to think of this framework in a traditional way, cyber protection relies more heavily on processes and technology. This means rebalancing the framework to fit the needs of the organization.
The human element is still as important as ever. Processes have continually been improved upon as the industry has grown, and the available technology has taken off in recent decades to nearly unbelievable levels. A modern framework is better visualized as a Venn diagram, but you can still see the existence of the triangle.
It is important to remember the areas where human intervention is needed. Machine learning (ML) and artificial intelligence (AI) might have taken over many tasks previously performed by people, but without human intervention, these technologies quickly become ineffective. Humans still need to review new threats, input data and data sources into SIEMs and other systems, adjust AI/ML training models, and improve processes and technologies. Cybercrime also has elements where automated remediation is unavailable, such as social engineering attacks.
As cybercrime tactics improve, processes adjust to improve the response to attacks. Incident response plans must account for an increasing number of possible scenarios. Processes are regularly reworked to improve efficiency, and address changes in workloads. The reality is that the human touch is still required but is shifted to different tasks, including the ongoing review of processes.
The most rapidly growing element is the technology being used. For many organizations, technology is improving and growing at unmanageable rates, but relying on technology is becoming increasingly important in daily workflows. In today’s world, a significant portion of the workload is not seen by human eyes, but is processed by the computers and applications we use to ensure efficiency. Despite this dependence on technology, we still need the people to ensure the technology is working as intended and make adjustments when it is not.
A machine learning algorithm does not have the intuition needed to determine something is wrong with the data it puts out. If the data fits within the training model, the ML will continue outputting data that might not be relevant.
Topher Tebow, Acronis Cyber Security Analyst
Topher Tebow is a cybersecurity analyst at Acronis, with a focus on malware tracking and analysis
The post Can a 60-year-old framework still improve cyber efficiency? appeared first on SC Media.
Click to Open Code Editor