Just like bad weather conditions found in nature, such as typhoons, hurricanes, or snowstorms, technology system defects and vulnerabilities are inherent characteristics found in a cyber system environment.
Regardless of whether it’s a fair comparison, weather changes are part of the natural environment that we have little direct control over, whereas the cyber environment is fundamentally a human creation. Despite these differences, the choices we make do have a direct implication even if they are not obvious. Take for example the use of lead-based or diesel fuel in vehicles, or controlled burns in the forest to clear land for agricultural use. Both have negative effects on air quality. The same is true for information technology developers, whose actions in designing software programs may unknowingly create software bugs or potential security risks because of their interactions with other non-tested, non-secure network systems and cyber environments.
With the constant challenge of resource limitations exacerbated by the pressure of meeting accelerated timelines and business demands, not all software bugs are uncovered within the development timeline. Even when these bugs are discovered, many cannot be addressed. Ultimately, production delays occur if we try to achieve perfection by fixing every system bug. Instead, a risk-based approach is common in the industry. Therefore, technology systems will inherently have weaknesses. As such, the potential for exposure always exists although not all are significant.
Within business and IT operational environments, many conflicting factors increase the complexity and challenges in managing information security risks. They include people (individuals and groups), processes and technology fueled by business economics, political desires of authorities, cultural constraints and individuals’ desires and motivation. Add into this mix, cyber criminals and others seeking opportunities to exploit and compromise data, systems, and people to achieve their own goals.
With limited success observed in existing approaches, practitioners and researchers are constantly looking for new strategies, technologies, and solutions to understand these complexities. In recent years, my research has resulted in the conceptualization of a substantive theory, known as the piezoelectric theory, which states that if the information security practices of an organization’s systems enables a prompt realignment while satisfying the systemic requirements for the changing risk condition, the potential negative effects of the new risk condition could be balanced or even counteracted.
The theory suggests a more responsive approach to security, whereby to effectively manage the challenges in information security, practitioners need to be reflective in their practices and organizations need to focus on gaining visibility of risk, and constantly be prepared for critical re-alignment when the security (or risk) posture of the organization’s cyber environment changes.
In the blog series that follows, I will discuss more about the issues and dilemmas of our current practices, what we mean by being responsive, it’s implications to organizational and regulatory policies as well as individuals’ practices, and reference some related case studies and examples.
CONNECT WITH US