GovCon Specialist Ron Lear Addresses Issues in Cybersecurity Software Quality and Capability

Published:

The Intersection of Software Quality and Cybersecurity: A Call for Action

By Ron Lear, Vice President of Models and Frameworks at ISACA

In the ever-evolving landscape of cybersecurity, the words of Jen Easterly, head of the Cybersecurity and Infrastructure Security Agency (CISA), resonate deeply. Her assertion that “we don’t have a cybersecurity problem. We have a software quality problem” encapsulates a critical issue that has persisted for decades. While I wholeheartedly agree with her sentiment, I also recognize that the situation is more complex than it may initially appear.

The Dual Nature of the Problem

Easterly’s comments highlight a fundamental truth: the technology industry has been allowed to produce software that is often defective, insecure, and flawed. However, the responsibility does not rest solely on the shoulders of technology vendors. It is imperative to acknowledge that customers and stakeholders have also played a role in this ongoing dilemma by permitting vendors to operate without stringent requirements.

Quality, as defined by the ISO 9001:2015 standard, is the “degree to which a set of inherent characteristics of an object fulfills requirements.” This definition underscores the importance of well-defined requirements in achieving software quality. If vendors are creating subpar products, it is essential to examine the requirements provided by their customers. Are these requirements clear, complete, and consistent? Are they tied to business value?

The Importance of Clear Requirements

To address the software quality problem, technology vendors must establish a robust framework for requirements development and management. This framework should ensure that requirements are:

  • Unambiguous: Clearly articulated to avoid misinterpretation.
  • Complete: Covering all necessary aspects of the software.
  • Consistent: Aligning with one another to prevent conflicts.
  • Testable: Allowing for verification through testing.
  • Traceable: Easily linked back to their source.
  • Achievable: Realistic in terms of implementation.

These criteria are not merely checkboxes; they form the foundation of a quality-driven software development process. Without them, vendors are left to navigate a murky landscape, resulting in products that fail to meet the needs of their users.

The Need for Verification and Validation

Once requirements are established, the next step is verification and validation. Vendors must implement consistent and thorough processes for testing, peer reviews, and quality assurance. This is where governance comes into play. Organizations need a structured approach to ensure that these capabilities are not only in place but are also effectively utilized.

Moreover, measuring the effectiveness of these processes is crucial. Vendors should track how well they adhere to their requirements, budget, and schedule. This involves maintaining data on components, versions, and performance metrics, which are essential for continuous improvement.

A Comprehensive Capability Framework

To tackle the software quality issue, we must recognize that it encompasses a wide array of capabilities. Here are at least 14 critical areas that organizations should focus on:

  1. Requirements development and management
  2. Verification and validation (testing)
  3. Design and technical solutions
  4. Product integration
  5. Peer review
  6. Process quality assurance
  7. Governance
  8. Infrastructure
  9. Performance measurement and analysis
  10. Estimating
  11. Planning
  12. Monitoring and control
  13. Configuration management
  14. Data management

Additionally, operational security requirements, such as incident resolution and supply chain management, must not be overlooked.

Leveraging Existing Best Practices

In light of these challenges, one might wonder if there are established best practices that organizations can adopt. Fortunately, there is a proven framework available: the Capability Maturity Model Integration (CMMI) V3.0. Developed through federally funded research, CMMI V3.0 offers a comprehensive model that integrates multiple core business capabilities across eight primary domains.

This model is not only adaptable for organizations of all sizes but also provides clear data demonstrating consistent outcomes and performance results. Given its proven efficacy, it raises the question: why wouldn’t the government mandate that vendors adhere to these best practices instead of relying on voluntary “secure by design” pledges?

The Limitations of Voluntary Pledges

While the idea of a voluntary pledge may seem appealing, it poses significant challenges in terms of enforcement and verification. Relying on businesses to take a pledge without any means of accountability is not a sustainable strategy. In the commercial software world, such an approach would be deemed inadequate.

Instead of reinventing the wheel, government agencies should leverage existing tools like CMMI V3.0 to ensure that organizations can implement the necessary capabilities and best practices. This would not only enhance software quality but also contribute to a more secure technological landscape.

Conclusion

In conclusion, the intersection of software quality and cybersecurity is a multifaceted issue that requires a collaborative effort from both technology vendors and their customers. By establishing clear requirements, implementing robust verification processes, and leveraging proven frameworks like CMMI V3.0, we can begin to address the software quality problem that underpins many of our cybersecurity challenges. It is time for all stakeholders to take responsibility and work together to create a more secure and resilient digital environment.

Related articles

Recent articles