The Components of Strong Cybersecurity Plans Part One: Maturity Assessment


By Mark Lanterman, Chief Technology Officer, Computer Forensic Services
Carolyn Engstrom, Director of Corporate Compliance

As organizations, companies, and individuals adapt to new technologies, awareness about the potential risks and dangers associated with these devices has grown. Headlines within the past few years have focused on a growing number of companies becoming victims of hacking, spear phishing, and other acts of cybercrime. Unfortunately, it would seem that within this digital landscape, it’s not a matter of if, but when. With these odds, many people are wanting more assurance when it comes to keeping their assets safe.

In response to a growing awareness of cybersecurity trends and organizational responsibility that extends far beyond the IT department, many people request penetration testing of their organization’s security infrastructure. Penetration testing and security assessments have almost become synonymous terms. Conflating penetration testing, vulnerability scanning, social engineering, among other security assessment components is common. However, all are separate components of complete security assessments.

In a series of five articles, I will condense the components of digital security programs to maturity assessment, security assessment, security auditing, technical vulnerability scanning, and penetration testing. Though last in this series, it would seem that in spite of recent attention being paid to strong security postures, penetration testing is given the most weight in organizations’ attempts to establish strong policies and procedures. However, comprehensive security plans require attention to all five of the aforementioned components, conducted on a regular basis. The maturity assessment is the first step in this ongoing process.

A maturity assessment defines management desires and expectations regarding the operation of its security program. During this critical phase, the personnel, processes, and technology capabilities in several key security areas are assessed. In this manner, a contextual understanding of the organization’s security culture is developed.

Methodologies involved in this stage include a review of critical security controls in relation to the NIST Cybersecurity Framework. Comparison between established baselines and current regulatory requirements will provide information for subsequent gap assessments.

Security professionals will present management with an established risk context, measurement of capability in adopting new procedures, and will propose appropriate strategies depending on the outcomes of the initial review and baseline assessment. The purpose of this is to obtain management support in improving existing policies and developing new policies.

In terms of deliverables, the maturity assessment is indispensable in that it provides management and relevant stakeholders with a quantitative statement, or score between 0-5, of an organization’s current security status. In its simplest form, a maturity assessment is conducted by reviewing the practices in each key security area and then assigning a maturity level to each area. Each level is associated with a numeric score. An overall score is calculated by averaging all the scores from the key areas.

A score of 0 equates to a certain key area having no maturity. A 1 is at an initial level of maturity and demonstrates that certain practices within a key area are just beginning to be realized. A 2 is an Ad Hoc maturity level score, meaning that while some practices exist within a key area, they are applied inconsistently. A score of 3 indicates a defined degree of maturity, meaning that an organization is aware of the key security area and has a plan for instituting policies. A score of 4 is associated with a managed maturity level, meaning that an organization is equipped with metrics on what has been defined for desired policies within a certain key area. A score of 5 indicates an optimized maturity level, meaning that continuous improvement within a key area is accounted for by an organization in addition to knowledge and implementation of existing policies. More sophisticated techniques, such as assigned various weights to key areas or practices, may be applied to achieve more variation in scoring. The key aspects of a security model tend to include several maturity levels which define a continuum, from least capable of consistent outcomes to an optimized and self-sufficient process of continuous improvement. Various models have differing numbers of levels, but typically, there are three to five levels.

Maturity levels are simplified stages representing a composite continuum of many factors. The following examples illustrate possible levels:

  1. Processes: undocumented to documented, manual to automated, siloes to integrated, external requirements to optimized internal requirements
  2. Technology: single purpose to orchestrated, simple to complex, open source to commercial
  3. People: generalized to specialist workforce, security awareness ranked from low to high

Due to the variety of factors to be considered, maturity assessments are subjective. Other security program components to be discussed in future articles are progressively more objective.

The Department of Energy’s Cyber Security Capability Maturity Model (C2M2), the National Institute of Standards and Technology’s Cyber Security Framework (CSF), International Standard’s Organizations 27001, and the Center of Internet Security’s Critical Security Controls are some of the most popular frameworks leveraged to identify the key areas and practices to be evaluated in a maturity assessment.

The primary purpose of the maturity assessment is to engage an organization’s management in developing a cybersecurity strategy. Management’s awareness of good cybersecurity practices rises when key areas are assessed in relation to creating a durable security program. Reviewing baseline results may be eye-opening, especially when considered in relation to an organization’s score on a continuum. Once the baseline maturity assessment is complete, management should identify areas needing improvement and establish a potential revision timeline. Usually, priorities are identified by observing the largest gaps between current capability and management’s desired capability.

Before other elements of the comprehensive security plan can be devised, the baseline maturity assessment serves as the initial step, followed immediately by gap assessment. Simply put, a gap assessment is a technique used to communicate the differences between the desired and current states of an organization’s security structure. It is a frequently applied technique when new regulatory requirements are anticipated. Regulatory compliance can be understood as the desired state, the current state is revealed through the maturity assessment, and gaps between the two are uncovered through the gap assessment. Applied more broadly, a gap assessment can compare management’s ideal security capability in relation to the maturity assessment. These inputs are elemental in forming a realistic information security strategy that aligns with management’s desired goals. When conducted annually, improvement and trends in maturity assessment scores can be tracked.

In the second article of this series, I will delve into the components of conducting a thorough security assessment based on the initial findings unearthed during maturity assessments and gap assessments. A security assessment identifies the risks to organizational assets based on recognized threats and vulnerabilities. These assessments are comprised of technical, administrative, and physical considerations. It should be noted that this part of the testing also pays attention to human vulnerabilities in addition to an organization’s technical vulnerabilities. Security assessments are then followed by security auditing, technical vulnerability scanning, and penetration testing being the least critical aspect of establishing a strong security posture in your organization.

Mark Lanterman is certified by the United States Department of Homeland Security as a “Seized Computer Evidence Recovery Specialist,” and is certified in computer forensics by the National White-Collar Crime Center. Lanterman presents over forty CLE classes annually, and has conducted training for the United States Supreme Court, the Kansas Supreme Court, the Nebraska Supreme Court, the Louisiana Supreme Court and the Eleventh Circuit Federal Judiciary. Additionally, he is an adjunct faculty member of computer science at the University of Minnesota’s Technological Leadership Institute and is currently teaching in the Master of Science Security Technologies (MSST) program. Lanterman is also an adjunct instructor at the Mitchell Hamline Law School and the National Judicial College in Reno, NV.

[clickToTweet tweet=”The Components of Strong Cybersecurity Plans Part One: Maturity Assessment” quote=”The Components of Strong Cybersecurity Plans Part One: Maturity Assessment” theme=”style3″]