Revolutive IT™
 
Software Complexity

T his article shows you many ways of estimating software complexity. In order to do that, many metrics have been created for this purpose.

Software complexity is one branch of software metrics that is focused on direct measurement of software attributes, as opposed to indirect software measures such as project milestone status and reported system failures. There are hundreds of software complexity measures, ranging from the simple, such as source line of code, to the esoteric, such as the number of variable definition/usage associations.

An important criterion for metrics selection is uniformity of application, also known as "open reengineering". The reason "open systems" are so popular for commercial software applications is that the user is guaranteed a certain level of interoperability - the applications work together in a common framework, and applications can be ported across hardware platforms with minimal impact. The open reengineering concept is similar in that the abstract models used to represent software systems should be as independent as possible of implementation characteristics such as source code formatting and programming language. The objective is to be able to set complexity standards and interpret the resultant numbers uniformly across projects and languages. A particular complexity value should mean the same thing whether it was calculated from source code written in Ada, FORTRAN, or some other language. The most basic complexity measure, the number of lines of code, does not meet the open reengineering criterion, since it is extremely sensitive to programming language, coding style, and textual formatting of the source code. The cyclomatic complexity measure, which measures the amount of decision logic in a source code function, does meet the open reengineering criterion. It is completely independent of text formatting and is nearly independent of programming language since the same fundamental decision structure are available and uniformly used in all procedural programming languages.

McCabe Metrics

Cyclomatic Complexity Metric (v(G))
Cyclomatic Complexity (v(G)) is a measure of the complexity of a module's decision structure. It is the number of linearly independent paths and therefore, the minimum number of paths that should be tested.

It is also the most widely used memeber of a class of static software metrics. Cyclomatic complexity may be considered broad measure of soundness and confidence for a program. Introduced in 1976 by Thomas McCabe, it measures the number of linearly-independant paths through a program module. This measure provides a single ordinal number that can be compared to the complexity of other programs. Cyclomatic complexity is often referred to simply as program complexity, or as McCabe's complexity. It is often used in concert with other software metrics. As one of the more widely-accepted software metrics, it is intended to be independant of language and language format.

Cyclomatic complexity can be applied in several areas, including:
   - Code development risk analysis;
   - Change risk analysis in maintenance;
   - Test planning;
   - Reengineering.

Actual Complexity Metric (ac)
Actual Complexity (ac) is the number of independent paths traversed during testing.

Module Design Complexity Metric (iv(G))
Module Design Complexity (iv(G)) is the complexity of the design-reduced module and reflects the complexity of the module's calling patterns to its immediate subordinate modules. This metric differentiates between modules which will seriously complicate the design of any program they are part of and modules which simply contain complex computational logic. It is the basis upon which program design and integration complexities (S0 and S1) are calculated.

Essential Complexity Metric (ev(G))
Essential Complexity (ev(G)) is a measure of the degree to which a module contains unstructured constructs. This metric measures the degree of structuredness and the quality of the code. It is used to predict the maintenance effort and to help in the modularization process.

Pathological Complexity Metric (pv(G))
pv(G) is a measure of the degree to which a module contains extremely unstructured constructs.

Design Complexity Metric (S0)
S0 measures the amount of interaction between modules in a system.

Integration Complexity Metric (S1)
S1 measures the amount of integration testing necessary to guard against errors.

Object Integration Complexity Metric (OS1)
OS1 quantifies the number of tests necessary to fully integrate an object or class into an OO system.

Global Data Complexity Metric (gdv(G))
gdv(G) quantifies the cyclomatic complexity of a module's structure as it relates to global/parameter data. It can be no less than one and no more than the cyclomatic complexity of the original flowgraph.

McCabe Date-Related Software Metrics

Date Complexity Metric (DV)
Date Complexity Metric (DV) quantifies the complexity of a module's structure as it relates to date-related variables. It is the number of independent paths through date logic, and therefore, a measure of the testing effort with respect to date-related variables.

Tested Date Complexity Metric (TDV )
Tested Date Complexity Metric (TDV) quantifies the complexity of a module's structure as it relates to date-related variables. It is the number of independent paths through date logic that have been tested.

Date Reference Metric (DR )
Date Reference Metric (DR ) measures references to date-related variables independently of control flow. It is the total number of times that date-related variables are used in a module.

Tested Date Reference Metric (TDR )
Tested Date Reference Metric (TDR ) is the total number of tested references to date-related variables.

Maintenance Severity Metric (maint_severity)
Maintenance Severity Metric (maint_severity) measures how difficult it is to maintain a module.

Date Reference Severity Metric (DR_severity)
Date Reference Severity Metric (DR_severity) measures the level of date intensity within a module. It is an indicator of high levels of date related code; therefore, a module is date intense if it contains a large number of date-related variables.

Date Complexity Severity Metric (DV_severity )
Date Complexity Severity Metric (DV_severity ) measures the level of date density within a module. It is an indicator of high levels of date logic in test paths; therefore, a module is date dense if it contains date-related variables in a large proportion of its structures.

Global Date Severity Metric (gdv_severity)
Global Date Severity Metric (gdv_severity) measures the potential impact of testing date-related basis paths across modules. It is based on global data test paths.

McCabe Object-Oriented Software Metrics

ENCAPSULATION

Percent Public Data (PCTPUB)
PCTPUB is the percentage of PUBLIC and PROTECTED data within a class.

Access to Public Data (PUBDATA)
PUBDATA indicates the number of accesses to PUBLIC and PROTECTED data.

POLYMORPHISM

Percent of Unoverloaded Calls (PCTCALL)
PCTCALL is the number of non-overloaded calls in a system.

Number of Roots (ROOTCNT)
ROOTCNT is the total number of class hierarchy roots within a program.

Fan-in (FANIN)
FANIN is the number of classes from which a class is derived.

QUALITY

Maximum v(G) (MAXV)
MAXV is the maximum cyclomatic complexity value for any single method within a class.

Maximum ev(G) (MAXEV)
MAXEV is the maximum essential complexity value for any single method within a class.

Hierarchy Quality(QUAL)
QUAL counts the number of classes within a system that are dependent upon their descendants.

Other Object-Oriented Software Metrics

Depth (DEPTH)
DEPTH indicates at what level a class is located within its class hierarchy.

Lack of Cohesion of Methods (LOCM)
LOCM is a measure of how the methods of a class interact with the data in a class.

Number of Children (NOC)
NOC is the number of classes that are derived directly from a specified class.

Response For a Class (RFC)
RFC is a count of methods implemented within a class plus the number of methods accessible to an object of this class type due to inheritance.

Weighted Methods Per Class (WMC)
WMC is a count of methods implemented within a class.

Halstead Software Metrics

Gives the primary measure of Algorithm complexity, measured by counting operators and operands

Program Length
The total number of operator occurrences and the total number of operand occurrences.

Program Volume
The minimum number of bits required for coding the program.

Program Level and Program Difficulty
Measure the program's ability to be comprehended.

Intelligent Content
Shows the complexity of a given algorithm independent of the language used to express the algorithm.

Programming Effort
The estimated mental effort required to develop the program.

Error Estimate
Calculates the number of errors in a program.

Programming Time
The estimated amount of time to implement an algorithm.

Line Count Software Metrics

Lines of Code

Lines of Comment

Lines of Mixed Code and Comments

Lines Left Blank

Henry and Kafura metrics

Coupling between modules (parameters, global variables, calls).

Bowles metrics

Module and system complexity; coupling via parameters and global variables.

Troy and Zweben metrics

Modularity or coupling; complexity of structure (maximum depth of structure chart); calls-to and called-by.

Ligier metrics

Modularity of the structure chart.


The top 10 mistakes in IT Measurements

Betting the measurement program on a single metric

Trying to find a single metric that solves all the problems and has no evil

The quest for an industry standard set of measures

Not linking measures to behaviors

Assuming that one set of measures will be good for "All Time"

Measuring the wrong IT output

Measuring in business terms, but the wrong business terms

Failure to quantify in business terms; failure to plan for benefits

Neglecting the full range of IT related outcomes

Lack of commitment; treating measurement as a non-value added add-on



Our solutions are the most efficient, flexible, quality-based and cheaper of the market, due to our fully innovative and customer-driven approach. Discover them and compare them to the solutions provided you by our competitors.


 
Copyright © 2002 Atlantis Technologies. All rights reserved.