Revolutive IT™
 
Software Complexity

M any customers often say that their applications are hard to maintain, difficult to understand, complex and use many ways to describe the fact that they would like to develop or have to maintain an application that is simpler than that they have. Such thing is not trivial because it asks us to give a clear definition of what is a good vs bad application, what is a maintainable vs. less maintainable application and other things like that. Thus why not introduce the concept of quality in the application. We don't speak here about the quality inside the development process which is more the field of the SAI-CMM process, but about the quality of the content, i.e. the program itself.

Measurement is important because it gives a way of continuousely evaluating the state of a program by doing at fixed schedule the assessment of the contained modules. This gives the opportunity to evaluate efforts needed for maintenance, enhancement, future decisions about what to do with the application portfolio. Another important reason to do such evaluation is a business one. IT measurement is not a technical affair. It is critical for organizations to do as an integral part of their business existence and strategy. In fact, everybody knows that we can manage only what we can measure. This simple sentence should be enough to start an IT assessement based on objective set of measures.

Of course measurement for measurement is not enough. The measurement must be part of the entire business strategy. This means that each industry should start a measurement program and develop a model to develop such program.
The first aim of measurement is to provide a clear and objective basis for decision. Measurement must be seen in the context of providing value to the enterprise by providing information (not data!) in the critical management dimensions of the organization and they must tie into and be used for process improvement. Measurement must be part of a process feedback loop
According to H. Rubin, many mistakes are made when talking about IT measurement. We list them below.

Many efforts have been made to give a defininition of a given program's complexity. Some authors have tried to define metrics in order to measure the complexity. Many methods exist for evaluating the complexity and the health state of programs. Hereafter we give many metrics with the name of their respective authors. We truly thank them to continue research on the subject as we know that it is crucial for any business to know how good are their applications.

A tlantis Technologies can help you in IT measurement as we specialize in this area. We provide our customers with tailorable solutions and services for complexity measurement and interpretation, software quality assessment and other related tasks. Measuring must be done by experts with clear objective in mind.

Read this article about IT measurement and its importance

McCabe Metrics

Cyclomatic Complexity Metric (v(G))
Cyclomatic Complexity (v(G)) is a measure of the complexity of a module's decision structure. It is the number of linearly independent paths and therefore, the minimum number of paths that should be tested.

It is also the most widely used memeber of a class of static software metrics. Cyclomatic complexity may be considered broad measure of soundness and confidence for a program. Introduced in 1976 by Thomas McCabe, it measures the number of linearly-independant paths through a program module. This measure provides a single ordinal number that can be compared to the complexity of other programs. Cyclomatic complexity is often referred to simply as program complexity, or as McCabe's complexity. It is often used in concert with other software metrics. As one of the more widely-accepted software metrics, it is intended to be independant of language and language format.

Cyclomatic complexity can be applied in several areas, including:
   - Code development risk analysis;
   - Change risk analysis in maintenance;
   - Test planning;
   - Reengineering.

Actual Complexity Metric (ac)
Actual Complexity (ac) is the number of independent paths traversed during testing.

Module Design Complexity Metric (iv(G))
Module Design Complexity (iv(G)) is the complexity of the design-reduced module and reflects the complexity of the module's calling patterns to its immediate subordinate modules. This metric differentiates between modules which will seriously complicate the design of any program they are part of and modules which simply contain complex computational logic. It is the basis upon which program design and integration complexities (S0 and S1) are calculated.

Essential Complexity Metric (ev(G))
Essential Complexity (ev(G)) is a measure of the degree to which a module contains unstructured constructs. This metric measures the degree of structuredness and the quality of the code. It is used to predict the maintenance effort and to help in the modularization process.

Pathological Complexity Metric (pv(G))
pv(G) is a measure of the degree to which a module contains extremely unstructured constructs.

Design Complexity Metric (S0)
S0 measures the amount of interaction between modules in a system.

Integration Complexity Metric (S1)
S1 measures the amount of integration testing necessary to guard against errors.

Object Integration Complexity Metric (OS1)
OS1 quantifies the number of tests necessary to fully integrate an object or class into an OO system.

Global Data Complexity Metric (gdv(G))
gdv(G) quantifies the cyclomatic complexity of a module's structure as it relates to global/parameter data. It can be no less than one and no more than the cyclomatic complexity of the original flowgraph.

McCabe Date-Related Software Metrics

Date Complexity Metric (DV)
Date Complexity Metric (DV) quantifies the complexity of a module's structure as it relates to date-related variables. It is the number of independent paths through date logic, and therefore, a measure of the testing effort with respect to date-related variables.

Tested Date Complexity Metric (TDV )
Tested Date Complexity Metric (TDV) quantifies the complexity of a module's structure as it relates to date-related variables. It is the number of independent paths through date logic that have been tested.

Date Reference Metric (DR )
Date Reference Metric (DR ) measures references to date-related variables independently of control flow. It is the total number of times that date-related variables are used in a module.

Tested Date Reference Metric (TDR )
Tested Date Reference Metric (TDR ) is the total number of tested references to date-related variables.

Maintenance Severity Metric (maint_severity)
Maintenance Severity Metric (maint_severity) measures how difficult it is to maintain a module.

Date Reference Severity Metric (DR_severity)
Date Reference Severity Metric (DR_severity) measures the level of date intensity within a module. It is an indicator of high levels of date related code; therefore, a module is date intense if it contains a large number of date-related variables.

Date Complexity Severity Metric (DV_severity )
Date Complexity Severity Metric (DV_severity ) measures the level of date density within a module. It is an indicator of high levels of date logic in test paths; therefore, a module is date dense if it contains date-related variables in a large proportion of its structures.

Global Date Severity Metric (gdv_severity)
Global Date Severity Metric (gdv_severity) measures the potential impact of testing date-related basis paths across modules. It is based on global data test paths.

McCabe Object-Oriented Software Metrics

ENCAPSULATION

Percent Public Data (PCTPUB)
PCTPUB is the percentage of PUBLIC and PROTECTED data within a class.

Access to Public Data (PUBDATA)
PUBDATA indicates the number of accesses to PUBLIC and PROTECTED data.

POLYMORPHISM

Percent of Unoverloaded Calls (PCTCALL)
PCTCALL is the number of non-overloaded calls in a system.

Number of Roots (ROOTCNT)
ROOTCNT is the total number of class hierarchy roots within a program.

Fan-in (FANIN)
FANIN is the number of classes from which a class is derived.

QUALITY

Maximum v(G) (MAXV)
MAXV is the maximum cyclomatic complexity value for any single method within a class.

Maximum ev(G) (MAXEV)
MAXEV is the maximum essential complexity value for any single method within a class.

Hierarchy Quality(QUAL)
QUAL counts the number of classes within a system that are dependent upon their descendants.

Other Object-Oriented Software Metrics

Depth (DEPTH)
DEPTH indicates at what level a class is located within its class hierarchy.

Lack of Cohesion of Methods (LOCM)
LOCM is a measure of how the methods of a class interact with the data in a class.

Number of Children (NOC)
NOC is the number of classes that are derived directly from a specified class.

Response For a Class (RFC)
RFC is a count of methods implemented within a class plus the number of methods accessible to an object of this class type due to inheritance.

Weighted Methods Per Class (WMC)
WMC is a count of methods implemented within a class.

Halstead Software Metrics

Gives the primary measure of Algorithm complexity, measured by counting operators and operands

Program Length
The total number of operator occurrences and the total number of operand occurrences.

Program Volume
The minimum number of bits required for coding the program.

Program Level and Program Difficulty
Measure the program's ability to be comprehended.

Intelligent Content
Shows the complexity of a given algorithm independent of the language used to express the algorithm.

Programming Effort
The estimated mental effort required to develop the program.

Error Estimate
Calculates the number of errors in a program.

Programming Time
The estimated amount of time to implement an algorithm.

Line Count Software Metrics

Lines of Code

Lines of Comment

Lines of Mixed Code and Comments

Lines Left Blank

Henry and Kafura metrics

Coupling between modules (parameters, global variables, calls).

Bowles metrics

Module and system complexity; coupling via parameters and global variables.

Troy and Zweben metrics

Modularity or coupling; complexity of structure (maximum depth of structure chart); calls-to and called-by.

Ligier metrics

Modularity of the structure chart.


The top 10 mistakes in IT Measurements

Betting the measurement program on a single metric

Trying to find a single metric that solves all the problems and has no evil

The quest for an industry standard set of measures

Not linking measures to behaviors

Assuming that one set of measures will be good for "All Time"

Measuring the wrong IT output

Measuring in business terms, but the wrong business terms

Failure to quantify in business terms; failure to plan for benefits

Neglecting the full range of IT related outcomes

Lack of commitment; treating measurement as a non-value added add-on



Our solutions are the most efficient, flexible, quality-based and cheaper of the market, due to our fully innovative and customer-driven approach. Discover them and compare them to the solutions provided you by our competitors.


 
Copyright © 2002 Atlantis Technologies. All rights reserved.