An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S....

15
An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000

Transcript of An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S....

Page 1: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

An evaluation of tools for static checking of C++ code

E. Arderiu Ribera, G. Cosmo, S. M. Fisher,

S. Paoli, M. Stavrianakou

CHEP2000, Padova, 10.02.2000

Page 2: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 2

Origin, purpose and scope

• SPIDER-CCU project (CERN IT-IPT, LHC experiments, IT projects) to define a common C++ coding standard and a tool to automatically check code against it SPIDER - C++ Coding Standard

• 108 rules for naming, coding and style

Tool evaluation• Scope limited to rule checking functionality

Page 3: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 3

Approach and tool selection

• Involve potential users of the tool in– the definition of evaluation criteria– the planning of the evaluation– the actual technical evaluations

• Take into account time and resource constraints

• Preselect tools based on technical merit

Page 4: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 4

Evaluated tools

• CodeCheck 8.01 B1 (Abraxas)

• QA C++ 3.1 (Programming Research Ltd.)

• CodeWizard 3.0 (Parasoft)

• Logiscope RuleChecker (Concerto/AuditC++) 3.5 (CS Verilog S.A.)

• TestBed 5.8.4 (LDRA Ltd.)

Page 5: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 5

Evaluation environment

• Evaluate on real and representative HEP C++ code– GEANT4 toolkit– ATLAS “Event” package– ATLAS “classlib” utility library chosen because of

complexity extensive use of STL variety of style and expertise familiarity to members of evaluation team

Page 6: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 6

Evaluation criteria• Technical

– Coverage of standard

– Addition of customised checks

– Other relevant configured checks

– Support of ANSI C++ standard

– Support of template libraries and STL

– Robustness

– Reliability

– Usability

– Customisability

– Performance

Page 7: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 7

Evaluation criteria (cont’d)

• Operational – installation, deployment and upgrade of a centrally

supported tool

• Managerial– licensing, maintenance costs, vendor information

• Other– quality and quantity of documentation (electronic,

paper, WWW)

– quality of available support

Page 8: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 8

Evaluation results: CodeCheck

limitations in parsing real code making extensive use of STL (no enhancements foreseen)

cumbersome in terms of customisability and implementation of new rules

excluded from further evaluation

Page 9: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 9

Evaluation results: TestBed

limitations in parsing complex code limited number of built-in rules, no possibility of

adding new rules excluded from further evaluation

Page 10: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 10

Evaluation results: Logiscope RuleChecker

simple, easy to use, fast limited number of built-in rules limited possibility of adding new rules flexibility in report generation and quality

limited by proprietary language (CQL) excluded from further evaluation

Page 11: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 11

Evaluation results: CodeWizard

at least 71 checks implemented incl. most of items from S. Meyers “Effective C++” and “More Effective C++”

configurable to cover 71% of “SPIDER” standard customisable in terms of rule selection customisable in terms of code inclusion/exclusion ability to parse ANSI C++ with STL possibility of using RuleWizard for addition of

customised checks not yet usable owing to poor documentation

Page 12: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 12

Evaluation results: CodeWizard (cont’d)

reports in graphical and ASCII format – not customisable

information for headers and libraries necessary straightforward by using the makefile– repetition of parsing and reporting

performance equivalent to compiler fully evaluated

Page 13: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 13

Evaluation results: QA C++

at least 500 checks implemented incl. ISO C++ configurable to cover 65% of “SPIDER” standard customisable in terms of rule selection customisable in terms of code inclusion/exclusion full STL support foreseen for next release

partial analysis possible via STL stubs provided by the company

easy to learn and use, robust

Page 14: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 14

Evaluation results: QA C++ (cont’d) information for headers and libraries necessary

possibility of single parsing and caching of headers makefile integration non trivial

powerful GUI and command line - largely interchangeable

high quality, customisable reports factor 2 slower performance compared to compiler fully evaluated BUT completely new version (full ANSI C++ compliance, new

parser) not available at the time of the evaluation

Page 15: An evaluation of tools for static checking of C++ code E. Arderiu Ribera, G. Cosmo, S. M. Fisher, S. Paoli, M. Stavrianakou CHEP2000, Padova, 10.02.2000.

10.02.2000 CHEP2000 15

Conclusions

Evaluation process suited to the goals, pragmatic, efficient user involvement, careful definition of evaluation criteria and

detailed planning essential

Evaluation results out of five tools considered, two, CodeWizard and QA C++,

preselected on technical merit and fully evaluated final choice to depend on weight given to various features,

relative cost, needs of institutes concerned and development of promising new tools (e.g. Together/Enterprise CASE tool and tool by ITC-IRST and ALICE experiment)