Test Approach OpenDNSSEC 1.2
description
Transcript of Test Approach OpenDNSSEC 1.2
Test Approach OpenDNSSEC 1.2
December 2010
1
2
Contents
- Goal- Scope- Phasing - Test Scripts- Planning- Preconditions
Goal
Testing OpenDNSSEC 1.2 by varying policies and zone contents to measure the level of quality of ODS 1.2 and prove it's quality to the community.
3
Scope - Bugfixes- Performance with many small zones (50.000+)- Shared keys- All different RR types- KASP functionality- Varying Zone content- New dependencies (ldns1.6.7, SoftHSM1.2, dnsruby1.51)- Zonefetcher- Extensive communication on test status
4
Phasing
1. Set up test organization2. Prepare test scripts3. Execute test scripts – System (Integration) Test – (ST/SIT)4. Report System Test5. Execute test scripts – Acceptance Test (AT)6. Report Acceptance Test
Test Scripts
6
Test script Description Status
Conf (ST) Varying conf on values like repository and privileges 5%
Kasp (ST) Varying KASP on values like keysharing and timings Not started
Zones (ST) Varying RR types and other zone content Not started
Auditor (ST) Trigger the implemented auditor checks Not started
Zonefetcher (SIT) Communication of ODS with other systems like BIND Not started
Scenarios (AT) Test scenario's like key rollover or switching HSM's 80%
Performance (AT) Performance when signing a large number of small zones or one large zone
20%
7
Week Activity Status
Week 51 + 52 Test approach and prepare test environment by installing Virtual Machines with ODS 1.2
70%
Week 1 till 6 Prepare conf and kasp test script 10%
Week 7 Prepare zones test script 0%
Week 8 Prepare auditor test script 0%
Week 9 Prepare zonefetcher test script 0%
Week 10 Prepare performance test script 0%
Week 11 Prepare scenarios test script 0%
Week 11-12 Execute tests 0%
Week 13 Rework on test scripts and test engine 0%
Week 14 Buffer n/a
Week 15 Final executing and reporting 0%
Planning
8
Preconditions
– Availability of System administrators first 2 weeks an average of 1 hour per day. Following weeks 3 hours per week;– Availability of the puppet system to create “one-button” test environment in form of Virtual Machines with designated software pre-installed (ODS, SoftHSM);– Parttime availability of Freddy Keurntjes for test engine support during test execution and rework (4 hours a day);– Weekly conference call with ODS team for status update;– Extensive communication on bugfixes and new releases;– Availability of ODS project members for support with an average of 4 hours a week
Feedback
9