www.jrc.ec.europa.eu
Serving societyStimulating innovationSupporting legislation
LPIS QA
a.k.a. quality assurance by the MS
Wim Devos
Outline
1.intro2.quality concepts and standards3.Abstract Test Suite4.Executive Test Suite5.LPIS QA portal6.screening of the ETS records7.thresholds8.results
Until 2010, EC assessed LPIS performance during Audit missions:Investigating OTSC control files and repeating control procedures, LPIS issues emerged√ All aspects and the whole process can be investigatedX Only a very small sample is checked and resources are limited.
= an example of an internal quality control by external body
Since 2010, in MS report on the observed performance of their LPIS, can add representativeness and systematic monitoringbased on external quality control by internal bodya proactive strategy of continuous quality control and reporting this was a very high priority
History
Goal of the LPIS QAF
1. Provide a view on the state of the LPIS that is harmonised quantitative unbiased precise complete current
2. This allows for comparison between MS a pan-European overview
3. and serves as base for planning remediate actions by the MS (self-assessment) considerations about the effect of weaknesses found (audit)
1.Find the failed processes that cause anomalies or defects and their effects:
1. Regulatory blockage (e.g. historical GAC mask)
2. Missed update (i.e. the land has changed)
3. Failed upgrade (e.g. eligibility rules have changed)
4. Incomplete processing (e.g. under a military mask)
5. Erroneous processing (i.e. sloppy job done)
6. Incompatible design (e.g. absence of LC delineation )
2.Analyse findings thoroughly before starting overhauling a system!!!
3. Implement appropriate remedies
MS Self-Assesment
Outline
1.intro2.quality concepts and standards3.Abstract Test Suite4.Executive Test Suite5.LPIS QA portal6.screening of the ETS records7.thresholds8.results
Quality
the totality of characteristics of an entity that bears on its ability to satisfy stated and implied needs
• stated: in specification (denotation) what can be measured
• implied: in users’ expectation (conotation) what is considered fit for purpose
Question: which is the better car: Audi A4 or Suzuki Grand Vitara?
7
quality assurance (QA): the set of activities whose purpose is to demonstrate that an entity meets all quality requirements (mostly producer’s perspective)
• ISO 9000 – series “Quality Management System – Requirements”
quality control (QC): the set of activities or techniques whose purpose is to ensure that all quality requirements are being met (often client’s perspective)
• ISO2859 & ISO3951 series “sampling procedures” attributes/variables
non-conformity: non-fulfilment of a specified requirement • i.e. non compliant with (a part of) the specification
defect: non-fulfilment of an intended usage requirement • i.e. a non-conformity that is critical so that the intended use is not
possible
anomaly: observed non-conformity (registered - confirmed?) • instrument to monitor your quality and drive the upkeep
Key definitions
Quality Policy - institution, procedure, data
Quality Assurance
– procedures, data
Quality Control
External QualityAudit
management
test procedures
test results
Quality Inspection(recurrent)
– data
JRC: •documentation •sampling •imagery •1st `+ 2nd screening
Member States European Commission
Testing methodologyISO 19105: Conformance and testing1.Prepare/document your LPIS implementation FC/AS2.Verify the logical consistency with EU model ATS3.Verify other data quality elements (=values) ETS4.Report
Model Conformance Test
Conformance Statement (ICS)
Abstract Test Suite (ATS)
Data Conformance Test
Executable Test Suite (ETS)Additional
Information for Testing
Conformance Test Report
Analysis of results
Application Schema or Feature
Catalogue of the implementation
under test
•ONCE•ONCE•YEARLY
Data Quality Element Data Quality Sub-elements
completeness commission/ omission
logical consistency conceptual consistency codelist consistencyformat consistencytopological consistency
positional accuracy
absolute or external accuracyrelative or internal accuracygridded data position accuracy
thematic accuracy
classification correctnessnon-quantitative attribute correctnessquantitative attribute accuracy
temporal accuracy
accuracy of a time measurementtemporal consistency temporal validity
ISO 19113
ATS
ETS
In practice
1. CAP Regulations CwRS source imagery LPIS databases Reporting cycle
2. GI Components LCM as core data schema ATS for consistency check harmonised ortho-imagery specification ( INSPIRE A2) inspection = f(LCCS)
3. Industry standardsISO 2859 acceptance sampling
7 primary EC Concerns
1. correct quantification of the truly eligible area within the LPIS as a system
2. distribution of ineligible land over the reference parcels
3. categorization of reference parcels regarding ineligible land
4. occurrence of critical defects within a reference parcel5. proportion of declared area inside a reference parcel6. effectiveness of update processes regarding the LPIS
system7. relation of LPIS quality issues with error rates observed
during the on the spot checks
Outline
1.intro2.quality concepts and standards3.Abstract Test Suite4.Executive Test Suite5.LPIS QA portal6.screening of the ETS records7.thresholds8.results
Source: application schema or Feature Catalogue
cd RefPArcel_2
«FeatureType»PhyBlock
+ eligibleLandType: CodeList::ReferenceParcel+ rpID: CharacterString+ referenceArea: Float+/ digitisedArea: Measure+/ farmerArea: Float+ effectiveDate: Date- status: CodeList- perimeter: Float
cd RefPArcel_2
«enumeration»EligibleLandCodeType
- arableLand: - permanentPasture: - oliveTrees: - permanentCrop: - semiNaturalGrassland: - nonAgricultural:
Name: PhyBlock
Definition: Reference parcel representing production block, which is a continuous area of agricultural land and grouping together a number of neighbouring agricultural parcels cultivated by one or more farmer(s) and delineated by most stable topographical boundaries.
Feature Operation(s):
Feature Attribute(s): Attributes inherited from ReferenceParcel class----------------------------Own attributes:
M - uniqueID; referenceArea; effectiveDate; status; C – digitizedArea; farmedArea O - perimeter;M – eligibleLandType; C – dominantLandType;
Subtype of: ReferenceParcel
LCM_discussion Inside of physical block class we in fact have two different approaches. One which is purely equal to the definition of production block and contains only agricultural land. Another approach shall be called rather ‘topographic block’, because in this case non-agricultural parcels such as forest, residential, water etc. can exist. Such systems often cover 100% of national territory. What is more important one parcel may contain combination of several types of land cover of agricultural /non-agricultural land, e.g forest block containing small arable or grassland patches. According to recent LPIS survey [8] 5 from 10 physical block systems account for the latter approach. Therefore, this type of physical block can not be defined fully by only one land cover attribute. To handle this case we proposed use of dominantLandType. However, when there are different eligibleLandTypes inside one parcel, the case is much closer to cadastral parcel then production block, and probably can be handled by introducing of sub-parcel class. Nevertheless, the total area of the eligible land inside parcel shall be stored in attribute referenceArea[JU1]
ATS StructureAbstract test suite structure
Modules and tests: e.g. Module A.1.1 can be assigned
‘Conforming’ value if one of the tests A.1.1.1.1 OR A.1.1.1.2 OR A.1.1.1.3
OR A.1.1.2.1 is ‘Conforming’
Test
purpose:
verify definition of reference parcel: reference parcel demarcated by the
farmer, who cultivates it (manage/execute his tenure rights: ownership,
rent etc.) on multi-annual basis
Test
method:
Inspect the documentation of the application schema or feature
catalogue, verify consistency with ICS
ATS NOTE: Conformant with Farmer’s block definition
Test
reference:
LCM specification
Test type: Basic test
A.1.1Basic test
A.1.1.1 A.1.1.2
A.1.1.1.1 A.1.1.1.2 A.1.1.1.3 A.1.1.2.1
ATS log (xls-version)
LCM - concept
corresponding national implementation
ATS-log (xml)
maps the core model into each national implementation
≈ dictionary
18
Scoreboard
19
Extract from ATS-log
≈ formality
ATS conclusion
LCM-ATS look unimpressive but are quintessential!
The ATS scoreboard needs no independent assessment/validation: incorrect mapping will cause poor ETS results 4 MS implemented remedial actions based on ATS findings
LCM-ATS remained stable since 2009 urgent need for upgrade
changed requirements from the reform
best practices from the MS
An untouched pool of information:•All questions (e.g. ECoA) on design choices and procedures are (or should be) addressed in the FC/AS and ICS•Allows for one common interface (cfr study JRC, TU Dresden)
20
Outline
1.intro2.quality concepts and standards3.Abstract Test Suite4.Executive Test Suite5.LPIS QA portal6.screening of the ETS records7.thresholds8.results
Sampling
22
Based on ISO2859-2 LQ
prepared by the JRC from delivered data
E.G. flanders 20101.population: 498.4892.within CwRS: 63.832●3.selected by JRC: 2400●4.inspected by MS: 800●
≈ 0,16%very small sample!!!
Inspection procedure
stepwise instruction to ensure all observations are correctly made
everything defined in LCM
equivalence of CAPI and Field survey where possible
observation/measurement separated from
assessment & evaluation
23
Case IV
LUI boundary
GeoEye imagery (acquired in 2011)
LUI
On ground truth of the ongoing year
MS identify the LUI (land under inspection, represented by the RP)
Case IV
ETS 2011 delineation performed by the MS
CAPI inspectionAgricultural lands are measured
Grassland (G): Agriculture land polygon area = 9414m2
Hedge (BR):Landscape Feature polygon Area = 317m2
Non agricultural areas are countedNatural Bare Areas
CAPI benefits from crosschecking (2009 Bing imagery)
Field inspection
26
Alternative to CAPI
@GNSS vertices”:2 pics of field ●2 pics of vertex ●
Labour intensive: complementary ? Invisible boundary
What is assessed at parcel level?1. Parcels with incorrect Maximum Eligible Area (area non-conforming)2. Parcels with serious functional problems (critical defects)
1. absence of agriculture
2. invalid perimeter
3. invalid common boundaries
4. incomplete block
5. multi-parcel
6. multi-polygon
3. Reasons for their non-conformity1. poor update
2. poor upgrade
3. poor processing
4. incomplete processing
5. poor design
1 standard observation record
≈ control file
Outline
1.intro2.quality concepts and standards3.Abstract Test Suite4.Executive Test Suite5.LPIS QA portal6.screening of the ETS records7.thresholds8.results
What?lpis.jrc.ec.europa.eu: instrument for managing the LPIS QA operations
Not to be confused with wikiCAP: marswiki.jrc.ec.europa.eu Needs IE 8 or above!
For the MS: By authorized users (managed by the MS) data upload by MS (Proprietary zones, RP population, control records) data download by MS (pre-selection lists) linked to the CID portal (for CwRS zones) run automatic processes by JRC (sampling, validation)
For the EC: 24/7 monitoring of status of each MS queries on options (e.g. selected CwRS zones) work floor contacts • …
30
Data exchange process
AbstractTest Suite(ATS)
ScreeningProceduresLPIS
Implementation
Feature CatalogueApplication Schema
ATS reporting package
XML PDF
ETS reporting package
XML PDF
GMLExecutiveTest Suite(ETS)
RandomSamplePre-selection
GML XML
OrthoimageryWMS
Sample pre-selection package
32
EC interface
OK
incomplete??????
incomplete
very late
late
33
MS interface
34
Key NumbersInfrastructure database size (geodb + portal db): 50GB Nr of uploaded files: 2400; size on disk: 50GB (zipped) Nr of packages (annual + discrete): 399 Nr of registered active users: 47
Content Nbr of tables Nbr of parcels (pts)
• 2010: 24 698 163• 2011: 71 367 872• 2012: 69 069 130
Nbr of parcels (polygons)
Outline
1.intro2.quality concepts and standards3.Abstract Test Suite4.Executive Test Suite5.LPIS QA portal6.screening of the ETS records7.thresholds8.results
What?Systematic inspection of the inspection records1. methodological screening to
identify and provide feedback on methodological issues “assure” correct implementation by discouraging manipulation
2. quality audit to support or revoke the scores and conclusion (≈ Y/N)
Neither: 1. is a repetition of the inspection by the MS (≠2nd opinion)2. involves calibration or mitigation of the expectations (≈ thresholds)3. directly yields a “good LPIS” or “bad LPIS” evaluation result
Both make only sense in1. presence of ALL required data (full population, “zero state”, imagery,
neighbouring parcels) 2. absence of manifest manipulation (timely procedure, scoping, zone
selection)365 June 2012
copy-paste
no-effect results become unreliable
375 June 2012
Skipping ≠ cherry picking
Parcel should be skipped, but wasn’t parcel skipped, without reason
385 June 2012
incomplete packages
?
?
?
Lands can be added “à la carte”
2010 screening report
Investigation full process flow of 10 inspected parcels
2011 screening report
Investigation of:•package content•10 random parcels
2011 screening findings
Outline
1.intro2.quality concepts and standards3.Abstract Test Suite4.Executive Test Suite5.LPIS QA portal6.screening of the ETS records7.thresholds8.results
purpose
Thresholds are values that control the decision flow!Currently thresholds are set to trigger analysis + reporting
scores and findings must be interpreted appropriatelyE.g. poor QE1 score ≡ “risk to the fund” ? - Not necessarily.
In general:1. QE1-QE5 relate to biases in land cover resp. land use2. QE2-QE3-QE4 provide parcel-based information3. QE6-QE7 monitor effects of LPIS operations
445 June 2012
rationale
QE1 (total area): 2%: threshold for serious error by ECoA DAS QE2 (rate of area based non conforming parcels):
3%: this threshold difference is specified in the Comm Reg 2009R1122 art 55
+ 58 Δ (area observed, area declared): 5 % and 7% thresholds introduce technical tolerance for smaller parcels 1ha: maximum tolerance of OTSC methodology
QE3 (causes of non-conformities and defects) and : 5%, arbitrary: serves an alert function.
QE4 (rate of defects): LQ2, (in percent nonconforming parcels) is set to 2 as in the threshold for serious error by ECoA
QE5 (area declaration rate): informativeQE6 (accumulated change rate): 25%, arbitrary, serves an alert
function. QE7 (rate of irregular applications): no significantly effect (χ2)http://marswiki.jrc.ec.europa.eu/wikicap/index.php/GAMMA_4.d#Topic_5.2_Rationale_behind_the_thresholds_.28or_better_the_quality_expectations.29
No threshold is set by DGAgri on LPIS specific criteria!!!
45
beyond analysis….important decisions depend on LPIS quality• reduction of OTSC sample rates (abolition of the 4% risk sample)• launch of LPIS refresh projects and overhauls• estimation of financial risks of system
Challenges:• are the current thresholds “fit for purpose” for each of these ?• is one single set of thresholds realistic?
Scoreboard (=assesment) ≠ quality statement (evaluation)
Outline
1.intro2.quality concepts and standards3.Abstract Test Suite4.Executive Test Suite5.LPIS QA portal6.screening of the ETS records7.thresholds8.results
Information offered by LPIS QA sc
ore
ana
lysi
s
rem
ed
ies
scre
eni
ng
ASSESSMENT REPORTS
Key numbers43 reports for 45 lots31 independent reports18 in English (or translations) (excl. 4 UK and 1 IE)Min 2 / max 15 pages (incl. remedial action plan)
Possible methodological issues (out of 31)• scoping (general or specific): 11• Land cover vs eligibility: 2• Understanding causal processes: 7• Applying acceptance number Ac 21 ?• Mitigate observations: 7 • ETS v5.1April 10, 2023 49
2011 (part.)40 / 43 exp.40---Min 3 / max 26
(out of 40)60112 ? 5 6
2011 (part.)2010
FINDINGS
(out of 31)Self-assessment “non-conforming”: 22
# QE thresholds 3
CAPI problems (image quality): 9wysiNwyg: 9need RFV: 4 (+ 1)These 3 problems are unrelated
Report improvement effect
JRC first opinion: 23/31 (28/43)
(out of 40)206
333
4
32/40
2011 (part.)2010
SOME 2011 OBSERVATIONS & QUOTES• cadastral parcels: 2 LPIS measure >90% of sample, 4 LPIS <15% • report contaminated RP > 0.1ha ≠ report RP with >0.1ha contamination• “in our AP system, all non-conformities can be attributed to errors from
farmers”• “RP bordering woodlands were skipped due to poor image quality”• “Art 31 holds “MS may use GNSS”, so ETS field activities cannot demand
the use of GNSS”• “update rates were measured differently for 2010 and 2011 so the rate
for 2010 must be 0”• “we do not regard topographic blocks consisting of 10 or more
permanent fields as a risk”. • “cause for non-conformance: 1. Area observed (Aobs) is less than the
area recorded (Arec)”
REMEDIAL ACTIONS(out of 31):
• do nothing: 4• correct the found non-conformities: 2 • apply database changes: 14• improve farmer’s input: 8• improve OTSC feedback: 6• start-strengthen intergov collaboration: 6• start-strengthen periodic refresh: 17• continue “acute update”: 10• do at least 2 of the last 5 above: 20• set up quality system:• improve documentation & training:• strengthen IT processes:• strengthen organization:• substitute VHR with aerial:
(out of 40):157 144410155196 9 451
2011 (part.)2010
REMEDIAL ACTION PLANS
Driven by: Self-assessment: 18Audit findings: 5
Request revisions (out of 31)account for farmer update: 6certain QE thresholds: 10
JRC first opinion: 20/31 (24/43)
2011 (part.)382
03
32/40≠ assessment
2010
PRELIMARY 2011 CONCLUSIONSclear improvement since 2010 on all points in nearly all MS1. scores
apart from QE6 (change rate): scores calculated in all reports 50% of LPIS passes all 6QE thresholds
2. assessment reports much less serious methodological issues (scope, land cover, mitigation) better relevance and acceptance of measures and thresholds better evidence of understanding causes behind the scores
3. remedial action plans 5 new types of action identified nearly all plans tackle several processes
remaining weaknesses: TO ADDRESS BY THE MS!!!1. version control ETS v4.3/draft v5.0/v5.1
2. timing TOO late sampling + 3 reports missing
3. acceptance numbers e.g. 1% expectation ≠ 8/800 but = 10/800
conclusions of an independent panel1. The LPIS QA is unique and important initiative and a due
step forward, involving: • close MS-EC interaction, including digital data exchange• comprehensive documentation and well defined guidance
2. The documentation can be improved further 3. The QE thresholds appear reasonable but some need
further investigation and political or scientific motivation4. The instrument can be used for self-assessment; for
justification of LPIS involved risks, following potential weaknesses need addressing: • the reference data need to be of appropriate quality• data must be independent• parcel shape/size and positional accuracy need further
research.
DGagri
JRC
DGagri
DGagri /JRC
JRC
True or False? True False +/-
A good LPIS QA score indicates a good LPIS
One can trust the LPIS QA scores provided by the MS
LPIS QA scores are not directly indicative for their performance in IACS
LPIS QA scores are isolated snapshots in time.
It's more important to present good LPIS QA scores than to demonstrate understanding of issues
Remedial plans should only be based on the analysis from the MS
Corrective measures are the sole responsibility of the MS
Questions?
575 June 2012
Top Related