iTrack WP3 workshop
-
Upload
trilateral-research -
Category
Government & Nonprofit
-
view
1.173 -
download
0
Transcript of iTrack WP3 workshop
Click to edit Master text styles
CLICK TO EDIT MASTER
TITLE STYLE
iTRACK PRIVACY AND ETHICAL IMPACT ASSESSMENT:
WORKSHOP
10/11/20161
Click to edit Master text styles
CLICK TO EDIT MASTER
TITLE STYLE
2
Click to edit Master title style
iTRACK Project
Agenda for the day
3
Time Activity
9.30 – 9.45 Registration and Coffee
9.45 -11.15 Introduction & Information Flows
11.15 -11.30 Coffee
11.30 – 12.30 What are Privacy and Ethical Risks
12.30 – 13.30 Lunch
13.30 – 14.45 Identifying Risks: Group Work
14.45 – 15.00 Coffee
15.00 – 17.00 Risk Likelihood, Solutions, Conclusion
Click to edit Master title style
iTRACK Project
WHAT are we doing today?
Ethics and Privacy Impact Assessment Workshop
4
Click to edit Master title styleWHERE ARE WE NOW?
What? We are developing a human-centred technology that takes into account actual real-world practices of humanitarian aid workers, and provide policies for better protection and a more effective and efficient response.
How? By building an integrated intelligent real-time tracking and threat identification system.
Is there a potential problem? Could potentially be intrusive and people could be suspicious of it.
What could be? A system that improves protection and improves efficiency.
Click to edit Master title styleWHY?
iTRACK’s nature
H2020
User expectations
Marketable
Click to edit Master title styleWHO for?
End user
Technology developers
Research
Click to edit Master title styleWHAT is the output of today?
Deliverable3.1
Future
Technology
Click to edit Master title styleMETHODOLOGY
1Preparation
2Interviews and mapping of information flows
3Stakeholder workshop
4Mapping out the risks and solutions
5Review of ethical and privacy assessments by an independent third-party
6Publication
Click to edit Master title styleINTERVIEWS
Click to edit Master title styleEXAMPLE QUESTIONS…
Please describe the system/technology that you are developing.
Are there alternatives to the technology that are less intrusive?
Could the technology affect vulnerable groups?
What personal information is collected?
Where will the information collected be stored?
Can users decline to use the technology?
Click to edit Master title style
iTRACK Project
TECHNOLOGY AND INFORMATION FLOWS
12
Understanding the information flows involved in a project is essential to a
proper assessment
System of systems
Click to edit Master title style
iTRACK Project
Information Flows – SOCIAL SENSE (K-NOW)
13
REMOVED: CONFIDENTIAL
Click to edit Master title style
iTRACK Project
Information Flows – STAFF SENSE (K-NOW)
14
REMOVED: CONFIDENTIAL
Click to edit Master title style
iTRACK Project
Information Flows – ON BOARD LOCALISATION AND
TRACKING IN VEHICLES (TREELOGIC)
15
REMOVED: CONFIDENTIAL
Click to edit Master title styleInformation Flows – RFID & GPS SENSORS (TREELOGIC)
REMOVED: CONFIDENTIAL
Click to edit Master title styleInformation Flows – 360-DEGREE PANORAMA CAMERA (TEKNOVA)
REMOVED: CONFIDENTIAL
Click to edit Master title styleInformation Flows – SECURE COMMUNICATION (Teleplan)
REMOVED: CONFIDENTIAL
Click to edit Master title styleInformation Flows – REAL TIME THREAT DETECTION AND SUPPORT (UiA)
REMOVED: CONFIDENTIAL
Click to edit Master title styleInformation Flows – BACKEND SERVER (TREELOGIC)
REMOVED: CONFIDENTIAL
Click to edit Master title styleInformation Flows – iTRACK INTEGRATED SYSTEM (INTRASOFT)
REMOVED: CONFIDENTIAL
Click to edit Master title styleCOFFEE BREAK
Click to edit Master title styleRISKS
We need to identify privacy risks
We need to identify ethical risks
Click to edit Master title stylePrinciples that will guide this analysis: ETHICS
Ethics, based on ECHR, UNDHR and other pieces of legislation
Look out for: harm, invasion of boundaries, invalidity, lack of trust, undermining of human dignity, breaching autonomy…
Click to edit Master title stylePrinciples that will guide this analysis: PRIVACY
There is no overarching definition of the term however in iTRACK we take a broad view
of privacy that goes beyond traditional notions of personal
information
Click to edit Master title stylePrinciples that will guide this analysis: PRIVACY
Privacy of the person
Privacy of behaviour and action
Privacy of communication
Privacy of data and image
Privacy of thoughts and feelings
Privacy of location and space
Privacy of association
Click to edit Master title styleEXAMPLE OF A RISK
There is a risk that iTRACK Reputational
damage due to loss of personal and
potentially sensitive data
Because the server is insecure
A hacker breaks into the iTRACK
system
Click to edit Master title styleEXAMPLE OF A RISK
There is a risk that a humanitarian worker may be discriminated on the grounds of his
religion
A humanitarian worker forgets to turn off his
tracking device and he is traced
to a place of worship
A nosy manager looks where workers go after work
Click to edit Master title styleEXAMPLE OF A RISK
There is a loss of personal data and thus a regulatory
fine & loss of reputation
They leave phones with
iTRACK apps lying around with no
passowrd
Employees do not know about
privacy and data protection
Click to edit Master title styleLUNCH BREAK
Click to edit Master title style
• Group 1: Social Sense (Social media collection) & Staff Sense (Tracking & communication & Health sensors)
• Group 2: On board localisation and tracking in vehicles
• Group 3: Tracking of assets: RFID
• Group 4: Real time threat detection and support
• Group 5: iTRACK Backend Server
• Group 6: 360-degree panorama camera (position and risk detection)
GROUP EXERCISE
Click to edit Master title styleGroup - Names
Group 1
Alexander Verbraeck
Hayley Watson
Paul Crompton
KatrinaPetersen
Vita Lanfranchi
Group 2
Yan Wang
Moran Naor
Gerardo Glorios
Anne-Laure DUVAL
Group 3
Julia Muraszkiewicz
Christian Baumhauer
YngvarSørensen
Tunca Tabaklar
Mehdi Ben Lazreg
Group 4
Jacinto Esteban
GyöngyiKovacs
Tina Comes
Anders Rye
Group 5
Heide Lukosch
Christos Pateritsas
Lars Hamre
Hlekiwe Kachali
Group 6
Philipp Schwarz
Sofia Tsekeridou
Lisa Maria Svendsen
Morten Goodwin
Click to edit Master title styleDISCUSSION - WHAT RISKS DID YOU IDENTIFY?
Group 1
Group 2
Group 3
Group 4
Group 5
Group 6
Click to edit Master title styleCOFFEE BREAK
Click to edit Master title style
• The aim of this tasks is to assess the value of the risk and whether it is acceptable. The decisions are based on the framework of:
– severity of the feared event: what would be the consequence of the feared event happening?
– likelihood of the feared event?
RISK MAP
Click to edit Master title styleRISK MAP
Severity Rating Likelyhood
Minor severity: Risk poses littlepotential for loss.
1 Minor Likelihood: Given the currentstate of technology the risk is veryunlikely to occur
Moderate severity: Risk requiressignificant resources to take place,with significant potential for loss. Or,risk requires little resources to takeplace, with moderate potential forloss.
2 Moderate Likelihood: Given thecurrent state of technology the riskmay occur
High severity: Risk requires fewresources take palce, with significantpotential for loss.
3 High Likelihood: Given the currentstate of technology the risk is verylikely to occur
Click to edit Master title styleEXAMPLE OF A RISK
There is a risk that iTRACK Reputational
damage due to loss of personal and
potentially sensitive data
Because the server is insecure
A hacker breaks into the iTRACK
system
Click to edit Master title styleEXAMPLE OF A RISK: LOSS OF PERSONAL DATA
Security safeguards in
place so likelihood is low
The severity is high (personal
data loss = data breach = fines & reputation loss = significant impact for individual and the organisation)
Need to think about solutions
Click to edit Master title styleGROUP EXERCISE
Click to edit Master title styleDISCUSSION
Please present 2x risks and their severity and likelihood.
Group 1
Group 2
Group 3
Group 4
Group 5
Group 6
Click to edit Master title style
• Risks with a high severity and likelihood absolutely must be avoided orreduced by implementing measures that reduce both their severity andtheir likelihood.
• Risks with a high severity but a low likelihood must be avoided or reduced
• Risks with a low severity but a high likelihood must be reduced byimplementing security measures that reduce their likelihood.
• Risks with a low severity and likelihood may be accepted, especially sincethe treatment of other risks should also lead to their treatment.
SOLUTIONS
Click to edit Master title style
Risk Solution(s) Result (avoided, reduced,minimised or transferred)
Risk: Reputational damage due toloss of personal and potentiallysensitive data
Add extra protectionlayer to server e.g., SSL
Minimised
Risk: Workers tracked and hurtbecause tracking app was leftunattended
Ensure devices arepassword protected
Minimised
SOLUTIONS
Click to edit Master title styleGROUP EXERCISE
Risk Solution(s) Result (avoided, reduced,minimised or transferred)
Click to edit Master title styleDISCUSSION
What Solutions did you identify?
Group 1
Group 2
Group 3
Group 4
Group 5
Group 6
Click to edit Master title styleNEXT STEPS
EPIA
TRI’s next steps
For tech partners...
For end user…
Other tasks...
Click to edit Master text styles
CLICK TO EDIT MASTER
TITLE STYLE
46
THANK YOU
JULIA MURASZKIEWICZ & INGA KROENER