COVR Award Evaluator Guide · The award evaluation process follows these guiding principles:...

14
Being safe around collaborative and versatile robots in shared spaces COVR Award Evaluator Guide Version: 2.1 Date 2019-02-06

Transcript of COVR Award Evaluator Guide · The award evaluation process follows these guiding principles:...

Page 1: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

Being safe around collaborative and versatile robots in shared spaces

COVR Award Evaluator Guide

Version: 2.1 Date 2019-02-06

Page 2: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

APPLICATION EVALUATION Thank you for agreeing to take part in the evaluation of COVR FSTP applications.

All eligible applications will be evaluated by independent experts.

The COVR consortium needs to verify that all applications have received a fair evaluation so need

sufficient detail in your evaluation report. The final decision on which applications are granted a COVR

Award is taken by the Steering Committee (SC), which is responsible for ensuring that the final

portfolio of applications is not only the best applications but also balanced according to COVR’s

objectives.

Once this process is completed, all individual applicants will receive an official assessment containing

information on the results of the evaluation, including the evaluation summary report containing the

combined opinion of the experts and the steering committee.

EXPERT EVALUATORS At least three expert evaluators will assess each Award application. Evaluators have been chosen

based on their expertise, either as domain specialists (end-users), technology specialists (e.g. robotics

technology providers), as exploitation specialists, or as safety and/or standardization specialists.

The award evaluation process follows these guiding principles:

Independence: The evaluation must demonstrate impartiality on its merits, irrespective of the

origin or identity of the applicants. Evaluators sign a declaration of conflict of interest saying

that they do not have any interest or benefit in the evaluated applications.

Confidentiality: Evaluators are kept anonymous (their identity is kept unknown to the

applicants) and they also sign a confidentiality declaration with the commitment of not

revealing to any third party any details of the application, neither during the evaluation nor

afterwards.

Fairness: Each application is evaluated by at least three evaluators to ensure broad

experiences and viewpoints are considered for each application.

ELIGIBILITY CRITERIA The COVR consortium is responsible for checking the eligibility of applications. Therefore, all

applications that you will be asked to assess can are eligible. Your contribution is to assess their worth.

EVALUATION FORM The application form is divided into five sections: Parts A, B, C, D and E. Only part E is scored by you,

the reviewers, as the first parts primarily contain formal information about the applicants and

information that a later stage can be used to ensure the right mix of Award projects, balancing

geography, company size, activity type, industry/domain and dominant technologies.

Part E consists of 4 general sections:

SECTION 1 – AWARD WORK

o Question: Award concept and Approach

SECTION 2 – IMPACT AND OUTCOMES

o Question: Outcomes

o Question: Impact

SECTION 3 – TECHNICAL CHALLENGES AND DATA

Page 3: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

o Question: Technical challenges

o Question: Novelty

o Question: Data generated in the project

SECTION 4 – PROJECT EXECUTION AND STAFFING

o Question: Milestone 1

o Question: Deliverables for Milestone 1

o Question: Milestone 2

o Question: Deliverables for Milestone 2

o Question: Project Team

o Question: Project Budget

Each of the four sections are weighted 25% of the total score.

For each section, the evaluator will assign a score as described below:

0 Fail The proposal fails to address the criterion under examination or cannot be judged due

to missing or incomplete information;

1 Poor The criterion is addressed in an inadequate manner, or there are serious inherent

weaknesses;

2 Fair While the proposal broadly addresses the criterion, there are significant weaknesses

(e.g. too little concrete information);

3 Good The proposal addresses the criterion well, although improvements would be necessary

4 Very good The proposal addresses the criterion very well, although improvements are still

desirable;

5 Excellent The proposal successfully addresses all relevant aspects of the criterion in

question. Any shortcomings are minor.

If an evaluator requires clarification on any part of the application, they should request this from COVR

partners, who will request more information from an applicant via email with a deadline for the

response. This can only be done once per evaluator for each Award application. Answers will be shared

with all assigned evaluators.

During the reviewing process, evaluators are encouraged to send any questions to the COVR team.

Page 4: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

SCORING, RANKING AND SELECTION As all sections are equally weighted 25%, the total score can be calculated as the sum of these scores.

For each evaluator, the scoring interval is 0 – 20 points for an application. Please note that fractions

are not used, as this granularity is deemed unnecessary.

The total score of an application is calculated as the sum of scores from all three reviewers:

Section Reviewer 1 Reviewer 2 Reviewer 3 SUM

Section 1 – Award work 0-5 0-5 0-5 0-15

Section 2 – Impact and outcomes 0-5 0-5 0-5 0-15

Section 3 – Technical challenges and data 0-5 0-5 0-5 0-15

Section 4 – Project execution and staffing 0-5 0-5 0-5 0-15

Total score 0 - 20 0 - 20 0 - 20 0 - 60

The maximum, total score an application can receive is 60 points. All applications from each call are

finally ranked according to their overall score.

The Steering Committee shall decide which applications will be granted an Award. This decision will

be based strongly on the ranking, whilst ensuring the portfolio of selected Awards is balanced as

detailed below. Feedback provided from the COVR consortium team members will also be considered

by the Steering Committee during selection, as the key technical knowledge and an overview of

current status of protocols and Toolkit may be required to achieve largest impact of selections.

In particular, the following criteria are used:

Balance between expected outcomes of Award projects to ensure largest possible overall

contribution to Toolkit and protocols

Balance among industry/domains

Balance among dominant technologies used in the Awards

Alignment with the impact goals of COVR

Contributes to the expanded use of cobots in real settings.

Balance across European countries/regions

Page 5: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

ACCESSING APPLICATIONS AND USING SURVEYMONKEY APPLY To review the applications, you will have to use the SurveyMonkey Apply platform built for the COVR

project. To review applications, you will go through the following process:

1. You will receive an email, saying that you have been registered as a Reviewer for the COVR

Award site. This email will contain a link to take you to the platform. At this point you will not

have any applications assigned for review, and therefore can take no actions on the platform.

2. You will receive an email informing that you have been assigned one or more applications for

review. This email will contain a link, which you must follow to access the applications.

3. Once you click “Start” you can see an overview of the applications that has been assigned to

you. For each of the applications you can also click “Start” to begin reviewing the specific

application.

4. Once you start the actual review, you will see a split-screen with the application on the left

and your review on the right. In your review panel you will be able to select scores and

provide feedback for each section.

5. You can easily expand and collapse the different parts of the application, by clicking the

small arrows in the top right corner of each part. This may be useful to keep an overview.

6. Once you have provided scores and comments to all sections, press the “Mark As Complete”

button to finish your review.

Page 6: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

EVALUATOR GUIDANCE – HOW TO SCORE THE 4 SECTIONS This guidance is intended as an outline introduction to help you create your feedback for a COVR

Award.

Section 1 – Award work scope and quality This section focuses on the actual work (scope and quality) to be completed by the beneficiaries. The

following type of Award work should be scored high by reviewers:

Work that makes CE-marking, “safety certification” easier, faster or more reliable, including

work on assessment tools and methods.

Work that can be categorized as non-trivial, cutting edge, pushing the state of the art.

Work resulting in new cobots or new cobot applications being developed

Applications involving close collaboration between robot and human will be preferred. Shared

spaces are a minimal requirement.

As evaluator, you should not assess the actual safety of the robot.

Section 2 – Impact and outcomes The impact is measured for beneficiaries, the COVR project and for the overall impact on cobot

deployment in the European Union. The following impacts should be scored high by reviewers:

Impact for beneficiaries

o Increase productivity and/or work environment safety for end-user companies

o Decrease time to market for cobot developers

o Increasing sales and/or potential market size for cobot developers

o Increase number of deployed cobots at beneficiaries

o CE-marking of robots/robot installations.

Impact for COVR project

o Good and wide test of toolkit and other COVR elements

o Writing new protocols / editing existing protocols

o Validating existing protocols

o Providing experience and best practice for specific use-case scenarios

o Providing large volumes of data on safety validation procedures

o Accreditation of COVR toolkit and protocols through the safety certification of large

numbers of cobot installations. CE-marking is also excellent.

Impact for cobot deployment in the European Union

o Promote the protection of humans including workers, elderly and vulnerable people

o More cobots are supported to market

o A wider range of certified cobots, cobot tools and cobot systems becomes available

o More cobot installations

o Is usable for other European companies/organizations or the general public

o More European cobots implies a bigger overall market share

Section 3 – Technical challenges and data These technical challenges and data collected should be scored high by reviewers:

New cobot types, new tools, new applications, new tests are preferred (while keeping the

work realistic / risks within reasonable limits).

Novel technical challenges addressing special cases that currently have no clear safety

validation procedure or best practice.

Page 7: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

Developing new methodologies for working with cobots or assessing their safety

Implementation of cobots in new environments or for new types of operations.

Applications with a structured and technical approach to improving the consistency of safety-

related measurements.

Large volume of data, validating toolkit, protocols and creating general consensus for the

COVR project.

Section 4 – Project execution and staffing These parameters should be scored high:

Realistic milestones/deliverables – the proposed work looks technically feasible and should fit

within the time frame

Ambitious milestones/deliverables – The proposed milestone shows/support the project

ambitions

Staff have deep knowledge of cobots

Staff have deep knowledge of safety and safety validation

The project work is based in a real industrial/healthcare case(s)

Applicants/Consortiums with SMEs are preferred

Page 8: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

HELPFUL TIPS FOR REVIEWERS OF COVR PILOT APPLICATIONS What is a good application? A good application… • Has a strong focus towards safety related to humans interacting with robots. The application

contains technical challenges that will provide new knowledge or data, useful for pushing the boundaries for safety standardisation and has the potential to be useful for a real industrial/social need. It is okay for a solution to be at an early stage (TRLs 3-8), as long as it has good potential.

• Communicates its idea(s) in a concrete way without too much ambiguity.

• Can be short and contain spelling errors. You are evaluating the innovative idea and technical feasibility and should be able to “read between the lines” and evaluate the concept behind the voucher application. That said, some voucher applications may be so poorly written that it is simply impossible to evaluate the idea properly, and these should of course be evaluated accordingly.

What makes a good comment? You will be asked to provide your comments twice during the evaluation process. Your first comment will be used by the COVR consortium only. The second comment will be shared with the applicants, regardless whether the application is accepted for an Award or not. A good comment for the applicant: • Justifies the evaluation score provided

• Describes which parts in the application are the strongest and most convincing

• Describes which parts in the application are the weakest and how these could be improved

• Is detailed and has around 5 sentences or more.

• Appreciates that Applicants took the time and trouble to apply and aims to help them improve future applications.

General Tips

Please remember that the comments that you will provide need to be clear and concise. You

need to explain why those evaluation scores have been chosen and state the strong and weak

parts of each application.

Try to be fair on the scores; if it helps, you can read all of your assigned applications before

you start scoring. This will give you a general idea of the overall level.

Please make sure you assess the feasibility of the Milestones and Deliverables to ensure that

the project has a valid chance of achieving its goals within the 9-month timescale of a COVR

Award.

If an application is addressing a field which is on the edge of your expertise, you are encouraged to discuss that with the COVR consortium in order to perhaps reassign the

Page 9: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

application to another evaluator. The application MAY NOT BE PASSED ON TO ANY THIRD PARTY outside of the COVR Consortium.

Thank you for your effort as a COVR Award evaluator!

With your help, we can identify the most promising and innovative ideas and facilitate cobot

deployment in Europe.

Page 10: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

Appendix 1 INTRODUCTION TO THE COVR VISION The need for collaboration between robots on human tasks is evident in all sectors of the European

market. Collaboration however inevitably raises safety issues, and European legislation is very careful

to protect people. Robot systems therefore need “certification”, i.e. to show compliance with the

mandatory Essential Requirements of Safety and Health.

In our experience with end-users, robotics components manufacturers, and system integrators, safety

has become a barrier to the promotion and availability of collaborative robotics technologies in all

domains. This is due to a number of issues both technical (e.g. some robotics can change their behavior

over time) and non-technical (e.g. understanding and correctly applying the current standards).

The EU-funded project “Being safe around collaborative and versatile robots in shared spaces” (COVR)

aims to systematically break down certification barriers and support more widespread use of

collaborative robots (cobots) in a wide range of industries and domains (e.g. manufacturing, logistics,

healthcare, rehabilitation, agriculture).

The COVR Project has 3 main activities:

Develop an online accessible Toolkit to make it easier to find out which standards,

directives and requirements a cobot solution must comply with, and which protocols must

be completed to validate compliance.

Write new protocols for validating cobot safety. Protocols serves as “recipes” for the tests

that must be carried out in order to demonstrate that a cobot solution meets given

requirements.

Run COVR Awards to promote the development and implementation of safe cobot solutions

in Europe, and provide key inputs to the Toolkit and protocols, as well as validating them.

These activities are all completed in a context of deploying more collaborative robots in Europe.

For detailed explanation of the COVR Toolkit and protocols, see Appendix 1. For glossary, see Appendix

2.

Page 11: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

INTRODUCTION TO THE COVR TOOLKIT The COVR toolkit is a guided procedure for listing the requirements for determining compliance with

relevant safety directives. It can be used by coboteers with different levels of knowledge about the

processes of safety assessment according to ISO 12100 [1] / ISO 14971 [2]. The main steps of safety

assessment, as specified in ISO 12100 are:

hazard identification,

risk estimation,

risk evaluation,

risk mitigation,

validation of protective measures

The toolkit is intended to assist users carrying out different phases of safety assessment and

evaluation. It provides explanations about the process with the goal of simplifying the task of finding

and interpreting mandatory procedures that are available in published normative materials. It consists

of a graphical-user interface (GUI) which provides a walkthrough of the analytical steps necessary to

derive the safety requirement checklists. Furthermore, the toolkit helps people to identify the

necessary methods for validation of their risk mitigation solutions, by including a collection of

validation protocols (step-by-step guides on how to carry out the required measurements) for selected

example risk mitigation solutions. If necessary, the toolkit can also support novice robotics users in

identifying the preliminary steps for analysing applications and/or systems/components, in order to

separately assess hazards and associated risks.

The COVR toolkit IS NOT intended:

To be a replacement for risk analysis and assessment

To automatically select the risk mitigation solutions to be implemented (The users will be

given a list of possible risk mitigation solutions; they will have to choose them according their

risk assessment and case or possibly look for other solutions)

To function as any kind of certification

The COVR toolkit will be accessible as a web service and will offer multiple paths through the

information depending on the user’s level of knowledge about the domain, the regulations and

standards, and the safety skills and safety functions used.

INTRODUCTION TO VALIDATION PROTOCOLS Validation is defined as a set of actions to evaluate (i.e. provide evidence through documented real

measurements, “metrics”) that a (set of) safety functions meets a set of target conditions. Safety

validation is the evaluation of whether or not a product, service, or system complies with a defined

operational condition characterized by a given level of risk. Safety validation serves as public evidence

that a product (including functions and algorithms), or system meets a set of safety requirements

agreed by stakeholders.

One of the main objectives of the COVR project is to provide support for the validation of protective

safety measures. This entails assistance for creating the necessary documentation (e.g. checklists for

showing compliance with individual requirements), as well as step-by-step instructions for performing

the validation tests themselves. There is a lot of uncertainty in the robotics community concerning the

validation of systems. One reason is the complexity of robotics installations, as discussed in [4].

Page 12: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

Furthermore, some of the standards applicable to robotics safety were written prior to the existence

of collaborative robots, such as the ISO 13855.

The COVR consortium has applied two distinctive methodologies for identifying protocols. On one

hand, by focusing on the two relatively mature domains of manufacturing and rehabilitation, we have

identified safety functionalities that require verification in the form of measurements that are defined

in the relevant standards. This methodology can be termed to be a “standards-based” approach. We

consider this approach to be valid as the required measurements for validation are the state of the art

and mandatory for collaborative robotics manufacturers and end-users, when relying on the given

safety skill. In order to investigate domains that do not yet have advanced standards, we used a

“bottom-up” methodology whereby collaborative robotics applications from specific domains are the

starting point for identifying protocols. Under this methodology, we briefly described specific

applications, identifying corresponding risks and hazards, before describing options for the risk

mitigation actions. These risk mitigation measures can be considered to be a form of safety skill, and

the means to validate their proper function under the appropriate conditions are the validation

protocols we are identifying.

Page 13: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

Appendix 2

GLOSSARY

Activity A task that needs to be accomplished within a defined period of time or by a deadline to work towards work-related goal or deliverable.

Award Responsible Partner (ARP)

The ARP will typically be the geographically nearest partner which can offer the necessary facilities to a given Award project. The beneficiaries will use COVR services primarily at that SSF. The ARP is responsible for communication between COVR partners and beneficiaries, for monitoring project progress, and for communicating any actual or probable problems to the COVR consortium.

Award Agreement

Legal agreement between the ARP and the Award winner. The Agreement regulates the commitments and responsibilities of both sides during the project work.

COVR Award A FSTP funded project lasting for up to 9 months.

COVR Course A short course given by COVR consortium to a defined public.

COVR Shared Safety Facilities (SSF)

COVR physical site containing test equipment and physical access to services, where coboteers bring their cobot for test.

COVR Toolkit

This is a program developed in the COVR project to help people assess cobot safety by identifying which protocols need to be applied for a specific case/application. It is a single point of access (“one-stop shop”) program which uses a common approach to safety certification which is valid across all people, all technologies, and all applications.

COVR Workshop A meeting for presenting and discussing results from COVR to an outside audience

Deliverable Term used in project management to describe a tangible result of a project, used partly for monitoring project activity and progress.

Directive

A legislative act that sets out a goal that all EU countries must achieve. However, it is up to the individual countries to devise their own laws on how to reach these goals. It’s the second highest ranking legislative act in terms of binding, after Regulation.

In COVR we almost exclusively mean “product safety Directive” (so the Machinery Directive, the Medical Device Directive, the Personal protective equipment (PPE) Directive, etc).

Expert evaluator Person assessing COVR Award applications. These evaluators will have expertise in both robotics and safety and will not be employees of the COVR Consortium.

Milestone A tool used in project management to mark significant points along a project timeline e.g. the end of a phase, or a technology/information handover.

Protocol

A predefined procedural method used in the design and implementation of an experiment. A protocol is the collection of all the procedures and the prerequisites that are necessary to carry out safety experiments, together with the instructions for documenting results with respect to verification

Page 14: COVR Award Evaluator Guide · The award evaluation process follows these guiding principles: Independence: The evaluation must demonstrate impartiality on its merits, irrespective

or validation objectives (verification protocol or validation protocol, respectively).

Steering Committee (SC)

A COVR management instrument consisting of one selected member from each of the partners within the COVR consortium. Their mission is to ensure that COVR is executed in accordance with the agreed work plans, is of high scientific quality and that the results will be useful to the European robotics community.

Dominant technology

Used as a rough method for COVR to classify the robotic technologies used within an Award, the list of dominant technologies is intended to be industry agnostic and can be applied across many applications in many fields. One example is pick and place technology, where 3D bin picking is suitable in manufacturing, healthcare, agriculture and logistics applications.