Clinical Engineering and Physical Science

23
Clinical Engineering and Physical Science Consultation Response on the Standards and Criteria January 2015 Version 1.1

Transcript of Clinical Engineering and Physical Science

Standards and Criteria
Introduction
Many thanks to all of you that took the time to respond to the iCEPSS standards consultation. The public
consultation ran for 12 weeks and closed on 14th November 2014. Over 20 bodies were contacted to inform
their members of the consultation.
During the consultation the following materials were made available:
iCEPSS standards and criteria
A letter explaining the consultation process and how to respond
The consultation questions
Responses were received from a variety of stakeholders including professional bodies, institutions and
individuals. There were a total of 42 responses with 22 responses received through the on-line survey of
which 6 were anonymous. A list is included in Appendix 2 for reference.
Overall the response to the consultation is very encouraging and supportive. It has further highlighted a
number of key issues that need to be addressed particularly in relation to the application of the standard
where services may be subject to overlapping standards. Key elements of the responses have been
tabulated in Appendix 1 for information.
The Academy for Healthcare Science is very appreciative of the willingness of other Professional Societies
and Royal Colleges to work with the Academy and to clarify the content and implementation of the
standard.
The Academy along with the Standards Advisory Group has carefully considered all the feedback received.
The following section outlines the specific response of the Academy to the key issues raised in the
Consultation.
1. Are the meaning and purpose of the standards clear?
a. An introduction to the Standard, its objectives, implementation and linkage to other
accreditation and certification standards will be developed. The Key Performance
Indicators (KPIs) or Quality Measures are currently under development and these provide
examples of the evidence that might be supplied to demonstrate compliance with the
standard. These will form the basis of the Self Assessment and Improvement Tool and will
be evaluated through the pilot phase.
2. Are these the appropriate accreditation standards to deliver Clinical Engineering and Physical
Science Services?
a. The Academy believes that these standards are applicable to services with and without
direct patient contact. All services ultimately have an impact on the quality and safety of
patient services and it is the intention of these standards to strengthen that linkage and
focus.
V1.1 January 2015 Page 3 of 23
b. The Academy notes the need to ensure the linkage of the iCEPSS Standard to existing
certification standards wherever possible. The Academy is still reviewing the most
appropriate mechanism for implementation. The requirement will be an important
element in this process. It is proposed to develop a position paper on implementation and
linkages to existing standards.
c. The linkage to other international standards eg ISO 55000, will be incorporated into the
Clinical Scientific Service domain and will be reflected in the KPIs where relevant. This
approach provides flexibility in the updating of the standard as local, national and
international standards develop.
d. The Academy will include a standard/criteria relevant to the requirements of Medical
Device legislation. This is acknowledged as an omission in the Standard as drafted.
(Highlighted in KMS)
3. Will implementing an independent peer review assessment against these standards improve
quality in Clinical Engineering and Physical Science services?
The majority view was that the implementation of these standards would improve quality although
the extent would depend on whether services had already well developed Quality Management
Systems and associated certification. The focus on the scientific and engineering aspects of the
services must be maintained. The accreditation process must be independent and be based on peer
review. The availability of appropriately skilled and trained peer reviewers would be vital to the
credibility of the Standard. There was a minority view that the process would consume scarce
resources that would be better focussed on service delivery.
4. Are there any standards that require particular attention and why?
Feedback suggests that simplification through the amalgamation of Standards is possible. This will
be considered by the Standards Advisory Group.
There is a general requirement to state which regulatory compliance is required or expected for
specific standards. This will be included in the Knowledge Management System as supporting
evidence to the Key Performance Indicators. These most probably need to be captured in relation
to scientific and engineering services domain as well as the overarching standards. The KMS will be
developed to provide this background information.
We believe that issues of compliance with the data protection act are covered under MFR10 C3.
We believe that the COSHH regulations are covered under SRM3 C3
The wording of MFR2 C3 will be strengthened to support ‘quality improvement’. Explicit reference
to the components of a Quality Management System will be considered to strengthen this aspect of
the Standard. Links to the requirements of a QMS will be contained within the Knowledge
Management System. Criteria have been re-ordered to enhance emphasis and linkages.
MFR7 C6 is primarily aimed at governance processes for the management of income from whatever
source. Is there evidence of probity in the management of finance.
V1.1 January 2015 Page 4 of 23
We believe that Domain 1 is applicable to all services in as much as all services must actively
consider the elements of the standard. As an example the service may not need to be engaged with
patient consent, however, there needs to be a responsible individual that is aware of these issues
and capable of considering the applicability of organisation wide policies on a specific service. How
a service links to other professions within an MDT are important to validate the quality of a service.
Domain 4 has been revised to ensure a better linkage between standards and criteria. Issues of
business continuity planning are addressed under MFR10 C9. A requirement to implement a
Medical Device management system has been included in SRM1.
MFR6 C3 requires a strategic approach to service planning and is thus included in the Standard.
WT3 C4 has been reworded.
The feedback regarding CSS1 C3 and the need to capture activity seems fundamental to justifying a
resource. The Standards Advisory Group believe consideration should be given to capturing activity
of any kind as a means of quantifying resource utilization and efficiency.
WT1 definition has been modified to remove the work ‘recruit’.
SRM3 has been reworded for consistency.
SRM2 C5, SRM3 C6. SRM4 C5 & SRM6 C7 have been removed as a simplification.
The request to change the title of the Clinical Scientific Service domain is noted and will be changed
to ‘Clinical Scientific or Engineering Service (CSES)’
MFR1 C8 has been moved to MFR4 for consistency.
5. Could the standards be improved, and if so, how?
The key feedback is to devise a simplified structure that avoids multiple standards for closely
related subject areas. This will be considered carefully by the steering group and tested during the
pilot process.
The majority of the comments related to the need to ensure that the Key Performance Indicators
were explicit about the standard of performance required to meet a Standard/Criteria. We believe
that this is the case and these will be tested further through the pilot phase.
6. Are there any gaps in the standards?
No major gaps were identified in the consultation. The issues raised have been reviewed and in
most cases guidance has been made explicit through the Key Performance Indicators and the
supporting Knowledge base. There are a number of issues which will be addressed as follows:-
V1.1 January 2015 Page 5 of 23
a. To ensure that service specific issues are clearly stated in the clinical scientific service
specific domain.
b. The wording and requirements of the standard will be checked against the CQC
fundamental standards.
c. The requirement for system wide learning is noted. This has been incorporated in the KPIs
as an important element of risk and incident management. This element will be
strengthened.
d. A section on R&D and Innovation is included in the standards. The requirement to
adequately disseminate findings is noted and will be added.
e. The issue of business continuity planning and testing is included in the KPIs for relevant
standards and criteria.
7. Are the criteria useful?
As presented a number of the respondents felt unable to comment on the question without sight
of the KPIs and expected evidence and the Knowledge Management System. This is accepted and
further testing will be undertaken through the pilot process. The use of the term ‘systems are in
place’ was seen as a too general statement. It was felt that often these were organisation wide
systems and not service specific systems and therefore not relevant. The key issue here is that
individual services need to be seen to be compliant and engaged with organisation wide systems.
The request to use the word monitor is noted. The KPIs usually require the service to be auditing
compliance with systems and thus monitoring their service. We believe this area is adequately
addressed but terminology will be reviewed.
8. Are there any criteria that need particular attention and why?
There were a number of responses suggesting that a number of the standards are very similar and
that a degree of simplification could be undertaken. This feedback will be considered carefully and
implemented where appropriate.
A number of Standards and Criteria have been revised as follows:-
a. SRM3 and associated criteria have been updated to include a specific reference to
radioactive substances.
b. The wording of PSUE4 C6 has been changed to be less specific in the light of changing
legislation and guidance thus permitting services to demonstrate compliance as
appropriate.
c. The request to add a standard around signage and access points is noted and MFR3 and
related standards will be reviewed. (MFR3 C2 updated).
d. The request to update MFR 9 is noted. This has been implemented in the relevant KPIs and
Example Evidence statements.
e. The Academy believes that MFR3 and MFR4 can be applied appropriately to all services.
The use of organisation wide systems is appropriate provided evidence of compliance and
audit are available.
f. The wording and interpretation of MFR1 C7 will be reviewed. This has been moved to
MFR2 and linked to the Quality Management System.
V1.1 January 2015 Page 6 of 23
g. MFR3 C2 is a general requirement for staff to be aware of the parameters within which
they are required to work. We do not believe multiple issues are confused in this construct.
To separate out the elements would produce significant repetition.
h. MFR3 C2. The fact that a facility does not fully meet user requirements does not preclude
accreditation. How the service is dealing with these issue is key to service improvement.
i. MFR4 C4-6. The relationship to the service specific domain will be considered and tested
during the pilot phase.
j. The purpose of WT6 will be clarified.
k. SRM1 C2 we believe this addresses the issue of Risk as distinct from Incidents. The use of
formal risk management methodologies is covered in the KPIs. This section has been
reordered to clearly separate out incident management from risk management.
9. Do you think assessment against these standards using a ‘Self Assessment Tool’ will improve the
quality of the service delivered?
There are really two key comments here. Firstly that these need to be a simple as possible without
undue repetition. Secondly, that service improvement will be better served by ensuring that there
are clear statements of service standards. This latter comment is being carefully considered in
relation to the specific scientific and engineering service domain.
Further comments.
1. The Academy accepts the need to ensure consistency with the CQC fundamental standards as
published and to ensure the standards are compatible with the need to deliver Safe, Effective,
Caring, Well Led and Responsive services.
2. The need to define how standards fit together to avoid duplication is accepted and further
discussions will take place with other professional bodies and organisations to clarify these issues
and provide clear guidance to services.
V1.1 January 2015 Page 7 of 23
Appendix 1
clear?
The text reads clearly but the relationship of the standards to existing
quality and accreditation systems needs to be clarified.
The purpose of the standards is not made explicit in the consultation
document. The SCoR would value seeing a definition of standard purpose
both in terms of the workforce it covers, the services it covers and how it
complements/supports other accreditation standards within the same
service delivery areas. E.g. ISAS, CHKS Oncology Service Accreditation etc.
E.g. is it expected that all Clinical engineering and Physical science services
being delivered within an institution will apply for 1 accreditation or can
elements such as the hearing aid service or medical physics service apply for
discrete accreditation?
A single page summary of the objective of the standard (plain English, no abbreviations) which outlines each of the Domains is required to introduce the document
‘The standards are clear but often the criteria is ambiguous and repetitive.’
‘The standards seem idealistic and so what degree of compliance are services
going to have to reach, particularly as some standards do not seem readily
auditable? It would be helpful if the criteria are backed up with appropriate
reference material or guidance to address the issues of what was expected. ‘
‘The rationale for the exercise is not adequately explained.’
Some of the standards have the potential to be 'abstract' and more relevant to some MPCE service areas than others. It would be useful to include some illustrative examples of what might be expected as evidence of compliance
V1.1 January 2015 Page 8 of 23
1. Are these the appropriate accreditation standards to deliver Clinical Engineering and Physical Science Services?
Some services are already accredited using alternative standards and to add
a new (and comprehensive) set of standards becomes onerous unless
duplication can be avoided.
For example Radiotherapy Physics has ISO 9000 in place in conjunction with
the Cancer Peer Review standards.
The other consideration is that very often the standards reflect what is
already covered within existing Trust procedures.
It is not made explicit how services that do not directly interface with
patients will achieve the PSUE domain. It is unclear how accreditation will
apply where areas such as informed consent are managed and delivered by
staff other than those in the clinical engineering and physical science work
force. This may apply to many of the more ‘clinical’ standards.’
‘The standards cover broad areas, and so would not be seen as being
specific enough for some patient based services involving multidisciplinary
teams and where specific regulatory requirements have to be met.’
‘The focus seems skewed towards local management arrangements within
the department and insufficient amount on assessing the quality of the
scientific services delivered.
Some of the wording makes the criteria more accessible to departments
with direct patient contact. The idea of the client being another department
or trust is not obvious: could be resolved with a glossary of terms. ‘
Not all of the standards are applicable to all services. There should be an option to opt out of non-applicable standards such as consent. I think the idea of a set of standards is an excellent one but they should be sufficiently adaptable that smaller services see the value of subscribing to them. It is clear that the standards have been written with a particular bias around medical physics which may be off-putting to smaller ME/EBME/Clinical Engineering departments who perhaps would have the most to gain from a low cost alternative to a full ISO9000 accreditation.
There is no reference to the MHRA or medical device legislation. We are obliged to meet NHS/PSA/D/2014/006 - Improving medical device incident reporting and learning. This should be referred to specifically Can the Standard explicitly recognize/link to the ISO55000 standard that is particularly relevant to Medical Equipment Management
V1.1 January 2015 Page 9 of 23
Needs additional clarification of which standards are not applicable to non patient facing services, e.g. radiotherapy physics, diagnostic radiology. At this stage the criteria are fairly general, which they need to be due to services varying considerably within the Clinical Engineering and Physical Sciences banner. The service specific Key Performance Indicators (KPIs) will be the most important development of the standard. KPIs have the ability to quantify the standard of service required (and hence drive up quality). However a difficulty may be ensuring they align with current local NHS Board/Trust KPIs already in place (these may sometime differ to national targets). Additionally some service may already have established accreditation schemes in place and therefore will not want to take part in new ones - for example gait/movement analysis labs are accredited through Clinical Movement Analysis Society (CMAS)
Although, often other standards would be equally appropriate e.g. ISAS or IQIPS. Whilst it is clear the these standards have been considered in the develpment of these iCEPSS standards it may be confusing to services which standards they should adopt e.g. where Medical Physics includes nuclear medicine services or physiological measurements services should they not also meet the standards within ISAS or IQIPS respectively? Perhaps where specialism specific standards exist within Medical Physics Departments, the iCEPSS document should state that other standards should be met, leaving the iCEPPS document for those specialisms which do not have specialism specific standards. Also, where medical physicists work under other department/directorate umbrellas e.g. oncology, should the iCEPPS standards apply to the medical physicist contribution?
On first impression they seem to be much more strongly-weighted towards management than science – I was anticipating the reverse.
2. Will implementing an independent peer review assessment against these standards improve the quality in Clinical Engineering and Physical Science services?
There may be increases in quality for services not already operating
quality/accreditation systems but for those already operating such systems
the improvements in the clinical service may be hard to quantify.
A clear focus on those services/aspects of service delivered by Clinical
engineering and Physical sciences must be maintained.
The peer review element provides external assurance that these processes
have taken place; that peer review process itself must be sufficiently
robust, well managed and independent, with adequate numbers of
reviewers trained / assessed to an appropriate standard to itself be
effective and it outcome trusted by both service users and those working
within services.
V1.1 January 2015 Page 10 of 23
‘Yes – although it will be dependant on whether it is a voluntary scheme or
compulsory’
‘In many instances it may not as they are excellent centres anyway. In
smaller departments it will improve the quality as there has been no
requirement to undertake the available audit tools that exist.’
‘For those departments who are not externally accredited already, yes, peer
review and external accreditation (to specific standards) will result in an
improvement in quality – if only through encouraging reflection on the
service offered. For those departments already accredited to an alternative
standard this will be more of a box ticking exercise that will require
additional resource for little benefit.’
‘Any review process that forces a department to consider its service from
the user point of view will lead to the identification of deficiencies, however
minor; yes, independent peer review assessment against these standards
will improve quality.
Some of the criteria refer to generic trust wide issues, outside the control of
an individual department. It would seem unjust to fail a department, which
gives a clinically/scientifically excellent service because of failures in the
upper echelons of the organisation. Also would we as clinical scientists have
enough experience of these higher, trust wide documents to say if they are
fully compliant or not?’
No - it will divert a huge amount of time away from delivering quality scientific services to delivering reams of paperwork and ticking boxes
Making the standards widely applicable to include as many services as possible should be prioritized over making the standards as comprehensive as possible. It is better to get more departments thinking along the lines of quality and safety standards than to ensure all the requirements of any one field are comprehensively met
Any accreditation standard will improve quality for those participating. I'm not sure how peer review would work though. I thought external auditors would assess as ISO9001. Peer review would entail a structure of expenses/time off etc. Also, what would happen to those departments who (for whatever reason) decided not to apply for, or continually failed to improve?
Yes, this is vital as it will open up communication channels with other
V1.1 January 2015 Page 11 of 23
services, enable sharing of ideas, and learning from each other. This will make accreditation more than just a badge of quality but a learning experience. It will also ensure the auditor has an understanding of the service. It would be very advantageous where possible if geographically local peers carried out a review. For example if assessors from Scottish services peer review other Scottish services, South Eastern England services peer review other South Eastern England services etc, as there services may be designed differently according to geographic location and they may have common local issues on which they may be able to work together on.
Not in the case of departments that are already fully covered by other accreditations and certifications
We would welcome this but strongly suggest that a pilot self assessment project is undertaken to refine and the applicability and clarity of the standards.
3. Are there any standards that require particular attention and why?
Domain 2 Management, Facilities and Resources contains three standards
dealing with improvements - 2/8/9. Amalgamation/simplification should be
considered.
legislation/professional guidelines etc is applied will be required.
e.g.MFR1 C5 talks about regulatory compliance – will what this is be stated
or user defined? MFR 10 currently has no mention of legislation but data
protection needs to conform to requirements of the Data Protection Act,
SRM3 does not refer to the COSHH regulations. MFR11 C4 relates to
regulatory compliance for R&D – will what that is be specified?
MFR2 domain: criterion C4 would be better as C2, and strengthened by the
addition of ‘with protected time to enable the role to be effective’ or
something similar.
Although risk and error management are covered in SRM1, criteria are
needed here relating to other key quality management tools i.e. a formal
document control system, deviations policy. I think MRF2 C3 would be
strengthened by adding ‘implementation of quality improvements’ as well
as quality objectives, with QI monitoring left in C6
Protocol development is mentioned in CSS1 C4 but should cross reference
to the QMS document control system
CSS1 C5 criterion would fit better in this section – accessibility of
documentation/protocols is an overarching attribute of a QMS
V1.1 January 2015 Page 12 of 23
In MFR1 C2 ‘required competence’ is vague – is ‘required’ user defined or
specified in guidance /KPI?
MFR 7 C6 – concepts around income generation and charitable donations
seem at variance. One may be system organised and closely regulated as
part of budgetary systems another may be ad hoc. Is this about suitable
handling of monies or use of such monies once generated? Does it refer to
individual donations or charitable purchase of large equipment items? How
does it fit with a business planning scenario that may rely on public
donations e.g. scanner appeals? The SCoR is unsure as to what this criteria is
assessing/assuring.
Some of Domain 1 will not be applicable to those CEPSS which are not
directly patient facing – this will be a challenge when a standard covers
disparate services some of which are support services with no direct patient
contact e.g. Clinical Engineering, and some clinical such as Nuclear
Medicine. There are also elements within the PSUE domain that may not be
the responsibility of the CPESS but another member of a MDT involved with
the patient i.e. PSUE4 There is significant overlap with other accreditation
schemes and for patient assurance and appropriate governance of quality it
must be very specific as to what elements are/are not being assessed.
The SRM1 section is muddled and appears ‘light’ compared to the other
domains. it should address firstly having a pro-active approach to risk
management, and secondly a reactive approach to incident/error
management and prevention. It would be improved if the order of SRMC1 &
C2 are reversed, to be followed by ‘Systems in place for systematic
assessment and management of risks associated with the service’ – which
would make the current C9 superfluous. The wording of C4 ‘risks regularly
audited’ requires expansion.
SRMC5 would fit better as the final criteria in this domain.
The wording of all the criteria in this domain are not consistent i.e. C6&7 do
not refer to systems but actions
Several other criteria also relate to incidents and errors i.e SRMC6 for H&S
errors, SRM4 C5 for moving and handling, SRM5 C4 for violence and
aggression, etc
If the SRM1 domain is adequate, is this duplication necessary?
Domain 4 Consideration could be given to an additional standard on Business Continuity planning for services in case of critical failure or plant shutdown i.e. SRM 7 The service implements and rehearses plans for business continuity in event of critical failure such as; power loss, water supply loss, critical shortage of staffing etc"
V1.1 January 2015 Page 13 of 23
Domain 4 should include a specific reference to how a service complies with the requirements of NHS/PSA/D/2014/006 -Improving medical device incident reporting and learning. There should be reference to the requirement for reporting incidents to the MHRA and to systems for responding to manufacturer field safety notices
The standard does not explicitly seek to ensure that the service aligns to the strategic needs of the organization. Perhaps this could be added into Domain 1 by adding a standard PSUE6
Some of the standards quite rightly highlight the patient. However, there are some disciplines that do not have as much direct contact with patients as others and the standards must take this into account. [Prescribing and Adverse reactions for example]
I think that MPE or equivalent should be mentioned so a department knows who their MPE is and how to contact them if there are any issues.
WT 3 C4: The phrase "which include the involvement of senior managers" seems somewhat superfluous - can the standard be achieved without involving this group? CSS 1 C3: I'm not quite sure what would be expected to meet this standard. Does it relate to capturing a request for service support, being able to monitor progress and evidence completion of support? Seems reasonable, but if we get a request to support, say theatre staff with a radiation protection concern, we just do it - there's no system to capture this activity. To do so would seem unnecessarily bureaucratic.
WT1 this includes recruitment but none of the criteria mention recruitment. Recruitment is referred to briefly in the criteria for WT2. Should equality be a criteria for recruitment? Similarly SRM3 covers radiation, hazardous substances and materials but radiation is not mentioned at all in the criteria. There are also many minor possible considerations. There is a great deal of repetition. eg every standard starts with "There are defined roles and responsibilities for professional leadership and management to ensure that....." So much repetition leads to "tick box" mentality. Some criteria are almost identical eg PSUE2C2 v similar to PSUE1 C6, SMRM2C5 wording identical to SRM1 C3 which covers all incidents- no need to repeat for specific hazards. Some could be combined eg PSUE2 C4 and C5.
Domain 5 primarily refers to "Clinical Scientific Service". It would be better to broaden this to "Clinical Scientific & Engineering Service".
V1.1 January 2015 Page 14 of 23
Doing so would remove any misunderstanding that the standards required were applicable mostly to clinical science.
MFR1 C8 Refers to 'technology suppliers'. Is this technology A) used by the service to deliver services or B) that out in the hospital which the service may have a responsibility for maintaining? If A perhaps this is best under MFR4. If B the service may not have overarching responsibility for selection. I can not see reference to implementation of safety notices for technology used by the service. Perhaps reference to this could go in Domain 4 or MFR4. Ensuring patient/service user requirements are met is fundamental to service delivery. Should PSUE1 C5 be perhaps PSUE C1 or C2?
4. Could the standards be improved, and if so, how?
5. I think they could be improved with a simplified structure where there are core standard deals with the more generic issues (improvement, defined roles & responsibilities, monitoring service delivery etc).
6. ‘To help improve clarity, a list of examples might be useful.’ ‘Sufficient detail within the clinical/specific nuclear medicine section should
be given. Therefore we would like to see some careful thought about the
Clinical Scientific Service domain and the relevant KPIs . By doing this the
proposed scheme will become stronger and could address many of the
issues raised by the BNMS as part of this consultation .
Where there is a requirement to comply with regulations – ie you need to
hold an ARSAC certificate(s) or have an MPE, the standard and associated
KPI must be clear this is needed It must be highlighted so that non-
compliance is a show stopper ‘
Yes - they need thinning out considerably, and turning into something that will help our services to patients……………..
Numbering is a bit weighty and could be simplified Some examples of evidence sheets could be developed especially as this is a new concept to comment some departments For example; SRM 6 C2- There are systems in place to define, assess and manage general health and safety risks Examples of evidence Risk Assessments Business continuity plans Fire risk assessments. Manual Handling assessments COSHH Assessments and MSDS.
Some guidance on the type and detail of evidence could be provided for each criteria. Perhaps some guidance to indicate if a Criteria is only applicable to a particular discipline could be provided.
V1.1 January 2015 Page 15 of 23
The standards are consistent with ISAS and IQIPS although it is noted that the Workforce and Training element has been separated. It is assumed that these criteria will be applicable to a service that is delivering training to its employees; it is not clear whether these also apply to any training provided to staff not employed by the service and whether there is an expectation that such a service would be 'accredited'.
7. Are there any gaps in the standards?
Domain 4 is interesting in that it is very specific about certain hazards
(which could be handled more generically) and omits others, such as the
hazard of high voltage equipment. I think you’ve got to either cover them all
(which is difficult) or make it more generic so the user responds with those
that are relevant.
The document also appears to place less stress on radiation safety than
other issues despite many departments providing this advice while other
issues (aggression etc) are normally dealt with at Trust level.
‘There are a lot of “service” standard and few “development” standards,
will iCEPPS support or stifle service development?’
‘The inclusion of a standard within Domain 4 that sets out a requirement for
the service to have an audit plan / programme should be considered.
Unless I have missed it I cannot see any reference to incident reporting,
investigation and learning.
All the standards should be checked against the wording in the new CQC
fundamental standards that were published on the CQC website on 11
November.
Consideration should be given in probably Domain 4 to a standard that sets
out the requirement to have a system for learning, both from within the
service, the entity within which it operates and the wider Clinical
Engineering and Physical Science community.’
I would like to see a section on R&D to ensure it is coordinated and managed effectively and to ensure findings are disseminated beyond local interest.
V1.1 January 2015 Page 16 of 23
Domain 4 Consideration could be given to an additional standard on Business Continuity planning for services in case of critical failure or plant shutdown IE SRM 7 The service implements and rehearses plans for business continuity in event of critical failure such as; power loss, water supply loss, critical shortage of staffing etc"
There will always be specialism specific standards that could be created, but I think that the overarching generic standard approach should cover such aspects - but they may not be picked up by audit without a specialism specific question.
Possibly - I wonder if the role of project management often lead by physics staff in my experience should be addressed more specifically. This may be covered by service improvement but more detail may be useful.
8. Are the criteria useful?
The biggest question any service provider has, when reading the iCEPPS criteria is ‘what do we have to do?’ This being the case, it becomes difficult to consult fully on the presented document, without seeing the kind of support the ‘Self Assessment Tool’ and ‘Knowledge Management Resource’ will provide.
‘Yes but the Standards all reference the word monitor but the criteria
generally include the words “systems in place.” My comment is that
monitoring may be implicit in this but could it be explicit?
Presumably there is an expectation of auditing and where necessary
corrective action.’
9. Are there any criteria that require particular attention and why?
Criteria SRM3 – The service implements and monitors systems to manage the risks associated with hazardous substances, materials and radiation. It is unclear whether ‘radiation’ refers to sealed and unsealed radioactive
materials (ie. those with the potential to be lost or to cause contamination)
or if it is intended to cover the generation and use of all ionising and non-
ionising radiations. I suspect the former given the COSHH-like bias of the
criteria, so a phrase like ‘hazardous substances, materials and radioactive
materials’ should be used in the criteria statement (and in points C1-C3 as
‘radiation’ disappears). However, if it is the latter then alternative phrasing
should be sought to make this clear, alternatively, separate criteria for the
V1.1 January 2015 Page 17 of 23
use of radioactive materials and equipment capable of generating ionising
and non-ionising radiations may be appropriate
‘PSUE C2 – it says “ensure … accessible to all patients”. I do not think that
we can guarantee that – access to services is so dependent upon funding
constraints, for example.
PSUE 3 – not sure at the moment how this can be demonstrated and
measured. Perhaps through questionnaires. Our Trust has a new set of
‘values and beliefs’ along similar lines and, again, how we can meet and
show them is being discussed.
MFR 3 – could perhaps include criteria relating to signage and access points
for patients. i.e. can a patient easily find the service and when there, can
they get in easily!
MFR 9 – to also include appraisals as a means for staff to make suggestions
for improvements.’
‘MRF3 & MRF4 could be clarified for small professional groups who do not
have involvement at this level.
Clarification is required as to whether individual profession specific
evidence is required or whether following generic Trust policies is enough.’
‘Patient and service user experience
Some standards/criteria are very similar and could be reduced in number
PSUE4 C6 Is it not a routine assumption now, in the NHS, that anonymised
data will be used for teaching, training and research?
Management, Facilities and Resources
MFR1 C7 Monitoring everything for relevance does not seem realistic or
achievable
MFR1 C8 Selection of suppliers is sometimes outside of our control
MFR2 C2 Staff awareness of importance of meeting needs of users should not
be mixed with regulatory requirements
MFR3 C2 This seems unrealistic – we can have systems to monitor our
facilities but we don’t have enough space and so does that mean we should
not try to achieve accreditation.
MFR4 C4-6 Should these be in the Service section?
V1.1 January 2015 Page 18 of 23
MFR7 C5 Management of contractors is done separately from budgets. The
standards should ensure that contractors have Service Level Agreements and
Technical Contracts.
Workforce and training
WT6 This seems to face the wrong way – we are generally given the curricula
and training requirements from external bodies
Safety and Risk Management
Risk management is really important for a quality system and this probably
should be developed more (see ICH Q9) and possibly separated from Health
and Safety.
Clinical Scientific Service
There seem to be few actual service standards, which does not seem right
given the detail in the previous Domains. ‘
9. Do you think assessment
against these standards
delivered?
If the standards can be simplified and made more user friendly a self
assessment tool is a great idea.
Definitely.
Yes definitely. The Institute of Medical Illustrators Quality scheme has also
started to collect exemplar evidence for education and support of smaller
department who might struggle to demonstrate best practice.
‘Yes, but there will be a huge resource requirement put upon services to self
assess correctly. Smaller services will not be able to provide this within
their current work capacity and will therefore require additional
resource. Accreditation will not be cost neutral and needs to be stated as
such.’
SAIT will help with benchmarking service and formulation of a plan to then make quality improvements
A self assessment tool will allow you to identify the gaps between your service and iCEPSS as part an initial assessment/implementation process. As with the implementation of any QMS, it will depend at where you are starting the process from.
Yes, I would suggest accreditation should not be awarded unless a self assessment has been completed prior to the external assessment and enough
V1.1 January 2015 Page 19 of 23
time has lapsed for improvements to be made where self assessment highlighted areas needing attention. This will ensure the external assessor assesses a service that is organized and feels it is in a fit state for accreditation.
It would if there is more detailed guidance which provides physics standards in relation to treatment planning acceptable practice, dosimetry in all radiation areas, need for external audit of calibrations and processes but standards as they are at present without more detail would not change practice or unduly test the practice in this hospital as far as I know.
10 Would you be willing to
participate in an
accreditation scheme based
on these standards?
Yes. The caveat we have though is that we already have ISO 9001 in place
that at times we are questioned as to its relevance and significance. These
questions are partly prompted by fact that it also incurs a cost.
11. Do you have any other
comments?
‘The document seems to place more emphasis on the clinical engineering
end of medical physics than the ionising radiation end. This may be
deliberate given existing standards.
recommendations, which is unusual for an IPEM document.
It would be useful to have an impact assessment in terms of the additional
human resource required to set up and operate this standard.’
‘The clear DoH hope is to have a one size fits all approach to Quality in
Healthcare Science. Having damaged MP&CE training through this approach
it needs now to be prevented for disrupting our approaches to quality.
In his 5 year Forward View Simon Stephens says: “7. England is too diverse
for a ‘one size fits all’ care model to apply everywhere. But nor is the answer
simply to let ‘a thousand flowers bloom’. Different local health communities
will instead be supported by the NHS’ national leadership to choose from
amongst a small number of radical new care delivery options, and then
given the resources and support to implement them where that makes
sense.”
Let us take this idea and say “MP&CE is too diverse for a ‘one size fits all’
quality model to apply everywhere. Different departments will be
supported by the AHCS and IPEM to choose from appropriate standards
models, and then given the resources and support to implement them”.
V1.1 January 2015 Page 20 of 23
The speedy introduction of generic standard systems is unlikely to happen,
with departments stretched already to meet the demands of the day job.’
The ScoR is grateful for the opportunity to provide feedback through the
consultation process and would be willing to provide further collaboration
on the integration of the various sets of accreditation standards across
imaging and radiotherapy services.
‘We are aware that for medical physicists who are working with colleagues
in imaging departments that may be seeking or may have achieved ISAS
accreditation, there would be some overlap between the standards.
This could be a problem if staff/departments feel they are being subject to
double accreditation, but it could conversely be beneficial if relevant
information/evidence from one scheme could be used as evidence for the
other. This could also help with the aligning of accreditation schemes and the
work that is being undertaken to achieve this. ‘

Where does iCEPPS fit with departments already externally accredited, for example ISO9001 Radiotherapy Physics departments? Where does iCEPPS fit with departments that are already part of a peer review process? The current standards and criteria will require significant interpretation as part of the peer review process which will impact upon the standardization of the accreditation process. Will the standards support or stifle service development?’
I am very much in favour of standards and welcome the opportunity to participate. I do feel though that in order for the standard to succeed that it needs to move quickly from a voluntary self-assessment tool to an externally audited one. A step of peer review between might be useful as proposed, but obtaining time off to do this may present problems unless it is of a reciprocal nature. The issue of non-participation needs to be considered. Whilst I hope all would join in, I suspect that some little empires may choose not to citing odd reasons. Is there going to be some mandates issued about this in the future once things have settled in? I presume that these standards will apply to all suppliers of healthcare science and not just NHS departments? How and what arrangements would be made for their assessment? I think it is important that these standards are seen as a positive step and not 'just another compliance'. When 'marketing' the standards and self assessment tool people need to be hopeful of attaining a level of accreditation and not feel that the task is too great. I am suggesting training days, support groups, forums and shared documents, good practice etc. be considered! Exciting stuff!
V1.1 January 2015 Page 21 of 23
I think there is a bit too much focus on budget etc. rather than scientific focus of service. I'd like to think that departments who have clinical scientists on-site have good systems in place, but those that don't should have evidence that they have some clinical science support and optimization.
Frequency of assessment would be my only concern. As mentioned the CMAS accreditation involves a yearly external assessment and yearly internal assessment - this is a huge burden on the services as there is a fair bit of preparation that is required prior to assessment. I feel one external assessment every two years would be reasonable.
1. This document and the entire project appears to be predicated on the idea that current quality systems do not guarantee that services deliver high quality patient centred outcomes. In the case of ISO9000 type services, contacts with senior members of the CSO team have yielded comments such as “ISO9000 simply requires services to document want they do and then prove that the do what their documentation states, without sufficient emphasis of whether the outcomes are good/bad, right/wrong” Our view is that this is an outdated and incorrect assessment of what a modern ISO system delivers and that this now has a clear focus on assessments of customer / service needs and feedback to ensure these are properly met. Of course there will be many small services providers who do not have established quality systems so perhaps this could be valuable for them. 2. Within our Medical physics there are 4 main service areas. These are • Radiotherapy Physics • RRPPS • Nuclear Medicine • Clinical Computing and Imaging sciences Of these, two are already full covered by long-established ISO-9000 systems as follows a) RRPPS has its own ISO system, inspected by BSI and covering all major activities b) Radiotherapy Physics is fully integrated into the Radiotherapy ISO system and covers all major activities The Nuclear Medicine Service at least in respect of the Imaging activities (which represent most of the service) will be incorporated into the Imaging Service Accreditation Scheme (ISAS, see https://www.isasuk.org/default.shtml) which is being rolled-out nationally and is likely to be a major project for the imaging service over the coming years. We have been gradually initiating a project to scope a formalized quality system for the activities of the Clinical Computing and Imaging Sciences team. 3. There appears to be massive overlap of the requirements of this proposed Accreditation system, with other processes with which we are already actively engaging and expending considerable effort. These include • Cancer Peer review • NHSLA • CQC Outcomes 4. Some of the issues suggested for inclusion in the Consultation document do not relate specifically to Medical Physics but rather to the Trust as a
V1.1 January 2015 Page 22 of 23
whole (e.g. managing patient feedback, ensuring standards of privacy, dignity etc are met, informed consent, staff suggestions, information governance, R&D governance, Health and Safety etc etc…). This is linked to 3 above. 5. The website suggests that linkages to existing ISO type systems will be important and that there will be support through a self-assessment tool. If this is sensibly configured then Accreditation might become available relatively easily to departments with established certifications. This is mentioned in the summary leaflet at http://www.ahcs.ac.uk/wordpress/wp-content/uploads/2014/06/iCEPSS- A4-leaflet.pdf We would suggest that for departments that already have full coverage of other external certifications, that the process of gaining accreditation should be as simple as the preparation of an annual statement that these certifications have been maintained. 6. Overall, we need to understand how extensive the work is likely to be for services which already have established quality system certifications, and we need some clarify on the benefits that are likely to follow.
Clinical Engineering and Physical science covers a diverse range of services, some directly patient facing - others completely remote. Many services fall under different management structures with radiotherapy and or imaging being separated in hospital management structures. These factors complicate any accreditation and the process associated with it. the scheme MUST complement any existing ISO accreditations. Trusts are unlikely to entertain much if any additional expense
V1.1 January 2015 Page 23 of 23
Appendix 2
Consultation Respondents
Edwin Claridge – Clinical Scientist
Royal College of Radiologists.
Institute of Medical Illustrators
British Nuclear Medicine Society
Engineering Advisory Group – Institute of Physics and Engineering in Medicine.
Institute of Physics and Engineering in Medicine – Science, Research & Innovation Council.
Glenn Pascoe
John Amoore – Clinical Engineer
Alyte Podvoiskis – Quality Manager
Paul Blackett – Clinical Engineer
Bob Perkins