Lessons Learned from the Usability Studies of … out the IBM Computer System Usability (Lewis,...

1
Discussion Average time required to complete the task significantly differed among platforms: • Doxy.me: The invitation mail for Doxy.me was very easy to comprehend and no plug-ins were required. This might have contributed to Doxy.me’s low task completion time. • Vidyo: The invitation mail contained multiple links and required plug-ins. Also, Vidyo did not notify doctor when patients came online. Investigator informed doctor each time. • Polycom: The login screen contained a number of options to join the telemedicine session. Many participants tried more than one option before clicking correct one. • VSee: Participants had to install VSee desktop application upon clicking the invitation link. Not all participants realized that they were creating an account. Majority of the errors were made by the participants during the phase when they initiated the session. The high level of mental demand for Vidyo and VSee may be due to the laborious process of installing the plugin and going through a set of steps to initiate the session Very low level of frustration with Polycom, Vidyo and Doxy.me. Analysis of preference data indicated that participants liked Doxy.me over Polycom, and Polycom to Vidyo and VSee. Figure 5 Figure 4 Towards a More Usable Home-based Telemedicine System: Lessons Learned from the Usability Studies of Telemedicine Systems Introduction SRUTHY OROZHIYATHUMANA AGNISARMAN 1 , NEIL GRAMOPADHYE 1 , KAPIL CHALIL MADATHIL 1 , KEVIN SMITH 2 , BRINDA FNU 1 , BRANDON WELCH 2 & JIMMY MCELLIGOTT 2 Key Findings Complicated check-in processes associated with Vidyo, VSee and Polycom contributed to significantly high mental demand. Participants really like the aspect that Polycom and Doxy.me did not require plug-ins or an application to be installed. Software installation process associated with VSee led to significantly high mental demand and effort. VSee client application had multiple windows (contact list, patient view, doctor’s view) which confused participants. In addition, VSee did not have a full screen button. Hence participants had to drag and enlarge the screen. These factors may have contributed to significantly high frustration level associated with VSee. Overall satisfaction score was significantly low for VSee compared to Doxy.me, Vidyo and Polycom. Limitations Study was conducted on a Mozilla FireFox browser on a Windows 7 Operating system. Participants of this study were primarily college students. We anticipate the differences observed between the platforms to be more exaggerated with less educated and older users. Essentially if we’re seeing these differences with this population, the differences will only let get more pronounced with different users. Results Method 1. Participants were assigned any one of the four telemedicine platforms randomly. 2. A researcher who acted as the doctor greeted the participant. 3. The “doctor” asked the participant to complete the following tasks: (1) Initiating the telemedicine session (2) Communicating with the doctor (3) Concluding the telemedicine session Home-based video telemedicine technologies can support post-discharge, home health and chronic care management (Bryant et al., 2015). Minimal studies are conducted to evaluate the usability of such home-based video telemedicine systems. It is critical, for real-world applicability and user acceptance, to situate such telemedicine applications within the context-specific needs of the people benefiting from or otherwise affected by that technology. This study investigates the usability issues associated with four commonly used telemedicine software platforms (Doxy.me, Vidyo, VSee and Polycom) using task performance metrics and subjective measures. Telemedicine server Participant Researcher Room 1 Room 2 Researcher Experimental setup Average time taken to complete the task Computer system usability scale Number of errors made NASA TLX test scores Final preference questionnaire 11 8 19 total participants* *Recruited From Clemson University 4. The completion of each experimental condition was followed by a retrospective think-aloud session. 5. The participant was then asked to fill out the IBM Computer System Usability (Lewis, 1995) questionnaire, the NASA-TLX workload instrument (Hart and Staveland, 1988) and a brief questionnaire on the perceived usefulness and ease-of-use. Procedure (Time to complete the experiment ~ 75 min.) Dependent Variables (The usability of software was determined by analyzing these dependent variables) Analysis To determine the significant differences among telemedicine software platforms, repeated measures one-way ANOVA was carried out at 95% confidence level. Post-hoc LSD comparisons were used to determine the locus of significant differences. Because of the nonparametric nature of the ranked preference data, the Friedman test and Wilcoxon Signed-Rank Test was used to identify the significant effects for this metric. 6. This was followed by the presentation of the remaining three software each followed by a retrospective think aloud session and completion of a subjective questionnaires. 7. Upon finishing all the tasks, the participant completed a final post-test subjective questionnaire ranking the software platforms in terms of his/her preference (1 = most preferred; 4 = least preferred). The study used two laptop computers with an in-built web camera. In addition to the printed test reports and other clinical parameters, participants were also provided with a headphone, a microphone and mouse. The experimental setup is shown above. Apparatus Participants All participants were of age 18 + years (Mean age = 25.474, SD = 3.438) Compensation was a $20 amazon gift card, regardless of performance No participant had prior experience with telemedicine systems. Study used a within-subject experimental design. All participants were exposed to all four softwares* investigated: Experimental Design (1) Doxy.me (2) Vidyo (3) VSee (4) Polycom *To minimize order effects, each participant received the software in a different order. The doctor sent an invitation through the telemedicine platform to participant’s email. Once the participants completed the task on the first platform, they were assigned to the remaining three conditions. Task Completion Time: The results showed significant differences in the average time taken to complete the telemedicine session, F (1.881, 33.866) = 8.610, p = 0.001. Subsequent post-hoc analysis indicated significant differences between Doxy.me and Vidyo, Doxy.me and VSee, Doxy.me and Polycom, and VSee and Polycom. (Figure 1) Errors: The mean number of errors made by the participants was not statistically significant among different telemedicine platforms at p < 0.05 level, F (3, 16) = 2.441, p = 0.102 (Figure 2) Workload: The results showed significant differences in the workload experienced by the participants, F (1.565, 28.173) = 8.245, p = 0.003. Significant differences were found between Doxy.me and Vidyo, Doxy.me and Vsee. (Figure 3) Usability: The results showed significant differences in the overall usability reported by the participants, F (2.054, 36.964) = 11.382, p < 0.001. Significant differences were found between Doxy.me and VSee, Vidyo (M = 3.936, SD = 0.282) and Vsee, and VSee and Polycom (M = 3.889, SD = 0.441). Figure 1 Figure 2 Figure 3 1 CLEMSON UNIVERSITY 2 MEDICAL UNIVERSITY OF SOUTH CAROLINA REFERENCES Hart, S. G., & Staveland, L. E. (1988). Development of NASA- TLX (Task Load Index): Results of empirical and theoretical research. Advances in psychology,52, 139-183. Lewis, J. R. (1995). IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. International Journal of Human‐Computer Interaction, 7(1), 57-78. Bryant, P. A., Ibrahim, L. F., Sacks, B., Golshevsky, D., Spagnolo, M., Layley, M., ... & Bryan, D. (2015). Acute medical review by mobile telemedicine for children in hospital-in-the-home: an innovation. Archives of disease in childhood, 100(2), 208-209. 1 2 3 4 (Doctor) DISCLOSURE: Dr. Welch, one of the authors of this poster is a co-founder and shareholder of Doxy.me and contributed to the research concept and design, but did not participate in data collection or analysis.

Transcript of Lessons Learned from the Usability Studies of … out the IBM Computer System Usability (Lewis,...

DiscussionAverage time required to complete the task significantly differed among platforms:• Doxy.me: The invitation mail for Doxy.me was very easy to comprehend and no plug-ins

were required. This might have contributed to Doxy.me’s low task completion time.

• Vidyo: The invitation mail contained multiple links and required plug-ins. Also, Vidyo did not notify doctor when patients came online. Investigator informed doctor each time.

• Polycom: The login screen contained a number of options to join the telemedicine session. Many participants tried more than one option before clicking correct one.

• VSee: Participants had to install VSee desktop application upon clicking the invitation link. Not all participants realized that they were creating an account.

Majority of the errors were made by the participants during the phase when they initiated the session.• The high level of mental demand for Vidyo and VSee may be due to the laborious process of installing the plugin and going

through a set of steps to initiate the session

• Very low level of frustration with Polycom, Vidyo and Doxy.me.

• Analysis of preference data indicated that participants liked Doxy.me over Polycom, and Polycom to Vidyo and VSee.

Figure 5

Figure 4

Towards a More Usable Home-based Telemedicine System: Lessons Learned from the Usability Studies of Telemedicine Systems

IntroductionSRUTHY OROZHIYATHUMANA AGNISARMAN1, NEIL GRAMOPADHYE1, KAPIL CHALIL MADATHIL1, KEVIN SMITH 2, BRINDA FNU1, BRANDON WELCH2 & JIMMY MCELLIGOTT2

Key Findings• Complicated check-in processes associated with Vidyo, VSee and Polycom contributed to significantly high mental demand.• Participants really like the aspect that Polycom and Doxy.me did not require plug-ins or an application to be installed.• Software installation process associated with VSee led to significantly high mental demand and effort.• VSee client application had multiple windows (contact list, patient view, doctor’s view) which confused participants. In addition,

VSee did not have a full screen button. Hence participants had to drag and enlarge the screen. These factors may have contributed to significantly high frustration level associated with VSee.

• Overall satisfaction score was significantly low for VSee compared to Doxy.me, Vidyo and Polycom.

Limitations• Study was conducted on a Mozilla FireFox browser on a Windows 7 Operating system.• Participants of this study were primarily college students. We anticipate the differences observed between the platforms to be more

exaggerated with less educated and older users. Essentially if we’re seeing these differences with this population, the differences will only let get more pronounced with different users.

Results

Method

1. Participants were assigned any one of the four telemedicine platforms randomly.

2. A researcher who acted as the doctor greeted the participant.

3. The “doctor” asked the participant to complete the following tasks:

(1) Initiating the telemedicine session (2) Communicating with the doctor(3) Concluding the telemedicine session

Home-based video telemedicine technologies can support post-discharge, home health and chronic care management (Bryant et al., 2015). Minimal studies are conducted to evaluate the usability of such home-based video telemedicine systems. It is critical, for real-world applicability and user acceptance, to situate such telemedicine applications within the context-specific needs of the people benefiting from or otherwise affected by that technology. This study investigates the usability issues associated with four commonly used telemedicine software platforms (Doxy.me, Vidyo, VSee and Polycom) using task performance metrics and subjective measures.

Telemedicine server

Participant

Researcher

Room 1 Room 2

ResearcherExperimental

setup

Average time taken to complete the task

Computer system usability scaleNumber of errors made NASA TLX test scores Final preference

questionnaire

11 8

19 total participants*

*Recruited From

Clemson University

4. The completion of each experimental condition was followed by a retrospective think-aloud session.

5. The participant was then asked to fill out the IBM Computer System Usability (Lewis, 1995) questionnaire, the NASA-TLX workload instrument (Hart and Staveland, 1988) and a brief questionnaire on the perceived usefulness and ease-of-use.

Procedure (Time to complete the experiment ~ 75 min.)

Dependent Variables (The usability of software was determined by analyzing these dependent variables)

AnalysisTo determine the significant differences among telemedicine software platforms, repeated measures one-way ANOVA was carried out at 95% confidence level. Post-hoc LSD comparisons were used to determine the locus of significant differences. Because of the nonparametric nature of the ranked preference data, the Friedman test and Wilcoxon Signed-Rank Test was used to identify the significant effects for this metric.

6. This was followed by the presentation of the remaining three software each followed by a retrospective think aloud session and completion of a subjective questionnaires.

7. Upon finishing all the tasks, the participant completed a final post-test subjective questionnaire ranking the software platforms in terms of his/her preference (1 = most preferred; 4 = least preferred).

The study used two laptop computers with an in-built web camera. In addition to the printed test reports and other clinical parameters, participants were also provided with a headphone, a microphone and mouse. The experimental setup is shown above.

ApparatusParticipants• All participants were of age 18+ years

(Mean age = 25.474, SD = 3.438)

• Compensation was a $20 amazon gift card, regardless of performance

• No participant had prior experience with telemedicine systems.

Study used a within-subject experimental design. All participants were exposed to all four softwares* investigated:

Experimental Design

(1) Doxy.me

(2) Vidyo

(3) VSee

(4) Polycom

*To minimize order effects, each participant received the software in a different order. The doctor sent an invitation through the telemedicine

platform to participant’s email. Once the participants completed the task on the first platform, they were assigned to the remaining three conditions.

Task Completion Time: The results showed significant differences in the average time taken to complete the telemedicine session, F (1.881, 33.866) = 8.610, p = 0.001. Subsequent post-hoc analysis indicated significant differences between Doxy.me and Vidyo, Doxy.me and VSee, Doxy.me and Polycom, and VSee and Polycom. (Figure 1)

Errors: The mean number of errors made by the participants was not statistically significant among different telemedicine platforms at p < 0.05 level, F (3, 16) = 2.441, p = 0.102 (Figure 2)

Workload: The results showed significant differences in the workload experienced by the participants, F (1.565, 28.173) = 8.245, p = 0.003. Significant differences were found between Doxy.me and Vidyo, Doxy.me and Vsee. (Figure 3)

Usability: The results showed significant differences in the overall usability reported by the participants, F (2.054, 36.964) = 11.382, p < 0.001. Significant differences were found between Doxy.me and VSee, Vidyo (M = 3.936, SD = 0.282) and Vsee, and VSee and Polycom (M = 3.889, SD = 0.441).

Figure 1 Figure 2 Figure 3

1 CLEMSON UNIVERSITY 2 MEDICAL UNIVERSITY OF SOUTH CAROLINA

REFERENCESHart, S. G., & Staveland, L. E. (1988). Development of NASA-

TLX (Task Load Index): Results of empirical and theoretical research. Advances in psychology,52, 139-183.

Lewis, J. R. (1995). IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. International Journal of Human‐Computer Interaction, 7(1), 57-78.

Bryant, P. A., Ibrahim, L. F., Sacks, B., Golshevsky, D., Spagnolo, M., Layley, M., ... & Bryan, D. (2015). Acute medical review by mobile telemedicine for children in hospital-in-the-home: an innovation. Archives of disease in childhood, 100(2), 208-209.

1

2

3

4(Doctor)

DISCLOSURE: Dr. Welch, one of the authors of this poster is a co-founder and shareholder of Doxy.me and contributed to the research concept and design, but did not participate in data collection or analysis.