Medical Handsfree System - Project Paper

13
1 Medical Hands-Free System Authors Avraham Levi, Guy Peleg Supervisors Ron Sivan, PhD , Yael Einav, PhD Abstract - Contemporary Human Computer Interaction (HCI) is based on usage of multiple devices such as keyboard, mouse or touchscreen, all of which require manual contact. This imposes a serious limitation when it comes to deal some medical equipment and applications where sterility is required, such as in the operating room. Statistical reports show that keyboard and mice contain more bacteria than lavatories. We focus in our project on the application of a hands free motion detector in surgical procedures. The purpose of the application in this setting is to enable a surgeon to account for the placement of surgical items to prevent forgetting any inside the patient by mistake. The application makes use of a hand motion detector from Leap Motion, Inc. We hope it will give us a good opportunity to solve this issue, and, furthermore, introduce the concept of hands free controller to the medical world. Keywords - Human computer interaction (HCI), Leap motion (Leap) , Infra-red (IR) ,Gesture, Image moments, Retained foreign object (RFO). 1. INTRODUCTION This project addresses the issue of recognizing hand and finger movement. Using a 3D motion controller, and utilizing computer vision method which is image moments. we hope to solve the issue of defining the hand motion patterns. In practice, we will need to expand the SDK the manufacturer of the device provides, to enable developers to record and create their own custom made gestures with our SDK we will develop an application that we hope will help surgeons in the operating room by assisting them in keeping track of instruments and materials they use during surgery so they wouldn't forget anything inside the patient. Our purpose is to introduce the possibility of hands-free control of a computer to the medical world, which is usually very conservative and is slow in adopting new technologies. 2. THEORY 2.1. Background and related work 2.1.1. Gesture Gestures are a method of communication using only hand movement, such as sign language used by the deaf (See Figure 1). A gesture includes movement of body parts (hands, fingers) or various implements, such as flags, pens, etc. Figure 1 : Hand gestures used in sign language

Transcript of Medical Handsfree System - Project Paper

Page 1: Medical Handsfree System - Project Paper

1

Medical Hands-Free System Authors

Avraham Levi, Guy Peleg

Supervisors

Ron Sivan, PhD , Yael Einav, PhD

Abstract - Contemporary Human Computer Interaction (HCI) is based on usage of multiple

devices such as keyboard, mouse or touchscreen, all of which require manual contact. This

imposes a serious limitation when it comes to deal some medical equipment and applications

where sterility is required, such as in the operating room. Statistical reports show that keyboard

and mice contain more bacteria than lavatories. We focus in our project on the application of a

hands free motion detector in surgical procedures. The purpose of the application in this setting is

to enable a surgeon to account for the placement of surgical items to prevent forgetting any inside

the patient by mistake. The application makes use of a hand motion detector from Leap Motion,

Inc. We hope it will give us a good opportunity to solve this issue, and, furthermore, introduce the

concept of hands free controller to the medical world.

Keywords - Human computer interaction (HCI), Leap motion (Leap) , Infra-red (IR) ,Gesture,

Image moments, Retained foreign object (RFO).

1. INTRODUCTION

This project addresses the issue of recognizing hand and finger movement. Using a 3D motion controller, and utilizing computer vision method which is image moments. we hope to solve the issue of defining the hand motion patterns. In practice, we will need to expand the SDK the manufacturer of the device provides, to enable developers to record and create their own custom made gestures with our SDK we will develop an application that we hope will help surgeons in the operating room by assisting them in keeping track of instruments and materials they use during surgery so they wouldn't forget anything inside the patient. Our purpose is to introduce the possibility of hands-free control of a computer to the medical world, which is usually very conservative and is slow in adopting new technologies.

2. THEORY

2.1. Background and related work

2.1.1. Gesture

Gestures are a method of communication using only hand movement, such as sign

language used by the deaf (See Figure 1). A gesture includes movement of body parts

(hands, fingers) or various implements, such as flags, pens, etc.

Figure 1 : Hand gestures used in sign language

Page 2: Medical Handsfree System - Project Paper

2

2.1.2. The leap motion controller

The Leap Motion controller [1] is a computer hardware sensor designed to detect hand and finger movement requiring no contact or touch. The main components of the device are three IR LEDs for illumination and two monochromatic IR cameras. The device is sensitive to hand and finger movements to a distance of about 1 meter. The LEDs generate a 3d pattern of dots of IR light, and the cameras generate 200 frames per second of captured data. This data is sent via USB to a host computer. The device comes with a development kit (SDK) capable of recognizing a few simple gestures. The main part of our project will consist of expanding the SDK for enabling users to define their own, more complex gestures. In particular, we plan to use that environment to create applications based on customized gestures to control medical equipment used in sterile conditions, where forms of control which require touch are precluded.

Figure 2 : The inside components of the Leap Motion controller

2.1.3. Other products

Other products in the 3D motion controller domain are Kinect [2] and Intel Perception Camera [3] . Compared with the Leap Motion device, the Kinect does not come with finger recognition and its main purpose is to capture the whole body, therefore it is much slower in capturing hand and fingers movement, too slow for gesture recognition. Because the Kinect camera captures the soundings as well, computing time is much longer. Moreover, the Kinect is much larger and requires more space. The Kinect, however, has better range and can capture the surroundings from a distance.

Another comparable device is the Intel Perceptual Computing Camera. However, It is available for purchase, although it rather more expensive than the Leap Motion ($200 for the device and $150 for Intel SDK). The device is not designed specifically for gesture recognition, it works in visible light (not IR), and depends on ambient lighting. It supports voice and various other technologies that are immaterial to our application.

2.1.4. The health care industry

Research [4] [5] strongly supports that computer keyboards and other input devices are source of bacteria and cross-contamination that can lead to hospital acquired infections. Therefore washable keyboards, mice, TV remotes and mobile products with anti-microbial product protection should be put in place along with proper disinfection protocols to reduce the risk of infection and cross-contamination.

2.1.5. Surgical inventory count

Oftentimes surgical instruments may accidentally be left behind in a surgical cavity causing, in the worst case, severe infection or even death [6]. These types of events (called retained foreign object โ€“ RFO) are considered as "never events" , namely โ€“ they are preventable medical errors. Unlike other medical errors, RFO errors were declared as "touch zero" errors โ€“ namely, the goal is to reach zero events, since this type of error is considered easy to prevent [7].

Page 3: Medical Handsfree System - Project Paper

3

Over the history of surgical procedures a strict surgical inventory count protocol was developed and is now obligatory in all surgical settings around the world. Two nurses are responsible for counting all surgical sponges, needles and instruments in most surgical settings. However, there are some surgical procedures in which the count protocol is not performed on a regular basis. These are "small", "simple" procedures in which there is no nurse continuously aiding the surgeon (e.g., episiotomy). Naturally, where there is no formal count protocol performed or no count at all, the chances for retained surgical sponge rises dramatically [6] . In these procedures the surgeon has to rely on his or her memory to recall how many items were used or how many items were in the set that was opened (for example 10 pads, 2 needles and 3 gauzes). To account for all items the surgeon actually needs to compare the number in his memory (e.g., "there were 5 pads in the set and then I opened a package of 5 more, and there were 3 needles in the set.") To the number of items found at the end of the procedure. However, to keep these numbers in mind for the whole procedure is much of a burden and since short term memory is so vulnerable, there is a good chance that the surgeon may make a mistake.

In the past 10-15 years there were few technological developed to support the counting protocol: "SURGICOUNT Medical" and "SmartSponge System" are two examples. These systems require hand contact to operate and are controlled mostly by the nurses responsible for the count. Our solution is designed to help surgeons in cases where there is no nurse available to write down (or type) items' count. Using the Leap Motion device for this purpose will provide the surgeon means to document the usage of items without the need to write down or type anything. This solution is the only solution (until now) that surgeons have to document anything themselves during surgery. It is expected to dramatically reduce memory load and promote a safer care for patients.

Figure 3 : Intra-operative radiograph performed because of an incorrect sponge count in a 54 year

old woman undergoing urethral suspension. The radio-opaque marker (arrow) of a 4 x 4 inch

surgical sponge is visible in the pelvis. The sponge was identified.

2.2. Detailed description

2.2.1. Gesture recognition

Gesture recognition refers to interpretation of human gestures via mathematical algorithms. The IR cameras in the Leap Motion device reads the movements of the human hand and communicates the data to a computer that uses the gesture input to control devices or applications. Using mathematical algorithms the computer can analyze the captured data and categorize it to one of several predefined patterns.

Page 4: Medical Handsfree System - Project Paper

4

2.2.2. Preprocessing

For the moment we confine our attention to planar gestures โ€“ gestures in which the index finger traces a path in a plane. We assume it will be easier for humans to reproduce and hope it will also be simpler to recognize, while not restricting the repertoire of possible gestures too severely.

It is obvious that any free hand motion cannot be constrained to be completely planar: planarity will be only approximated. We therefore find, as a first step in interpreting gesture data, the plane whose distance from the captured gesture trace is minimal, using Singular Vector Decomposition (see 2.2.3).

2.2.3. Singular vector decomposition

Singular Vector Decomposition (SVD) is a factorization method with many useful applications in signal processing and statistics.

One useful application that can be used for our purpose is fitting planes and lines by orthogonal distance regression [8] .Say we want to find the plane that are as close as possible to set of ๐‘› 3-D points (๐‘1, โ€ฆ , ๐‘๐‘›) which we captured by the device.

Let the matrix A of size n x 3 holds the ๐‘๐‘– = (๐‘ฅ๐‘– , ๐‘ฆ๐‘– , ๐‘ง๐‘–) in each row :

๐ด = [

๐‘ฅ1 ๐‘ฆ1 ๐‘ง1: : :๐‘ฅ๐‘› ๐‘ฆ๐‘› ๐‘ง๐‘›

]

Using the Transpose Matrix operation we create the 3 x n AT :

๐ด๐‘‡ = [

๐‘ฅ1 โ€ฆ ๐‘ฅ๐‘›๐‘ฆ1 โ€ฆ ๐‘ฆ๐‘›๐‘ง1 โ€ฆ ๐‘ง๐‘›

]

Building matrix B by Matrix Multiplication between A and AT yielding a 3 x 3 matrix :

๐ต = ๐ด โˆ™ ๐ด๐‘‡

Solving the eigenvalue equation for matrix B:

det (๐ต โˆ’ ๐œ† โˆ™ ๐ผ ) = 0

This equation is a polynomial of degree 3, and hence has 3 solutions. Under the conditions of the problem at hand, they are expected to be real, positive and distinct.

ฮป1, ฮป2, ฮป3 โˆˆ โ„+

For each of the ฮป values we compute the corresponding eigenvector:

(๐ต โˆ’ ๐œ†๐‘– โˆ™ ๐ผ )๏ฟฝฬ…๏ฟฝ๐‘– = 0

Using the eigenvalues we have calculated, we need to create the U matrix that built from each of the eigenvectors.

U = [

๐‘ข1๐‘ฅ๐‘ข1๐‘ฆ๐‘ข1๐‘ง

๐‘ข2๐‘ฅ๐‘ข2๐‘ฆ๐‘ข2๐‘ง

๐‘ข3๐‘ฅ๐‘ข3๐‘ฆ๐‘ข3๐‘ง]

โŸ u1 u2 u3

Page 5: Medical Handsfree System - Project Paper

5

The eigenvector that belongs to the minimal value ๐œ†๐‘– is the normal for our working plane.

2.2.4. Projecting point on the new plane

After we have found the closest plane to all the given points, we project each point onto the plane to be given set of N coplanar. We use the canonical form of the plane equation to compute the distance of each point ๐‘ƒ๐‘–(๐‘ฅ๐‘– , ๐‘ฆ๐‘– , ๐‘ง๐‘–)from the plane:

๐ด๐‘ฅ๐‘– + ๐ต๐‘ฆ๐‘– + ๐ถ๐‘ง๐‘– + ๐ท

โˆš๐ด2 + ๐ต2 + ๐ถ2= ๐‘‘๐‘–

๐‘ƒ๐‘–โƒ—โƒ— โˆ’ ๐‘‘๐‘–๏ฟฝโƒ—๏ฟฝ = ๐‘ƒ๐‘–โ€ฒโƒ—โƒ—โƒ—โƒ—

2.2.5. Reducing dimensions

Now that we have moved all points into one plane, we want to reduce the number of coordinates of each point from 3 to 2. Let {๐‘ƒ๐‘–(๐‘ฅ๐‘– , ๐‘ฆ๐‘– , ๐‘ง๐‘–)} be that set of points, and let ๐‘€(๐ด๐‘ฅ + ๐ต๐‘ฆ + ๐ถ๐‘ง + ๐ท = 0) be that plane. In general, the plane ๐‘€ need not be ๐‘€0(๐‘ง = 0), the XY plane, and therefore the ๐‘ง component of the points ๐‘ƒ๐‘– need not vanish. We

therefore construct a Cartesian system on plane ๐‘€ by choosing two perpendicular lines on ๐‘€, namely LX Let and LY. ๐ฟ๐‘‹(๐ด๐‘ฅ + ๐ต๐‘ฆ + ๐ท = 0) is the intersection line between ๐‘€ and

๐‘€0. As the origin we choose the point ๐‘‚ (0,โˆ’๐ท

๐ต, 0) on ๐ฟ๐‘‹. ๐ฟ๐‘Œ will then be the line lying in

๐‘€0 that passes through the origin ๐‘‚ and is perpendicular to ๐ฟ๐‘‹. The distances ๐‘ฅโ€ฒ๐‘– and

๐‘ฆโ€ฒ๐‘– of every point ๐‘ƒ๐‘– from ๐ฟ๐‘‹ and ๐ฟ๐‘Œ correspondingly will act as the coordinates of the points for further analysis.

Considering Figure 4, we find the distance d on ๐‘€0 between the projection on ๐‘ƒ๐‘– on ๐‘€0 and ๐ฟ๐‘‹, which by definition also lies on ๐‘€0. This distance d and the z coordinate of point ๐‘ƒ๐‘– form a right-angle triangle whose hypotenous is the distance of ๐‘ƒ๐‘–from LX, hence is y'i.

Defining point Q as the intersection of y'i and ๐ฟ๐‘‹, the distance from Q to the origin O is the

distance on ๐‘€ from ๐‘ƒ๐‘– to ๐ฟ๐‘‹, and hence is x'i.

Figure 4 : Reducing the number of coordinates

Developing the math we get:

22

2222

'

))(())((

BA

ABxB

DyAxB

B

DyAB

xiiii

i

2

22

2' )(

iii

i zBA

DByAxy

Page 6: Medical Handsfree System - Project Paper

6

2.2.6. Building the image

Once a planar shape is obtained, we find a bounding rectangle for the points inside. Forming a matrix M with the dimensions of that rectangle, we initialize the matrix according to this formula:

๐‘€(๐‘ฅ, ๐‘ฆ) = {1 ๐‘ฅ, ๐‘ฆ ๐‘–๐‘  ๐‘Ž ๐‘‘๐‘Ž๐‘ก๐‘Ž ๐‘๐‘œ๐‘–๐‘›๐‘ก

0 ๐‘œ๐‘กโ„Ž๐‘’๐‘Ÿ๐‘ค๐‘–๐‘ ๐‘’

Yielding a matrix representing the image.

2.2.7. Image moments

In order to distinguish between patterns we compute Image Moments [9] [10]. Image moments, each being a real number, are various weighted averages of the image pixels intensities, representing increasing detail of pixel distribution, such as centroid, area, and information about orientation. Moments are invariant to translation, and some are invariant to rotation as well, and we limit our attention to those only.

We have used the following mapping function:

๐‘“(๐‘ฅ, ๐‘ฆ) = {0 , ๐‘–๐‘“ ๐‘กโ„Ž๐‘’ ๐‘๐‘–๐‘ฅ๐‘’๐‘™ ๐‘–๐‘  ๐‘คโ„Ž๐‘–๐‘ก๐‘’1 , ๐‘–๐‘“ ๐‘กโ„Ž๐‘’ ๐‘๐‘–๐‘ฅ๐‘’๐‘™ ๐‘–๐‘  ๐‘๐‘™๐‘Ž๐‘๐‘˜

We first calculate the raw moments

๐‘€๐‘๐‘ž =โˆ‘โˆ‘ ๐‘ฅ๐‘๐‘ฆ๐‘ž๐‘“(๐‘ฅ, ๐‘ฆ)

๐‘ฆ๐‘ฅ

๐œ‡๐‘๐‘ž =โˆ‘โˆ‘(๐‘ฅโˆ’ ๏ฟฝฬ…๏ฟฝ)๐‘(๐‘ฆ โˆ’ ๏ฟฝฬ…๏ฟฝ)๐‘ž๐‘“(๐‘ฅ, ๐‘ฆ)

๐‘ฆ๐‘ฅ

๐‘คโ„Ž๐‘’๐‘Ÿ๐‘’ ๏ฟฝฬ…๏ฟฝ =๐‘€10๐‘€00

๐‘Ž๐‘›๐‘‘ ๏ฟฝฬ…๏ฟฝ =๐‘€01๐‘€00

We have chosen to use 9 moments for now, but this number may increase if it turns out good recognition needs more. Now that we have calculated the moments we need make sure that they are scale invariants so we use the following formula:

๐œ‚๐‘–๐‘— =๐œ‡๐‘–๐‘—

๐œ‡001+๐‘–+๐‘—2

Now we want our moments to be invariant under rotation, most known set of moments are Hu's [11] set of invariant moments, also known as the 7 moments of Hu:

๐“๐Ÿ = ๐œ‚20 + ๐œ‚02

๐“๐Ÿ = (๐œ™1)2 + 4๐œ‚11

2

๐“๐Ÿ‘ = (๐œ‚30 โˆ’ 3๐œ‚12)2 + (3๐œ‚12 โˆ’ ๐œ‚30)

2

๐“๐Ÿ’ = (๐œ‚30 + ๐œ‚12)2 + (๐œ‚30 + ๐œ‚21)

2

๐“๐Ÿ“ = (๐œ‚30 โˆ’ 3๐œ‚12)(๐œ‚30 + 3๐œ‚12)[(๐œ‚30 + ๐œ‚12)2 โˆ’ 3(๐œ‚30 + 3๐œ‚12)

2]+ (3๐œ‚21 โˆ’ ๐œ‚03)(๐œ‚03 + 3๐œ‚21)[3(๐œ‚30 + 3๐œ‚12)

2 โˆ’ (๐œ‚21 + ๐œ‚03)2]

๐“๐Ÿ” = (๐œ‚20 โˆ’ ๐œ‚02)[๐œ‚30 + ๐œ‚12)2 โˆ’ (๐œ‚21 + ๐œ‚03)

2] + 4๐œ‚_11 (๐œ‚30 + ๐œ‚12)(๐œ‚21 + ๐œ‚03)

๐“๐Ÿ• = (3๐œ‚21 โˆ’ ๐œ‚03)(๐œ‚30 + ๐œ‚12)[(๐œ‚30 + ๐œ‚12)2 โˆ’ 3(๐œ‚21 + ๐œ‚03)

2] โˆ’ (๐œ‚30 โˆ’ 3๐œ‚12)(๐œ‚21 + ๐œ‚03)[(3(๐œ‚30 + ๐œ‚12)2

โˆ’ (๐œ‚21 + ๐œ‚03)2]

Page 7: Medical Handsfree System - Project Paper

7

The 9 moments of a gesture are taken as coordinates of the gesture in some abstract 9-dimensional space. It is assumed that the representation of similar gestures will congregate into "clouds" whose extent, in Euclidean metric, will be small compared to the distance between "cloud" centroids. Those centroid will be calculated for each "cloud" and will be saved, and with each insertion of new moments the centroid will be updated, the usage of this data is represented in the next section.

2.2.8. Minimum distance algorithm

Minimum Distance algorithm is basic method to compare distance between our ongoing gesture and the "clouds" of calculated method, the main idea of the usage of the algorithm in this context is to calculate the distance between the ongoing gesture centroid to the other centroids, so in high probability the "cloud" with the minimum distance to the ongoing gesture is the same gesture,

We have considered two distance function:

๐‘™๐‘’๐‘ก ๐‘‹, ๐‘Œ ๐‘๐‘’ ๐‘‰๐‘’๐‘๐‘ก๐‘œ๐‘Ÿ ๐‘–๐‘› ๐‘˜ ๐‘‘๐‘–๐‘š๐‘’๐‘›๐‘ ๐‘–๐‘œ๐‘›๐‘Ž๐‘™ ๐‘ ๐‘๐‘Ž๐‘๐‘’

๐ธ๐‘ข๐‘๐‘™๐‘–๐‘‘๐‘’๐‘Ž๐‘› โˆถ โˆšโˆ‘(๐‘‹๐‘– โˆ’ ๐‘Œ๐‘–)2

๐‘˜

๐‘–=1

๐‘€๐‘Ž๐‘›โ„Ž๐‘Ž๐‘ก๐‘ก๐‘Ž๐‘› โˆถ โˆ‘|๐‘‹๐‘– โˆ’ ๐‘Œ๐‘–|

๐‘˜

๐‘–=1

We assume that the number of gesture recorded is finite and considerably low. So in that case we can allow us to calculate each distance. Since the number of rows of data is finite the complexity of this process the complexity is low. Which is ๐‘‚(๐‘†).

The Algorithm:

Let ๐‘† = ๐‘1, โ€ฆ , ๐‘๐‘› be the set of centroids of gesture {๐‘”๐‘–} in feature space.

Let p be the point in representing a new gesture to be recognized in feature space

Compute the distance d(p, ci) Find the minimum distance.

Identify the new gesture as the gesture i with the minimal distance

Return the id

2.3. Expected results

The final product will be implemented in each device that require hand contact especially for counting procedures in operating room. This would reduce the number of retain sponge cases.

Our SDK will give the opportunity for developers around the world to implement application which based on custom-made gesture.

Page 8: Medical Handsfree System - Project Paper

8

3. PRELIMINARY SOFTWARE ENGINEERING DOCUMENTATION

3.1. Requirements (Use Cases)

3.2. GUI

In this section we would like to introduce our UI prototype for the project. This prototype may demonstrate the usability of the SDK and the Medical application.

3.2.1. SDK expansion

We have divided the toolkit to 3 major steps which give the user all the functionally he needs to create a custom made gesture. Each step screen has its own purpose, we gave the user a wizard that give him information about the current step.

Step 1 (Figure 5): In this step the user records the custom made gesture he or she would like to create.

Step 2 (Figure 6): In this step the user trains the computer to recognize the new gesture by recording additional examples preferably by different people.

Step 3 (Figure 7): In this step the user tests the system by presenting the gesture to see if it is recognized.

Step 4 : In this step the user gets the information about the new gesture he has just created.

Each screen has the following components (See figures below):

A) Wizard- A UI component which presents the user a sequence of screens that lead him through a series of well-defined steps.

B) Gesture grid panel โ€“ A real time panel displayer which tracks the movement of the user finger and display this movement in this panel.

C) Coordinates table โ€“ A table which is filled with the point coordinates of each finger position that was detected in the current frame.

D) Operation buttons โ€“ the four control buttons which can be clicked upon to perform an action.

E) Status bar โ€“ A status line that gives an information about the Leap controller connection , and about the action that is performed.

F) Help button โ€“ a control button which stands for giving help for the user.

Page 9: Medical Handsfree System - Project Paper

9

G) Match rate bar โ€“a bar that presents the likelihood rate of the gesture that the user is doing with the custom made gesture that was saved in the record step. (refers to steps 2+3)

G

A

B C

D

E

F

Figure 5 : Record Step

Figure 6 : Train Step

Figure 7 : Exercise Step

Page 10: Medical Handsfree System - Project Paper

10

3.2.2. Items counting application

We design our medical application to be simple and intuitive to the surgeons that will

use it eventually. We focused on the Episiotomy procedure kit since it perfectly

demonstrate the usage of our system.

As we can see in Figure 8 below, there are 4 components:

A) Item panel โ€“ this panel includes elements : gesture icon , item image and counter .

B) Undo panel โ€“ this panel show the gesture icon that the surgeons needs to do in

order to undo his last gesture action.

C) More panel - this panel show the gesture icon that the surgeons needs to do in

order to add different item to the surgery.

D) Status bar โ€“ A status line that gives an information about the Leap controller connection , and about the action that is performed.

Using these components the surgeon makes the gesture according the item that was entered to the surgery environment and the counter that belongs to this gesture is incremented, and a feedback will be shown on the screen.

3.3. Program structure โ€“ Architecture, Design (UML diagrams)

3.3.1. Software architecture

The SDK itself will divide into two main parts, the main part the core of the program with all the logic is going to be C++ based application. When the GUI is going to be done with QT or WPF/WINFORMS and will communicate with the C++ program that will transfer event for each gesture that have been made, as for the medical application the whole application is going to be written with WPF and communicate with the SDK.

The following API is an initial prototype of the interface of our SDK:

Init() โ€“ this function initialize the setting of the device.

SetupListener() โ€“ this function connect to the infrastructure of the system.

CallBackFunc() โ€“ this interface stands for the user in order to send it as a delegate function to the device.

OnClose() โ€“ operation to be done when the application is close.

B A

C

D

Figure 8 : Instruments counting application

Page 11: Medical Handsfree System - Project Paper

11

SDK Class Diagram

Medical Counter Class Diagram

Page 12: Medical Handsfree System - Project Paper

12

3.4. Testing plan

3.4.1. Testing plan for the SDK

3.4.2. Testing plan for the medical application

REFERENCES

[1] L. M. Controller, "Leap Motion Specs," Leap Motion Inc, 2013. [Online]. Available:

https://www.leapmotion.com/product.

[2] "Kinect WikiPedia," [Online]. Available: http://en.wikipedia.org/wiki/Kinect.

[3] I. Corp, "Intel Dev Guide for perceptual-computing," Intel Corp, [Online]. Available: http://software.intel.com/en-

us/vcsource/tools/perceptual-computing-sdk.

[4] A. K. Al-Ghamdi, S. M. A. Abdelmalek, A. M. Ashshi, H. Faidah, H. Shukri and A. A. Jiman-Fatan, "Bacterial

contamination of computer keyboards and mice ,elevtors and shoping carts," African Journal of Microbiology

Research, no. 5(23), 2011.

[5] D. C. -. A. N. M. Unit, "Abc News - Your Keyboard: Dirtier Than a Toilet," 5 May 2008. [Online]. Available:

http://abcnews.go.com/Health/Germs/story?id=4774746.

[6] M. M. D. M. S. L. S. M. a. M. J. Z. M. Atul A. Gawande, "Risk Factors for Retained Instruments," The new england

journal of medicine, 2003.

[7] M. t. S. F. B. M. K. P. S. R. J. T. S. B. M. a. H. A. K. B. C. William Kaiser, "The Retained Surgical Sponge," ANNALS

OF SURGERY, vol. 224.

[8] W. Comunity, "SVD - WIKI," [Online]. Available: http://en.wikipedia.org/wiki/Singular_value_decomposition.

[9] J. Flusser, "On the independence of rotation moment invariants," Pattern Recognition, no. 33, 1999.

[10] J. F. a. T. Suk, "Rotation Moment Invariants for Recognition of Symmetric Objects," IEEE TRANSACTIONS ON

IMAGE PROCESSING, vol. 15, 2006.

[11] J. L. Zhihu Huang, "Analysis of Hu's Moment Invariants on Image Scaling and Rotation," ECU Publications Pre,

2011.

Test name Scenario Expected result

Record input data We record input from the device

A file with the recorded data will appear in the DB.

Call SVD function over input data file SVD function gets the raw data from the file

Function return the normal of closest plane.

Call image moments function Image moment will be extracted from the bit map matrix.

All moment will be recorded to the DB.

Record And test 3 Different Gestures repeat this with different people

We record 3 different gesture and test them

There system will notice between the 3 different gestures

Test name Scenario Expected result

Open application User starts the application Application opens in main screen with no errors

User imitates gesture shown on the screen

The user follow the gesture shape.

A counter of the instrument in being increased , and the shape is being highlighted .

The user chose to undo the action User imitates gesture of undo. The last counter is being decreased.

Open surgery log The user press log button. A log of his actions is being shown "you made X , pad 4X4 was incremented by 1".

Open surgery inventory report The user click inventory report button.

A report of all the required equipment is being shown "two pair of scalpels" etc.

Page 13: Medical Handsfree System - Project Paper

13