edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is...

15
Multidisciplinary Senior Design Conference Kate Gleason College of Engineering Rochester Institute of Technology Rochester, New York 14623 Project Number: P16051 THE ROBOTIC EYE MOTION SIMULATOR Joshua Long Mechanical Engineer Nathan Twichel Biomedical Engineer Jordan Blandford Mechanical Engineer Amy Zeller Biomedical Engineer Peter Cho Electrical Engineer ABSTRACT Contemporary eye trackers have no industry standard; various eye tracker manufacturers have mostly proprietary and arbitrary standards that they measure and report their products statistics against. The Chester F. Carlson Center for Imaging Science at RIT gave the team the challenge of creating a robotic eye that was accurate, repeatable, and easy to manufacture; with the idea that this device could help to form a standard that the industry could adopt. The most important requirement was to simulate quick eye movement of the eye (saccade) in one axis with very accurate, controllable, and repeatable dynamics. While also needing to simulate smooth pursuit of an object without jitter; upon completion of a test results such as position and timestamp needed to be outputted in a human readable form. This data can then be compared against the eye tracker to evaluate accuracy and to validate or invalidate manufacturers’ claims. NOMENCLATURE Copyright © 2016 Rochester Institute of Technology

Transcript of edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is...

Page 1: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Multidisciplinary Senior Design ConferenceKate Gleason College of Engineering

Rochester Institute of TechnologyRochester, New York 14623

Project Number: P16051

THE ROBOTIC EYE MOTION SIMULATOR

Joshua LongMechanical Engineer

Nathan TwichelBiomedical Engineer

Jordan BlandfordMechanical Engineer

Amy ZellerBiomedical Engineer

Peter ChoElectrical Engineer

ABSTRACTContemporary eye trackers have no industry standard; various eye tracker manufacturers have mostly

proprietary and arbitrary standards that they measure and report their products statistics against. The Chester F. Carlson Center for Imaging Science at RIT gave the team the challenge of creating a robotic eye that was accurate, repeatable, and easy to manufacture; with the idea that this device could help to form a standard that the industry could adopt.

The most important requirement was to simulate quick eye movement of the eye (saccade) in one axis with very accurate, controllable, and repeatable dynamics. While also needing to simulate smooth pursuit of an object without jitter; upon completion of a test results such as position and timestamp needed to be outputted in a human readable form. This data can then be compared against the eye tracker to evaluate accuracy and to validate or invalidate manufacturers’ claims.

NOMENCLATURE

Eye Tracker - A device that tracks human eye movement and estimates gaze position.Saccade - The quickest eye movement. Normally performed when moving in between fixation points.Smooth Pursuit - The slowest eye movement. Normally performed when tracking a moving object.

INTRODUCTIONEye trackers have long been used in psychology research, visual systems research, marketing, and, recently,

as an input device for human-computer interaction. The quality of the data eye trackers output is a fundamental aspect for any research based on eye tracking. Data quality can be influenced by both eye tracker-specific properties such as the camera, the illuminator, sampling frequency and latency, as well as biological properties of participants such as eye color, pupil size and eye makeup.

Copyright © 2016 Rochester Institute of Technology

Page 2: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Proceedings of the Multi-Disciplinary Senior Design Conference Page 2

There is currently no standardized test method for evaluating the quality of data collected from eye trackers. The lack of standard may lead to research being based on unreliable data. Different manufacturers measure quality using their own methods and researchers either measure it again using different methods or simply report whatever numbers the manufacturer provides. In order to investigate the tracker-related issues only, a set of artificial eyes are needed to eliminate biological variance. Ideally, eye models that can perform real eye movements with high repeatability are needed so that the same movements can be recorded on different eye trackers. The COGAIN (Communication by Gaze Interaction) Association, which is supported by eye tracking researchers, including the MVRL (Multi-Disciplinary Vision Research Lab) at RIT and eye tracker manufacturers around the world, has been putting effort into developing a set of artificial eyes. Progress has been made to establish a static eye model that mimics human eye structure. The next step is to automate the movement of the static eye model, named OEMI-7.

The objective of this project is to develop a robotic eye that mimics human eye movement to standardize eye tracker testing. A previous MSD team succeeded in designing and manufacturing a device that demonstrated the concept of a robotic eye model standard. The device was able to prove the concept but the hardware and software only demonstrated unrepeatable and very slow smooth pursuit. With the concept proven, the current team was able to build upon the foundation built by the previous team. This foundation was set in the form of a basis for design considerations from replicating what processes and designs performed as expected and modifying those that were less than ideal. The current MSD team is targeting performing eye movements including both smooth pursuit and saccade, while being very repeatable. Another goal of this project is to make the robotic eye affordable since eye tracker manufacturers and eye tracking researchers have expressed their interest in purchasing these devices.

PROCESSThe design for this project ultimately began with a list of customer requirements (see Appendix A). This list

was built from the previous team’s customer requirements list, along with some additional measures specific to this particular project. From the customer requirements, another list was made of engineering requirements (see Appendix B). The distinction of the engineering requirements is that these are more specific and most of them have numerical values to use as goals, while the customer requirements were more subjective, with specifications that the device achieves “saccade” or has “accurate and precise positioning.” Some additional constraints were taken into consideration as well, including that the device had to be at most the size of a human head.

The most important engineering requirements for this project were that it could achieve the acceleration and velocity of a human saccade while maintaining high accuracy and precision. The ideal values for the positioning requirements were that the device could meet a resolution of 0.0075 degrees, with an accuracy of 0.015 degrees, or double the resolution. These specifications were put in place due to the claimed accuracy values of popular eye trackers, and the goal was for this device to be more accurate that the eye tracker manufacturers claim their eye trackers are. Due to the fact that a human eye can perform a saccade at an angular velocity of 500 degrees/s, and an angular acceleration of 20,000 degrees/s2, the robotic eye should be this fast as well. Each of these requirements can also be found in Appendix B.

The specifications needed for the motor to drive the system at the required velocity and acceleration were difficult to find at an affordable price. The motor specifications that were evaluated during benchmarking of motor options included a torque of 0.05 N-m, resolution of at most 0.0075 degrees, accuracy of 0.015 degrees, maximum speed of at least 500 degrees/s, acceleration of at least 20,000 degrees/s2, mechanical time constant of at most 5.48 ms, and a size small enough that the whole device could be at a maximum the same size as a human head. The ideal motor choice would meet all these requirements while resulting in a total project cost of $2,000 at a maximum. Ultimately the motor chosen was a Clearpath motor from Teknic, as shown in Figure 1 below. The Teknic Clearpath met the specifications for speed, accuracy, torque, mechanical time constant, and cost, but lacked the resolution and

Copyright © 2016 Rochester Institute of Technology

Page 3: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Proceedings of the Multi-Disciplinary Senior Design Conference Page 3

accuracy to be ideal. The proposed remedy was to improve resolution by using a gear ratio in a belt and pulley system.

Figure 1: ClearPath Motor from Teknic Motors

The belt drive components came from Gates, a manufacturer of timing belts and pulleys. Gates was chosen for their reputation of high quality belt drive products, including belts with a small pitch, but wide enough to resist significant stretching. The goal of the belt drive system was to use parts with the smallest pitch available, while still having a sufficient belt width. The belt and pulley system of choice included a pitch of 3mm, and a belt width of 15mm. There was a choice of a 2-mm pitch, but the widest belt available in that size was 9mm, and it was decided that the slightly larger pitch was an acceptable tradeoff for having a wider belt. Additionally the belt selected was fiberglass-reinforced, to minimize the amount of stretch in the belt. The pulleys were selected in order to obtain a gear ratio of approximately 3:1. However, the smaller pulley (attached to the motor), was recommended to have a pitch diameter of at least 1 inch, so the pulleys chosen had pitch diameters of 1.053 inches and 3.384 inches. These pulleys contained 28 and 90 teeth, respectively, which result in a gear ratio of approximately 3.21:1. An idler pulley was added to the system to maintain sufficient tension in the belt during operation. The position of the idler would be adjustable to allow for the belt to be easily installed or removed as necessary.

The final design of the robotic eye device can be seen in Figure 2 below. Figure 3 below is a labeled version of Figure 2. One difference between the two images is that in Figure 2 the calibration unit is assembled onto the dynamic eye shaft. The calibration unit is 3D printed with four through holes and two center bore holes - one for the cornea of the OEMI-7 to fit into, and another to embed a laser pointer. The four through holes are for screws which go through the eye clamp to secure the calibration unit to the dynamic eye. In this design you can also see the implemented belt and pulley system encased by two aluminum sides and two acrylic sides. Non-load-bearing acrylic siding was chosen so that a visual of the system could be easily obtained without the disassembly of the entire system. Further design considerations include a slot for the idler pulley, which allows for tensioning and re-tensioning of the belt. Our design also includes room for our Arduino motor controller within the encasing so that wires can’t be accidentally pulled. The only thing not featured in either Figure 2 or 3 is our power supply which due to both size and weight will be placed on the floor or a tabletop near the device and therefore was not incorporated into the CAD model of the system. This is because the device itself is already heavy and with the added weight of the power supply, not everyone would be able to lift the overall system easily.

Figure 3 highlights the components of our design, such as the yoke (a piece slightly redesigned from the alpha team) which allows for the vertical angle of the dynamic eye to be controlled. The static eye allows our device to be compatible with a multitude of trackers, since many trackers need two eyes in order to run. However, both eyes don’t need to be dynamic, as the second eye is normally just used as a reference. The static eye is threaded onto a threaded rod which can be therefore moved up and down in the case that the device is placed at an angle and both eyes need to be at the same height.

Copyright © 2016 Rochester Institute of Technology

Page 4: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Proceedings of the Multi-Disciplinary Senior Design Conference Page 4

Figure 2: Final System Design

Figure 3: Labeled System Design

RESULTS AND DISCUSSIONThe final prototype consists of the Teknic Clearpath motor, the gimbal containing the OEMI-7 model eye, and a

belt and pulley system using Gates products all controlled by an Arduino Uno microcontroller. In operating the motor via our software and testing the device after it was assembled, there was higher system error than what was theoretically expected.

The motor itself contains the motor and controller integrated into one unit. It is a 12,800 post-quad encoder count device with 6,400 of them being controllable. This gives a commandable motor resolution of 0.05625 degrees per count. The motor itself has an accuracy of +/- 1 encoder counts per movement post-quad. This means that the motor is capable of moving 0.05625 degrees per count +/- 0.02813 degrees. When properly tensioned, the estimated backlash from the belt and pulley system was 1 to 2 thousandths of a degree which was negligible in terms of the

Copyright © 2016 Rochester Institute of Technology

Page 5: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Proceedings of the Multi-Disciplinary Senior Design Conference Page 5

total system error. Applying the gear ratio, however, the assembled device would allow for a theoretical resolution of 0.0175 degrees per count +/- 0.0088 degrees on the gimbal shaft assuming all else was perfect.

In order to control the motor shaft to exact angles, a specific number of square wave pulses needed to be sent from the Arduino Uno to the motor input according to the motor’s pulses-per-revolution setting in the motor’s MSP software. For example, with a motor input resolution of 6,400 pulses-per-revolution, the Arduino would pulse the square wave exactly 3,200 times to move the motor shaft 180 degrees which in turn is 3,200 encoder counts. With the preliminary Arduino coding, the Arduino outputted a constant digital pulse-width-modulation (PWM) square wave at 500kHz from 0 to 5V. This output was toggled on and off for specific time intervals proportional to how many pulses were wanted, e.g., PWM turned on for exactly 2 microseconds to generate 1 pulse. This methodology wasn’t reliable due to the inherent inaccuracies of using time to send an estimated amount of pulses rather than counting each pulse manually. By sending the wrong amount of pulses, the motor shaft would be off-position by that many encoder counts. These errors are cumulative when the user sends a large amount of commands and the motor can begin to lose its accuracy and have a significant offset from the expected location. This behavior was tested with a variety of movement scripts in order to find a relationship between how many movements it took to cause encoder count error. It was found that after only a 4-5 movements, the motor would begin to be 1 or 2 encoder counts off of the wanted position.

In order to remove the encoder count error, the PWM time-based system was scrapped and the Arduino code was rewritten to send a single square wave pulse per iteration and count how many pulses have been sent. With this method, the digital output is pulsed once per iteration and can be reliably counted instead of being based off of time. The graphical user interface (GUI) was modified so that the user could send movements down to exactly 0.0175 degrees, or one encoder count, allowing for extreme precision with no error present. Over a period of 25 tests of commanding the motor to a variety of angles from -29.995 to 29.995 degrees and back to 0, the motor had no encoder count error present.

Another aspect of the system that was looked at was the high level feedback (HLFB) output of the motor which communicates to the GUI when a movement had finished. The motor can be set so that the HLFB pulses high when the commanded destination has been reached and stays low otherwise. The software architecture works by running a timer from the moment the user sends a movement command up until the HLFB from the motor pulses that the destination has been reached. This information roughly represents how long each movement took. To test this, the exact position of the motor at each millisecond is observable by using the oscilloscope feature included in the motor’s MSP software. While performing various movements, the time durations observed in the MSP software were compared to the times recorded by the GUI timer.

In Table(s) 1, several movement commands ranging from 0-to-29.995 degree sweeps all the way down to 0-to-0.9975 degree sweeps were performed. Each movement was performed ten times and the average was taken from the GUI timer results. The MSP software oscilloscope was used to get the exact duration of each movement from start to finish and was extremely repeatable. Ideally, there would be no difference between these two results. The tests were ran with speed profiles ranging from 900, 500, 100, and 20 rpm, respectively. There was a constant max acceleration of 100,000 rpm/s used throughout all of the testing.

Copyright © 2016 Rochester Institute of Technology

Page 6: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Proceedings of the Multi-Disciplinary Senior Design Conference Page 6

Table(s) 1 - Tests at 900, 500, 100, and 20 rpm speed and 100,000 rpm/s acceleration

Figure 4: Timing error plotted with respect to position in degrees at different speed limits.

Copyright © 2016 Rochester Institute of Technology

Page 7: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Proceedings of the Multi-Disciplinary Senior Design Conference Page 7

After tabulation and analysis of the timing error as shown in Figure 4, it was found that, between the different speed settings and movement sweeps, the mean error was -1.4 milliseconds. It was found that the HLFB output was most accurate when higher speeds were used. With the slower speeds, the HLFB output was less reliable with small movement increments under 10 degrees. Overall, the movement durations that the GUI reports from the HLFB are fairly reliable with this consideration.

Although the motor speed and acceleration were parameters that were able to be easily set within the motor’s software, the speed and acceleration at the output shaft of the eye needed to be measured. Using a high speed camera, the movement of the robotic eye with the laser calibration unit attached was recorded moving a set distance with distance markers for a sense of reference. The results shown in Figure(s) 5 through 7 represent the position of the robotic eye, the velocity, as well as acceleration.

Figure 5: Angular position with respect to time.

Figure 6: Angular velocity with respect to time.

Copyright © 2016 Rochester Institute of Technology

Page 8: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Proceedings of the Multi-Disciplinary Senior Design Conference Page 8

Figure 7: Angular acceleration with respect to time.

By using the position results in Figure 5, a best fit line was attached to the data. This best fit line was further derived for the velocity as well as acceleration. A peak velocity of around 360 deg/s was reached and a peak acceleration of around 7500 deg/s^2 was reached. Using these results, the settings of the motor parameters were able to be adjusted so the objective of 900 deg/s and 7000 deg/s^2 were able to be reached.

CONCLUSIONS AND RECOMMENDATIONSFor future teams, it is recommended to use Teknic motors due to the company’s location near Rochester and

their providing of a student discount. Teknic also has a new line of motors being released within a year from the completion of this project that will better suit the needs of the robotic eye. These motors should be comparable in price to the motor used in this project, which is much cheaper than those from the competition and will provide for more customization. Along with the new motor, there is a smaller and lighter power supply that has the possibility of powering two motors at the same time. The current power supply required more space than the robotic eye device could hold and thus the power supply was relegated to a separate housing. This separate housing required time and materials that the team could have better allocated elsewhere. These are the items that the team wishes could have been changed for the current project and hopes future teams will be able to implement.

ACKNOWLEDGMENTSWe would first like to acknowledge our customers Jeff Pelz and Dong Wang from RIT’s Chester F. Carlson

Center for Imaging Science. We would also like to thank the Kate Gleason College of Engineering for its support with this project. As well as our guide Susan Farnand, the MSD office, and all topic specialists we consulted with during this project. The topic specialists who we’ve spent considerable time with are Stephen Boedo, Mark Kempski, Ferat Sahin, Jan Maneti, Rob Kraynik, Craig Piccarreto, John Bonzo, Abe Amirana, Ed Hanzlik, and Michael Schertzer.

Copyright © 2016 Rochester Institute of Technology

Page 9: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Proceedings of the Multi-Disciplinary Senior Design Conference Page 9

APPENDIXA. Customer Requirements

B. Engineering Requirements

Copyright © 2016 Rochester Institute of Technology

Page 10: edge.rit.eduedge.rit.edu/edge/P16051/public/Final Documents/Paper.docx · Web viewThere is currently no standardized test method for evaluating the quality of data collected from

Proceedings of the Multi-Disciplinary Senior Design Conference Page 10

C. Bill of Materials

Copyright © 2016 Rochester Institute of Technology