Control of a Robotic Arm over CAN using a touch screen user interface

147
Department of Electronic & Computer Engineering Control of a Robotic Arm over CAN using a touch screen user interface Student Name: Róisín Howard Student ID: 0850896 Supervisor: Dr. John Nelson Course: Bachelor of Engineering in Computer Engineering Academic Year: 2011/ 2012 March 2012 Submitted by: Róisín Howard ID No. : 0850896 Project supervisor: Dr. John Nelson

description

This is the full version of my Final Year Project for the University of Limerick in March 2012. The Undergraduate Awards have uploaded a slimmed down version of this as the limit was 20,000 words for the submission in made in July 2012.This project was developed as a showcase demonstration for Analog Devices. In this project the author is looking to provide Analog Devices with a portable demonstration to showcase a new transceiver developed by Analog. Research around the topic of CAN is outlined along with other relevant topics in the project.

Transcript of Control of a Robotic Arm over CAN using a touch screen user interface

Page 1: Control of a Robotic Arm over CAN using a touch screen user interface

Department of Electronic & Computer Engineering

Control of a Robotic Arm over CAN

using a touch screen user interface

Student Name: Róisín Howard

Student ID: 0850896

Supervisor: Dr. John Nelson

Course: Bachelor of Engineering in Computer Engineering

Academic Year: 2011/ 2012

March 2012

Submitted by: Róisín Howard

ID No. : 0850896

Project supervisor: Dr. John Nelson

Page 2: Control of a Robotic Arm over CAN using a touch screen user interface
Page 3: Control of a Robotic Arm over CAN using a touch screen user interface

Control of a Robotic Arm

over CAN using a touch

screen user interface

A showcase demonstration

Róisín Howard

30/3/2012

The control of a Robotic Arm over CAN (Controller Area Networking) using a touch screen user interface,

which will be developed using the BlackFin BF548 ezkit running the µClinux operating system.

Page 4: Control of a Robotic Arm over CAN using a touch screen user interface
Page 5: Control of a Robotic Arm over CAN using a touch screen user interface

i

Abstract

This project was developed as a showcase demonstration for Analog Devices. In this project the author

is looking to provide Analog Devices with a portable demonstration to showcase a new transceiver

developed by Analog. Previously Analog had transceivers for the RS-485 network; this new transceiver is

for CAN and to be used in industrial application.

The intent is to create a portable showcase demonstration in which a Robotic Arm with servo

motors will be controlled over CAN using a touch screen interface. The capabilities of these new CAN

transceivers will be demonstrated as a result of this project. Precise control of the Robotic Arm will be

possible due to the use of servo motors. A touch screen application will provide the control of the

Robotic Arm as a portable solution.

The benefit of this project to Analog Devices is that they now have a showcase demonstration

for their CAN parts. There is no need for a laptop to set up the project; it can just be powered on as is.

This portable solution can be used at trade shows and on customer visits to demonstrate the capabilities

of Analog’s CAN transceivers. The project proved both challenging and mentally stimulating but it

achieved its objectives in creating a portable showcase demonstration for Analog Devices.

.

Page 6: Control of a Robotic Arm over CAN using a touch screen user interface

ii

Acknowledgement

I would like to take this opportunity to express my thanks to Analog Devices Incorporated, for giving me

the opportunity to create a showcase demonstration for them. I would like to thank all my colleagues at

Analog for their continued support and encouragement along the way. I would especially like to thank

Hein Marais, Conal Watterson, Colm Ronan and Mairtin Walsh.

I would like to express my thanks to my supervisor, Dr. John Nelson, for his continued support and

valuable advice over the year.

I would like to thank all the technicians and staff of the Electronic and Computer Engineering

department, for always being there to give a helping hand along the way.

Finally I would like to thank my family and friends for providing a sounding board for my ideas, and

making the last four years so unforgettable.

Page 7: Control of a Robotic Arm over CAN using a touch screen user interface

iii

Declaration

I hereby certify that this material is entirely my own work and has not been submitted to any other

University of higher education institute, or for any other academic award in this University. Where use

has been made of other people it has been fully acknowledged and fully referenced.

Signed: Date:

Róisín Howard

Page 8: Control of a Robotic Arm over CAN using a touch screen user interface

iv

Table of Contents

Abstract ................................................................................................................................................... i

Acknowledgement .................................................................................................................................. ii

Declaration ............................................................................................................................................. iii

List of Nomenclatures ............................................................................................................................ vii

List of Figures ......................................................................................................................................... ix

List of Tables .......................................................................................................................................... xi

Chapter 1 Introduction ............................................................................................................................ 1

1. 1 Analog Devices .............................................................................................................................. 1

1. 2 Aims & Objectives ......................................................................................................................... 1

1. 3 Project Outline .............................................................................................................................. 3

1. 4 Overview ...................................................................................................................................... 4

Chapter 2 Literature Review .................................................................................................................... 7

2. 1 Controller Area Network ............................................................................................................... 7

2. 2 The CAN protocol .......................................................................................................................... 8

2. 2. 2 CANopen ............................................................................................................................. 16

2. 2. 3 CAN Kingdom ...................................................................................................................... 16

2. 2. 4 DeviceNet ............................................................................................................................ 17

2. 3 Embedded Systems ..................................................................................................................... 18

2. 3. 1 UNIX Systems & the C Programming Language .................................................................... 18

2. 3. 2 CAN in real-time embedded systems ................................................................................... 19

2. 3. 3 Microcontrollers and Digital Signal Processors ..................................................................... 21

2. 4 Other In-Vehicle Networks .......................................................................................................... 22

2. 4. 1 FlexRay ................................................................................................................................ 22

2. 4. 2 Local Interconnect Network................................................................................................. 22

Page 9: Control of a Robotic Arm over CAN using a touch screen user interface

v

2. 4. 3 Motorola Interconnect ........................................................................................................ 23

2. 4. 4 Time-Triggered Protocol ...................................................................................................... 23

2. 5 Time triggered CAN ..................................................................................................................... 24

2. 6 Discussion ................................................................................................................................... 26

Chapter 3 Methodology ......................................................................................................................... 29

3. 1 Inception .................................................................................................................................... 29

3. 2 Elaboration ................................................................................................................................. 30

3. 3 Construction ............................................................................................................................... 30

3. 4 Transition .................................................................................................................................... 31

Chapter 4 Hardware Design & Implementation ..................................................................................... 33

4. 1 Board Layout/ Schematic ............................................................................................................ 33

4. 2 The structure of CAN bus nodes .................................................................................................. 37

4. 3 Choosing a microcontroller ......................................................................................................... 37

4. 4 Setting up the microcontroller functionality ................................................................................ 41

4. 4. 1 Serial Peripheral Interface ................................................................................................... 41

4. 4. 2 Pulse Width Modulation ...................................................................................................... 43

4. 5 Power Control ............................................................................................................................. 44

4. 6 Testing the network .................................................................................................................... 44

4. 7 Setting up the CAN controller ...................................................................................................... 45

4. 7. 1 Power up and resetting the CAN controller .......................................................................... 45

4. 7. 2 The bit rate calculation ........................................................................................................ 46

4. 7. 3 Message transmission & reception ...................................................................................... 47

4. 8 The Servo Motors........................................................................................................................ 47

4. 9 Discussion ................................................................................................................................... 49

Chapter 5 Software Design .................................................................................................................... 51

Page 10: Control of a Robotic Arm over CAN using a touch screen user interface

vi

5. 1 Use cases .................................................................................................................................... 51

5. 2 Graphical User Interface ............................................................................................................. 53

5. 3 Setting up the screen layout ........................................................................................................ 55

5. 4 Sequence diagram ...................................................................................................................... 57

5. 5 Discussion ................................................................................................................................... 58

Chapter 6 Software Implementation ...................................................................................................... 61

6. 1 Setting up the Robotic Arm control program and environment ................................................... 61

6. 2 Setting up the BlackFin ................................................................................................................ 61

6. 2. 1 Building the kernel .............................................................................................................. 62

6. 2. 2 Creating an application ........................................................................................................ 63

6. 2. 3 The state machine ............................................................................................................... 64

6. 2. 4 Threads ............................................................................................................................... 64

6. 2. 5 Utilities ................................................................................................................................ 65

6. 3 Discussion ................................................................................................................................... 65

Chapter 7 Testing .................................................................................................................................. 67

7. 1 Testing the Robotic Arm Control Board ....................................................................................... 67

7. 2 Testing the Touch Screen Application .......................................................................................... 69

7. 3 Test Cases ................................................................................................................................... 72

7. 4 Discussion ................................................................................................................................... 77

Chapter 8 Discussion of Results ............................................................................................................. 79

8. 1 Problems Encountered ................................................................................................................ 81

Chapter 9 Conclusion............................................................................................................................. 83

9. 1 Recommendations ...................................................................................................................... 84

References ............................................................................................................................................ 87

Appendices ........................................................................................................................................... 92

Page 11: Control of a Robotic Arm over CAN using a touch screen user interface

vii

List of Nomenclatures

3D – Three Dimensional

ABS – Anti-lock Braking System

ACK – Acknowledgement

ACU – Airbag control unit

ADI – Analog Devices Incorporated

API – Application Programming Interface

ARM – Advanced RISC Machine (RISC – Reduced

Instruction Set Computer)

BOM – Bill of Materials

BRP – Baud Rate Prescaler

CAL – CAN Application Layer

CAN – Controller Area Networking

CCW – Counter Clockwise

CiA – CAN in Automation

CNFx – Configuration Register x

CRC – Cyclic Redundancy Check

CS – Chip Select

CSMA/CR – Carrier Sense Multiple Access/

Colission Resolution

CTS – Clear to Send

CW – Clockwise

DC – Direct Current

DLC – Data Length Counter

DLL – Data Link Layer

ECU – Electronic Control Unit

EOF – End of Frame

EXIDE – Extended Identifier Flag

GND – Ground

GNU – GNU’s Not UNIX

GPIO – General Purpose Input/ Output

GPL – General Public License

GPxCON – General Purpose control register for

port x

GUI – Graphical User Interface

IAR – Ingenjörsfirman Anders Rundgren,

(Anders Rundgren Engineering Company)

IDE – Integrated Development Environment

IDE –Indetifier Extension Bit

ISO – International Standards Organisation

JTAG – Joint Test Action Group

LCD – Liquid Crystal Display

LED – Light Emitting Diode

LLC – Logic Link Control

LSB – Least Significant Bit

MAC – Medium Access Control

MISO – Master In Slave Out

MOSI – Master Out Slave In

MSB – Most Significant Bit

Page 12: Control of a Robotic Arm over CAN using a touch screen user interface

viii

NAK – Negative Acknowledgement

NBR – Nominal Bit Rate

NRZ – Non Return to Zero

OD – Object Dictionary

OSC – Oscillator

OSI – Open Systems Interconnectoin

OST – Oscillator Start-up Timer

PADS – Schematic Program

PC – Personal Computer

PLL – Phase Lock Loop

PLLCON – PLL control

POSIX – Portable Operating System Instructions

for UNIX based systems

POWCON – Power control register

PS1/ PS2 – Phase Segment ½

PWM – Pulse Width Modulation

PWMCON1 – PWM control register 1

PWMxCOM1/2/3 – PWM compare registers

1/2/3 for pair x

PWMxLEN – PWM length registers for pair x

RAM – Random Access Memory

REC – Receive Error Counter

RTR – Remote Transmission Request

RTS – Request to send

SAE – Society of Autmotive Engineers

SI – Serial In

SO – Serial Out

SOF – Start of Frame

SPI – Serial Peripheral Interface

SPICLK – SPI clock

SPICON – SPI control register

SPIRX – SPI Receiver Register

SPITX – SPI Transmit Register

TDMA – Time Division Multiple Access

TEC – Transmission Error Counter

TQ – Time Quanta

TTCAN – Time Triggered CAN

UART – Universal Asynchronous Receive

Transmit

UML – Unified Modeling Language

UNIX –UNiplexed Information and Computing

Service (originally spelled Unics)

VCC – Positive Input Voltage

VIO – Voltage Input/ Output

VISOIN – Isolation Surge Voltage In

VISOOUT – Isolation Surge Voltage Out

XCLKI – External clock in

XCLKO – External clock out

Page 13: Control of a Robotic Arm over CAN using a touch screen user interface

ix

List of Figures

Figure 1-1 the project set-up ................................................................................................................... 2

Figure 1-2 BF548 ..................................................................................................................................... 2

1-3 The showcase demonstration set-up ................................................................................................. 3

Figure 2-1 the OSI reference model[28] ................................................................................................... 8

2-2 Layered Architecture of CAN[29] ....................................................................................................... 9

Figure 2-3 CAN signals[38] ..................................................................................................................... 11

Figure 2-4 CAN Standard Frame Format[10] .......................................................................................... 12

Figure 2-5 CAN bit timing[40] ................................................................................................................ 13

Figure 2-6 Node Status Transition Diagram[29] ...................................................................................... 15

Figure 2-7 The Linux Kernel[53] ............................................................................................................. 19

Figure 2-8 System matrix in TTCAN [21] ................................................................................................. 25

Figure 4-1 Manufactured board (front) .................................................................................................. 33

Figure 4-2 JTAG connector ..................................................................................................................... 35

Figure 4-3 Manufactured board (back) .................................................................................................. 36

4-4 the structure of a CAN node[72] ...................................................................................................... 37

Figure 4-5 ADuC7026 evaluation board ................................................................................................. 38

Figure 4-6 ADuC7060 evaluation board ................................................................................................. 39

4-7 ADuC7128 set-up ............................................................................................................................. 40

4-8 The CAN_RESET command............................................................................................................... 42

Figure 4-9 Servo motor control[75] ........................................................................................................ 48

Figure 5-1 Use Case 1: the main menu ................................................................................................... 51

Figure 5-2 Use Case 2: the control screen .............................................................................................. 52

Figure 5-3 Use case 3: the routines screen ............................................................................................. 52

Figure 5-4 Original application design .................................................................................................... 53

Page 14: Control of a Robotic Arm over CAN using a touch screen user interface

x

Figure 5-5 the state machine ................................................................................................................. 54

Figure 5-6 Final application design ......................................................................................................... 55

Figure 5-7 Touch screen application sequence diagram ......................................................................... 57

Figure 6-1 BlackFin BF548 ezkit .............................................................................................................. 62

6-2 State machine sample code ............................................................................................................. 64

7-1 Robotic Arm set-up .......................................................................................................................... 68

7-2 Robotic Arm using DC motors set-up with BlackFin device for testing purposes ............................... 70

Page 15: Control of a Robotic Arm over CAN using a touch screen user interface

xi

List of Tables

Table 4-1 servo motor CAN commands .................................................................................................. 49

Table 5-1 the button layout ................................................................................................................... 56

Table 7-1 CAN messages ........................................................................................................................ 71

Table 8-1 microcontroller comparison ................................................................................................... 80

Page 16: Control of a Robotic Arm over CAN using a touch screen user interface
Page 17: Control of a Robotic Arm over CAN using a touch screen user interface

Introduction

1

Chapter 1 Introduction

An introduction into the company Analog Devices[1] will be given, along with a discussion into where

the idea for this project manifested. The aims of the project and the outline will also be discussed.

Finally there will be an overview of the topics discussed throughout this report.

1. 1 Analog Devices

Analog Devices Incorporated (ADI) is a semiconductor devices company. Ray Stata and Matthew Lorber

founded Analog Devices in 1965 in Cambridge, Massachusetts. It is the “world leader in high

performance signal processing”.[1] There are offices worldwide. Today, Analog Devices operates

Ireland's largest semiconductor R&D center in Limerick[1]. In Limerick there are over 1000 employees.

There is both manufacturing and Research and Development in the Limerick plant. The author worked

with Analog as a Co-Operative Education Student gaining the knowledge and experience needed to

pursue this project. A new transceiver was being developed for a communication network called CAN

(Controller Area Networking[2]). There was a lot of work going into marketing these new transceivers.

The parts ADM3052 and ADM3053 are the first signal and power isolated CAN transceivers in a single

surface mount package on the market[3]. Analog wanted to demonstrate the capabilities of their new

CAN transceiver to their customers in a portable solution; this lead to the need for the development of a

showcase demonstration model.

1. 2 Aims & Objectives

The aim of this project was to create a showcase demonstration for Analog Devices, to show the use of

ADI’s new CAN transceiver, ADM3053[4]. The ADM3053 is an signal and power isolated CAN transceiver

with the latest iCoupler® technology[5] for use in the industrial and automotive segments. Figure 1-1

shows an example of the CAN network which has been developed for this project.

In this project a Robotic Arm[6] with a control board and a BlackFin BF548[7] processor, Figure

1-2, with an integrated touch screen communicating through CAN, were required. These two devices

needed to communicate over the CAN bus to showcase the functionality of this CAN transceiver; each

device had to have a CAN transceiver to process the messages being received and transmitted. On the

Robotic Arm node the ADM3053 CAN transceiver was required, and the BlackFin BF548 node contained

an integrated CAN transceiver for this functionality.

Page 18: Control of a Robotic Arm over CAN using a touch screen user interface

Introduction

2

Figure 1-1 the project set-up

Figure 1-2 BF548

Page 19: Control of a Robotic Arm over CAN using a touch screen user interface

Introduction

3

1. 3 Project Outline

This showcase demonstration entailed the use of servo motors[8] to provide absolute control of the

Robotic Arm. By using the servo motors an accurate home position can be obtained. Once the motor is

given the signal it will remain in that position until the power is removed. Evaluation work was required

on different types of servo motors to see if the servo would be able to move the arm. To control the

Robotic Arm the ADuC7128[9] microcontroller from Analog has been chosen. The MCP2515[10] is the

CAN controller from Microchip which is being used. This is connected to the microcontroller through

the SPI peripherals[11]. The CAN controller then sends the CAN signals to a CAN transceiver, the

ADM3053, ADI’s newest release.

CAN signals can be transmitted across approximately a 25m cable to demonstrate the

capabilities of the CAN transceiver. At the other side of the cable is the BlackFin, BF548-ezkit[12],

embedded processor. This kit includes an integrated touch screen, which is being used to control the

Robotic Arm. A control application has been developed and is being run on this device. This kit has CAN

transceivers built onto the board; one of these CAN transceivers is directly connected to the ADM3053

across the 25m cable. The network that has been developed can be seen in Figure 1-3, note only a short

cable is used in this picture.

1-3 The showcase demonstration set-up

Page 20: Control of a Robotic Arm over CAN using a touch screen user interface

Introduction

4

The operating system[13] that is being used on the BlackFin is µClinux[14]. A version of this

operating system had to be built around the required drivers and interfaces required for the control

application that was developed. Once the correct version of the operating system was on the board,

work got underway in developing the control application.

Servo motors are used to control the movement of this Robotic Arm as they are very accurate

and high precision was required. The PWM (pulse width modulation)[15] in the microcontroller was

used to send the correct signals to the servo motors. Using the touch screen interface the Robotic Arm

is being controlled to do a variety of tasks, perform a dance routine, pick up an object and move it to a

required position, or just simply move the motors on the robotic Arm in a clockwise or anticlockwise

direction.

1. 4 Overview

Initially the focus of the report was the components specified by Analog for use within the design. An in

depth outline of CAN, its advantages and adaptations will be discussed in the literature review along

with its use in embedded real time systems. The CAN standard[16] was developed to allow similar

devices to communicate over a network. It is a message based protocol. It was originally designed for

automotive applications[17] but now it is also used in medical equipment and in industrial automation.

This report then demonstrates the inception, elaboration, construction and transition of the

project. A discussion into the model used as a guideline will be given in the methodology, Chapter 3.

Here the different phases applied throughout the project will be discussed and how they impacted the

final outcome.

Various microcontrollers have been evaluated to see which one would provide the best control

for this project; these details will be discussed in the Hardware Design & Implementation section,

Chapter 4. This section will also include the schematic drawn up for this project and how nodes on a

CAN bus should be laid out. An outline of how the functionality of the chosen microcontroller will be

given and the set-up required for the CAN controller to operate correctly.

A separate chapter will be given for both, the Software Design and the Software Implementation

Chapter 5 and Chapter 6 respectively. These sections will go through how the application was designed

and how the software was implemented in both the robot control board and the BlackFin BF548 ezkit. A

Page 21: Control of a Robotic Arm over CAN using a touch screen user interface

Introduction

5

selection of UML diagrams will be given in the Software Design section to outline the operations of the

control application which was developed.

A section on testing will discuss the different testing methods throughout the project and why

certain methods proved more important than others. Unit testing was an important part of this project.

This meant that each unit would be tested before moving on to the next step in the project. Some test

cases used throughout the project will be outlined. This section can be found in Chapter 7.

A microcontroller[18] operates very like a computer; it has memory, a core processor and

programmable input/output peripherals, integrated on a single chip. Microcontrollers are used in

embedded systems. In this project a microcontroller will be used to send control signals to the Robotic

Arm. The Discussion of Results section, Chapter 8, will go through the different microcontrollers

evaluated and why the one that was chosen proved to be the best fit for the project. This section will

also go through how each part of the project was validated before the whole project was put together.

Recommendations of possible future work are included in the Conclusion section, Chapter 9.

This section mentions the important topics discussed throughout the project and why the project was a

success. The recommendations, section 9. 1 , provide a taste of what capabilities the Robotic Arm has,

and what huge scope there is for this project. The limitation of time prevented these possibilities. Some

examples of what the Robotic Arm has been seen to do is given, one option is the Robotic Arm playing a

game of draughts[19].

Page 22: Control of a Robotic Arm over CAN using a touch screen user interface

6

Blank Page

Page 23: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

7

Chapter 2 Literature Review

Topics studied throughout the course of the project are discussed here and their relevance to the

project is also mentioned. An in depth detail is needed about the CAN protocol and its usefulness in

embedded and Real-Time Systems[20]. Different timing requirements associated with CAN are

mentioned and an outline of some CAN protocols for the application layer is also given. Embedded

systems are mentioned along with the background information needed to use and program embedded

devices. A discussion into real time embedded devices is given and the new adaptation of CAN, Time

Triggered CAN[21] is mentioned for real-time requirements. Finally there is a summary of all the points

mentioned and how they are relevant to the project.

2. 1 Controller Area Network

Controller Area Network, CAN, is a data link layer protocol in communication systems. It was originally

developed for use in the automotive segment, as an embedded control system in passenger cars[17]. It

is bus standard designed to allow communication between devices such as microcontrollers without

having a host computer. CAN is a message based protocol. The original development of the CAN-bus

launched at Robert Bosch GmbH in 1983. It was released officially at the Society of Automotive

Engineers (SAE)[22] in 1986 Detroit, Michigan.[23, 24]

Early on in the specification stage of the new serial bus system engineers from Mercedes-Benz

got involved. Intel also got involved as they would be the main semiconductor vendor for the new parts.

A Professor from Germany called Dr. Wolfhard Lawrenz was hired as a consultant. He gave the new

network protocol the name “Controller Area Network”. Two months ahead of schedule in 1987, Intel

delivered the 82526[25], the first CAN controller chip, the first hardware implementation of the CAN

protocol. Philips Semiconductors were the next company to deliver a CAN controller, the

82C200[26].[23]

The Bosch CAN specification was submitted in the early 1990’s[16]. Early in 1992 an

international users and manufacturers association called CAN in Automation (CiA), was established[27].

The CAN Application Layer (CAL)[23] was specified by the CiA. It was intended that CAL would bridge

the gap between the underlying communication support and the distributed application process. CAL

was application independent so this was not successful. A suitable profile had to be developed by each

user for his/her specific application field.[23, 24]

Page 24: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

8

2. 2 The CAN protocol

The physical layer and part of the data link layer in the OSI model[28] are defined by the CAN protocol.

The OSI Reference Model can be seen below in Figure 2-1. The lower two layers are where the CAN

protocol sits. The CAN protocol is defined in the ISO 11898, International Standards Organization [29-

31]. There is also a conformance test defined in the ISO 16845 for the CAN protocol[32]. This

conformance test guarantees the interchangeability of CAN chips.

Figure 2-1 the OSI reference model[28]

The physical layer deals with the transmission of bits including bit timing and synchronization.

This network is based on a shared-bus topology; this means that to suppress signal reflections resistors

are used to terminate the buses at each end. CAN is a multi-master data bus. It uses Carrier Sense

Multiple Access/ Collision Detection (CSMA/CD) to determine which node gets access to the bus. The

communication is asynchronous.[33, 34]

The data link layer (DLL) is split into two; the lower half of the DLL is known as the Medium

Access Control (MAC) and the upper half is known as the Logic Link Control (LLC). The MAC is there so

as to avoid collisions. The different functions of this layer include frame encoding and decoding,

Page 25: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

9

arbitration and error checking. The LLC is there to provide a proper interface to the application

programs running in the upper layers. Figure 2-2 shows the layered architecture of CAN.[24, 29]

2-2 Layered Architecture of CAN[29]

2. 2. 1. 1 Bit Encoding and Synchronization

Controller Area Networking is a very robust and reliable communication network. Non Return to Zero

bit coding, NRZ, is used in CAN. In NRZ encoding there are two signal levels; bit 0 is represented by a

high level and bit 1 by a low level. The level remains constant over the full time slot. One bit is

represented by one time slot. To avoid a loss of synchronization in the signal due to successive 1’s or 0’s

a technique called bit stuffing is used. The bit stuffing technique that the CAN protocol utilizes is a

complimentary bit is inserted after five consecutive bits of equal value. At the receiver these bits have

to be un-stuffed so the original data is processed. This technique is also a mechanism from detecting

errors at bit level.[24, 34, 35]

When CAN sends an error flag (discussed further in section 2. 2. 1. 3 ), it will consist of 6 bits of

the same polarity, this will depend on the state of the CAN node. If the node is in the error passive state

the error flag consists of 6 1’s, on the other hand if the node is in the error active state the error flag

consists of 6 0’s. Due to these bit patterns being used to signal errors, these patterns must be avoided

Page 26: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

10

in the data part of the CAN frame. This is another reason for bit stuffing. A bit of the opposite polarity is

transmitted whenever 5 bits of the same polarity are transmitted.[20, 24, 34]

Bit stuffing increases the length of the CAN frame thus increasing the maximum transmission

time of the CAN frame. The maximum transmission time Cm, of a CAN message, m containing sm data

bytes, is given by the following equation[36]:

bit

m

mm

sgsgC τ

−++++=

4

18138 Equation 2-1

For a standard CAN frame format g = 34 and g = 54 for an extended frame format. The floor

notation returns the integer rounded up to the next whole number. The transmission time for a single

bit is given by τbit. For the standard CAN frame format this information can be used with the Equation 2-

1, it can simplify down to[36, 37]:

( ) bitmm sC τ1055 += Equation 2-2

For an extended CAN frame format it simplifies to[36, 37]:

( ) bitmm sC τ1080 += Equation 2-3

CAN Signals

The CAN bus has two signals, CAN_H and CAN_L, a high and a low signal shown in Figure 2-3[38]. Here it

can be seen that the CAN_H signal is the red line, and the CAN_L signal is the blue line. For this project

the differential voltage will be 3.3V, this is the difference in voltage between the high voltage of CAN_H

and the low voltage of CAN_L. The CAN bus has two states, dominant and recessive. The recessive state

is when both CAN_H and CAN_L float at the middle voltage. The dominant state is when the signals are

at their extremities, i.e. the differential voltage.[34]

Page 27: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

11

Figure 2-3 CAN signals[38]

2. 2. 1. 2 CAN frame formats, bit timing and arbitration

The CAN protocol has two frame formats; a base frame (Figure 2-4), and an extended frame. This allows

the transmission of either an 11-bit or 29-bit message identifier. The 11-bit message ID has priority over

the 29-bit message ID. The CAN protocol also has four types of frames; a data frame, a remote frame,

an error frame and an overload frame. The data frame is used to send information over the network.

Each data frame consists of a Start of frame bit (SOF), the 11-bit message identifier, for the base frame

format or an 18-bit extended identifier, for the extended frame format. The next part of the frame is the

control bits (RTR, IDE & reserve), a 4-bit data length counter, DLC, the data field, the 15-bit CRC, the

delimiter and ACK and the end of frame which is 7-bits in length.[24, 34, 35]

While the CAN bus is idle it remains in the recessive state. The start of frame bit is a dominant

start bit, due to the bus being in the recessive state while idle; any transition from idle to dominant is

considered the start of the frame. The next part of the CAN frame being sent is the message ID, this is

used in the arbitration process so each message ID must be unique to the network. The control bits

consist of a remote transmission request, RTR, which is used to discriminate between data and remote

frames. A dominant value, a zero bit, denotes a data frame and a recessive value, a 1 bit, denotes the

remote frame. Due to the data frame having a dominant value in the RTR field, this message has priority

over the remote frame having the same identifier. The IDE is the identifier bit which allows the choice

of the base frame or extended frame format. It is recommended that these are left as 0, and considered

reserved bits, in CANopen networks. The Data Length Code, DLC, is used to specify how many data

bytes will be transmitted in this frame. The data field is transmitted next and it contains as many data

bytes as specified in the previous field. The 15-bit CRC follows the data field and then the remaining

Page 28: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

12

control bits are the CRC delimiter, the acknowledgement slot and its delimiter. A delimiter is used to

allow the nodes some time to react to the previous bits. Finally the data frame ends with a sequence of

7 consecutive recessive bits which notifies all nodes on the bus of the end of an error-free

transmission.[10, 24, 34]

Nodes in the network are notified of errors on the bus with error frames. There are two fields in

error frames; an error flag followed by a delimiter. There are two types of error flags, an active error

flag and a passive error flag. The active error flag contains six dominant bits while a passive error flag

contains six recessive bits. The active error flag violate the bit stuffing rules or the fixed-format parts of

the frame that is being exchanged; this enforces an error condition which is picked up by all nodes on

the network. Each node that detects an error can transmit an error flag of its own, this means that

there can be from 6 to 12 dominant bits on the bit. The delimiter is then made up of 8 recessive bits.

Each node sends recessive bits after the transmission of an error flag and monitors the bus level at the

same time until a recessive bit is detected. When the recessive bit is detected the node sends the 7

remaining recessive bits which completes the error delimiter.[24, 34]

Remote frames are similar to data frames only they contain no data. Overload frames are used

to slow down the operations on the network. They are sent by the slow receivers, an extra delay is

added between the consecutive data and remote frames. The overload frames have a similar format to

the error frames. The frame contains an overload flag followed by a delimiter.[24, 34]

Figure 2-4 CAN Standard Frame Format[10]

Page 29: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

13

Each CAN message has a unique message identifier. This serves for both the purpose of identity

as well as for arbitration and collision avoidance. The CAN bus uses bit-wise arbitration; each message

has a unique priority, i.e. the message ID. Each node senses the network and must wait until the bus is

free before they can begin transmission. The transmission begins with the sending of the message ID.

As the node is sending the message ID it compares the bit being sent with the bit on the bus. If at any

stage this is different the node will stop transmission to avoid collisions with another CAN frame. This

lets the node know if the transmission of the bit was successful or if it was overwritten; writing a 0, the

dominant state, overwrites a 1, the recessive state. Unlike other networks, instead of producing a

jamming sequence to prevent collisions, they are resolved by priority. The bus with the highest priority

wins arbitration and this node can transmit the message. The process is constantly repeated so any

message that lost arbitration in the last cycle will have the chance to send again, the message with the

next highest priority will be transmitted the next time around. This entire process is implemented in

hardware, no software intervention is required. This mechanism is known as collision avoidance rather

than collision detection as collisions do not happen in CAN.[20, 24, 34, 36, 39]

Each CAN controller typically has its own clock. These clocks may drift with respect to other

clocks on the bus. On each message transmission the CAN nodes resynchronize. The node that starts to

transmit first will send a start of frame bit; every node must synchronize on the leading edge of this bit.

Figure 2-5 shows the CAN bit timing structure. There are 4 segments in each bit time; the

synchronization segment, the propagation segment, the phase segment 1 and phase segment 2.

Registers can be set up so as to allow certain time quanta for each segment. Generally the sample point

is between the 1st

and 2nd

phase segment. This will be discussed further in Section 4.7.2, the Bit Rate

Calculation.[33, 34, 40]

Figure 2-5 CAN bit timing[40]

Page 30: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

14

2. 2. 1. 3 Detection of errors in CAN and fault tolerance

There are three fundamental approaches to error control in Data Communication. The open-loop

approach is where the receiver does not send feedback to the transmitter about the transmission. The

closed-loop approach involves the feedback about erroneously received frames/packets. A combination

of these two approaches is known as the hybrid approach. The CAN protocol has error checking in both

the physical and data link layer. The error detection is achieved by means of transmitter-based-

monitoring, bit stuffing, Cyclic Redundancy Check and message frame check.[24, 34]

In the physical layer there are two error detection mechanisms; monitoring and bit stuffing.

When a CAN message is transmitted the station also monitors the bus level. This allows the station to

compare the bits sent to the bits received and thus detect differences which is reliable detection of both

errors local to the transmitter, and global errors. The bit stuffing prevents a loss of synchronization in

messages with consecutive bit values as mentioned in Section 2. 2. 1. 1 . This bit stuffing only occurs

from the SOF to the CRC sequence.[24, 34]

The data link layer has three mechanisms for error detection. The first one is the CRC, Cyclic

Redundancy Check. When a frame is being transmitted a 15-bit wide CRC is added to the end of the

frame by the transmitter. At the receiver the CRC is reevaluated and then compared to the transmitted

one. If the CRC matches, the receiver pulls the acknowledgment bit to the dominant state to confirm

that the frame was received correctly. Every node on the network performs this task. If one node

detects a mismatch the message is destroyed for every node and has to be resent. The reliability of this

network is very high due to a 15-bit checksum where 8 data bits are covered. The hamming distance is 6

which means up to 5-bit errors can be detected.[24, 34]

Frame check is where the fields can be checked against their expected values due to the fixed-

format fields in CAN frames. The EOF field as well as, the CRC and ACK delimiters has to be at the

recessive level. If any illegal bit is detected an error frame is generated. The transmitting node also

checks to see if the ACK bit has been set in the received frame. If this bit is recessive an error is

generated.[24, 34]

A node on the network may not be functioning correctly. Fault confinement is used to prevent

this node from constantly sending corrupt frames and blocking the entire network. This fault

confinement unit supervises that the MAC sublayer is operating correctly. If the node becomes

defective it is disconnected from the bus by this unit. To prevent faulty nodes generating incorrect

Page 31: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

15

checksums CAN implements 3 error states; error active, error passive and bus off. The error active and

passive states perform as normal on the network but they handle error conditions differently. The error

active state sends active error flags and the error passive state sends error passive flags. An error

passive state for a node means that errors have already occurred and this node should avoid interfering

with the network operations.[24, 34]

Two counters are used by the fault confinement unit to track the behavior of the node with

respect to the transmission errors. These two counters are the transmission error count (TEC) and the

receive error count (REC). Whenever there is an error detected on the bus the counters are increased

by a given amount, these counters are decreased by one when there are successful exchanges. The

increment value is higher the more severe the error. The nodes that detect the error have a higher

increase than the nodes that reply to an error flag. This means that for the faulty nodes, the counters

increase more quickly than that of the correctly functioning nodes. These counters have thresholds, the

first threshold is 127. If this is exceeded the node is switched to the error passive state. Finally is the

next threshold, 255, is exceeded the node is switched to bus off. This means that the node shuts down

and can no longer transmit or receive any frame on the network. The node has to be reset and

reconfigured before it can be switched back to the error active state, self recovery is not possible.

Figure 2-6 shows an example of the CAN node’s possible error states[24, 29, 34]

Figure 2-6 Node Status Transition Diagram[29]

Page 32: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

16

2. 2. 2 CANopen

CANopen[41] is a high level protocol which was being developed by Bosch; it was a CAL-based profile

which was for embedded networking. Recently in the latest specification, CANopen no longer refers

explicitly to CAL. The aim CANopen was to interact with the field devices in a standardized way by

providing a well-defined and useable set of primitives. CANopen is a communication protocol and

device profile specification for embedded systems used in automation. CANopen is built on top of CAL,

using a subset of CAL services and communication protocols; it provides an implementation of a

distributed control system using the services and protocols of CAL. It does this in such a way that nodes

can range from simple to complex in their functionality without compromising the interoperability

between the nodes in the network.[34, 41]

CANopen defines an object model which describes the behavior of devices which allows devices

from different manufactures to be interoperable and interchangeable. CANopen is mostly used in

embedded systems. This protocol enables the designer to combine proprietary CAN nodes and

CANopen compliant nodes into one network. CANopen can be tailored towards a specific application; it

can be easily customized or extended. This is because CANopen comes with a lot of mandatory

functionality and the designer can choose which functionality to include. There is only a small amount

of mandatory functionality which has to be included. CANopen is also expandable. Future functionality

is tolerated; manufactures can implement functionality that is not yet available.[34, 41]

The Device Object Dictionary (OD) is the central concept in CANopen. CANopen basically

standardises the description of device functionality by means of this OD. It is an ordered group of

objects addressed using a 16-bit index. An 8-bit sub-index is defined as well to allow individual elements

of structures of data to be accessed. For every node in the network there exists an OD. The OD contains

all parameters describing the device and its network behaviour. CANopen is defined in the form of

documents describing profiles. CANopen also has a standard description of a device and its

configuration in the form of ASCII files.[34, 41]

2. 2. 3 CAN Kingdom

CAN Kingdom is a communications protocol running on top of CAN. It is designed as a fieldbus. Unlike

other CAN high level protocols, no attempt is made to follow the OSI model. The network is mostly

distributed, as the nodes may run autonomously. However a "King" or master controller is needed to

configure the network. The system designer is fully aware of the capabilities of the nodes, while on the

Page 33: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

17

other hand a node designer needs to know nothing about the other nodes; it is up to the system

designer to activate a node's services or not through runtime configuration.[42, 43]

The “King” or master controller configures which nodes will receive and transmit which

messages. It provides an infrastructure for transmitting fixed format data that runs over a single CAN

packet payload length. Point to point transfer of data streams, through the "Block Transfer" mechanism

is allowed. Runtime mapping of CAN identifiers with an optional second level of indirection is accepted.

The first level, from CAN identifier to "folder" must be mapped by the King. The second level, "folder" to

"document", can optionally be fixed by the node designer. Remapping of messages can occur at

runtime; this means the system can be configured to coexist with nodes on the system which may be

using a different high level protocol like CANopen or DeviceNet. CAN Kingdom will also provide a

mechanism for clock synchronisation. Synchronous, event driven, and “daisy chain” messaging is

provided along with a mechanism to set message filters and a specification for packet bit fields.[42, 43]

2. 2. 4 DeviceNet

DeviceNet[44] is a communication protocol used in the automation industry for the interconnection of

control devices for data exchange. It uses CAN as the backbone technology. An application layer is

defined to cover a range of device profiles. Typical applications of DeviceNet include safety devices,

information exchange, and large I/O control networks.[45]

In Layer 1 (Physical Layer) nodes are distributed along the network by means of a trunkline-

dropline topology. This trunkline-dropline topology allows for ease in wiring and access to the network

from multiple taps. It is possible for nodes to be easily removed and added to increase network

flexibility, decrease troubleshooting time and also reduce production downtime. The communication

power and device power can share the same bus because the physical layer is optically isolated from the

device.[44, 45]

In the Data Link Layer (layer 2) the CAN bus is used DeviceNet’s backbone network. This means

that a minimal bandwidth is required to transmit and package messages. Also, a smaller processor may

be selected in the design of the device due to data from the standard CAN frame format and the ease at

which the processor can parse through data.[44, 45]

Page 34: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

18

2. 3 Embedded Systems

The communication connection between microcontrollers in an embedded system is known as

embedded networking. Microcontrollers and microprocessors are embedded devices. Embedded

networks often use distributed control. This means that there is no master node. Instead each node

has the intelligence on its own to decide what to do with the data it has. CAN is a multi-master network

which means that there is not just one master node. Messages to be transmitted are broadcast over the

whole network and the node which the frame is meant for will read take in the frame, the others will

discard it. For this reason CAN is useful in Embedded Systems.[34]

Embedded systems can be divided into two main parts. One category contains the “high-

volume electronics” found in everyday consumer products. On the other hand there is the “low-volume

specialty electronics”, these run specific control tasks in which microcontrollers or microprocessors used

tend to be more expensive and thus have greater power and memory.[34, 46]

CANopen tends to be used by Embedded Systems. As CANopen is very flexible this means that

the amount of CPU time required for handling the communication is dependent on the functionality

implemented. The C programming language[47] is the most used embedded programming language,

while UNIX systems are used as a host environment for this embedded programming[34, 48, 49].

2. 3. 1 UNIX Systems & the C Programming Language

An operating system is a software environment which provides a virtual machine to abstract the

underlying hardware; it is also known as a resource manager. The operating system sits on the

underlying hardware and provides the user with a simpler environment to work under. The main

component in most computer operating systems is the kernel. It is a bridge between the data processing

done at hardware level and the applications at user level. The primary function of the kernel is to

manage the computers’ resources and to allow other programs to run and use these resources.[13, 50,

51]

The Linux kernel is written in the C programming language, and so are a lot of programs which

make up a Linux distribution. UNIX was developed in Bell labs[52] in 1969 by Ken Thompson and Dennis

Ritchie. They re-wrote UNIX in 1973 in the new programming language called C. Dennis Ritchie

developed the C language. Ken Thompson had created the B language, C was derived from this. This

language was created to make programming the UNIX operating system easier than programming it in

Page 35: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

19

assembly. The C language is portable which means it can be moved between systems. The program can

be created on one system and compiled on another. C is a low level language which is very powerful

and flexible. Linus Torvalds developed the operating system Linux which is a subset of UNIX, he began in

1991. His kernel was released in 1994, Linux 1.0, under the GNU GPL license. An illustration of the Linux

kernel is shown below in Figure 2-7.[47, 50, 52, 53]

Figure 2-7 The Linux Kernel[53]

2. 3. 2 CAN in real-time embedded systems

A real time system is one which must deliver a task in a timely manner, within a timing constraint. Not

only does a system need to run a control law with time constraints, it must also schedule

communications. A Real-Time System has to deal with sending and receiving messages according to

deadlines. Timing constraints for real-time systems can be divided into two types; hard and soft. Hard

real time systems are safety critical, if they do not finish within their timing deadline serious

consequences may result. Soft real time systems on the other hand have deadlines but it is not critical

that they meet them. Many embedded systems are hard real-time systems. Deadlines of jobs in an

embedded system are typically derived from the required responsiveness of the sensors and actuators

monitored and controlled by it.[54]

Real-time embedded systems are critical in automobiles. In modern vehicles there could be up

to 80 electronic control units (ECU). An ECU is an embedded system which deals with the control of one

or more subsystems in a motor vehicle. The number of networked ECU’s in cars such as BMW and

Mercedes went from around 5 up to 40 from the year 2000 onwards. Nowadays almost all of the cars in

Page 36: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

20

Europe are equipped with at least one CAN bus. There are many examples of ECU’s some common

control units are as follows[36, 40, 55]:

• Engine control unit

• Airbag control unit (ACU)

• Door control unit

• Power-train control unit

• Transmission control unit

• Speed control unit

• Break control module (ABS or ESC)

Some of these ECU’s form independent subsystems but for others they need to communicate,

and this communication is essential. The way for the ECU’s to communicate is over and embedded

network. In automotive applications high speed networks connecting chassis and power-train ECU’s are

typically provided by CAN. These networks are generally 500 kbps. Low speed networks also use CAN,

for example climate control.[36]

Real-time constraints

When messages are sent over the CAN bus in automotive applications they are referred to as signals. A

lot of these signals have real-time constraints linked with them. An example would be a user of the

vehicle pressing the brake pedal. The brake pedal has a switch associated with it and the position of this

switch can be read by the ECU. Information that the brake pedal has been applied is sent over CAN as a

signal from this ECU to other ECU’s associated with the brake pedal. The ECU for controlling the brake

lights receives the signal and switches the brake lights on. This is done within a few tens of milliseconds

of the brake being pressed.[36]

Real-Time Analysis of Ideal CAN

Assuming that all messages have deadlines less than or equal to their periods ( mm TD ≤ ). For Real-Time

systems the worse case response time, Rm of a message, is defined as the longest time from the start of

that event until the time at which it is received by the nodes that require it. The message is schedulable

Page 37: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

21

if and only if the worst case response time of a message is less than or equal to its relative deadline

( mm DR ≤ ). A schedulable system is one which is schedulable if and only if all messages in the system

are schedulable. The worst case response time of a message is defined by: [20, 36, 37, 39, 56]

mmmm CwJR ++= Equation 2-4

The queuing jitter Jm, corresponding to the longest time between the initiating event and the message

being queued, ready for transmission. Wm is defined as the queuing delay of the message: [20, 36, 37,

39, 56]

∑∈∀

+

+++=

)(

1 ),max(mhpk

k

k

bitk

n

m

mm

n

m CT

JwCBw

τ Equation 2-5

Cm is the maximum transmission time seen in Equation 2-1, and Bm is the worst case blocking time given

by the equation: [20, 36, 37, 39, 56]

)(max)(

kmlpk

m CB∈∀

= Equation 2-6

The set of messages with lower priority than the message m is given by lp(m), similarly the set of

messages with higher priority than the message m is given my hp(m).[20, 36, 37, 39, 56, 57]

2. 3. 3 Microcontrollers and Digital Signal Processors

A microcontroller is a device designed for a wide range of applications. These operations include

memory, I/O, busses and peripherals in addition to a processing unit. Some of the primary architecture

in the embedded microcontroller space include ARM, MIPS, and x86. Most vendors have recently

started offering versions of ARM as embedded cores. The microcontrollers outlined in this project are

using an ARM7TDMI core which views all memory and registers as a single linear array[9].[46]

The Digital Signal Processors (DSP) market is one of the fastest growing sectors in the

computational semiconductor market. These processors focus on the very efficient execution of

arithmetic operations in tight loop-oriented kernels, which contrasts them from general-purpose

microcontrollers. A DSP contains components such as data memory, program memory, I/O and a

computational engine. The BlackFin device outlined in this project has architecture is based upon a 10-

stage RISC MCU/DSP pipeline with a mixed 16-/32-bit Instruction Set Architecture designed for optimal

code density[58].[46, 59]

Page 38: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

22

2. 4 Other In-Vehicle Networks

There are many alternative in-vehicle networks available such as FlexRay, Local Interconnect Network

(LIN) Motorola Interconnect (MI) or Time- Triggered Protocol (TTP). These other networks may be used

as sub networks for CAN, in vehicles. Due to a lack of user experience with these protocols, and their

comparatively high cost, it may be desirable for system developers to continue to use CAN where this is

practical.

2. 4. 1 FlexRay

FlexRay[60] is a high-speed serial communication system for in-vehicle networks developed by FlexRay

Consortium. It uses both synchronous and asynchronous communication and is designed to be faster

than CAN and TTP, but is much more expensive. An Un-shielded Twisted Pair (UTP) or Shielded Twisted

Pair (STP) cable is used and it operates at speeds of up to 20 Mbps. The FlexRay bus defines the Physical

layer and Protocol, under separate specifications. FlexRay is a fault tolerant bus and provides

deterministic data transmission at a Baud-Rate of between 500 kbps to 10 Mbps with a 24 bit CRC.[60,

61]

It has both time- and event-triggered behavior. Electronic systems communicate continually in

pre-defined time slots on a data bus. The Electrical Physical layer uses Non Return to Zero (NRZ) bit

encoding for data communication on an Optical bus. The use of NRZ encoding ensures compact

messages with a minimum number of transitions and high resilience to external disturbance. The

Network Topology used is point-to-point connections via linear passive busses and passive stars up to

active star topologies. Applications for FlexRay include; steer-by-wire and brake-by-wire. However, the

bus has its own disadvantages like lower operating voltage levels and asymmetric of the edges, which

leads to problems in extending the network length. FlexRay might soon be phased out with Ethernet

taking over its place where high speed data transfers are required in vehicles.[60, 61]

2. 4. 2 Local Interconnect Network

Local Interconnect Network, LIN, is used as an in-vehicle communication and networking serial bus

between intelligent sensors and actuators operating at +12 V. The LIN bus is a small and slow network

system that is used as a cheap sub-network of a CAN bus to integrate intelligent sensor devices or

actuators in today’s cars. Recently LIN may be used also over the vehicle's battery power-line with a

special DC-LIN transceiver. Other auto body electronics include air conditioning systems, doors, seats,

column, climate control, switch panel, intelligent wipers, and sunroof actuators.[61, 62]

Page 39: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

23

The LIN specification covers the both the physical and the data link layer, and the transmission

medium. The maximum communication speed on a LIN bus is 19200 baud, a termination pull-up resistor

of 1kΩ is required. A maximum cable length of 40m is possible at this data rate. The LIN bus uses a

Master/Slave approach, having one Master and one or more Slaves. The LIN bus does not need to

resolve bus collisions because only one message is allowed on the bus at a time. LIN is not a full

replacement of the CAN bus. But the LIN bus is a good alternative wherever costs are essential and

speed/bandwidth is not essential.[61, 62]

2. 4. 3 Motorola Interconnect

Motorola Interconnect, MI, is a serial communications interface using a single line from one Master to as

many as 8 slave devices. The MI bus is used to control Smart Switches, Motors, Sensors, and Actuators.

It may also be used as an Automotive Bus to drive Mirrors, Seats, Window lifts or Head light levelers.

The MI interface uses only 1-wire to send and receive data at a data rate of 20 kHz.[61]

The MI bus utilizes a push/pull sequence to transfer data between the master and the slaves.

The Master sends a push field to the slave devices connected to the bus. This field contains data, plus

the address of one of the slaves. The Slave addressed then responds to the received data, transmitting

the Pull field on the MI Bus to the Master. The data returned by the slave is likely to be status bits

representing internal or external information. The MI bus has two states; Dominant (State 0) which is

represented by a maximum of 0.3 V, and Recessive (State 1), which is represented by +5 V. Each device

on the bus has a 10K pull-up resistor to +5 V. A termination resistor of 600 Ω is used to stabilize the

bus.[61]

2. 4. 4 Time-Triggered Protocol

Time- Triggered Protocol, TTP, is a dual channel time- triggered field bus. Redundant communication is

supported due to replicated data on both channels. TTP provides autonomous fault-tolerant message

transport at known times and with minimal jitter. A TDMA (Time-Division Multiple Access) strategy on

replicated communication channels is employed for this purpose. Data communication in TTP is

organized in TDMA rounds which are divided into slots. Each node in the communication system has

one slot to must send frames in every round. A frame size of between 2 to 240 bytes in length is

allocated to a node; each frame can carry several messages. A 24-bit CRC is used to protect the

data.[63]

TTP offers fault-tolerant clock synchronization that establishes the global time base without

relying on a central time server. All nodes have an equivalent time concept as a result of the clock

Page 40: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

24

synchronization. Each node measures the difference between the expected and the observed arrival

time of a correct message. This allows the node to learn about the difference between the sender’s

clock and the receiver’s clock. A fault-tolerant average algorithm is used to periodically calculate a

correction term for the local clock using the measurements each node calculates. As a result the clock is

kept in synchrony with all other clocks of the cluster.[63]

2. 5 Time triggered CAN

TTCAN sits on top of the data link layer CAN protocol. It is a high level protocol. This protocol allows

time-triggered operations as well as event-driven operations to coexist in the same network. TTCAN

was introduced by Bosch in 1999. In vehicles data traffic must usually be both event-triggered and time-

triggered. In order not to have to develop two different network systems the ISO defined the TTCAN

protocol[30]. The aim was to make CAN suitable for the new needs of the automotive industry. The

real-time performance in CAN-based in-vehicle networks increases with the use of TTCAN.[21, 24]

Additional overhead in the CAN frame is not needed for TTCAN. A node called the time master

keeps the network synchronized by regularly broadcasting a reference message. All participants of the

TTCAN network identify the reference message by its identifier. As soon as the SOF is recognized, the

local time unit is synchronized. The individual TTCAN participants are configured to know when to send

their frames after having received the reference frame.[21, 24]

There are two levels of implementation in TTCAN, Level 1 and Level 2. The basic time-triggered

operations over CAN are implemented in Level 1. Level 2 enables high-end synchronization by

maintaining a global system time across the whole network. The sending of CAN frames in Level 1

between two reference frames is triggered by the local time units. A Level 2 TTCAN network may be

implemented in the case of a time difference between nodes; a time-stamp is included in this reference

frame. Due to this after at least two times of receiving the reference frame, the nodes can

resynchronize according to the time-stamp. A synchronization accuracy of 1 µs is achievable as a

result.[21, 24]

Transmission of data is organized in TTCAN as a sequence of basic cycles. The reference

message is the first message at the beginning of each basic cycle. This is followed by a fixed number of

time windows that are configured into one of three types: exclusive windows, arbitration windows, or

free windows. In the exclusive window there is a reservation for a predefined message. Collisions

cannot occur as a result of this. These types of windows are used for safety-critical data, send

Page 41: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

25

deterministically without jitters. Arbitration windows rely on the no destructive CAN arbitration scheme

to resolve collisions that may occur. These windows are not predefined with any message. The free

windows are reserved for future expansion. Figure 2-8 shows an example of a basic cycle containing

each of the defined windows.[21, 24]

Figure 2-8 System matrix in TTCAN [21]

Forestry vehicles or forklifts often use the higher-layer protocol CANopen. Therefore the

CANopen standard has intentions of implementing TTCAN-specific objects. CANopen networks are used

also for separate vehicle add-ons such as firefighting equipment and cranes. CiA members are working

on extending the CANopen standard to enable the usage of TTCAN hardware within CANopen

networks.[21]

For implementation of TTCAN on the current CAN chips only small inexpensive changes are

necessary. Transmit and receive triggers are necessary along with a counter to manage the cycle time

for ensuring time-triggered operations. Level 1 can be either implemented in software or in hardware

to reduce the burden on the processor. Modified hardware is necessary for Level 2 due to the fact that

the controllers should allow drift correction and calibration of local time.[24]

Two additional blocks are needed in the modules: the trigger memory and the frame

synchronization entity. Time trigger communications are controlled by the frame synchronization

entity. The time marks of the system matrix are linked to the message buffers and held in the

controller’s memory, the trigger memory is used for this purpose.[24]

Page 42: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

26

2. 6 Discussion

A good knowledge of CAN is needed to implement this project. The controller area network (CAN)

protocol was originally introduced for automotive applications but is now also widely used in industrial

areas such as automation and instrumentation. CAN is easy to use and provides more hardware support

for error detection and error recovery. As a consequence of its popularity and widespread use, most

modern microcontroller families now have one or more members with on-chip hardware support for

this protocol, such as the BlackFin BF548 ezkit used in this project. As a result CAN networks can now be

implemented at very low cost.[40, 56, 57]

The most important feature of CAN from the real-time perspective is its predictable behaviour.

CAN provides means for prioritized control of the transmission medium. By using an arbitration

mechanism it is guaranteed that the highest priority message that enters arbitration will be transmitted

first. This makes CAN a very robust and reliable network.[20, 24, 34]

Applicable knowledge as regards microcontrollers and DSPs is also necessary for this project as

microcontrollers will be an integral part of the project. The need for knowledge of their functionality

and capabilities is very important. Knowledge of DSPs is relevant when work gets underway with the

BlackFin device in this project. Since this is a CAN based system with embedded microcontrollers and

such devices it is also necessary to understand the real time constraints associated with these devices.

The CAN messages sent across the network have real-time constraints. The corresponding motor

movement is needed within these time constraints.[20, 46]

Recently FlexRay and TTP have come to the fore. These protocols have been designed such that

all the deficiencies outlined in CAN are amended. These protocols both feature the ability to

communicate over multiple redundant communication channels, integrated bus guardians; and support

for hard time-triggered communications. Although either of these protocols may be used as the

backbone in automotive networks CAN still remains an attractive option for low-cost embedded

systems.[57]

One of the main limitations of CAN is lack of support for time-triggered communications.

Although CAN was primarily intended to support event-triggered communications between

unsynchronized nodes, time-triggered communication may be enforced. A number of hardware- and

software-based protocol extensions and modifications have been proposed to enable time-triggered

communications on CAN. These extensions tend to rely on the use of a global clock that supports a time

Page 43: Control of a Robotic Arm over CAN using a touch screen user interface

Literature Review

27

division multiple access (TDMA) message schedule. The time-triggered CAN (TTCAN) protocol uses a

synchronization method to provide time-triggered operation of CAN at the hardware level. This allows

the combination of both event- and time-triggered messages.[24, 57]

Even though CAN has been around for over 20 years it is still at the beginning of global market

penetration. This bus system could grow for a further ten years. The US and Far Eastern car

manufactures are only starting to use CAN in the serial production of their vehicles in recent years. In

addition, CAN is not just being used in passenger cars but also in domestic appliances and in

entertainment.[21]

Higher layer protocols will have several enhancements regarding the approval for different

safety-relevant and safety-critical applications. A German professional association has already approved

CANopen-Safety. Overall CAN will remain the backbone network in many vehicles for some years to

come and with CAN being used in industries such as instrumentation and automation it is possible there

will be a lot more CAN networks implemented in everyday electronics. The enhancements available for

CAN mean that it will become more widely used and due to the compatibility of these enhancements

with the original CAN it will remain in technology.

Page 44: Control of a Robotic Arm over CAN using a touch screen user interface

28

Blank Page

Page 45: Control of a Robotic Arm over CAN using a touch screen user interface

Methodology

29

Chapter 3 Methodology

The project involved two phases, one for each of the CAN nodes on the bus. Each phase was worked on

separately until both were functioning correctly as separate units. There was both a hardware and

software component in this project. A software engineering model called the Rational Unified

Process[64] was used as a guideline for the implementation of this project. This life cycle model has four

phases and was initially developed as a guide to UML (Unified Modeling Language)[65]. UML is a way of

visualizing the behaviors and interactions of an object oriented system. Although UML is generally used

for object oriented systems and the system in this project is a functional system, the diagrams still

provide a useful visual aid. Use cases, a sequence diagram and a state diagram are used in Chapter 5,

the Software Design Chapter, all of which are UML diagrams. Class diagrams will not be used.

The four phases of this model are Inception, Elaboration, Construction and Transition. In the

Inception phase the project plan is put in place and the objectives of the project are outlined. The

system is developed and risks are addressed in the Elaboration phase. The Construction phase involved

the development, testing and integration of components. Finally the Transition stage is the deployment

of the project; the beta release.[64]

3. 1 Inception

The author was an integral part of the team that developed a previous demonstration with Analog

Devices which was to showcase the pre-release of the new CAN transceiver, ADM3053. It had been a

success and there was a lot of interest in the part. Analog needed a new and improved demonstration

for the Embedded World Exhibition & Conference[66], in Germany, in February 2012.

The previous demonstration involved a Robotic Arm with DC motors and a PC application to

control their movement[67]. It was a trivial Robotic Arm, which required the GPIO, general purpose

input output pins, to be connected a motor driver and then to the DC motors. This required one port to

turn the motor in a clockwise direction and another port to turn the motor in the opposite direction.

A new demonstration which was very accurate and easily set up was required. Analog required

a demonstration where something other than a laptop would be used to control the more accurate

Robotic Arm. This is where the idea for this project manifested. Research was carried out into the

possible motors that could be used in a Robotic Arm. The Arm had to be easily obtained also, which is

Page 46: Control of a Robotic Arm over CAN using a touch screen user interface

Methodology

30

why a Robotic Arm with servo motor control was chosen. Although stepper motors are generally used

for Robotic Arms due to their fail safes and accuracy[68], servo motors were quite good also.

3. 2 Elaboration

Research was done into the many Robotic Arms available; using dc motors, servo motors or stepper

motors were some of the options. After researching the benefits of each of the motors and finding a

Robotic Arm[69] that was easily available using these motors, an arm was chosen.

A microcontroller which had the functions to operate the servo motors had to be chosen. This

would require the analysis of various microcontrollers and testing of their functionalities until a solution

was found. The microcontroller had to communicate with the CAN controller being used also; this was

another requirement that was fulfilled when choosing the microcontroller. Various microcontrollers

were evaluated to decide which would fit the project the best and give the correct functionality for each

part. A discussion into the microcontrollers needed functionality is given in section 4. 3 .

There was a touch screen device available from Analog Devices which ran the µClinux[70]

operating system and had build in CAN functionality. The CAN node which was required to control the

movement of the Robotic Arm needed to have this CAN functionality and a touch screen interface was

required to operate the control application. This is why the BlackFin device was chosen.

3. 3 Construction

The hardware design and implementation of the Robotic Arm control board was the first step in this

project. This step also involved some low level programming to tell each of the components their

required functionalities. Firstly connecting the CAN controller to the microcontroller was implemented.

Then controlling the servo motors was put into operation. Finally the network was developed and

tested. Each of these stages were tested and validated before moving on to the next step. It involved a

lot of unit testing, design and analysis before each iteration of implementation got underway. Finally

when this CAN node was operational as required in the project outline the hardware for the board could

be developed and manufactured. 0 deals with the Hardware Design and Implementation of the project.

The Software Design and Implementation was needed for the BlackFin devices. For this node,

again there was many iterations in which design, analysis, implementation and testing was done to

validate the working solution. The research into this device and the background knowledge needed to

be gained before any implementation could get underway. The kernel functionality and interfaces had

Page 47: Control of a Robotic Arm over CAN using a touch screen user interface

Methodology

31

to be decided upon before building the application. This meant that when the application was

developed it could be loaded to the board and it would work correctly because the correct version of

the kernel was running. Finally the design of the application could be worked on once all the necessary

functionality was operational.

3. 4 Transition

The Embedded World Conference & Exhibition[66] in February 2012 was the beta release of this

showcase demonstration. This proved to be a success. There was a good response for the part and the

demonstration will be used again. Analog are going to use the demonstration in the United States of

America in April for another exhibition. They also intend to bring the demonstration with them on

customer visits to demonstrate the use of the part in a quick and portable solution.

A training day in Analog also took place in March in which the demonstration was used. New

graduates and interns will also be able to avail of this project. It will be useful for them to learn the

workings of each of the components and how they interact. The main core of the demonstration model

has been developed by the author and there is the possibility of adapting other functionality for future

demonstrations.

Page 48: Control of a Robotic Arm over CAN using a touch screen user interface

32

Blank Page

Page 49: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

33

Chapter 4 Hardware Design & Implementation

This section reviews the schematics that have to be drawn to manufacture the Robotic Arm control

board. A few comments are given on the need for separate voltage supplies due to the servo motors

drawing high current and the need for decoupling capacitors and terminating resistors. An outline of

how each CAN node has to be set up is given along with the microcontroller chosen and how the

different functionality was implemented. This includes communication between the microcontroller

and the CAN controller over the SPI peripherals and the controlling of the servo motors using the PWM

outputs.

4. 1 Board Layout/ Schematic

Before any work could get underway with creating the final board (shown in Figure 4-1 and Figure 4-2),

the hardware had to be tested, and functioning correctly. Once the hardware was operating as required

the schematic could be finalized. The program PADS[71] was used to create the schematic for this

project. The schematic of the microcontroller evaluation boards could be obtained either on the Analog

website or by contacting one of the employees related to the part. The part could then be saved to the

library and drawn into the current project.

Figure 4-1 Manufactured board (front)

Page 50: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

34

The CAN controller, MCP2515, and CAN transceiver, ADM3053, had to be drawn up by creating

a new part and setting its package type and naming all the connections. To create a new part for the

schematic first the part package had to be defined. Then the size of the part and the number of pins had

to be selected. Next the pins could be labeled in the correct order and their labels could be inserted.

Finally the part decals would be selected and then the part could be saved to the library.

Once the parts were all available in the library, the drawing of the schematic could get

underway. The voltages and grounds needed to be connected and the multiple nets needed to be tied

together respectively. There were different nets for the 5 V sources, 3.3 V sources and the 6 V source.

There was also different ground nets, the ADM3053 had an isolated ground which couldn’t be

connected to main ground net. Then the connections being used by all the parts had to be included.

The microcontroller was connected to the CAN controller over the SPI peripherals so these

connections had to be included; the microcontroller being the master, and the CAN controller being the

slave. The MOSI pin on the microcontroller had to be connected to the SI pin on the CAN controller,

similarly the MISO pin on the microcontroller had to be connected to the SO pin on the CAN controller.

There is a CLK (clock), a CS (Chip select), and an IRQ (interrupt) on both the microcontroller and the CAN

controller which had to be connected also. There isn’t a set pin on the microcontroller for the CAN

interrupt so one of the external interrupt pins on the microcontroller was used for this. The CAN also

had a CANRESET pin which needed to be connected to the microcontroller for a software reset. It was

set up so that the CAN controller could be reset using either software or hardware. This was done by

connecting the CANRESET pin to a GPIO pin on the microcontroller and connecting a 47 µF capacitor to

ground on the microcontroller side, the calculation for this value will be given in section 4.7.3. An

oscillator had to be connected to the CAN controller which set the clock speed of that device. This was

connected between the OSC1 and OSC2 pins. Two capacitors were also connected on these pins to

ground.

The CAN controller was connected to the CAN transceiver by connecting the RX and TX pins on

both devices. Both devices, the controller and transceiver, needed extra resistors and capacitors, these

values could be calculated using the data sheets of both parts. The CAN transceiver had to have

terminating resistors because at the end of each node there was the need for this so as to avoid

reflection of the frames if no node on the bus took in the frame. Bypass capacitors are required for

several operating frequencies. These capacitors are connected between GND1 and VIO, for VIO and VCC

Page 51: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

35

and GND1 for VCC. The VISOIN and VISOOUT capacitors are connected between GND2 and VISOOUT and

VISOIN and GND2. The capacitor values used are 0.1 µF and 10 µF.

The microcontroller also had connections for JTAG and UART support, as well as a reset button

and a serial download button if the serial port was used to download code instead of the JTAG interface.

Shown below in Figure 4-2 is the JTAG connector used to download code to the evaluation board and

debug the code as it is running. The UART is the serial port there are four pins that need to be

connected here. The Serial In (SI), Serial Out (SO), the request to send (RTS) and the clear to send (CTS).

Because there are two microcontrollers a lot of the schematic had to be repeated, care had to be taken

to make sure each had labels to distinguish which microcontroller was connected where. A 32.768 kHz

oscillator was connected between XCLKO and XCLKI, and two capacitors of 12 pF were also connected

on these pins and pulled to ground.

Figure 4-2 JTAG connector

The servo motors needed special connections. A three pin header for each servo motor was

needed. The servos needed one pin connected to ground, one to 6 V and finally the signal which gave

them their position. This signal wire was connected to the PWM pins on the microcontroller. On both

microcontrollers the high PWM pins were connected to the 6 servo motors respectively, i.e. PWM1,

PWM3 and PWM5 on both microcontrollers.

Two regulators were needed. One regulated the power supply down to 3.3 V for the

microcontrollers; CAN controllers and the bus side of the CAN transceivers. The isolated side of the

Page 52: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

36

CAN transceiver needed a 5 V supply which was regulated down by the second regulator. The 6 V supply

for the servo motors has been given two options. A second power supply can be used which will be

connected straight to the servo motors, or the same power supply can be used for the whole board in

which case a jumper can be moved to let the 6 V supply come from this common supply.

Once all of these parts needed were included the properties of all the parts had to be included

in the schematic. This was so as to create a Bill of Materials (BOM) from the schematic. This

information included the part name, part manufacturer, manufacturer number, stock code number from

Farnell1; the value of the part, the tolerance and the package type. All of these properties also

depended on what kind of part it was; some parts didn’t need all of this information.

A copy of the schematic can be seen in Appendix A. This can be compared against the Figures 4-

1 and 4-2 to see where the components were laid out. In the schematic there is a different sheet for

each microcontroller, one for the CAN controller and CAN transceivers, one for the regulators used and

finally one for the servo motor connectors. Dimensions of the board were given to the layout team so

that the board would fit where was specified on the Robotic Arm.

Figure 4-3 Manufactured board (back)

1 Farnell is the supplier Analog Devices used to order components from

Page 53: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

37

4. 2 The structure of CAN bus nodes

Each node on the network must contain a microcontroller connected to a CAN controller and then to a

CAN transceiver before the node can be connected to the physical CAN bus. This can be seen in Figure

4-4. In section 1, the project set up diagram shows this same configuration for CAN nodes, Figure 1-1.

CAN is a bus topology, this means that each node can be connected by a drop out cable to the main bus.

Up to 127 nodes can be connected on the bus. The maximum bus length is 40m and the maximum data

rate is 1 Mbps. The maximum number of nodes can be increased if the data rate is reduced.

4-4 the structure of a CAN node[72]

4. 3 Choosing a microcontroller

This project is based around a showcase demonstration for Analog therefore the choice of

microcontrollers is limited to Analog parts. Due to these constraints there was implementation work

done with three different microcontrollers to find which suited the scope of the project the best. The

first microcontroller chosen was the ADuC7026[73]. This microcontroller provided control to the servo

motors over PWM channels. Unfortunately the channels were paired so as to be used with H-bridge

motors; on each pair one channel was the inversion of the other. This couldn’t be changed to use each

channel separately. The next step was to choose a shift register which could create six signals for each

of the motors. This proved to be a very challenging task as an interrupt had to take place every 100 µs

so as to shift out 1 out of 200 bits for each signal to the servo. The microcontroller wasn’t capable of

doing this and sending receiving messages over the CAN line so a new microcontroller had to be decided

upon. Below Figure 4-5 shows an example of the ADuC7026 evaluation board used.

Page 54: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

38

Figure 4-5 ADuC7026 evaluation board

The ADuC7060[74] and ADuC7128 were the next microcontrollers looked at. Both of these

microcontrollers offered functionality for a 6-channel PWM, like the ADuc7026 the channels were in

pairs to operate H-bridge motors, but they had the functionality to set up the PWM in normal mode.

This meant that all 6 servo motors could be controlled at the same time. Unfortunately the ADuC7128

was not available at the time so both microcontroller evaluation boards were ordered. Testing got

underway with the ADuC7060 evaluation board. The one draw-back with using this processor was that

the operating voltage was 2.5 V and a 3.3 V processor was required due to the CAN controller and

transceiver needing the same signal level for the CAN signals, i.e. 3.3 V. The CAN signals had to remain

the same across the network so having a 2.5 V microcontroller would cause the signal levels to be

incorrect and errors would occur when sending messages across the CAN network. Level translators

could be used to rectify this problem so work was carried out on the ADuC7060. Figure 4-6 shows an

example of the ADuC7060 evaluation board.

Page 55: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

39

Figure 4-6 ADuC7060 evaluation board

When the ADuC7128 evaluation board was available work had be carried out on this board

without the need for extra components such as level translators. The ADuC7060 and the ADuC7128 had

similar functionality and the source code for them was pretty similar so it was good practice to program

the ADuC7060 while the other board was being dispatched. The signals on the SPI could be tested and

the PWM outputs could be tested. Thus the ADuC7128 evaluation board would be easier to program

and all the similar calculations would have been done already.

Similar to the ADuC7026 there was a problem with the PWM outputs. The channels were in

pairs; one low and one high channel. The length of the period, 20ms, could be set up for each pair and

the length of the pulse could be set up for each channel separately however when the high channel

went low the low channel automatically went low simultaneously. This meant that there could not be a

longer pulse length on the low channel therefore two microcontrollers had to be used together to get

six unique outputs on the PWM pins simultaneously.

Finally it was decided that two ADuC7128 microcontrollers would be needed to control the

Robotic Arm. Two ADuC7128 evaluation boards were then set up; each were loaded with similar

versions of the original code, but modified for the three high channels on each microcontroller to

Page 56: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

40

communicate each with three different motors. By having two microcontrollers this meant that another

two extra components had to be added. Each node on the CAN network needed a microcontroller

connected to a CAN controller and then connected to a CAN transceiver before the node could be

connected to the bus. This set up can be seen in Figure 4-7. Here there are two microcontroller boards,

the CAN controllers are soldered to the bottom of these boards and the CAN transceivers are connected

then to these controllers. The CAN transceiver evaluation boards are used for this set-up.

4-7 ADuC7128 set-up

Page 57: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

41

4. 4 Setting up the microcontroller functionality

The functionality of each microcontroller has to be understood in great detail before programming can

take place. Different registers are specified for different functional blocks in the microcontroller. The

calculations and equations used to set up registers are explained here. There is a header file called

mcp2515.h where masks are set so that instead of calculating the hex value to be written to each

register, the bits can be set by simply writing for example, SPICON = bit0 + bit1 + bit6 + bit8 + bit9 +

bit12, instead of writing SPICON = 0x1343. In this header file the function prototypes are declared, the

global variables and the robot definitions. These definitions included message ID’s, the home positions

for each motor and the PWM frequency value. This header file can be found in Appendix B along with

the rest of the code used in the Robotic Arm control board.

4. 4. 1 Serial Peripheral Interface

There are 4 GPIO port pins specified for SPI functionality: SPICLK, /CS, MOSI, MISO. These are the 4 pins

that will be connected to the CAN controller. Port 1 is the GPIO port that can be set up to have SPI

functionality. The GP1CON is the control register for this port. It is set up to have a value of

0x11112222. This sets the port to have UART functionality for pins P1.0 – P1.3 and SPI functionality for

pins P1.4 – P1.7. To set up the microcontroller to have the correct SPI functionality the SPICON control

register needs to be initialized.

SPICON is the SPI control register. In this register the SPI enable bit has to be set, along with the

bit to enable master mode, i.e. the microcontroller is set up to be the master, the CAN controller will be

the slave. The serial clock is set to idle low and the phase is set up so that the clock pulses at the end of

each serial bit transfer. Both of these bits are left clear to allow these settings. The LSB first transfer

enable bit is cleared to allow the transmission of the MSB first. Then the master mode bit is set to allow

the user to initiate transfer with a write to SPITX, the transmit register. The SPITX underflow bit is

cleared to allow the previous data to be retransmitted if the register is under flowing. If new data is

coming into the SPIRX register it overwrites the old data, this is done by setting the SPIRX overflow

overwrite enable bit. The slave select input enable bit is set to enable the output. Loopback mode is

disabled along with slave output enable. Finally the continuous transfer enable bit is set. This bit allows

transfer to continue until no valid data is available in the SPITX register. The Chip select is asserted and

remains low for the duration of each 8-bit transfer until the SPITX is empty. The value that is written to

the SPICON as a result of these settings is 0x1343.

Page 58: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

42

The SPI peripherals are set up to have a 125 kHz clock which is created through the internal

clock by Equation 4-1 shown below:

)1(2 SPIDIV

ff UCLK

KSERIALCLOC+

= Equation 4-1

The core clock is 10.44MHz and the clock for the SPI is 125 kHz so manipulating the formula a value of

40 is obtained; this is change to hexadecimal format, i.e. 0x28. The SPIDIV register is assigned this value

and the clock is then set up to run at 125 kHz.

Figure 4-8 shows the signals on each pin of the SPI peripherals for a CAN_RESET command. This

is a transmission over the SPI so the MISO, Master In Slave Out line will stay high as it is not in use. The

MOSI, Master Out Slave In is the data out, this is a value of 0xC0 which will be written to the SPITX, the

transmit register, to provide a reset of the CAN controller. The clock can be seen also in this Figure

along with the Chip Select, CS, which is active low. This means that when this pin is brought low,

transmission can get underway. Further information about setting up the CAN controller can be found

in section 4.7.

4-8 The CAN_RESET command

Page 59: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

43

4. 4. 2 Pulse Width Modulation

PWM is short for pulse width modulation in this project it is used to control servo motors which are

connected to a Robotic Arm. There is a PWM control register, PWMCON1, which needs to be set up to

initialize the PWM interface. Firstly the bit has to be set to enable the PWM outputs; then a bit is set to

operate the PWM in standard mode. Finally the clock prescaler bits are set so as to divide the internal

clock down by multiples of 2x, up to 2

8, i.e. 256. A value of 64 is used to divide down the internal clock

to 640, 000 Hz. This means bits 6 and 8 are set in the control register. This gives a value of 0x141 to be

written to the PWMCON1 register. This new clock value defines how precise the timing of the PWM

signals will be.

The PWM pins are arranged as three pairs of outputs; there is one register for each pair to set

up the period of the PWM. There are three other registers for each pair to set up the time at which the

output goes high or low. A count value is used for this purpose, when the timer runs out the waveform

will either go high or low depending on which register value runs out. The length registers, PMW1LEN,

PWM2LEN and PWM3LEN, are set up to have a 20ms period. A 20 ms period means a 50 Hz clock, to

work out what value goes in to the 16-bit length Equation 4-2 is used as follows:

320001280050

640000x

cyPWMfrequen

PWMclock=== Equation 4-2

Each PWM pair has PWMxCOM1, PWMxCOM2, PWMxCOM3 registers to control the time at

which the lines go high or low ( x being 1, 2 or 3 for each pair). Taking pair 1 as an example, this includes

the PWM1 and PWM2 pins. The low side waveform PWM2, goes high when the timer runs out in the

PWM1LEN register, it goes low when the count value in PWM1COM3 runs out or when the high-side

waveform goes low, this puts a restriction on the values being sent to the servo motors; as a result two

ADuc7128 microcontrollers are used. The high side of the PWM pins is used on both microcontrollers.

For the high-side waveform, PWM1 goes high when the count value in PWM1COM1 runs out and it goes

low when the count value in PWM1COM2 runs out. The remaining two pairs operate under the same

conditions. Calculating the value to write to these registers is done using Equation 4-2.

Page 60: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

44

4. 5 Power Control

A 32.768 kHz oscillator was connected to the XCLKO and XCLKI pins on the microcontroller. The PLL is

set up to have its default conditions, which provides a stable 41.78 MHz internal clock for the system.

To set up this in the code the PLLKEY1 register has to have 0xAA written to it first then the PLLCON, PLL

control register, can have the data 0x01 written to it. This value sets up the PLL to use its default

configuration; and to say that and external 32 kHz crystal is being used. Final the register PLLKEY2 needs

to have 0x55 written to it. Any time the PLL control register needs to be written to the PLLKEY1 and

PLLKEY2 registers must be written to with those values before and after the PLLCON.

The ADuC7128 has a choice of operation modes available to it. This has to be set up on start up

so that the microcontroller is in normal operation mode. Similarly to the PLL control register the power

control register (POWCON) needs to have two registers written to before and after the POWCON is

written to. These registers are POWKEY1 and POWKEY2, the values 0x01 and 0xF4 must be written to

these respectively. The ADuC7128 can be set up to be in active mode, pause mode, sleep mode, nap

mode or stop mode. The internal lock can be changed by using the last two bits in this register as clock

divider bits. In this project is it set up to have a value of 10.444800 MHz, thus the value written to

POWCON is 0x02 also setting up the microcontroller in active mode.

4. 6 Testing the network

At each stage of implementation with the microcontrollers mentioned testing had to be done to make

sure that the CAN lines were working. Using an oscilloscope the signals on the different pins of the

microcontroller, CAN controller and CAN transceiver could be checked to make sure the correct CAN

messages were being sent through the network. There is also an existing PC application which was

modified and the board for this was connected to the control board for the Robotic Arm. Here the

messages would be displayed on the PC and helped in the debugging processes. Further information on

testing can be found in Chapter 7.

Page 61: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

45

4. 7 Setting up the CAN controller

The CAN controller used was the MCP2515 by Microchip[10]. Appendix B contains the source code for

“SPI.c” which has the code for setting up the SPI functionality. The CAN controller communicates with

the microcontroller over SPI. The CAN bit rate has to be set up along with routines to send and receive

the CAN frames. Special functions have to be set up for different types of reads and for writing values

into the required registers from the microcontroller. The Standard CAN frame is used in this project so

this has to be set up in the code. Message transmission and reception buffers have to be set up with

initial values. Everything that has been set up to initialize and operate this device correctly is done by

following the information given in the data sheet for this part.

4. 7. 1 Power up and resetting the CAN controller

The CAN controller must be initialized before activation. This can only be done in Configuration mode,

which is automatically selected after power-up. The set up of all the buffers, etc, must be done here.

Once this is complete normal mode can be entered for standard operation.

An oscillator startup timer (OST) is utilized in the CAN controller. The OST holds the controller in

reset to ensure that the oscillator has stabilized before the internal state machine begins to operate. It

maintains reset for the first 128 OSC1 clock cycles after power-up. To allow for this a delay of 192 µs is

set up in the can_reset() function, Appendix B, before anything is initialized in the controller.

As was mentioned previously in section 4.1 the CAN controller can be either reset using

hardware or software. The CANRESET pin is connected to a GPIO pin in the microcontroller which has

an internal pull-up resistor of 100 kΩ. For a correct reset on this device, it must be held in reset for a

minimum of 2 µs. The following Equation 4-3 is used:

RC

1=τ Equation 4-3

By equating τ > 2 µs and R = 100 kΩ it works out that C < 50 µF. A value of 47 µF is chosen as the

capacitor value as a result. The capacitor is then connected to the port pin and then to ground allowing

the device to be reset using hardware where possible.

Page 62: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

46

4. 7. 2 The bit rate calculation

To set up the CAN bit rate the CAN bit time must first be understood. Each bit time contains a

synchronization segment a propagation segment and two phase segments. The sample point must lie

between the two phase segments, shown in Figure 2-2. The CAN bit time is measured in time quanta

(TQ). The TQ is calculated as follows, Equation 4-4:

OSC

OSCF

BRPTBRPTQ

)(2))((2 == Equation 4-4

The BRP is the Baud Rate Prescaler which is used to scale the TQ.

bit

bitt

fNBR1

== Equation 4-5

The nominal bit rate is the number of bits transmitted per second. This calculation is shown in Equation

4-5. The Nominal Bit Time is made up of the time of each segment, shown in Equation 4-6.

21Pr PSPSopSegSyncSegbit ttttt +++= Equation 4-6

Three registers are used to set up the CAN bit rate. The first register, CNF1 (Configuration 1) sets up the

Synchronization jump widths and the Baud Rate Prescaler. The next register, CNF2, sets up the Bit Time

Length, the sample point configuration and the length of the propagation segment and the phase

segment 1. CNF3 sets up the Start of frame signal, the wake up filter and the length of the phase

segment 2.

The CAN controller is set up to have 86000 baud rate. CNF1 is set up with a synchronization

jump width of 1TQ and the TQ is set up as 16/FOSC. FOSC is 11.0592 MHz, the crystal which is used with

the board. This means that the TQ = 1.45 µs, from Equation 4-4. CNF2 sets up that the bus lines are

sampled once at the sample point. The length of PS1 is set up to be 3TQ, and the length of the

propagation segment is set up to be 1TQ. CNF3 sets up PS2 to be 3TQ, disables the Wake-up filter and

enables the CLKOUT pin for clock functionality. The tBIT is then calculated using equation 4-3 => tBIT =

8TQ = 11.57 µs. Finally using equation 4-5 the Nominal Bit Rate equates to be roughly 86000 bps.

Page 63: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

47

4. 7. 3 Message transmission & reception

The CAN controller has three transmit buffers. Prior to sending the Start of Frame bit the priority of

each of these buffers is compared, the buffer with the highest priority will be sent first. To initiate

transmission first the register has to be written to via the SPI command, then the SPI request to send

command is sent and finally the transmit buffer request to send pin is put low for the particular buffer

that is being transmitted.

There are two receiver buffers available in the CAN controller, RXB0 and RXB1. RXB0 is the

higher priority buffer. It can be set up so that RXB0 messages will rollover and be written to RXB1 is

RXB0 is full.

4. 8 The Servo Motors

A servo motor can typically turn between an angle of 0° and 180°. This is sufficient for the

Robotic Arm control. To operate a servo motor a signal, voltage and ground wire must be used. The

servo motor can operate generally between 4.8V - 6V. The signal level for the servo is usually around

3.5V - 5V. This is the level of the signal fed into the servo motors by the microcontroller. This signal

must have a period of 20ms - 30ms; 20ms will be used for this project. The pulse width of the signal

determines the command position of the servo motor. The rough guideline is that 90° is generally

1.5ms, a duty cycle of 7.5%.

Setting up the different PWM outputs to give certain pulse width’s by changing the duty cycle is

possible to move the servo motor to the required position. See Figure 4-9 below, this shows some

positions the servo motor would obtain with the specified parameters. The first signal shown in the

diagram is the servo motor moving in a counter clock-wise direction (CCW), the pulse width here is

roughly between 1.7 – 2.0ms. The second signal shows the servo motor moving in a clock-wise direction

(CW), here a pulse with of 0.7 – 1.0ms is roughly needed for this movement. The final signal is of the

servo motor staying in the center position or returning to the center position, this has an approximate

pulse width of 1.5ms.

Page 64: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

48

Figure 4-9 Servo motor control[75]

One must also take into account although that these positions may not be accurate as each

servo motor is different and tests have to be carried out as to where the home position is, especially as

regards the servo motors on the Robotic Arm. The home position on each motor will be chosen

differently resulting in the arm having the shoulder to elbow in an upright position and the rest of the

arm parallel to the table. Also note that the head of the servo motor can be taken off and the center

position changed; be it purposefully or accidentally. Different servo motors may also move in the

opposite direction as shown in Figure 4-8 with the corresponding pulse width.

As can be seen in table 4-1, the different PWM values are given for the different servo motors

on the Robotic Arm. The motors are split up into either microcontroller 1 or microcontroller 2. The 3

high-side PWM outputs are used to control the servo motors on the Robotic Arm. The CAN message ID

is different for each motor so as to distinguish which motor will move to the given position determined

by the pulse width. As mentioned above about the pulse width corresponding to a direction, by

increasing the pulse width the servo motor will move in a clockwise direction and in an anti-clockwise

direction if the pulse width is decreased (up, down, open and close are more precise directions for these

motors due to their position in the arm). The home position PWM values are also given in the table

below.

There will be an outline of the different CAN messages implemented by the microcontroller

code in Appendix D. Here it can be seen that the motors can be given a position to move to, move to

the max clockwise position or anti-clockwise position or to move to the home position. The control

Page 65: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

49

application then has the option to use any of these commands. The microcontroller code can also

report the motor positions back to the control application and reply to a version request. All of this

functionality is outlined in Appendix D.

Table 4-1 servo motor CAN commands

Motor

Micro

controlling

the motor

Last CAN

message ID

byte

Increasing

Duty

Cycle/PWM

value

Decreasing

Duty

Cycle/PWM

value

Home PWM

value (CAN

data, byte 0,

byte 1)*

Home

PWM

value*

Base Micro 2 9 Clockwise Counter-

clockwise 00 08

0x400

(1024)

Shoulder Micro 1 A Back Forward 57 07 0x3D7

(983)

Elbow Micro 2 B Down Up 2E 07 0x3AE

(942)

Wrist Micro 1 C Up Down 58 07 0x3D8

(984)

Gripper Micro 2 D Close Open 00 08 0x400

(1024)

Wrist

Rotate Micro 1 E Clockwise

Counter-

clockwise 0E 07

0x38E

(910)

*Micro code uses 0x3C0 initially, then 0x400 (in response to home position commands), for home position

for all motors

4. 9 Discussion

The hardware design and implementation involved a lot of research testing and knowledge obtained

through evaluation of components. The restrictions based on the project due to the fact that all

components had to be Analog Devices lead to evaluating different microcontrollers to find the best fit

for the project. This took some time but provided to be a valuable learning experience. Knowledge

about the workings of microcontrollers and their functionalities was gained. The compatibilities

between components were also valuable pieces of information.

In addition to knowledge of the microcontrollers many functionalities and the required set up to

have these functioning correctly, was the knowledge needed about the network. This network involved

knowledge about how the signals propagated over the CAN bus and how the frames had to be sent. An

understanding of the arbitration process and priority of the frames was needed also. The setting up of

the CAN controller helped gain understanding of how the different messages had to be sent in certain

Page 66: Control of a Robotic Arm over CAN using a touch screen user interface

Hardware Design & Implementation

50

order over the SPI interface to get the correct response from the CAN controller. A good understanding

of the MCP2515 data sheet was imperative for this. Another essential part of the hardware design and

implementation was making sure that the same signal level was maintained on the bus (i.e. 3.3V). This

was crucial so as not to incur any unnecessary errors.

Finally knowledge about the operation of servo motors was needed so as to create the correct

pulse width required to operate these at their highest precision. The microcontroller had to be

implemented in such a way as to allow different messages to be sent to the PWM to control the servo

motors, while also sending a report of the motors’ position automatically back to the control program

after a command was sent across the network.

Overall valuable knowledge was obtained about all the hardware components needed in this

project. The knowledge gained here was useful in the implementation of the remainder of the project.

Having the correct information about the network and other components proved to be helpful when

programming the control application.

Page 67: Control of a Robotic Arm over CAN using a touch screen user interface

Software Design

51

Chapter 5 Software Design

To begin work on the control application for the Robotic Arm first some use cases had to be outlined.

Use cases are behavioral diagrams of the system. The graphical user interface was the main part in the

software design. The screen had to be laid out in such a way as to implicitly tell the user how to operate

the controls. That is, the use of user friendly controls and ergonomics of the design had to be examined

about. The different designs used are shown. A state machine was implemented to transition between

the different screens. The layout of the screens is discussed, mentioning the positions of the buttons on

the screens and the co-ordinates that they represent when the screen is touched. Finally there is a

sequence diagram to show the interactions of the application.

5. 1 Use cases

There are three use cases for this application. When the application starts up it transitions

automatically into the main menu, here there is a choice to enter the control screen, routines screen or

information screen. This is shown by use case 1, Figure 5-1. Use case 2, shown in Figure 5-2, is when

the control screen is active. The user can choose to move a particular motor in this use case. Finally in

use case 3, for the routines screen, the user can choose to play one of two prerecorded routines or

move the Robotic Arm back to the home position, Figure 5-3.

Figure 5-1 Use Case 1: the main menu

Page 68: Control of a Robotic Arm over CAN using a touch screen user interface

Software Design

52

Figure 5-2 Use Case 2: the control screen

Figure 5-3 Use case 3: the routines screen

Page 69: Control of a Robotic Arm over CAN using a touch screen user interface

Software Design

53

5. 2 Graphical User Interface

The first design chosen for the application can be seen in Figure 5-4. Here four of the five available

screens can be seen. First the welcome screen will be displayed when the application is run; then this

will transition automatically into the main menu screen. The choice of manually controlling the robot or

playing routines can be chosen here. There is also an information screen which will just display some

details about the application. The screens for robot control or playing routines are also shown and in

both of these screens there is the option to transition back to the main menu.

Figure 5-4 Original application design

There is a choice of 6 motors that the user can operate, each having two directions. Once any of

these buttons are pressed the corresponding motor on the Robotic Arm will move in the direction

specified for a certain time. This can be seen in the control screen, where the option to play a routine or

move the arm back to the home position is available. The state machine implemented to transition

between the screens in this application is shown below in Figure 5-5.

Page 70: Control of a Robotic Arm over CAN using a touch screen user interface

Software Design

54

Figure 5-5 the state machine

Once the application was tested it was decided that the buttons on the control screen were a bit

too close together. A second design was implemented for this screen to make it more user-friendly.

This screen proved more ergonomically designed and was easier to operate as a result. The images for

the new application can be seen in Figure 5-6. In this design the welcome screen has been changed to

suit the Final Year Project demonstrations. The main menu screen remains unchanged and the routines

screen now has an added routine that can be played. In the control screen the buttons have been

spread out over the whole screen, they are laid out in rows of three now which is more user friendly.

There is now less chance of pressing the wrong button.

Page 71: Control of a Robotic Arm over CAN using a touch screen user interface

Software Design

55

Figure 5-6 Final application design

5. 3 Setting up the screen layout

The touch screen on the BlackFin BF548 ezkit has a display of 320 X 260 pixels. The buttons can be

positioned with x values of no greater than 260 and y values of no greater than 320. The button

positions for each screen can be seen in Table 5-1. The corresponding coordinates of a button press

range are also shown. When an event occurs on the touch screen; the coordinates range from 0 – 4096,

on both the x and y plane. The table below shows how these coordinates are matched against the

button positions. An application available called event_test allowed button positions to be tested

against the coordinates received from the screen touch event. The range shown in the table is

presented as being from a larger value to a smaller value because the screen coordinates are measured

from left to right like the pixel values. The screen coordinates start as (0, 0), i.e. (x, y), from the bottom

right hand corner of the screen, where as the pixel values start as (0, 0) from the top left hand corner of

the screen. To keep things simple the table is written with values from left to right by convention.

Page 72: Control of a Robotic Arm over CAN using a touch screen user interface

Software Design

56

Table 5-1 the button layout

Buttons

X

position

Y

position width height X Coordinate Y Coordinate

(pixels) (pixels) (pixels) (pixels) range range

Main Menu Screen

Robot Control 80 50 160 50 3000 - 1000 3100 - 2400

Routines 80 120 160 50 3000 - 1000 2100 - 1400

Information 80 190 160 50 3000 - 1000 1100 - 400

Routines Screen

Pick & place routine 30 70 100 50 3500 - 2300 2800 - 2200

Dance Routine 190 170 100 50 1600 - 400 1300 - 800

Home position 190 70 100 50 1600 - 400 2800 - 2200

Back to Main Menu 30 170 100 50 3500 - 2300 1300 - 800

Information Screen

Back to Main Menu 80 220 160 40 3000 - 1000 600 - 300

Robot Control Screen

Gripper open 10 60 80 30 3700 - 2800 2900 - 2600

Gripper closed 10 100 80 30 3700 - 2800 2300 - 2000

Elbow up 10 140 80 30 3700 - 2800 1800 - 1500

Elbow down 10 180 80 30 3700 - 2800 1300 - 1000

Main menu 10 220 80 30 3700 - 2800 800 - 500

Wrist left 120 60 80 30 2500 - 1500 2900 - 2600

Wrist right 120 100 80 30 2500 - 1500 2300 - 2000

Shoulder up 120 140 80 30 2500 - 1500 1800 - 1500

Shoulder down 120 180 80 30 2500 - 1500 1300 - 1000

Home position 120 220 80 30 2500 - 1500 800 - 500

Wrist up 230 60 80 30 1200 - 300 2900 - 2600

Wrist down 230 100 80 30 1200 - 300 2300 - 2000

Base left 230 140 80 30 1200 - 300 1800 - 1500

Base right 230 180 80 30 1200 - 300 1300 - 1000

Routines 230 220 80 30 1200 - 300 800 - 500

Page 73: Control of a Robotic Arm over CAN using a touch screen user interface

Software Design

57

5. 4 Sequence diagram

Sequence diagrams are useful to show the basic interactions of the system. Figure 5-7 shows the

sequence diagram for the touch screen application. This shows the basic communications between

modules of the application; only one CAN message is shown to be sent from the control screen and

routines screen in this diagram for demonstration purposes.

Figure 5-7 Touch screen application sequence diagram

Page 74: Control of a Robotic Arm over CAN using a touch screen user interface

Software Design

58

There can be many CAN messages sent from either of these screens. The user can touch the

screen more than once and move the different motors in the control screen. In the following sequence

diagram it is just noted as a general CAN messages for display purposes. Similarly in the routines screen

there can be many CAN messages sent. In the routines screen once a screen_touched event has been

registered a sequences of CAN messages are sent to the Robotic Arm as it is a pre-recorded routine.

From the sequence diagram it can be seen that once the user touches the screen this event is

dependent on which state the system is in. If the system is in the main state and the user touches the

screen to call the control state the main screen registers the touch event and calls the control screen.

The control screen then takes over control and displays the screen to the user. The system is now in the

control state. This is equivalent for the routines and info screens.

5. 5 Discussion

UML[65] is a useful tool for describing the system to user; it gives the user a visual model of the system.

Use cases are behavioral diagrams and are a great way of outlining the functionality needed by the

project. They are a simple way of letting the user know what actions he/she can perform with this

application. Similarly state diagrams are useful to explain to the user the different states and transitions

available in the application. Sequence diagrams are another effective way of conveying how the

application operates. They are a practical way of outlining the interactions to the user with the

graphical notation techniques used by UML.

Knowing the values of the screen in pixels and the x and y coordinates used in the Linux frame

buffer utility was important to make sure that the pressing of a button lined up with the actual button

displayed on the screen. The need for these values to be precise was of huge importance. A mismatch

between these values would cause either an incorrect CAN message to be sent, i.e. moving the wrong

motor, or transitioning to the wrong screen leaving the user confused and with a lack of faith in the

application.

The need for a user friendly application is an important one when developing a software

application. Some key measurements in providing a user friendly application are efficiency, ability to

learn, errors made, ability to memorize, and satisfaction[76]. Is the user able to operate the controls

with ease? There is no point in the programmer being the only person able to operate and understand

the application. Any person should be able to sit down in front of the board and operate the Robotic

Arm controls. They should find the application self explanatory and not find the need for a manual. The

Page 75: Control of a Robotic Arm over CAN using a touch screen user interface

Software Design

59

user should know by using the application that they can operate each motor individually in either

direction or to play a pre-recorded routine. The labels of the buttons therefore have to be accurate and

concise letting the user know that if they press this button “x” will happen.[76]

Is the design of the buttons and screens easily understandable? The layout of the screen can’t

have the buttons too close together. This would give the application the ambiguity of which button has

been pressed. The application has to be user friendly and the instructions to the user can’t be vague or

confusing. If the user enters the wrong screen there should be an easy way for them to move back to

the previous menu.[76]

Overall the system needs to be user friendly and attractive. The user must want to use this

application, and not feel stressed after using it. The user should be able to remember the steps he/she

used previously when working with the application and not have to figure out how to user it each time it

is power on. Thus the application should be effective and satisfy the user.[76]

Page 76: Control of a Robotic Arm over CAN using a touch screen user interface

60

Blank Page

Page 77: Control of a Robotic Arm over CAN using a touch screen user interface

Software Implementation

61

Chapter 6 Software Implementation

There are two different parts to this showcase demonstration. The BlackFin BF548 ezkit runs under

µClinux operating system and the Robotic Arm can be programmed in the Windows environment with

the IAR embedded workbench. A virtual machine which ran Linux had to be set up and the BlackFin

toolchain had to be downloaded. This section goes through the software used in implementing the

Robotic Arm control board as well as the software used in implementing the BlackFin BF548 ezkit.

6. 1 Setting up the Robotic Arm control program and environment

The Robotic Arm control program used Windows and the C programming language. The IAR embedded

workbench[77] was used as an IDE to compile and load the software needed for the microcontroller.

This workbench also allowed debugging due to the use of the JTAG[78] interface which was connected

to the microcontroller.

In the IAR embedded workbench the part being used could be specified to optimize the

program. In the options Analog Devices could be selected and a choice of microcontrollers followed,

this had to be set to download the code to the board. This workbench provided useful when debugging

the code as one could step through the code as it was operating on the microcontroller. For instance

flashing an LED was possible on the microcontroller if the code was stepped over in the program.

6. 2 Setting up the BlackFin

Before any implementation work could get underway for the BlackFin (Figure 6-1), a lot of background

work had to be done. Analog Devices have a website with documentation on all of this open source

software for the BlackFin devices[79]. There was a huge amount of knowledge which had to be gained

before any programming could be undergone.

Firstly the environment had to be set up correctly. This was done by first downloading a virtual

machine, vmware[80] was used in this project. Once this was installed ubuntu, which is a Linux

distribution, was downloaded. Ubuntu is a debian based system so it was possible to use the “sudo apt-

get install” command to install missing packages, for example the gcc compiler. A terminal program[81]

also had to be set up on the Linux system so that the new kernel image and the application developed

could be downloaded onto the board. The windows environment could also be used for this, Hyper

Page 78: Control of a Robotic Arm over CAN using a touch screen user interface

Software Implementation

62

Terminal provided useful to download the image of the kernel onto the board and to transfer the

application across also.

The terminal programs used in ubuntu were “PuTTY SSH Client” and “CuteCom”. CuteCom was

used to download the application to the board and the PuTTY SSH Client was used to run applications

from the kernel of the board. Once Linux was running on the PC the BlackFin tool chain could be

downloaded and installed and U-boot could be set up and downloaded to the board. Finally the next

step was to build a kernel for the BlackFin BF548 ezkit. An outline for setting up the BlackFin BF537 with

ubuntu was given on the internet which was also a useful guide.[82]

Figure 6-1 BlackFin BF548 ezkit

6. 2. 1 Building the kernel

An image of the kernel[51] for the device could be downloaded from the related BlackFin website[83].

This image was used to build the required kernel needed for the project. The different functionality

needed for this board could be chosen in the kernel and application settings. Functionality like that of

the touch screen and the CAN transceivers had to be included in the kernel settings. Each time a setting

was changed the kernel had to be built this took over an hour to build and on occasions there would be

missing libraries or incorrect paths set which would have to be changed or included to allow the build to

continue. Once the build was complete the image could be loaded on to the board. It could be loaded

to either flash or RAM. This image could then be booted and the operating systems would then run on

Page 79: Control of a Robotic Arm over CAN using a touch screen user interface

Software Implementation

63

the board. Once all the correct libraries and functionalities were included the board was fully

operational and example applications could be built into the kernel and run when the kernel booted.

Knowledge of how these different applications worked helped when it came to creating the application

for the control of the Robotic Arm. Now that the kernel worked and all the knowledge was there, work

could get underway in creating the application.

6. 2. 2 Creating an application

The C programming language was used to create the application for this project. A simple “Hello World”

application was the starting point of the application[84]. There were a few different options which

could be used to create the application to control the Robotic Arm over the CAN bus. Existing APIs could

be used such as Microwindows and DirectFB which had the functionality for graphics; however these

methods meant that more functionality had to be added to the kernel. On the other hand an

application could be created by simply using functions defined in the Linux frame buffer utilities c file.

Here many different colors could be defined and used to display on the screen. In Section 5.2 examples

of how the application looked in its different stages can be seen.

The functions in this file were used to create a black screen and to add buttons, which were

drawn as blue rectangles; text could be then inserted into the center of these rectangles. Similarly

writing could be displayed on the screen illustrating the name of the application. Table 5-1 shows the

positioning of the buttons on the screen, similarly for inserting writing to the screen the x and y value

could be specified to give it a position on the screen. The code for the application can be found in

Appendix C.

The SocketCAN[85] utility was used to communicate over CAN. The BF548 has an integrated

CAN controller and the BlackFin ezkit has two CAN transceivers build onto the board. This utility was

created so as to send and receive CAN messages across the network. There were functions available

which allowed a socket to be created and then to bind the CAN transceiver to the socket, can0 or can1.

Throughout this project can0 will be used. There were also functions which allowed either a standard or

an extended CAN frame to be sent over the network. Once all these functions were implemented the

CAN bus was operational for the project.

Eclipse[86] could have been used as a development environment to build the application

instead it was built using the terminal. This was done by entering the folder where the source code was

and then typing “make” to build the application. A makefile[87] also had to be created to build the

Page 80: Control of a Robotic Arm over CAN using a touch screen user interface

Software Implementation

64

application otherwise the “make” command would not work. In the makefile the list of c files and

header files would be included, instructions of how to make the corresponding object files were needed

and finally a name could be given to the executable file which would be the running application on the

board. The corresponding makefile for this project can be found in Appendix C with the application

source code.

6. 2. 3 The state machine

Originally flags were used to set which screen would be operational, but this meant that four separate

flags had to be set and cleared when each screen was called. This meant that there could be up to

sixteen different states. To make the code more expandable a state machine was implemented where

the 4 different screens were defined and one variable was used to switch between the screens. This

makes the code easily adaptable if an extra screen needs to be added. Sample code can be seen below

in Figure 6-2. An example of the implemented state machine can be seen previously in Figure 5-5.

6. 2. 4 Threads

The kernel often needs to establish processes that sleeps or waits for events to occur. A lightweight

process or a thread[88] can be used for this task. A single thread was used in this application. The

application was set up so as to poll on the touch screen event. When this touch screen registered an

event, the coordinates would be recorded and compared to the buttons corresponding. The application

would then either transition to the next screen or send a CAN message.

The possibility to create multiple threads was available also. Multithreading would allow the

application to call one thread to deal with the touch screen events. Certain events would then call

another thread to deal with the CAN messages or another thread to change between screen views of

//define the 4 available screens

#define MAIN 0

#define CONTROL 1

#define ROUTINES 2

#define INFO 3

// declare and initialise screen_value

int screen_value = -1;

// set the screen_value to main

screen_value = MAIN;

6-2 State machine sample code

Page 81: Control of a Robotic Arm over CAN using a touch screen user interface

Software Implementation

65

the application. It was not necessary for the functionality needed in this application to use a

multithreaded application; all of this could be performed using a single thread. Polling on more than

one device could also have been set up in this application which was also unnecessary for what was

required in the demonstration.

6. 2. 5 Utilities

There were options to use existing utilities in this project. SocketCAN which was a necessary utility was

used[89]. Functions defined in the lib.c file were used to bind to the socket and send the CAN frame.

Instead of re-writing the necessary code the lib.h and lib.c files were added to the project and the lib.o

file which was created in the build was compiled into the application.

The Linux frame buffer utility was used to create the buttons, text and display them in different

colours and positions on the screen. A makefile was used to outline which files would be included in the

build and how. This makefile can be found in Appendix C with the application source code.

6. 3 Discussion

A huge amount of knowledge had to be gained in the BlackFin domain before any implementation work

could get underway. Correct operating environments had to be set up and the correct tool chain had to

be installed. Once the environment was set up correctly then knowledge about setting up the kernel

had to be gained and the correct interfaces and functionality had to be set up in the kernel before the

board would operate under the conditions needed for the project. Finally after a lot of background

reading and understanding of the different applications already available implementation work of the

actual control application for the Robotic Arm could get underway.

When designing the control application knowledge about the different utilities available was

necessary to know the capabilities of the application. Realistic goals had to be set when implementing

this application. By understanding the utilities available in µClinux it was possible to ascertain how long

it would take to implement the necessary functions using this utility. Due to time constrains the utilities

which were more graphics orientated were not implemented as it was more important to have the basic

functionality operational. The graphics could be added at a later stage when all the required

functionality was up and running.

Page 82: Control of a Robotic Arm over CAN using a touch screen user interface

Software Implementation

66

Knowledge about the network was also very important and the knowledge gained from the

microcontroller board was valuable when working on the BlackFin. Understanding the SocketCAN utility

came a lot easier having the vast background knowledge about CAN before implementation. Once an

understanding of how SocketCAN send the CAN frames over the network was gained it was possible to

modify the messages to suit the Robotic Arm’s required movements.

Having the ability to create a state machine for the transitions between control screens in the

application was important and the knowledge gained about how to improve the design was valuable.

This helped the application become more expandable and maintainable and gave the system less chance

of failing by entering an unknown state.

Page 83: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

67

Chapter 7 Testing

The different tests and validation carried out when implementing the Robotic Arm control board are

outlined. The JTAG interface is used for debug purposes and proved to be very useful. The touch screen

interface also needed a lot of testing an implementation. The steps involved in testing the application

are given. The message IDs of the CAN messages for each motor are given along with the message ID for

the version request. Finally some examples of test cases are given. Those of which were used

throughout the implementation of the project.

7. 1 Testing the Robotic Arm Control Board

To test the Robotic Arm control board, which is the microcontroller board, values could be hardcoded

into the registers. Then using the IAR embedded workbench to download and debug the code, it could

be stepped through to see if each function was operational. By giving the Robotic Arm different

positions in the code these functions could be stepped through and the arm would move

correspondingly. Also different functions could be checked by causing an LED to flash when the

functions were entered. This program also allowed variables and registered to be watched, the values

moving through these would be seen on a window in the screen which helped with debug purposes.

Originally some problems arose with the SPIRX register which was the register in the SPI

peripherals used to receive the CAN commands from the CAN controller. It turned out that the

microcontroller had to do a certain amount of reads on this register before the correct value would be

visible, otherwise the incorrect value in the buffer would be used in the code. A special loop was set up

in the code in SPI.c (Appendix A) to allow for this incorrect value in the buffer for all the functions that

had a read on SPIRX.

A PC application was available in Analog, for the control of a previous Robotic Arm

demonstration. By using and updating this PC application it was possible to both test the functionality

of the microcontroller and that the CAN network was operational. The CAN transceiver on the PC

application board could be connected to the CAN transceiver on the microcontroller board and CAN

messages could be sent over the CAN bus. A version request sent from the PC application would trigger

the microcontroller to send back the version of the code that it was running. This message would then

be received back on the PC application side and displayed on the screen. Once a version request

message was successfully sent and received by both CAN nodes it would be possible to send CAN

messages to move the motors on the Robotic Arm.

Page 84: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

68

Figure 7-1 shows an example of the equipment used throughout implementation of this project.

The oscilloscope is connected to the SPI peripherals and on the screen the MOSI, MISO, SPICLK and CS

signals can be seen on the four channels. Three motors are operational in the Robotic Arm as only one

microcontroller board is connected. This is why the current is just under 600mA, shown on the power

supply screen. The JTAG interface can also been seen in this image as the code is currently being

debugged and stepped through. The other boards that can be seen on the bottom left of the image are

the PC application control boards which provided useful for testing the network. An SDP board with a

BF527 chip is used. The original board which was bought with the Robotic Arm is not used in this

project. It was used for testing purposes before implementation of the new microcontroller board got

underway. The functionality of the arm was tested with this board and the way signals were sent was

viewed on the oscilloscope.

7-1 Robotic Arm set-up

Page 85: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

69

7. 2 Testing the Touch Screen Application

There were many small steps performed before a full scale application was built and tested. The first

step was to test the network. SocketCAN was the utility which could be set up in the BlackFin to send

and receive CAN messages. This utility allowed the robot to be connected to the board and through the

terminal existing functions in SocketCAN allowed the robot to be moved manually and version requests

to be sent. SocketCAN had a “cansend” command which allowed messages to be sent across the

network and a “candump” command which could be run in the background and any CAN messages

received would be automatically displayed on the terminal program screen. Once it was possible to

communicate with the robot manually over the terminal the CAN functions could be implemented inside

an application. To do this a socket had to be first created in the application. Then the program had to

bind the CAN transceiver to the socket. There were two transceivers on the board; can0 is used

throughout the project. The CAN frame that had to be sent was then defined and using functions from

SocketCAN the frame could be sent over the network through the can0 transceiver to the

microcontroller board.

The next step was to use the touch screen on the BlackFin BF549 ezkit and to see if that event

was working. To do this the event_test application was studied and the values on the different positions

of the screen were noted, as this application displayed the coordinates of where on the screen was

touched. Once this application worked it was known that the touch screen worked and an application

could be written to touch the screen and receive a message. Firstly without displaying anything on the

screen an application was written so that if one half of the screen was touched it would display one

message and the other half would display something different. This was done using the printf function

so the message would be displayed on the terminal. Then the functionality to send a version request

message to the Robotic Arm was added, and then to send CAN messages to the motors.

Displaying something on the screen was the next step involved. An application was then written

to create a black screen with some blue outlined buttons and white writing inside the button. When the

coordinates touched matched that specified by the application to denote a button a message was

displayed and then a CAN message sent. Finally the application began to take shape and the multiple

states needed to display different screens with different functionalities were possible. When all this was

operational the Robotic Arm was connected up again and each CAN message was confirmed that it

worked with the corresponding motor. Finally using the PC application a routine was recorded to

perform both a pick and place routine and a dance routine. The CAN messages were recorded on the PC

Page 86: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

70

application and hardcoded into the touch screen application. This routine was performed then over

CAN using the touch screen application.

A Robotic Arm with DC motors was available from Analog devices for testing purposes also. This

was used to test simple CAN messages to and from the Robot. Due to the fact that it was DC motors

these movements were easier to send in the application for initial testing. A “1” in the MSB of the CAN

data meant move in one direction and a “0” in the MSB meant move in the other direction. The rest of

the bits in the data told the motor how long to switch on for. While the application was being

developed, this proved an efficient method of testing its functionality without complicating the code

with the PWM pulse width being sent in the CAN data. Figure 7-2 shows the set up with the other

Robotic Arm.

7-2 Robotic Arm using DC motors set-up with BlackFin device for testing purposes

Page 87: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

71

The CAN messages that needed to be sent to this Robotic Arm can be seen in the following

table, table 7-1. Here it can be seen that like previously each motor has a specific ID and then the data is

where the difference occurs. The layout of the CAN frame is shown here. The first column gives the

command description, the second column is the CAN message ID which is unique to each message and

the EXIDE (extended identifier flag) is 0 as this bit says that it is a standard CAN message. The next two

sections are the PC data and the Robotic Arm data. The PC data is the data that the PC will send to the

Robotic Arm. This data is to get the motor to move in a certain direction for a certain length of time.

The PC in this case is the BlackFin BF548 ezkit. The robot control section of the table is the data that the

Robotic Arm will send back in a CAN message. If for example a version request is sent from the BlackFin

the Robotic Arm will reply back with its version in the data piece of the CAN frame. The glossary of

terms is given just below the table.

Table 7-1 CAN messages

CAN frame

PC Robot Control

Description Std. ID (High Low) EXIDE RTR DLC Data RTR DLC Data

Version

Request 00010000 000 0 1 0 - 0 1

XXXX

YYYY

Operate

Gripper 00000001 101 0 0 1

ABBB

BBBB

Operate

Wrist 00000001 100 0 0 1

ABBB

BBBB

Operate

Elbow 00000001 011 0 0 1

ABBB

BBBB

Operate

Base 00000001 010 0 0 1

ABBB

BBBB

Operate

Rotate 00000001 001 0 0 1

ABBB

BBBB

Operations:

A: Motor direction: Forward - 0, Reverse - 1

B0-B3: Time quanta to operate for (0-15 quanta)

B4-B7: 000 (unused bits for more time quanta).

Page 88: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

72

Version X.Y

X: Version number (0-15)

Y: Version increment (0-9)

7. 3 Test Cases

This section outlines some test cases which were used over the course of this project, testing the SPI,

PWM, CAN and touch screen are some of the important tests that had to be performed.

Test Case 1: The PWM signals

The purpose of this test is to validate that the PWM outputs have a period of 20ms (50 Hz) and a pulse

width corresponding to what is defined in this test case. This test case involves only the need for the

ADuC7128 microcontroller evaluation board. There is no need to connect the Robotic Arm or the

BlackFin device. An oscilloscope will be needed for this test case.

Inputs:

PWM1COM1 = PWM2COM1 = PWM3COM1 = 0x00

PWM1COM2 = PWM1COM3 = PWM2COM2 = PWM2COM3 = PWM3COM3 = 0x3C0

Expected Outputs:

Pulse width = 1.5ms

Period = 20ms (freq = 50 Hz)

Test set-up:

1. Connect the oscilloscope channels up to the PWM output pins.

2. Power up the 2 ADuC7128 evaluation boards

3. In the robotHome function defined in the motors.c file, each of the outputs is set up to have a

duty cycle of 1.5ms. The hexadecimal value is worked out using Equation 4-2 from section 4.

(i.e. 640000 ÷ (1/1.5ms)). The robotHome function is called automatically on power up of this

device

4. Using the oscilloscope measure the pulse width to confirm 1.5ms and the period to confirm

20ms.

Additional testing:

The source code can be changed by adding different test cases and giving the PWM outputs different

values.

Page 89: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

73

Test Case 2: The SPI signals

The purpose of this test is to validate that the CAN controller is in the correct mode on power up and to

validate that the CAN controller is in normal mode for standard operation. This test case involves only

the need for the ADuC7128 microcontroller evaluation board. There is no need to connect the Robotic

Arm or the BlackFin device. An oscilloscope would be needed for this test case.

Inputs:

N/A

Expected Outputs:

MISO = CANStatus = 0x80 for configuration mode – just after CANReset

MISO = CANStatus = 0x00 for normal mode – once CANNormalMode is called

Test set-up:

1. Connect the oscilloscope channels up to the SPI output pins (MOSI, MISO, SPICLK, /CS).

2. Power up the 2 ADuC7128 evaluation boards

3. Using the debug function in the IAR embedded workbench step through the code, making sure

to call the CANStatus function after CANReset and again after CANNormalMode.

4. Using the oscilloscope read off the MISO values to confirm 0x00 for configuration mode and

0x80 for normal mode. In the IAR embedded workbench the SPIRX register can be read and the

CANStatus variable can also be read. Watch these values change as the mode changes.

Additional testing:

The MOSI line can be watched on the oscilloscope also to validate that the values in the SPITX are

corresponding to what is being sent on the pins. The CANReset function will send 0xC0 to reset the CAN

controller. Trigger on this channel to see this value. Similarly the MOSI line can be validated against the

CANNormalMode function and the CANStatus function. The comments beside the code give the value

that is being sent across the SPI, they can also be found in the header file for the MCP2515 (REGS2515.h

given in Appendix B).

Page 90: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

74

Test Case 3: The CAN bus

The purpose of this test is to validate that the network is fully operational; CAN messages should be sent

across the bus. This test case involves the need for the ADuC7128 microcontroller evaluation board the

BlackFin device. There is no need to connect the Robotic Arm. The BlackFin device will need to be

connected to the PC for this test case.

Inputs:

N/A

Expected Outputs:

Version request acknowledged

“Robotic Arm Connected” displayed on LCD

Test set-up:

1. Power up and connect both boards

2. Open the hyper terminal program and connect to the BlackFin device

3. On power up the BlackFin device will look for a version request to check that the Robotic Arm

control board is connected.

4. The Robotic Arm control board will receive the version request and reply with the version of

code it is running.

5. The terminal will show that the BlackFin sends a version request and it will also show the

received version.

6. The BlackFin LCD display will show the words “Robotic Arm Connected” on reception of a CAN

message to validate that the network is operational also.

Page 91: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

75

Test Case 4: Gripper motor open

The purpose of this test is to validate that the both the network is operational and the PWM signals are

being sent to the servo motors. The full set up is required here. The BlackFin device will need to be

connected to the PC for this test case.

Inputs:

N/A

Expected Outputs:

The gripper motor will open

“Robotic Arm Connected” displayed on LCD

Test set-up:

1. Power up the devices

2. Open the hyper terminal program and connect to the BlackFin device

3. The terminal will show that the BlackFin sends a version request and it will also show the

received version from the Robotic Arm control board

4. The BlackFin LCD display will show the words “Robotic Arm Connected” on reception of a CAN

message to validate that the network is operational

5. The application will start. Touch the control button on the main menu.

6. On the terminal it will display “entering control screen”

7. Press the gripper open button

8. The terminal will display “gripper opening”

9. The gripper on the Robotic Arm will open

Additional testing:

This test can be repeated for each of the motors.

Page 92: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

76

Test Case 5: Pick & Place routine

The purpose of this test is to validate that both the network is operational, the PWM signals are being

sent to the servo motors and the routine will play without fault (i.e. loss of CAN messages). The full set

up is required here. The BlackFin device will need to be connected to the PC for this test case.

Inputs:

N/A

Expected Outputs:

The Pick & Place routine will play on a continuous loop

“Robotic Arm Connected” displayed on LCD

Test set-up:

1. Power up the devices

2. Open the hyper terminal program and connect to the BlackFin device

3. The terminal will show that the BlackFin sends a version request and it will also show the

received version from the Robotic Arm control board

4. The BlackFin LCD display will show the words “Robotic Arm Connected” on reception of a CAN

message to validate that the network is operational

5. The application will start. Touch the routines button on the main menu.

6. On the terminal it will display “entering routines screen”

7. Press the Pick & Place button

8. The terminal will display “pick and place activated”

9. The Robotic Arm will proceed to pick up the box move it to another location, place it there, pick

it back up and move it back to the original starting position. This will happen until the button is

pressed again to stop the routine.

10. The LCD now displays “stop routine”. Touch this button. The arm will stop moving.

Additional Testing:

This test can be repeated for the Dance routine also. If the arm is stopped in a position other than the

home position, this test can be done for the home position button.

Page 93: Control of a Robotic Arm over CAN using a touch screen user interface

Testing

77

7. 4 Discussion

The need for unit and component testing is a very important one. Testing is destructive in nature, this is

important so as when tests are complete and the results are validated, the outcome is that the product

will not fail under these tested situations. The goal of this unit testing is to isolate parts of the project to

show that these individual parts are functioning correctly. It is important to test every part of the

product making sure it is not likely to stop working when a user is operating it. This would result in the

user having a lack of faith in the product and not likely to trust the manufacturer again.

In this project it was important to test that all the features of the microcontroller were

functioning correctly, which involved a lot of unit testing. Once each function was operating as required

it was important to test the project as a whole. By sending CAN messages and checking that they were

received and that the Robotic Arm moved it validated the earlier tests as well as validating that the

whole project was functioning correctly. The PWM was working because the Robotic Arm moved, the

SPI was working because the CAN controller was in the correct mode and dealt with the messages it

received correctly, thus telling the PWM to send a signal to the servo motors.

The BlackFin device required the same discipline for testing. The application was built in stages

and each stage was tested before moving on and adding more functionality. It was important that the

network was operational, that the touch screen was registering the event, and that the coordinates

received from this event matched up to the correct button position. It was also important that the

graphics were user friendly and that it didn’t confuse the user. The directions needed to be self

explanatory and the control could be figured out by any user playing with the device. The controls

needed to make sense and if the robot motor moved left the button pressed should correspond.

Page 94: Control of a Robotic Arm over CAN using a touch screen user interface

78

Blank Page

Page 95: Control of a Robotic Arm over CAN using a touch screen user interface

Discussion

79

Chapter 8 Discussion of Results

This project proved to have a huge learning curve associated with it. A lot of testing was done to

validate the functionality of both the hardware and software throughout the project. This was due to

both component constraints of the project and the fact that the background knowledge for the project

was being covered as the project was progressing.

On the Robotic Arm node different functionality had to be working correctly before any progress

was made to the next step. The first thing was to get each functional block operating as desired, i.e. the

SPI peripherals and the PWM outputs. One of the areas where implementation work got underway

using a few different components was with the microcontroller. The microcontroller was an integral

part of the project as it was used for the Robotic Arm control program.

Due to the fact that this project was a showcase demonstration for Analog devices, this put a

restriction on the components that could be used. As the microcontroller was an integral part it had to

be an Analog Devices part. Although there are a huge range of devices available from Analog devices,

the specifications in the parts weren’t for controlling Robotic Arms with servo motors. The Robotic Arm

being used in this project needed functionality to control six servo motors simultaneously. Most of the

microcontrollers studied only had functionality for three.

The CAN transceiver which is the basis for this showcase demonstration is signal and power

isolated. On one side it operated at 5 V and the other side could be either 3.3 V or 5 V. The CAN

controller being used was operational between 3 V and 6 V. Most of the microcontrollers evaluated

were operational at 3.3 V so this meant that all three devices would have to operate at 3.3 V so that the

CAN signals would be consistent at 3.3 V. Table 8-1 shows a comparison of the microcontrollers

evaluated in this project. As can be seen from the table the microcontroller most suited for this project

was the ADuC7128.

Page 96: Control of a Robotic Arm over CAN using a touch screen user interface

Discussion

80

Table 8-1 microcontroller comparison

Microcontroller / Functionality ADuC7026 ADuC7060 ADuC7128

Operating Voltage 3.3V 2.5V 3.3V

PWM functionality 3 channel

PWM

6 channel

PWM

6 channel

PWM

SPI functionality Yes Yes Yes

UART functionality Yes Yes Yes

JTAG support Yes Yes Yes

Once the microcontroller was finally decided upon implementation work could be finished on

this CAN node. The signals to the servo motors could be compared to the pulse width being sent using

an oscilloscope to validate that they were functioning as desired. Similarly using an oscilloscope for the

SPI peripherals would validate that the CAN controller was in the correct state, by sending CAN status

commands and comparing the values on the scope to what was expected from the MCP2515 data

sheet[10]. More information on this can be seen in Chapter 7, Testing.

A large understanding of all the background was necessary for the BlackFin node. All the

necessary environment settings had to be covered before any implementation got underway. Once

most of the information was covered it was possible to build the required kernel and create the

application. The application proved to be a success. The control of the Robotic Arm was possible over

CAN using a touch screen interface.

It was possible to move between different states in the application, one state to display the

main menu, another for robot control, another for playing pre-recorded routines and finally one to

display information about the application itself. This gave the user the opportunity to control each

motor separately moving them to their maximum or minimum positions, while also allowing the user to

play routines which were pre-recorded using the PC application.

In these routines it was possible for the user to play a pick and place routine in which the

Robotic Arm picked up a small item, moved it to another position, picked it back up and moved it back

to where it started. This routine is set up on a continuous loop until the user tells it to stop allowing the

pick and place routine to play continuously in the background for a demonstration. The option to play a

dance routine is also possible for demonstration purposes in the same manor. Overall this proved to be

a successful application which provided control of a Robotic Arm over CAN.

Page 97: Control of a Robotic Arm over CAN using a touch screen user interface

Discussion

81

8. 1 Problems Encountered

Throughout the project there were some tricky areas to overcome. One of the first areas where a

problem occurred was with the SPI peripherals on the microcontroller. Initially in the ADuC7026 the SPI

data being received appeared to be incorrect. While studying the SPI lines on the oscilloscope it was

noted that the MISO line was always showing the correct data byte which meant that the data was

being received by the CAN controller but just not correctly in the software. This ruled out a hardware

problem and meant that evaluation of the software would deem necessary.

Through analysis and debugging it was noted that the SPIRX register was reading a previous

value of the register, which always turned out to be 0xFF, as this was the line being held high while it

was inactive. When the program was being debugged the code could be stepped through. By watching

the SPIRX register it could be seen that the value was 0xFF when the whole function was stepped over.

However when the function was stepped into and each line was executed the correct value would

appear in the register. It was noted that by slowing down the reading of the register the correct data

would be read in from SPIRX.

Consultation with the Application Engineers for the microcontroller in Analog provided no real

solution to the problem or a reason for it. As a result a solution had to be developed where by the

register would be read until it did not return the value 0xFF. A count value was set on this and through

analysis it was noted that after roughly 7 cycles of reading the register the value would be correctly read

back. This loop had to be inserted into each function that required a read from the SPI peripherals, i.e.

reading the CAN status.

Many small problems occurred when building the kernel such as missing packages or libraries

which were easily resolved. Due to a laptop failure the virtual machine had to be recovered from the

hard drive and the virtual machine had to be set up on another laptop. Unfortunately this did not work

as well as planned. The new builds appeared to be working but when loaded to the BlackFin the images

would not load. A new image of the kernel was downloaded and the building of the kernel started from

scratch. This proved a better solution and the images were successfully loaded to the board.

However, once work got underway to test the functionality it was found that the touch screen

event wasn’t working correctly for one of the applications. This meant that the source files of the kernel

had to be examined down as far as driver level in some cases. After a lot of evaluation and research of

the BlackFin documentation and forums it became apparent that when the new image of the kernel was

Page 98: Control of a Robotic Arm over CAN using a touch screen user interface

Discussion

82

downloaded the corresponding new toolchain had to be installed with it. The old toolchain that was

being previously used is not compatible with the new kernel image. With every release of a kernel

image a new release of the toolchain occurs also.

Page 99: Control of a Robotic Arm over CAN using a touch screen user interface

Conclusion

83

Chapter 9 Conclusion

In conclusion this project was a success. The project specifications given were met and engineering

design and evaluation was done throughout to achieve the desired results. This project proved to be a

mobile way of demonstrating the use of ADI’s CAN transceiver, ADM3053, for industrial and automotive

use. Analog Devices used this working demonstration for the Embedded World Exhibition &Conference

in February which proved to be a success. The demonstration has also been used in a training day at

Analog and in the United States in April this demonstration will be used again.

This project established a huge knowledge base for the author. The mixture of hardware and

software design and implementation kept things interesting and mentally stimulating. The ground work

is now in place for this demonstration and Analog Devices can adapt the project as needed. Future Co-

operative Education students can work with this finished product and learn how the product works and

what each of the components is doing. The top level of this project can be easily adapted and expanded

without having to delve into too much detail about the under workings of the project. This will be a

great learning experience for a Co-operative Education student and provide them with knowledge about

CAN networks, microcontrollers, embedded programming and much more.

This application is also a very fun and exciting way to introduce primary and post primary

children to the use of robotics, electronics and software. This is an ever expanding industry with a

shortage of Engineers from Ireland graduating. Students may be encouraged to pursue a career in

Engineering once they see what can be achieved with software and hardware even in a restricted time

frame.

The world of open source software taught valuable experiences about not taking anything that

happens for granted. The program will only do what it is told to do by the programmer, nothing

happens automatically. For example Eclipse may generate an automatic makefile for the project but it is

better to create one separately which fits the purpose intended. The documentation for open source

software isn’t always reliable or available in some situations. A vast amount of background knowledge

of the system and its capabilities is necessary as a result. It is important to make sure that, in the case of

the BlackFin kernel, the toolchain downloaded corresponds to the kernel image being built. With open

source software different individuals can check in and out different versions of the software. No two

versions will be the same. The kernels and toolchains aren’t backward compatible. For the big version

Page 100: Control of a Robotic Arm over CAN using a touch screen user interface

Conclusion

84

releases new toolchains are released to match. This is an important lesson learned as some time was

spent building a kernel with an incompatible toolchain.

Overall this project was a very valuable learning experience and a success. It proved both

challenging and mentally stimulating. Having a mixture of both hardware and software it provided

valuable information about both areas tying in everything studied in college modules. Even though

there was a huge amount of knowledge associated with the project undertaken, there is still so much

more knowledge to be obtained in this area.

9. 1 Recommendations

The possibility of creating an application using an existing API is available for the BlackFin device.

Microwindows is a library which can be built into the kernel. This system allows you to bring the

graphical features from Windows to smaller devices and platforms.[90] Addition of different display,

mouse, touchscreen and keyboard devices can be added with ease using this architecture. Graphics

applications can be written without having to have knowledge of the underlying graphics hardware, or

use the X Window System. This is the way that Microwindows typically runs on embedded systems.[91]

DirectFB[92] is a similar library which adds graphical power to embedded systems. The Linux

kernel being built for the BlackFin device contains code for both API’s to allow user applications to

access the Linux frame buffer. DirectFB is a small, powerful, flexible and easy to use technology for

accelerated and enhanced graphics support; it was developed for the special requirements of embedded

devices.[92] By using either of these API’s it is possible to create an application with better graphics and

better support for the touchscreen.

The control application has the possibility for increased features in dealing with the Robotic Arm

control. In the microcontroller code there are additional definitions for the CAN messages to control the

servo motors movements. By implementing these in the application it will give the user the opportunity

of moving the motor to a given position. To implement this, the drivers for the keypad on the BlackFin

device will have to be built into the kernel and this event will have to be polled similarly to the touch

screen event.

Another option for increased functionality with the application would be allowing the

application to store the CAN data giving the motor positions for each motor. Each time the application

sends a CAN message to move a motor the microcontroller code replies with the position of the motor.

Page 101: Control of a Robotic Arm over CAN using a touch screen user interface

Conclusion

85

By storing the position of the motor in an array structure the positions can be used in the control

application. By having these positions it would be possible to move each motor in increments, the

application would know what position the motor was in thus it could correctly move it by an increment

of 0.2ms.

The BlackFin BF548 ezkit has to option to add functionality such as playing a music file or a video

file. An option to add a VLC media player is available; however this requires a lot of additional libraries.

With more time it could have been possible to play a tune when the dance routine was called on the

control application. An additional screen could also have been added where the video of the Robotic

Arm performing the pick and place routine would be displayed. In the application this would require a

lot of code to implement these features. Adding the extra screen would be trivial as the state variable

provides an easy expansion of the application.

On the microcontroller side, the possibility of adding an accelerometer to the hardware is there.

The microcontroller has enough unused port pins to allow this functionality. The accelerometer could

be place on the Robotic Arm itself and the feedback could give a 3D position of the Robotic Arm. This

would give more accurate feedback of the robots position.

While originally researching Robotic Arms to use in the project the different capabilities of this

Robotic Arm were noted. Previous projects using this Robotic Arm proved interesting and impressive.

One example is where the Robotic Arm is programmed to play chess against a human[93]. This involved

artificial intelligence algorithms and a camera to view the moves made on the board. Another example

is of this Robotic Arm programmed to play tic-tac-toe[94]. Similarly and artificial intelligence algorithm

and a webcam are used. A Masters student similarly programmed the Robotic Arm to play a game of

checkers against a human[19]. A team of Electronic and Computer Engineers have also provided the

control of this Robotic Arm using wireless and wired control[95]. A webcam is used here and the Arm

will pick up a pen and move it.

Although the Robotic Arm in this project does not perform quiet so extensive operations it fits

the requirements set out. It performs a pick and place operation and a dance routine to showcase the

CAN transceivers functionality in the network. By adding the functionality to the Robotic Arm to take in

reading from a webcam it would be possible to create a more advanced demonstration where the

Robotic Arm would play checkers or tic-tac-toe.

Page 102: Control of a Robotic Arm over CAN using a touch screen user interface

Conclusion

86

Due to the fact that this project is technology based and this is an ever expanding industry the

possibility for expansion and increased functionality is endless. According to Moore’s law transistor

density doubles roughly every 18 months. This means that every 15 years, densities increase by three

orders of magnitude. Technology is ever changing and always growing and becoming more advanced.

This project has a baseline in place and all the ground work is done, so that when technology changes it

can be easily adapted an expanded to suit the situation. Extra components and functionality can be

added to make the demonstration more advanced and move with the times.

Page 103: Control of a Robotic Arm over CAN using a touch screen user interface

References

87

References

[1] ADI. (1995-2012, 20th March 2012). Analog Devices. Available: www.analog.com

[2] CiA. (2001-2011, 5th October 2011). Controller Area Network. Available: http://www.can-

cia.org/index.php?id=can

[3] ADI. (2010, 12th March 2012). ADM3052 / ADM3053: Fully Isolated CAN Transceivers. Available:

http://videos.analog.com/video/products/interface/756419088001/ADM30523-Fully-Isolated-

CAN-Transceivers/

[4] ADI. (2011-2012, 20th March 2012). ADM3053 Data Sheet. Available:

http://www.analog.com/static/imported-files/data_sheets/ADM3053.pdf

[5] ADI. (2010, 10th March 2012). Industry’s First Signal and Power Isolated CAN Transceivers.

Available: http://www.analog.com/static/imported-

files/product_highlights/ADM3052_3_ProductHighlight.pdf

[6] RobotShop. (2003-2012, 5th October 2011). Lynxmotion AL5A 4 Degrees of Freedom Robotic

Arm Combo Kit (with electronics- EU Version). Available:

http://www.robotshop.com/eu/lynxmotion-al5a-4-dof-robotic-arm-combo-kit-electronics-

eu.html

[7] ADI. (2012, 20th March 2012). BF548 Data Sheet. Available:

http://www.analog.com/static/imported-files/data_sheets/ADSP-

BF542_BF544_BF547_BF548_BF54.pdf

[8] Lirtex. (2012, 5th October 2011). Servo Motors – Information, Usage and Control. Available:

http://www.lirtex.com/robotics/servo-motors-information-and-control/

[9] ADI. (2007, 20th March 2012). ADuC7128 Data Sheet. Available:

http://www.analog.com/static/imported-files/data_sheets/ADUC7128_7129.pdf

[10] Microchip. (22nd March 2012). MCP2515 Data Sheet. Available:

http://ww1.microchip.com/downloads/en/DeviceDoc/21801d.pdf

[11] Wikipedia. (2012, 5th October 2011). Serial Peripheral Interface Bus. Available:

http://en.wikipedia.org/wiki/Serial_Peripheral_Interface_Bus

[12] ADI. (2008, 20th March 2012). BF548 ezkit manual. Available:

http://www.analog.com/static/imported-

files/eval_kit_manuals/BF548_EZKIT_Ref.Man_REV1.3.pdf

[13] D. C. Curt Franklin. (2008, 10th March 2012). How Operating Systems Work. Available:

http://computer.howstuffworks.com/operating-system1.htm

[14] D. J. D. a. M. Durrant. (5th October 2011). uClinux Available: http://www.uclinux.org/index.html

[15] M. Barr. (2007, 20th March 2012). Introduction to Pulse Width Modulation (PWM). Available:

http://www.barrgroup.com/Embedded-Systems/How-To/PWM-Pulse-Width-Modulation

[16] R. B. GmbH. (1991, 5th October 2011). CAN specification (Version 2.0 ed.). Available:

http://www.gaw.ru/data/Interface/CAN_BUS.PDF

[17] CiA. (2001-2011, 10th March 2012). CAN in automotive applications. Available: http://www.can-

cia.de/index.php?id=229

[18] M. Brain. (10th March 2012). How Microcontrollers Work. Available:

http://electronics.howstuffworks.com/microcontroller1.htm

[19] YouTube. (2010, 5th October 2011). Lynxmotion Arm Playing Checkers. Available:

http://www.youtube.com/watch?v=_ukHCOfggZ4&feature=related

[20] K. W. H. Tindell, H. ; Wellings, A.J., "Analysing Real-Time Communications: Controller Area

Network (CAN)," Real-Time Systems Symposium, vol. 7, pp. 259 - 263, December 1994 1994.

Page 104: Control of a Robotic Arm over CAN using a touch screen user interface

References

88

[21] CiA. (2001-2011, 10th March 2012). Time-triggered CAN. Available: http://www.can-

cia.org/index.php?id=521

[22] SAE. (2012, 10th March 2012). SAE International. Available: http://www.sae.org/

[23] CiA. (2001-2011, 5th October 2011). CAN History. Available: http://www.can-

cia.org/index.php?id=systemdesign-can-history

[24] R. Zurawski, "Controller Area Network: A Survey," in The Industrial Communication Technology

Handbook, ed United States of America: CRC Press, 2005.

[25] Intel. (1992, 5th October 2011). 82526 Controller Area Network Chip, Data Sheet. Available:

http://pdf1.alldatasheet.com/datasheet-pdf/view/157268/INTEL/AN82526.html

[26] P. Semiconductors. (1992, 12th October 2011). Standalone CAN controller, 82C200 Data Sheet.

Available: http://www.e-lab.de/downloads/DOCs/PCA82C200.pdf

[27] CiA. (2001-2011, 5th October 2011). CAN in Automation. Available: http://www.can-cia.de/

[28] A. S. Tannenbaum, "Introduction: Reference Models," in Computer Networks. vol. Edition 4, ed

New Jersey: Prentice Hall, 2003.

[29] ISO, "ISO 11898-1:2003," in Road vehicles -- Controller area network (CAN) -- Part 1: Data link

layer and physical signalling, ed, 2003.

[30] ISO, "ISO 11898-4:2004," in Road vehicles -- Controller area network (CAN) -- Part 4: Time-

triggered communication, ed, 2004.

[31] ISO, "ISO 11898:1993," in Road vehicles -- Interchange of digital information -- Controller area

network (CAN) for high-speed communication, ed, 1993.

[32] ISO, "ISO 16845:2004," in Road vehicles -- Controller area network (CAN) -- Conformance test

plan, ed, 2004.

[33] CiA. (2001-2011). CAN physical layer. Available: http://www.can-

cia.org/index.php?id=systemdesign-can-physicallayer

[34] O. Pfeiffer, Ayre, A., Keydel, C., Embedded Networking with CAN and CANopen. Germany: RTC

Books, 2003.

[35] CiA. (2001-2011, 10th February 2012). CAN protocol. Available: http://www.can-

cia.org/index.php?id=87

[36] R. Davis, Burns, A., Bril, R., Lukkien, J., "Controller Area Network (CAN) schedulability analysis:

Refuted, revisited and revised," Real-Time Systems, vol. 35, pp. 239 - 272, April 2007 2007.

[37] R. I. Davis, Burns, A., "Robust priority assignment for messages on Controller Area Network

(CAN)," Real-Time Systems, vol. 41, pp. 152 - 180, 2009.

[38] sigy. (2009, 10th March 2012). CAN bus physical layer. Available:

http://canbus.pl/index.php?id=3&lang=en

[39] M. Kumar, A. Verma, and A. Srividya, "Response-Time Modeling of Controller Area Network

(CAN) Distributed Computing and Networking." vol. 5408, V. Garg, R. Wattenhofer, and K.

Kothapalli, Eds., ed: Springer Berlin / Heidelberg, 2009, pp. 163-174.

[40] Wikipedia. (2011, 5th October 2011). CAN bus. Available: http://en.wikipedia.org/wiki/CAN_bus

[41] CiA. (2001-2011, 10th March 2012). CANopen. Available: http://www.can-

cia.de/index.php?id=systemdesign-canopen

[42] L.-B. Fredriksson. (2011, 11th October 2011). CAN Kingdom Rev. 3.01. Available:

http://www.kvaser.com/images/Papers/ck301p.pdf

[43] Wikipedia. (2011, 11th October 2011). CAN Kingdom. Available:

http://en.wikipedia.org/wiki/CAN_Kingdom

[44] CiA. (2001-2011, 10th March 2012). DeviceNet. Available: http://www.can-

cia.de/index.php?id=48

[45] Wikipedia. (2011, 11th November 2011). DeviceNet. Available:

http://en.wikipedia.org/wiki/DeviceNet

Page 105: Control of a Robotic Arm over CAN using a touch screen user interface

References

89

[46] J. A. Fisher, Faraboschi, P., Yound, C., "An Introduction to Embedded Processing," in Embedded

Computing: A VLIW Approach to Architecture, Compilers, and Tools, J. Turley, Ed., ed San

Francisco: Elsevier Inc., 2005, pp. 1 - 43.

[47] B. W. Kernighan, Ritchie, D.M, The C Programming Language. New Jersey: Prentice Hall

Software Series, 1988.

[48] E. De Castro Lopo, Aitken, P., Jones, B.L., Sams Teach yourself C for Linux Programming in 21

Days. United States of America: Sams publishing, 1999.

[49] J. A. Fisher, Faraboschi, P., Yound, C., "Embedded Compiling and Toolchains," in Embedded

Computing: A VLIW Approach to Architecture, Compilers, and Tools, ed San Francisco: Elsevier

Inc., 2005, pp. 290-291.

[50] D. Heffernan, "Title," unpublished|.

[51] Wikipedia. (2011, 12th October 2011). Kernel (computing). Available:

http://en.wikipedia.org/wiki/Kernel_(computing)

[52] L. Technologies. (2002, 5th October 2011). A history of UNIX. Available: http://www.bell-

labs.com/history/unix/

[53] M. T. Jones, IBM. (2007, 10th October 2011). Anatomy of the Linux kernel. Available:

http://www.ibm.com/developerworks/linux/library/l-linux-kernel/

[54] J. W. S. Lui, Real Time Systems. United States of America: Prentice Hall, 2000.

[55] K. H. Johansson, Torngren, M., Nielsen, L., "Vehicle Applications of Controller Area Network,"

2004.

[56] H. Aysan, Dobrin, R., Punnekkat, S., "Fault Tolerant Scheduling on Controller Area Network

(CAN)," 2010 13th IEEE International Symposium on Object/Component/Service-Oriented Real-

Time Distributed Computing Workshops, vol. 4, pp. 226 - 232, May 2010 2010.

[57] M. Short, Pont, M.J., "Fault-Tolerant Time-Triggered Communication Using CAN," IEEE

Transactions On Industrial Informatics, vol. 3, pp. 131 - 142 May 2007 2007.

[58] ADI. (1995-2012, 20th March 2012). Blackfin Processor Architecture Overview. Available:

http://www.analog.com/en/processors-dsp/Blackfin/processors/Blackfin_architecture/fca.html

[59] ADI. (1995-2012, 20th March 2012). A Beginners Guide to Digital Signla Processing. Available:

http://www.analog.com/en/processors-dsp/processors/beginners_guide_to_dsp/fca.html

[60] Wikipedia. (2012, 20th March 2012). FlexRay. Available: http://en.wikipedia.org/wiki/FlexRay

[61] L. Davis. (1998-2012, 10th March 2012). Automotive Buses. Available:

http://www.interfacebus.com/Design_Connector_Automotive.html

[62] Wikipedia. (2012, 20th March 2012). Local Interconnect Network. Available:

http://en.wikipedia.org/wiki/Local_Interconnect_Network

[63] Wikipedia. (2012, 20th March 2012). Time- Triggered Protocol. Available:

http://en.wikipedia.org/wiki/Time-Triggered_Protocol

[64] Wikipedia. (2011, 10th March 2012). IBM Rational Unified Process. Available:

http://en.wikipedia.org/wiki/IBM_Rational_Unified_Process

[65] A. Holub. (2011, 10th February 2012). Allen Holub's UML Quick Reference. Available:

http://www.holub.com/goodies/uml/

[66] (2012, 10th March 2012). Embedded World 2012 Exhibition & Conferrence. Available:

http://www.embedded-world.de/en/

[67] ADI. (2012, 12th March 2012). Isolated Controller Area Network (CAN). Available:

http://videos.analog.com/video/products/interface/969525772001/Isolated-Controller-Area-

Network-CAN/

[68] S. T. Robotics. (2011, 12th October 2011). Why ST robots have stepping motors. Available:

http://www.strobotics.com/stepper.htm

Page 106: Control of a Robotic Arm over CAN using a touch screen user interface

References

90

[69] Lynxmotion. (2010, 5th October 2011). Lynxmotion - AL5A. Available:

http://www.lynxmotion.com/c-124-al5a.aspx

[70] ADI. (1995-2012, 10th February 2012 ). why use uclinux. Available:

http://docs.blackfin.uclinux.org/doku.php?id=why_use_uclinux

[71] M. Graphics. (2012, 20th March 2012). PCB Design Software. Available:

http://www.mentor.com/products/pcb-system-design/design-flows/pads/pads-evaluation

[72] E. E. Herald. (2006, 12th March 2012). Controller Area Network (CAN) interface in embedded

systems: . Available: http://www.eeherald.com/section/design-guide/esmod9.html

[73] ADI. (2005-2011, 5th October 2011). ADuC7026 Data Sheet. Available:

http://www.analog.com/static/imported-

files/data_sheets/ADuC7019_20_21_22_24_25_26_27_28_29.pdf

[74] ADI. (2009-2011, 5th October 2011). ADuC7060 Data Sheet. Available:

http://www.analog.com/static/imported-files/data_sheets/ADuC7060_7061.pdf

[75] R. W. Besinga. (2009). Basic Servo Motor Controlling with Microchip PIC Microcontroller.

Available: http://www.ermicro.com/blog/?p=771

[76] J. Koetsier. (2009, 20th March 2012). User Friendly - How to Know Your Software is Usable.

Available: http://ezinearticles.com/?User-Friendly---How-to-Know-Your-Software-is-

Usable&id=1865354

[77] I. Systmes. (2012, 10th March 2012). IAR. Available: http://www.iar.com/

[78] SEGGER. (2009, 10th October 2011). User guide of the JTAG emulators for ARM Cores. Available:

http://supp.iar.com/FilesPublic/UPDINFO/004916/arm/doc/jlinkarm.pdf

[79] ADI. (1995-2012, 20th March 2012). Analog Devices Open Source Software. Available:

http://docs.blackfin.uclinux.org/doku.php

[80] VMware. (2012, 10th October 2011). VMware. Available: http://www.vmware.com/

[81] ADI. (1995-2012, 10th November 2011). terminal programs. Available:

http://docs.blackfin.uclinux.org/doku.php?id=terminal_programs

[82] DokuWiki. (2011, 10th November 2011). blackfin:uclinux_bf537setup_ubuntu. Available:

http://visionlab.uncc.edu/dokuwiki/doku.php?id=blackfin:uclinux_bf537setup_ubuntu

[83] ADI. (1995-2012, 20th March 2012). BlackFin koop. Available: http://blackfin.uclinux.org/gf/

[84] ADI. (1995-2012, 10th October 2011). Simple Hello World Application Example. Available:

http://docs.blackfin.uclinux.org/doku.php?id=simple_hello_world_application_example

[85] ADI. (1995-2012, 10th March 2012). Blackfin SocketCAN Driver. Available:

http://docs.blackfin.uclinux.org/doku.php?id=linux-kernel:drivers:bfin_can

[86] T. E. Foundation. (2012, 10th March 2012). Eclipse. Available: http://www.eclipse.org/

[87] ADI. (1995-2012, 10th February 2012). Using Make. Available:

https://docs.blackfin.uclinux.org/doku.php?id=toolchain:make&s[]=makefile

[88] ADI. (1995-2012, 12th February 2012). Kernel Threads. Available:

http://docs.blackfin.uclinux.org/doku.php?id=linux-kernel:threads

[89] I. DAS. (2010, 10th October 2011). Linux SocketCAN CAN Bus Manual. Available:

ftp://ftp.icpdas.com/pub/cd/fieldbus_cd/can/pci/piso-

can200_400/linux_can_driver/socketcan/linux_socketcan_can_bus_manual.pdf

[90] (2012, 10th March 2012). The Nano-X Window System. Available:

http://www.microwindows.org/

[91] (2012, 10th March 2012). The Microwindows Project. Available:

http://www.microwindows.org/MicrowindowsPaper.html

[92] DirectFB. (2012, 10th March 2012). DirectFB. Available: http://directfb.org/

[93] YouTube. (2010, 5th October 2011). ChessBot Lynxmotion AL5A Progress. Available:

http://www.youtube.com/watch?v=CkGqn5rNzK8&feature=related

Page 107: Control of a Robotic Arm over CAN using a touch screen user interface

References

91

[94] YouTube. (2010, 5th October 2011). RoboticArm-TicTacToe. Available:

http://www.youtube.com/watch?v=azuVH6LwjpU&feature=related

[95] YouTube. (2010, 5th October 2011). Robotic Arm Controller. Available:

http://www.youtube.com/watch?v=INAU6wPLikk&feature=related

Page 108: Control of a Robotic Arm over CAN using a touch screen user interface

Appendices

92

Appendices

CD contents:

• Robotic Arm control board schematic files

• Robotic Arm control board project code files

• BlackFin control application project code files

• CAN message definitions document

• Project plan document

• Pictures associated with the project

• Video clips of the Robotic Arm performing the routines

• A copy of the report in PDF format

• Data sheets of devices used

• Interim Report

• Interim Presentation

Appendix A: Robotic Arm control board schematic 93 - 97

Appendix B: Robotic Arm control board code 98 - 109

Appendix C: BlackFin control application code 110 - 126

Appendix D: CAN message definitions 127 - 130

Appendix E: Project plan 131

Page 109: Control of a Robotic Arm over CAN using a touch screen user interface

93

Appendix A

Page 110: Control of a Robotic Arm over CAN using a touch screen user interface

94

Page 111: Control of a Robotic Arm over CAN using a touch screen user interface

95

Page 112: Control of a Robotic Arm over CAN using a touch screen user interface

96

Page 113: Control of a Robotic Arm over CAN using a touch screen user interface

97

Page 114: Control of a Robotic Arm over CAN using a touch screen user interface

98

Appendix B main.c

/* Main.c * Name: Roisin Howard * ID number: 0850896 * Start Date: July 2011 * micro1 code */ #include <Analogdevices/ioaduc7128.h> // Include ADuC7128 Header File #include <intrinsics.h> #include "stdio.h" #include "mcp2515.h" #include "REGS2515.h" char thisStatus = 0; int base_motor = BASE_HOME; int shoulder_motor = SHOULDER_HOME; int elbow_motor = ELBOW_HOME; int wrist_motor = WRIST_HOME; int gripper_motor = GRIPPER_HOME; int wrotate_motor = WRISTROTATE_HOME; /* * Function: main * Description: main routine * calls the microcontroller initialisation routine * and the motor control routine */ int main (void) GP4DAT = 0x04000000; Aduc_Init(); motorControlRoutine(); // end main /* * Function: InitPowerControlSystem * Description: initialize internal clock frequency * sets up the pll control register and * the power control register

*/ void InitPowerControlSystem(void) // Use external 32kHz crystal, PLL default PLLKEY1 = 0xAA; PLLCON = 0x01; PLLKEY2 = 0x55; // CPU clock ~ 5.12MHz //POWKEY1 = 0x01; //POWCON = 0x03; //POWKEY2 = 0xF4; // CPU clock ~ 10.44 MHz POWKEY1 = 0x01; POWCON = 0x02; POWKEY2 = 0xF4; /* * Function: Aduc_Init * Description: ADuC7026 initialisation routine * calls the routines to set up the power control, * the PWM & SPI functionality, sets up the PWM * frequency and moves the robotic arm to the home * position, then resets the CAN controller and puts * it in normal mode */ void Aduc_Init(void) InitPowerControlSystem();// initialise the core clock set up //init_serial(); // initialise serial port InitPWM(); // main initialization SPI_init(); // initialise interrupt driven spi SetPwmFreq(); // set up the PWM pairs to have a period of 20ms robotHome(); // give robot a home position CANReset(); // resets the CAN controller CANNormalMode();// puts the CAN controller in normal mode /* * Function: delay * Description: Delay function * allows a variable delay to be set up */

Page 115: Control of a Robotic Arm over CAN using a touch screen user interface

99

void delay (int length) while (length >=0) length--; /* * Function: LEDToggle * Description: routine to toggle the LED * used in debug purposes to toggle the LED * when certain functions were entered */ void LedToggle(int time) GP4DAT ^= 0x00040000; // Complement P4.2 delay(time); spi.c /* * SPI.C: * - initializing the SPI peripherals so ADuC7128 will * communicate with MCP2515 (CAN controller) * - initializing the CAN controller - MCP2515 * * Name: Roisin Howard * ID number: 0850896 * Start Date: July 2011 * */ #include <Analogdevices/ioaduc7128.h> #include "stdio.h" #include "mcp2515.h" #include "REGS2515.h" unsigned char data; unsigned char SpiReceiveBuffer; unsigned char transmit_complete; unsigned char read; unsigned char spi_dummy; unsigned char SpiDataGlobal;

unsigned long counter; unsigned long SPISTATUS = 0; /* * Function: SPI_init * Description: Sets up the SPI peripherals. * Sets up continuous transfer enable, mast mode, * SPI initialised, MSB first,clock idles high, & * serial clock pulses at beginning of each transfer, * initiate transfer with write to SPITX, * interrupt occurs when SPITX * is empty,SPITX transmits 0x00 when empty, * SPIRX overwrite enable */ void SPI_init(void) // initialize spi peripherals (SS, SCLK, MISO, MOSI) GP1CON = BIT17 + BIT21 + BIT25 + BIT29; // spi control register SPICON = 0x114F; // Set devider for SPI clock when HCLK = 41.78MHz SPIDIV = 0xA6; // 125kHz /* * Function: SPITransmit * Description: Transmit one byte via SPI */ void SPITransmit(unsigned char SpiData) while((SPISTA&0x01)); SPITX = SpiData; /* * Function: CANReset * Description: Resets the MCP2515 * - allows a start up time of greater than 5.8us * (128 oscillation periods) * 11.0592MHz crystal => 22.1184M clock cycles * 128 X (1/22.1184M) ~ 5.8us * - puts it into configuration mode: * sets up the bit rate the receive buffers

Page 116: Control of a Robotic Arm over CAN using a touch screen user interface

100

*/ void CANReset() SPITransmit(CAN_RESET); //0xC0 //allow the CAN controller to start up: for (int i=0; i<200; i++ ); //~95us /*CAN controller automatically in Configuration mode after reset*/ //Setting up bit rate (86000bps) CANWrite(CNF1,(SJW1 + (8-1))); CANWrite(CNF2,(BTLMODE + (3-1)*8 + (1-1))); CANWrite(CNF3,(SOF_DISABLED + WAKFIL_DISABLED + (3-1))); //Setting up interrupt for receive buffers BitModify(CANINTE, RX0IE + RX1IE, G_RXIE_ENABLED); CANWrite(RXM0SIDH, 1 << (ROBOT_ID + 4)); CANWrite(RXM0SIDL, 0x00); CANWrite(RXF0SIDH, 1 << (ROBOT_ID + 4)); CANWrite(RXF1SIDH, 1 << (ROBOT_ID + 4)); CANWrite(RXM1SIDH, 1 << (ROBOT_ID + 4)); CANWrite(RXM1SIDL, 0x00); CANWrite(RXF2SIDH, 1 << (ROBOT_ID + 4)); CANWrite(RXF3SIDH, 1 << (ROBOT_ID + 4)); CANWrite(RXF4SIDH, 1 << (ROBOT_ID + 4)); CANWrite(RXF5SIDH, 1 << (ROBOT_ID + 4)); //Disable interrupts for transmitt buffers

BitModify(CANINTE, MERRE + WAKIE + ERRIE + TX0IE + TX1IE + TX2IE, 0);

//Reset all interrupt flags CANWrite(CANINTF, 0x00); /* * Function: CANReadStatus * Description: Read status of the CAN controller * a read status command is transmitted * followed by two dummy bytes * to allow the slave to send back * two bytes of information

*/ char CANReadStatus() unsigned char spiReadData = 0; SPITransmit(CAN_RD_STATUS); SPITransmit(0xFF); while((SPISTA & 0x08) != 0x08); spiReadData=SPIRX; counter = 0; SPITransmit(0xFF); while((SPISTA & 0x08) != 0x08); spiReadData=SPIRX; while(spiReadData==0xFF) counter++; spiReadData=SPIRX; counter = 0; spi_delay(); return spiReadData; /* * Function: CANStatus * Description: Reads the CAN Status */ char CANStatus() char CANStatusByte; CANStatusByte = CANRead(CANSTAT); spi_delay(); return CANStatusByte; /* * Function: CANConfigMode * Description: Puts the device into configuration mode */ void CANConfigMode() BitModify(CANCTRL, REQOP, REQOP_CONFIG); spi_delay(); /*

Page 117: Control of a Robotic Arm over CAN using a touch screen user interface

101

* Function: CANNormalMode * Description: Puts the device into normal mode – * standard operation */ void CANNormalMode() BitModify(CANCTRL, REQOP, REQOP_NORMAL); spi_delay(); /* * Function: CANloopbackMode * Description: Puts the device into loopback mode */ void CANLoopbackMode() BitModify(CANCTRL, REQOP, REQOP_LOOPBACK); spi_delay(); /* * Function: CANRead * Description: Read from the CAN controller * a read command is transmitted followed by the * address to be read from and a dummy byte to * allow the slave to send back the byte of information */ char CANRead(unsigned char address) char readData = 0; counter = 0; SPITransmit(CAN_READ); //0x03 SPITransmit(address); SPITransmit(0xFF); while((SPISTA & 0x08) != 0x08); readData=SPIRX; while(readData==0xFF) counter++; readData=SPIRX; counter = 0; spi_delay(); return readData;

/* * Function: CANWrite * Description: Write to the CAN controller * a write command is transmitted * followed by the address to be written to and the data */ void CANWrite(unsigned char address, unsigned char can_data) SPITransmit(CAN_WRITE); //0x02 SPITransmit(address); SPITransmit(can_data); spi_delay(); /* * Function: CANRxStatus * Desciption: Receiver buffer status * the can message to request a RX status is sent first then a dummy byte * is sent to allow the slave to send back the status of the buffer */ char CANRxStatus() char readData = 0; counter = 0; SPITransmit(CAN_RX_STATUS); SPITransmit(0xFF); while((SPISTA & 0x08) != 0x08); readData=SPIRX; while(readData==0xFF) counter++; readData=SPIRX; counter = 0; spi_delay(); return readData; /* * Function: BitModify * Description: allows an address to be modified * sends a CAN_BIT_MODIFY command * followed by the address to modify

Page 118: Control of a Robotic Arm over CAN using a touch screen user interface

102

* the maks and the bit value */ void BitModify(unsigned char address, unsigned char mask, unsigned char bit_value) SPITransmit(CAN_BIT_MODIFY); SPITransmit(address); SPITransmit(mask); SPITransmit(bit_value); spi_delay(); /* * Funciton: CAN_TransmitMessage * Description: transmits the CAN messages * reads the CAN status to see if there is a message * then it checks through each transmit buffer to see * if it is already in use, if it is empty it is loaded * with the message to be transmitted, if it is full * the next buffer is checked. */ void CAN_TransmitMessage() int ii = 0; unsigned char CANstatus = 0; unsigned char bufferStatus = 0; CANstatus = CANReadStatus(); if (!(CANstatus & 0x04)) //check if buffer 0 is in use CANWrite(TXB0SIDH, Transmit_SIDH); // if not load it CANWrite(TXB0SIDL, Transmit_SIDL); CANWrite(TXB0DLC, Transmit_DLC); for(ii = 0; ii < (Transmit_DLC & 0x0F); ii++) CANWrite(TXB0D0+ii, Transmit_Data[ii]); BitModify(TXB0CTRL, TXREQ, TXREQ_SET); else if(!(CANstatus & 0x10)) //check if buffer 1 is in use CANWrite(TXB1SIDH, Transmit_SIDH); // if not load it CANWrite(TXB1SIDL, Transmit_SIDL); CANWrite(TXB1DLC, Transmit_DLC); for(ii = 0; ii < (Transmit_DLC & 0x0F); ii++)

CANWrite(TXB1D0+ii, Transmit_Data[ii]); BitModify(TXB1CTRL, TXREQ, TXREQ_SET); else if(!(CANstatus & 0x40)) //check if buffer 2 is in use CANWrite(TXB2SIDH, Transmit_SIDH); // if not load it CANWrite(TXB2SIDL, Transmit_SIDL); CANWrite(TXB2DLC, Transmit_DLC); for(ii = 0; ii < (Transmit_DLC & 0x0F); ii++) CANWrite(TXB2D0+ii, Transmit_Data[ii]); BitModify(TXB2CTRL, TXREQ, TXREQ_SET); else bufferStatus = CANRead(TXB0CTRL); printf("Buffer status: %x \r\n", bufferStatus & 0x30); printf("CAN status: %x \r\n", CANstatus); //CAN status printf("Buffer status: %x \r\n", bufferStatus); //value in buffer /* * Function: spi_delay * Description: creates a delay * allows functions enough of a delay for the CS to deassert itself * before the next transmission to save confusion within the messages */ void spi_delay() for (int i=0; i<400; i++ );

Page 119: Control of a Robotic Arm over CAN using a touch screen user interface

103

motors.c /* * motors.c * - initializes parameters associated with the motors * i.e. PWM initialization * - message reception from CAN controller * * Name: Roisin Howard * ID number: 0850896 * Start Date: July 2011 */ #include <Analogdevices/ioaduc7128.h> #include <stdio.h> #include "mcp2515.h" #include "REGS2515.h" unsigned char spi_data[16]; unsigned char spi_index = 0; unsigned char CAN_interrupt_received = 0; unsigned char CAN_message_received = 0; unsigned char Transmit_SIDH; unsigned char Transmit_SIDL; unsigned char Transmit_DLC; unsigned char Transmit_Data[8]; /* * Function: InitPWM * Description: to initialize the relevant registers in the PWM */ void InitPWM(void) //initialize PWM outputs: PWM1-PWM6 (Port3) GP3CON = BIT0 + BIT4 + BIT8 + BIT12 + BIT16 + BIT20;

// set PWM1-PWM6 as outputs GP3DAT = BIT24 + BIT25 + BIT26 + BIT27 + BIT28 + BIT29; // pwm control register PWMCON1 = 0x00; PWMCON1 = BIT0 + BIT6 + BIT8; /* * Function: SetPwmDutyCycle * Description: setup PWM period at 20ms (50Hz) * UCLK/64 = 10.44M(x4)/64 = 160,000 x 4 = 640000

* 640,000/50 = 12800 => 0x3200 in hex... */ void SetPwmFreq(void) PWM1LEN = PWM_FREQVALUE; //20ms period PWM2LEN = PWM_FREQVALUE; //20ms period PWM3LEN = PWM_FREQVALUE; //20ms period return; /* * Function: robot Home * Description: sets robot up in home position */ void robotHome() PWM1COM1 = START; PWM1COM2 = WRISTROTATE_HOME; //1.3ms PWM1COM3 = GRIPPER_HOME; //1.3ms PWM2COM1 = START; PWM2COM2 = WRIST_HOME; //2.4ms PWM2COM3 = ELBOW_HOME; //1.6ms PWM3COM1 = START; PWM3COM2 = SHOULDER_HOME; //1.6ms PWM3COM3 = BASE_HOME; //1.6ms /* *Function: position motor *Description: sets up when the high signal goes low on PWM high-side */ void positionMotor(int motor, int position) switch (motor) case 0: case 1: PWM3COM2 = position; delay(2000); break; case 2: case 3: PWM2COM2 = position; delay(2000); break;

Page 120: Control of a Robotic Arm over CAN using a touch screen user interface

104

case 4: case 5: PWM1COM2 = position; delay(2000); break; default: break; /* * Function: InitMotors * Description: function to control motor movement */ void motorControlRoutine(void) unsigned char i = 0; unsigned char j = 0; unsigned int MessageID = 0; unsigned char Receive_SIDH[2]; unsigned char Receive_SIDL[2]; unsigned char Receive_DLC[2]; unsigned char Receive_Data[2][8]; unsigned char CANstatus; unsigned char RXBoffset = 0; int motormessage = -1; int motorstatus[] = 0,0,0,0,0,0; /*int motorstatusmessage[] = MSG_STATUS_BASE,MSG_STATUS_SHOULDER,MSG_STATUS_ELBOW,MSG_STATUS_WRIST,MSG_STATUS_GRIPPER,MSG_STATUS_WRISTROTATE;*/ int motormovedmessage[] = MSG_MOVED_BASE,MSG_MOVED_SHOULDER,MSG_MOVED_ELBOW,MSG_MOVED_WRIST,MSG_MOVED_GRIPPER,MSG_MOVED_WRISTROTATE; int motorpos[] = BASE_HOME,SHOULDER_HOME,ELBOW_HOME,WRIST_HOME,GRIPPER_HOME,WRISTROTATE_HOME; int motormax[] = BASE_MAX,SHOULDER_MAX,MOTOR_MAX,MOTOR_MAX,MOTOR_MAX,MOTOR_MAX; int motormin[] = BASE_MIN,SHOULDER_MIN,MOTOR_MIN,MOTOR_MIN,MOTOR_MIN,MOTOR_MIN;

int motorgoal[] = BASE_HOME,SHOULDER_HOME,ELBOW_HOME,WRIST_HOME,GRIPPER_HOME,WRISTROTATE_HOME; int newgoal; //printf ("Robot ID: %u, RA Demo Version %u", robot, major_ver); //printf (".%u\r\n", minor_ver); while (1) //CAN_interrupt_received = 1; if (CAN_interrupt_received) //printf("CANint\n"); CAN_interrupt_received = 0; CAN_message_received = 0; CANstatus = CANReadStatus(); //Receive buffer 0 and 1 for(i = 0; i < 2; i++) //Add an offset of 0x10 if it is receive buffer 1 RXBoffset = i * 0x10; //Check if this receive buffer full if ((CANstatus & (i+1)) > 0) //Read ID, DLC, data Receive_SIDH[i] = CANRead(RXB0SIDH + RXBoffset); Receive_SIDL[i] = CANRead(RXB0SIDL + RXBoffset); Receive_DLC[i] = CANRead(RXB0DLC + RXBoffset) & 0x0F; for (j=0; j < Receive_DLC[i];j++) Receive_Data[i][j] = CANRead(RXB0D0 + RXBoffset + j); spi_delay(); CAN_message_received = CAN_message_received | (i+1); //Modify CANINTF bit 0 or 1 BitModify(CANINTF, i + 1, 0); //Reset error flag BitModify(EFLG, (i + 1) << 6, 0);

Page 121: Control of a Robotic Arm over CAN using a touch screen user interface

105

//Loop through stored messages (one or two) for(i=0; i < 2; i++) if ((CAN_message_received & (i + 1)) > 0) MessageID = (Receive_SIDH[i] << 3) + (Receive_SIDL[i] >> 5); //printf("MessageID %x \n\r", MessageID); switch(MessageID & 0x7F) case 0x08 : printf("Version Request Rcd \n\r"); Transmit_SIDH = 0x01 | (1 << ROBOT_ID + 4); Transmit_SIDL = 0x00; Transmit_DLC = 0x08; Transmit_Data[0] = 0x00; Transmit_Data[1] = 0x00; Transmit_Data[2] = 0x00; Transmit_Data[3] = RACODE_MAJORVERSION; Transmit_Data[4] = 0x00; Transmit_Data[5] = 0x00; Transmit_Data[6] = 0x00; Transmit_Data[7] = RACODE_MINORVERSION; CAN_TransmitMessage(); break; //case MSG_BASE :

case MSG_SHOULDER : //case MSG_ELBOW : case MSG_WRIST : //case MSG_GRIPPER : case MSG_WRISTROTATE : motormessage = (MessageID & 0x7F) - MSG_BASE; if ( Receive_Data[i][0] == 10 ) if (motorstatus[motormessage] > 0) motorstatus[motormessage] = 0; Transmit_SIDH = (motormovedmessage[motormessage] >> 3) | (1 << ROBOT_ID + 4); Transmit_SIDL = (motormovedmessage[motormessage] & 0x07) << 5; Transmit_DLC = 0x02; Transmit_Data[0] = motorpos[motormessage] & 0x7F; Transmit_Data[1] = motorpos[motormessage] >> 7; CAN_TransmitMessage(); else motorstatus[motormessage] = 1; motorgoal[motormessage] = motormax[motormessage]; else if (motorstatus[motormessage] < 0) motorstatus[motormessage] = 0; Transmit_SIDH = (motormovedmessage[motormessage] >> 3) | (1 << ROBOT_ID + 4); Transmit_SIDL = (motormovedmessage[motormessage] & 0x07) << 5; Transmit_DLC = 0x02; Transmit_Data[0] = motorpos[motormessage] & 0x7F; Transmit_Data[1] = motorpos[motormessage] >> 7; CAN_TransmitMessage(); else

Page 122: Control of a Robotic Arm over CAN using a touch screen user interface

106

motorstatus[motormessage] = -1; motorgoal[motormessage] = motormin[motormessage]; break; //case MSG_POS_BASE : case MSG_POS_SHOULDER : //case MSG_POS_ELBOW : case MSG_POS_WRIST : //case MSG_POS_GRIPPER : case MSG_POS_WRISTROTATE : motormessage = (MessageID & 0x7F) - MSG_POS_BASE; newgoal = Receive_Data[i][0] + (Receive_Data[i][1] << 7); //if ((newgoal < motormax[motormessage]) && (newgoal > motormin[motormessage])) motorgoal[motormessage] = newgoal; if (newgoal < motorpos[motormessage]) motorstatus[motormessage] = -1; //if (newgoal > motorpos[motormessage]) else motorstatus[motormessage] = 1; // break; case ROBOTHOME: //robotHome(); for(j=1; j < 6; j = j + 2) motorgoal[j] = 0x400; if (motorgoal[j] < motorpos[j]) motorstatus[j] = -1;

//if (motorgoal[j] > motorpos[j]) else motorstatus[j] = 1; break; default: break; //end switch // end if //end for (loop through saved messages) //end if (CAN interrupt received) for(int i = 1; i < 6; i = i + 1) if (motorstatus[i] > 0) if ((motorpos[i] < motorgoal[i]) && (motorpos[i] < motormax[i])) positionMotor(i,motorpos[i]++); else motorstatus[i] = 0; Transmit_SIDH = (motormovedmessage[i] >> 3) | (1 << ROBOT_ID + 4); Transmit_SIDL = (motormovedmessage[i] & 0x07) << 5; Transmit_DLC = 0x02; Transmit_Data[0] = motorpos[i] & 0x7F; Transmit_Data[1] = motorpos[i] >> 7; CAN_TransmitMessage(); if (motorstatus[i] < 0) if ((motorpos[i] > motorgoal[i]) && (motorpos[i] > motormin[i])) positionMotor(i,motorpos[i]--);

Page 123: Control of a Robotic Arm over CAN using a touch screen user interface

107

else motorstatus[i] = 0; Transmit_SIDH = (motormovedmessage[i] >> 3) | (1 << ROBOT_ID + 4); Transmit_SIDL = (motormovedmessage[i] & 0x07) << 5; Transmit_DLC = 0x02; Transmit_Data[0] = motorpos[i] & 0x7F; Transmit_Data[1] = motorpos[i] >> 7; CAN_TransmitMessage(); CANstatus = CANReadStatus(); if (CANstatus > 0) CAN_interrupt_received = 1; //end while(1) loop //end init motors mcp2515.h /* * mcp2515.h * Name: Roisin Howard * ID number: 0850896 * Start Date: July 2011 * micro 1 */ #define RACODE_MAJORVERSION 1 #define RACODE_MINORVERSION 7 #define ROBOT_ID 1

/* * function prototypes */ void CANReset(); char CANReadStatus(); void CANConfigMode(); void LedToggle(); void CANNormalMode(); void CANLoopbackMode(); char CANRxStatus(); void CANReadRxBuffer(); void CANWrite(unsigned char address, unsigned char can_data); char CANRead(unsigned char address); void CAN_TransmitMessage(); void SPI_init(void); void init_serial(void); void SPITransmit(unsigned char SpiData); char CANStatus(); void BitModify(unsigned char address, unsigned char mask, unsigned char bit_value); void positionMotor(int motor, int position); int switchIO(int operation); void com_initialize(void); void motorControlRoutine(void); void InitPowerControlSystem(void); void InitPWM(void); void delay(int); int putchar(int ch); void Aduc_Init(void); void robotHome(); int getchar(void); int write(int file, char * ptr, int len); void SetPwmFreq(void); void spi_delay(void); /* * Global variables */ extern unsigned char DataReceived; extern unsigned char ReceiveBuffer[8]; extern unsigned char index; extern unsigned char DataTransmit; extern unsigned char spi_index; extern unsigned char read_index;

Page 124: Control of a Robotic Arm over CAN using a touch screen user interface

108

extern unsigned char CAN_interrupt_received; extern unsigned char transmit_complete; extern unsigned char Transmit_SIDH; extern unsigned char Transmit_SIDL; extern unsigned char Transmit_DLC; extern unsigned char Transmit_Data[8]; extern char read_data[10]; extern unsigned char spi_data[16]; extern unsigned char SpiReceiveBuffer; extern unsigned char read; extern unsigned char spi_dummy; extern unsigned char SpiDataGlobal; extern int base_motor; extern int shoulder_motor; extern int elbow_motor; extern int wrist_motor ; extern int gripper_motor; extern int wrotate_motor; /* * Missing definitions from ADuC7060 header file */ __IO_REG32(POWKEY3, 0xFFFF0434,__WRITE); __IO_REG32(POWKEY4, 0xFFFF043C,__WRITE); __IO_REG32(POWCON1, 0xFFFF0438,__WRITE); /* * Bit Definitions */ #define BIT0 0x01 #define BIT1 0x02 #define BIT2 0x04 #define BIT3 0x08 #define BIT4 0x10 #define BIT5 0x20 #define BIT6 0x40 #define BIT7 0x80 #define BIT8 0x100 #define BIT9 0x200 #define BIT10 0x400 #define BIT11 0x800 #define BIT12 0x1000 #define BIT13 0x2000 #define BIT14 0x4000

#define BIT15 0x8000 #define BIT16 0x10000 #define BIT17 0x20000 #define BIT18 0x40000 #define BIT19 0x80000 #define BIT20 0x100000 #define BIT21 0x200000 #define BIT22 0x400000 #define BIT23 0x800000 #define BIT24 0x1000000 #define BIT25 0x2000000 #define BIT26 0x4000000 #define BIT27 0x8000000 #define BIT28 0x10000000 #define BIT29 0x20000000 #define BIT30 0x40000000 #define BIT31 0x80000000 #define BIT32 0x100000000 /* * Robotic Arm Definitions */ #define PWM_FREQVALUE 0x3200 /* set PWM freq to 50Hz; 20ms period */ #define MSG_STOP 0x01 /* Add 0x300 to these for the CAN message ID sent to Robot IDs 1 & 2 */ /* Add 0x100 for response by Robot ID 1 */ /* Add 0x200 for response by Robot ID 2 */ #define MSG_VREQ 0x08 /* Add 0x300 to these for the CAN message IDs sent to Robot IDs 1 & 2 */ #define MSG_BASE 0x09 #define MSG_SHOULDER 0x0A #define MSG_ELBOW 0x0B #define MSG_WRIST 0x0C #define MSG_GRIPPER 0x0D #define MSG_WRISTROTATE 0x0E #define ROBOTHOME 0x0F /* Commands to send motor to given position */ #define MSG_POS_BASE 0x19 /* Add 0x200 for Robot ID 2 */ #define MSG_POS_SHOULDER 0x1A /* Add 0x100 for Robot ID 1 */

Page 125: Control of a Robotic Arm over CAN using a touch screen user interface

109

#define MSG_POS_ELBOW 0x1B /* Add 0x200 for Robot ID 2 */ #define MSG_POS_WRIST 0x1C /* Add 0x100 for Robot ID 1 */ #define MSG_POS_GRIPPER 0x1D /* Add 0x200 for Robot ID 2 */ #define MSG_POS_WRISTROTATE 0x1E /* Add 0x100 for Robot ID 1 */ /* MSG_STATUS_X not implemented*/ #define MSG_STATUS_BASE 0x29 /* Add 0x200 for Robot ID 2 */ #define MSG_STATUS_SHOULDER 0x2A /* Add 0x100 for Robot ID 1 */ #define MSG_STATUS_ELBOW 0x2B /* Add 0x200 for Robot ID 2 */ #define MSG_STATUS_WRIST 0x2C /* Add 0x100 for Robot ID 1 */ #define MSG_STATUS_GRIPPER 0x2D /* Add 0x200 for Robot ID 2 */ #define MSG_STATUS_WRISTROTATE 0x2E /* Add 0x100 for Robot ID 1*/ /* Report of motor position after movement/stop */ #define MSG_MOVED_BASE 0x39 /* Add 0x200 for Robot ID 2 */ #define MSG_MOVED_SHOULDER 0x3A /* Add 0x100 for Robot ID 1 */ #define MSG_MOVED_ELBOW 0x3B /* Add 0x200 for Robot ID 2 */ #define MSG_MOVED_WRIST 0x3C /* Add 0x100 for Robot ID 1 */ #define MSG_MOVED_GRIPPER 0x3D /* Add 0x200 for Robot ID 2 */ #define MSG_MOVED_WRISTROTATE 0x3E/* Add 0x100 for Robot ID 1*/ /* Home positions from LynxTerm */ #define WRISTROTATE_HOME 0x3C0 // (640000/(1/1.5ms)) #define GRIPPER_HOME 0x3C0 // (640000/(1/1.5ms)) #define WRIST_HOME 0x3C0 // (640000/(1/1.5ms)) #define ELBOW_HOME 0x3C0 // (640000/(1/1.5ms)) #define SHOULDER_HOME 0x3C0 // (640000/(1/1.5ms)) #define BASE_HOME 0x3C0 // (640000/(1/1.5ms))

/* Measured max/min positions for Base Rotate and Shoulder */ #define BASE_MAX 0x640 #define BASE_MIN 0x190 #define SHOULDER_MAX 0x4B0 #define SHOULDER_MIN 0x230 /* Max, Min motor positions from LynxTerm */ #define MOTOR_MAX 0x5A0 #define MOTOR_MIN 0x1E0 #define START 0x00

Page 126: Control of a Robotic Arm over CAN using a touch screen user interface

110

Appendix C hello_world.c

/* * Roisin Howard * 0850896 * Analog Devices * February 2012 */ #include <stdio.h> #include <stdarg.h> #include <stdlib.h> #include <string.h> #include <unistd.h> #include <stdint.h> #include <fcntl.h> #include <dlfcn.h> #include <alloca.h> #include <net/if.h> #include <sys/ioctl.h> #include <linux/can.h> #include <linux/socket.h> #include <linux/can/raw.h> #include <linux/version.h> #include <linux/input.h> #include <poll.h> #include "lib.h" #include "CANCommands.h" #include "fbutils.h" /* function prototypes */ int parse_canframe(char *cs, struct can_frame *cf); static int asc2nibble(char c); int can_setup(); int button_setup(); int menu_button_setup(); int routine_button_setup(); int robot_control(); int routines(); int main_menu(); int send_can();

int Get_Touch_Position(); /*global variables */ int fd, rd; struct input_event ev[64]; char name[256] = "Unknown"; int screen_touched = 0; unsigned int colorindex; struct tsdev *ts; char *tsdevice=NULL; int x, y, i; int s; /* can raw socket */ int nbytes; struct sockaddr_can addr; struct can_frame frame; struct can_frame recframe; struct ifreq ifr; char s_holder[1024]; int CAN_Msg_rcvd = 0; unsigned int mode = 0; int counter = 0; int playing_routine = 0; int dcounter = 0; int playing_dance_routine = 0; int new_can_msg_to_send = 0; int xcoord = 0; int ycoord = 0; int screen_value = -1; static int palette [] = 0x000000, 0xffe080, 0xffffff, 0xff00ff ; #define NR_COLORS 4 /*(sizeof (palette) / sizeof (palette [0]))*/ struct ts_button int x, y, w, h; char *text; int flags; #define BUTTON_ACTIVE 0x00000001 ; /* [inactive] border fill text [active] border fill text */ static int button_palette [6] =

Page 127: Control of a Robotic Arm over CAN using a touch screen user interface

111

1, 4, 2, 1, 5, 0 ; #define NR_BUTTONS 15 static struct ts_button buttons [NR_BUTTONS]; #define NR_MENU_BUTTONS 3 static struct ts_button menubuttons [NR_MENU_BUTTONS]; #define NR_ROUTINE_BUTTONS 4 static struct ts_button routinebuttons [NR_ROUTINE_BUTTONS]; #define NR_INFO_BUTTON 1 static struct ts_button infobutton [NR_INFO_BUTTON]; char *events[EV_MAX + 1] = [0 ... EV_MAX] = NULL, [EV_SYN] = "Sync", [EV_KEY] = "Key", [EV_REL] = "Relative", [EV_ABS] = "Absolute", [EV_MSC] = "Misc", [EV_LED] = "LED", [EV_SND] = "Sound", [EV_REP] = "Repeat", [EV_FF] = "ForceFeedback", [EV_PWR] = "Power", [EV_FF_STATUS] = "ForceFeedbackStatus" ; char *relatives[REL_MAX + 1] = [0 ... REL_MAX] = NULL, [REL_X] = "X", [REL_Y] = "Y" ; char *absolutes[ABS_MAX + 1] = [0 ... ABS_MAX] = NULL, [ABS_X] = "X", [ABS_Y] = "Y" ; static void welcome_screen() fillrect (0,0, xres -1, yres -1, 0); put_string_center (xres/2, 50, "Analog Devices CAN Demonstration", 1); put_string_center (xres/2, 100, "Robotic Arm Controller", 2); put_string_center (xres/2, 150, "Roisin Howard 0850896", 3); put_string_center (xres/2, 200, "Final Year Project 2012", 1);

static void button_draw (struct ts_button *button) int s = (button->flags & BUTTON_ACTIVE) ? 3 : 0; rect (button->x, button->y, button->x + button->w - 1, button->y + button->h - 1, button_palette [s]); fillrect (button->x + 1, button->y + 1, button->x + button->w - 2, button->y + button->h - 2, button_palette [s + 1]); put_string_center (button->x + button->w / 2, button->y + button->h / 2, button->text, button_palette [s + 2]); static void refresh_menu_screen() int i; fillrect (0,0, xres -1, yres -1, 0); put_string_center (xres/2, 15, "Analog Devices CAN Demonstration", 1); put_string_center (xres/2, 30, "Robotic Arm Controller", 2); for (i = 0; i < NR_MENU_BUTTONS; i++) button_draw (&menubuttons [i]); static void refresh_screen() int i; fillrect (0, 0, xres - 1, yres - 1, 0); put_string_center (xres/2, 15, "Analog Devices CAN Demonstration", 1); put_string_center (xres/2, 30, "Robotic Arm Controller", 2); for (i = 0; i < NR_BUTTONS; i++) button_draw (&buttons [i]); static void refresh_routine_screen() int i; fillrect (0,0, xres -1, yres -1, 0);

Page 128: Control of a Robotic Arm over CAN using a touch screen user interface

112

put_string_center (xres/2, 15, "Analog Devices CAN Demonstration", 1); put_string_center (xres/2, 30, "Robotic Arm Controller", 2); for (i = 0; i < NR_ROUTINE_BUTTONS; i++) button_draw (&routinebuttons [i]); static void info_screen() fillrect (0,0, xres -1, yres -1, 0); put_string_center (xres/2, 20, "Analog Devices CAN Demonstration", 1); put_string_center (xres/2, 40, "Robotic Arm Controller", 2); put_string_center (xres/2, 60, "Roisin Howard 0850896", 3); put_string_center (xres/2, 80, "This is a showcase demonstration for the signal", 2); put_string_center (xres/2, 100, "and power isolated CAN transceiver, ADM3053.", 2); put_string_center (xres/2, 120, "The robotic arm can be controlled manually by", 2); put_string_center (xres/2, 140, "clicking the Robot Control button in the main menu.", 2); put_string_center (xres/2, 160, "Pre-recorded routines can also be played back by", 2); put_string_center (xres/2, 180, "clicking the Routines button in the main menu.", 2); button_draw (&infobutton [0]); /* * This function sets up the buttons for the main menu screen * */ int menu_button_setup() memset (&menubuttons, 0, sizeof (menubuttons)); /* set the width of the button */ menubuttons [0].w = xres / 2; menubuttons [1].w = xres / 2; menubuttons [2].w = xres / 2; /* set the height of the button */ menubuttons [0].h = 50; menubuttons [1].h = 50; menubuttons [2].h = 50;

/* set the x position on the screen of the button */ menubuttons [0].x = (2 * xres) / 4 - menubuttons [0].w / 2 ; menubuttons [1].x = (2 * xres) / 4 - menubuttons [0].w / 2 ; menubuttons [2].x = (2 * xres) / 4 - menubuttons [0].w / 2; /* set the y position on the screen of the button */ menubuttons [0].y = 50; menubuttons [1].y = 120; menubuttons [2].y = 190; /* add text to the center of the button */ menubuttons [0].text = "Robot Controller"; menubuttons [1].text = "Routines"; menubuttons [2].text = "Information"; /* refresh the screen to display menu buttons */ refresh_menu_screen(); /* * This function sets up the buttons for the robot control screen * NB change this layout to make more space to press button...more user friendly * */ int button_setup() memset (&buttons, 0, sizeof (buttons)); /* set the width of the button */ buttons [0].w = buttons [1].w = xres / 4; buttons [2].w = buttons [3].w = xres / 4; buttons [4].w = buttons [5].w = xres / 4; buttons [6].w = buttons [7].w = xres / 4; buttons [8].w = buttons [9].w = xres / 4; buttons [10].w = buttons [11].w = xres / 4; buttons [12].w = buttons [13].w = xres / 4; buttons [14].w = xres / 4; /* set the height of the button */ buttons [0].h = buttons [1].h = 30; buttons [2].h = buttons [3].h = 30; buttons [4].h = buttons [5].h = 30; buttons [6].h = buttons [7].h = 30; buttons [8].h = buttons [9].h = 30; buttons [10].h = buttons [11].h = 30; buttons [12].h = buttons [13].h = 30; buttons [14].h = 30; /* set the x position on the screen of the button */

Page 129: Control of a Robotic Arm over CAN using a touch screen user interface

113

buttons [0].x = xres / 4 - buttons [0].w / 2 - 30; buttons [1].x = (2 * xres) / 4 - buttons [0].w / 2; buttons [2].x = (3 * xres) / 4 - buttons [0].w / 2 + 30; buttons [3].x = xres / 4 - buttons [0].w / 2 - 30; buttons [4].x = (2 * xres) / 4 - buttons [0].w / 2; buttons [5].x = (3 * xres) / 4 - buttons [0].w / 2 + 30; buttons [6].x = xres / 4 - buttons [0].w / 2 - 30; buttons [7].x = (2 * xres) / 4 - buttons [0].w / 2; buttons [8].x = (3 * xres) / 4 - buttons [0].w / 2 + 30; buttons [9].x = xres / 4 - buttons [0].w / 2 - 30; buttons [10].x = (2 * xres) / 4 - buttons [0].w / 2; buttons [11].x = (3 * xres) / 4 - buttons [0].w / 2 + 30; buttons [12].x = xres / 4 - buttons [0].w / 2 - 30; buttons [13].x = (2 * xres) / 4 - buttons [0].w / 2; buttons [14].x = (3 * xres) / 4 - buttons [0].w / 2 + 30; /* set the y position on the screen of the button */ buttons [0].y = buttons [1].y = buttons [2].y = 60; buttons [3].y = buttons [4].y = buttons [5].y = 100; buttons [6].y = buttons [7].y = buttons [8].y = 140; buttons [9].y = buttons [10].y = buttons [11].y = 180; buttons [12].y = buttons [13].y = buttons [14].y = 220; /* add text to the center of the button */ buttons [0].text = "Open Gripper"; buttons [1].text = "Wrist left"; buttons [2].text = "Wrist up"; buttons [3].text = "Close Gripper"; buttons [4].text = "Wrist right"; buttons [5].text = "Wrist down"; buttons [6].text = "Elbow up"; buttons [7].text = "Shoulder up"; buttons [8].text = "Base left"; buttons [9].text = "Elbow down"; buttons [10].text = "Shoulder down"; buttons [11].text = "Base right"; buttons [12].text = "Main menu"; buttons [13].text = "Home position"; buttons [14].text = "Play Routine"; /* refresh the screen to include the buttons */ refresh_screen(); /* * This function sets up the buttons for the routine screen * */

int routine_button_setup() memset (&routinebuttons, 0, sizeof (routinebuttons)); /* set the width of the button */ routinebuttons [0].w = xres / 3; routinebuttons [1].w = xres / 3; routinebuttons [2].w = xres / 3; routinebuttons [3].w = xres / 3; /* set the height of the button */ routinebuttons [0].h = 50; routinebuttons [1].h = 50; routinebuttons [2].h = 50; routinebuttons [3].h = 50; /* set the x position on the screen of the button */ routinebuttons [0].x = (xres) / 4 - routinebuttons [0].w / 2; routinebuttons [1].x = (3 * xres) / 4 - routinebuttons [0].w / 2; routinebuttons [2].x = (xres) / 4 - routinebuttons [0].w / 2; routinebuttons [3].x = (3 * xres) / 4 - routinebuttons [0].w / 2; /* set the y position on the screen of the button */ routinebuttons [0].y = 70; routinebuttons [1].y = 70; routinebuttons [2].y = 170; routinebuttons [3].y = 170; /* add text to the center of the button */ routinebuttons [0].text = "Pick & Place"; routinebuttons [1].text = "Home position"; routinebuttons [2].text = "Main menu"; routinebuttons [3].text = "Dance"; /* refresh the screen to display menu buttons */ if(playing_routine) routinebuttons [0].text = "Stop routine"; else if(playing_dance_routine) routinebuttons [3].text = "Stop routine"; refresh_routine_screen(); /* * This function sets up the buttons for the information screen * */

Page 130: Control of a Robotic Arm over CAN using a touch screen user interface

114

int info_button_setup() memset (&infobutton, 0, sizeof (infobutton)); /* set the width of the button */ infobutton [0].w = xres / 2; /* set the height of the button */ infobutton [0].h = 40; /* set the x position on the screen of the button */ infobutton [0].x = (2 * xres) / 4 - infobutton [0].w / 2 ; /* set the y position on the screen of the button */ infobutton [0].y = 220; /* add text to the center of the button */ infobutton [0].text = "Back to main menu"; /* refresh the screen to display info button */ info_screen(); /* * This function checks that the robotic arm is connected * If the can message received flag is set the message * "Robot Arm Connected" is displayed on the screen * */ int Check_RA_Connected() if (CAN_Msg_rcvd) CAN_Msg_rcvd = 0; put_string_center (xres/2, 45, "Robot Arm Connected", 3); /* * This function sets up the CAN network * A CAN socket is created and binded * */ int can_setup() /*open can socket*/ if ((s = socket(PF_CAN, SOCK_RAW, CAN_RAW)) < 0) perror("socket");

return 1; strcpy(ifr.ifr_name, "can0" ); ioctl(s, SIOCGIFINDEX, &ifr); addr.can_family = AF_CAN; addr.can_ifindex = ifr.ifr_ifindex; if (bind(s, (struct sockaddr *)&addr, sizeof(addr)) < 0) perror("bind"); return 1; /* * This function checks for a receiving CAN message * */ int Check_For_CAN_Msg() fd_set rdfs; FD_ZERO(&rdfs); FD_SET(s, &rdfs); printf("Checking for CAN Message...\n"); CAN_Msg_rcvd = 0; socklen_t len = sizeof(addr); nbytes = read(s, &recframe, sizeof(recframe)); if (nbytes<0) perror("can raw socket read"); return 1; else printf("Message Received!\n"); printf("can_received_message : ID %x, DLC %d, DATA %x\n", recframe.can_id, recframe.can_dlc, recframe.data); CAN_Msg_rcvd = 1;

Page 131: Control of a Robotic Arm over CAN using a touch screen user interface

115

/* * This function calls the button set up for the main menu and it * performs the actions on certain touch screen events for that screen...the * main menu */ int main_menu() /* calls the routine to draw the buttons & refreshes the screen */ menu_button_setup(); Get_Touch_Position(); if(screen_touched) if ( (xcoord < 3000) && (xcoord > 1000)) if ( (ycoord < 3100 ) && (ycoord > 2400)) printf("Robot Control activated!\n"); screen_value = CONTROL; button_setup(); /*creates the button size position and text*/ robot_control(); else if ( (ycoord < 2100 ) && (ycoord > 1400)) printf("Pre-recorded Routines!\n"); screen_value = ROUTINES; routine_button_setup(); routines(); else if ( (ycoord < 1100 ) && (ycoord > 400)) printf("Information!\n"); screen_value = INFO; info_button_setup(); info(); /* end if screen_touched */ /* end main_menu function*/

/* * This function calls the button set up for the robot control screen and it * performs the actions on certain touch screen events for that screen...the * robot control */ int robot_control() /* calls the routine to draw the buttons & refreshes the screen */ button_setup(); Get_Touch_Position(); if(screen_touched) if ((xcoord < 3700) && (xcoord > 2800)) if ( (ycoord < 2900) && (ycoord > 2600)) #ifdef OLD_ROBOT parse_canframe(MSG_GRIPPER_OPEN, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_G_OPEN, &frame); #endif new_can_msg_to_send = 1; printf("Open Gripper\n"); else if ((ycoord < 2300 ) && (ycoord >2000)) #ifdef OLD_ROBOT parse_canframe(MSG_GRIPPER_CLOSE, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_G_CLOSE , &frame); #endif new_can_msg_to_send = 1; printf("Close Gripper\n"); else if ((ycoord < 1800) && (ycoord > 1500)) #ifdef OLD_ROBOT

Page 132: Control of a Robotic Arm over CAN using a touch screen user interface

116

parse_canframe(MSG_ELBOW_UP, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_E_UP, &frame); #endif new_can_msg_to_send = 1; printf("Elbow up \n"); else if ((ycoord < 1300) && (ycoord > 1000)) #ifdef OLD_ROBOT parse_canframe(MSG_ELBOW_DOWN, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_E_DOWN, &frame); #endif new_can_msg_to_send = 1; printf("Elbow down \n"); else if ((ycoord < 800) && (ycoord > 500)) screen_value = MAIN; main_menu(); else if ((xcoord < 2500) && (xcoord > 1500)) if ((ycoord < 2900) && (ycoord > 2600)) #ifdef OLD_ROBOT parse_canframe(MSG_WRISTROTATE_L, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_WR_LEFT, &frame); #endif new_can_msg_to_send = 1; printf("Wrist left \n"); else if ((ycoord < 2300) && (ycoord > 2000))

#ifdef OLD_ROBOT parse_canframe(MSG_WRISTROTATE_R, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_WR_RIGHT, &frame); #endif new_can_msg_to_send = 1; printf("Wrist right \n"); else if ((ycoord < 1800) && (ycoord > 1500)) #ifdef OLD_ROBOT parse_canframe(MSG_SHOULDER_UP, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_S_UP, &frame); #endif new_can_msg_to_send = 1; printf("Shoulder up \n"); else if ((ycoord < 1300) && (ycoord > 1000)) #ifdef OLD_ROBOT parse_canframe(MSG_SHOULDER_DOWN, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_S_DOWN, &frame); #endif new_can_msg_to_send = 1; printf("Shoulder down \n"); else if ((ycoord < 800) && (ycoord > 500)) #ifdef NEW_ROBOT go_home(); #endif printf("Move robot home!\n"); else if ((xcoord < 1200) && (xcoord > 300))

Page 133: Control of a Robotic Arm over CAN using a touch screen user interface

117

if ((ycoord < 2900) && (ycoord > 2600)) #ifdef OLD_ROBOT parse_canframe(MSG_WRIST_UP, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_W_UP, &frame); #endif new_can_msg_to_send = 1; printf("Wrist up \n"); else if ((ycoord < 2300) && (ycoord > 2000)) #ifdef OLD_ROBOT parse_canframe(MSG_WRIST_DOWN, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_W_DOWN, &frame); #endif new_can_msg_to_send = 1; printf("Wrist down \n"); else if ((ycoord < 1800) && (ycoord > 1500)) #ifdef OLD_ROBOT parse_canframe(MSG_BASE_LEFT, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_B_LEFT, &frame); #endif new_can_msg_to_send = 1; printf("Base Left \n"); else if ((ycoord < 1300) && (ycoord > 1000)) #ifdef OLD_ROBOT parse_canframe(MSG_BASE_RIGHT, &frame); #endif #ifdef NEW_ROBOT parse_canframe(MSG_B_RIGHT, &frame);

#endif new_can_msg_to_send = 1; printf("Base right \n"); else if ((ycoord < 800) && (ycoord > 500)) printf("Routines Screen!\n"); screen_value = ROUTINES; routine_button_setup(); routines(); /* * This function calls the button set up for the routines and it * performs the actions on certain touch screen events for that screen...the * routines */ int routines() /* calls the routine to draw the buttons & refreshes the screen */ routine_button_setup(); Get_Touch_Position(); if(screen_touched) if ( (xcoord < 3500) && (xcoord > 2300)) if ( (ycoord < 2800) && (ycoord > 2200)) printf("Play routine -> P&P!\n"); #ifdef NEW_ROBOT playing_routine = !playing_routine; counter = 0; if (playing_routine) printf("Pick and place activated!\n"); refresh_routine_screen(); else

Page 134: Control of a Robotic Arm over CAN using a touch screen user interface

118

routinebuttons [0].text = "Pick & Place"; put_string_center (xres/4,250, "Routine Done", 1); refresh_routine_screen(); #endif else if ( (ycoord < 1300) && (ycoord > 800)) printf("Back to main menu!\n"); screen_value = MAIN; main_menu(); else if ( (xcoord < 1500) && (xcoord > 400)) if ( (ycoord < 2800) && (ycoord > 2200)) printf("Move robot home!\n"); #ifdef NEW_ROBOT go_home(); #endif else if ( ( ycoord < 1300) && (ycoord > 800)) printf("Play routine -> dance!\n"); #ifdef NEW_ROBOT playing_dance_routine = !playing_dance_routine; counter = 0; if (playing_dance_routine) printf("Dance Routine activated!\n"); refresh_routine_screen(); else routinebuttons [3].text = "Dance"; put_string_center (xres/4,250, "Routine Done", 1); refresh_routine_screen();

#endif // end if(screen_touched) /* * This function calls the button set up for the information screen and it * performs the actions on certain touch screen events for that screen...the * info */ int info() info_button_setup(); Get_Touch_Position(); if(screen_touched) if ( (xcoord < 3000) && (xcoord > 1000)) if ( (ycoord < 600) && (ycoord > 300)) screen_value = MAIN; main_menu(); int Get_Touch_Position() if (screen_touched) ycoord = 0; xcoord = 0; screen_touched = 0; ev[1].value = 0; ev[1].type = 0; if (ev[1].code == 0 && ev[1].type == 3)

Page 135: Control of a Robotic Arm over CAN using a touch screen user interface

119

printf("X Coordinate: %d\n", ev[1].value); xcoord = ev[1].value; if(ycoord > 0) screen_touched = 1; printf("Screen touched \n"); usleep(100); if (ev[1].code == 1 && ev[1].type == 3) printf("Y Coordinate: %d\n", ev[1].value); ycoord = ev[1].value; if(xcoord > 0) screen_touched = 1; printf("Screen touched \n"); usleep(100); /* * This function calls moves the robot to the home position */ int go_home() /* home position */ parse_canframe("31D#0008", &frame); if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1; Check_For_CAN_Msg(); parse_canframe("31A#0008", &frame); if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1; Check_For_CAN_Msg(); parse_canframe("31E#0008", &frame);

if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1; Check_For_CAN_Msg(); parse_canframe("31B#0008", &frame); if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1; Check_For_CAN_Msg(); parse_canframe("31C#0008", &frame); if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1; Check_For_CAN_Msg(); parse_canframe("319#0008", &frame); if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1; Check_For_CAN_Msg(); printf("Home position \n"); /* * This function sends a version request to the connected robot * */ int version_req() /* sends a version request */ parse_canframe(MSG_VERSION_REQ, &frame); printf("Version Request\n"); /* send frame */ if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1;

Page 136: Control of a Robotic Arm over CAN using a touch screen user interface

120

/* * This function performs the sending of a CAN message * and clears the message to send flag */ int send_can() new_can_msg_to_send = 0; printf("sending can message\n"); /* send frame */ if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1; static int asc2nibble(char c) if ((c >= '0') && (c <= '9')) return c - '0'; if ((c >= 'A') && (c <= 'F')) return c - 'A' + 10; if ((c >= 'a') && (c <= 'f')) return c - 'a' + 10; return 16; // error int parse_canframe(char *cs, struct can_frame *cf) // documentation see lib.h int i, idx, dlc, len, tmp; len = strlen(cs); //printf("'%s' len %d\n", cs, len); memset(cf, 0, sizeof(*cf)); // init CAN frame, e.g. DLC = 0 if (len < 4)

return 1; if (!((cs[3] == CANID_DELIM) || (cs[8] == CANID_DELIM))) return 1; if (cs[8] == CANID_DELIM) // 8 digits idx = 9; for (i=0; i<8; i++) if ((tmp = asc2nibble(cs[i])) > 0x0F) return 1; cf->can_id |= (tmp << (7-i)*4); if (!(cf->can_id & CAN_ERR_FLAG)) // 8 digits but no errorframe? cf->can_id |= CAN_EFF_FLAG; // then it is an extended frame else // 3 digits idx = 4; for (i=0; i<3; i++) if ((tmp = asc2nibble(cs[i])) > 0x0F) return 1; cf->can_id |= (tmp << (2-i)*4); if((cs[idx] == 'R') || (cs[idx] == 'r')) // RTR frame cf->can_id |= CAN_RTR_FLAG; return 0; for (i=0, dlc=0; i<8; i++) if(cs[idx] == DATA_SEPERATOR) // skip (optional) seperator idx++; if(idx >= len) // end of string => end of data break; if ((tmp = asc2nibble(cs[idx++])) > 0x0F) return 1; cf->data[i] = (tmp << 4); if ((tmp = asc2nibble(cs[idx++])) > 0x0F) return 1;

Page 137: Control of a Robotic Arm over CAN using a touch screen user interface

121

cf->data[i] |= tmp; dlc++; cf->can_dlc = dlc; return 0; void fprint_canframe(FILE *stream , struct can_frame *cf, char *eol, int sep) // documentation see lib.h char buf[sizeof(MAX_CANFRAME)+1]; // max length sprint_canframe(buf, cf, sep); fprintf(stream, "%s", buf); if (eol) fprintf(stream, "%s", eol); void sprint_canframe(char *buf , struct can_frame *cf, int sep) // documentation see lib.h int i,offset; if (cf->can_id & CAN_ERR_FLAG) sprintf(buf, "%08X#", cf->can_id & (CAN_ERR_MASK|CAN_ERR_FLAG)); offset = 9; else if (cf->can_id & CAN_EFF_FLAG) sprintf(buf, "%08X#", cf->can_id & CAN_EFF_MASK); offset = 9; else sprintf(buf, "%03X#", cf->can_id & CAN_SFF_MASK); offset = 4; if (cf->can_id & CAN_RTR_FLAG) // there are no ERR frames with RTR sprintf(buf+offset, "R"); else for (i = 0; i < cf->can_dlc; i++) sprintf(buf+offset, "%02X", cf->data[i]); offset += 2; if (sep && (i+1 < cf->can_dlc)) sprintf(buf+offset++, ".");

int Play_Routine(int count_value) int xx = 0; /* send frame */ put_string_center (xres/4,250, "Routine Busy ...", 3); parse_canframe(pick_and_place[count_value], &frame); printf("can_frame: %s\n", pick_and_place[count_value]); if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1; Check_For_CAN_Msg(); int Play_Dance_Routine(int count_value) int xx = 0; /* send frame */ put_string_center (xres/4,250, "Routine Busy ...", 3); parse_canframe(dance[count_value], &frame); printf("can_frame: %s\n", dance[count_value]); if ((nbytes = write(s, &frame, sizeof(frame))) != sizeof(frame)) perror("write"); return 1; Check_For_CAN_Msg(); int main(int ac,char **av) char buffer[128]; int timeout = 200; /* Timeout in msec. */ struct pollfd pfds[2]; /* poll file descriptor */ int result; if (open_framebuffer())

Page 138: Control of a Robotic Arm over CAN using a touch screen user interface

122

close_framebuffer(); printf("framebuffer not opened\n"); exit(1); for (colorindex = 0; colorindex < NR_COLORS; colorindex++) setcolor (colorindex, palette [colorindex]); /* display a welcome message on screen before start up */ welcome_screen(); sleep(5); /* display the welcome screen for 5 seconds */ screen_value = MAIN; menu_button_setup(); can_setup(); /* sets up the can transceiver (can0) and socket s */ /* print to the terminal screen to show the app is running correctly */ printf("********************************************\n"); printf("*************Hello, World********************\n"); printf("*********This is Roisin's Robitc Arm *************\n"); printf("*******Showcase Demonstration App! ***********\n"); printf("********************************************\n"); printf("********************************************\n"); pfds[0].fd = open("/dev/input/event1", O_RDONLY); if ( pfds[0].fd < 0 ) printf(" failed to open dev/input/event1\n"); exit (1); pfds[0].events=POLLIN; ioctl(pfds[0].fd, EVIOCGNAME(sizeof(name)), name); /* sends a version request to the robotic arm */ version_req(); #ifdef NEW_ROBOT

/* checks to see if CAN messages are being received - 2 micros responding */ Check_For_CAN_Msg(); Check_For_CAN_Msg(); /* checks if the robotic arm is connected */ Check_RA_Connected(); /* send robot home message */ go_home(); #endif counter = 0; dcounter = 0; while(1) result = poll (pfds, 1, timeout); switch (result) case 0: printf ("timeout\n"); break; case -1: printf ("poll error \n"); exit (1); default: if (pfds[0].revents & POLLIN) rd = read (pfds[0].fd, ev, sizeof(struct input_event) * 64); if (rd == 0) printf ("/dev/input/event1 done\n"); exit (0); else rd = 0; if(screen_value == MAIN)

Page 139: Control of a Robotic Arm over CAN using a touch screen user interface

123

/* call main menu */ main_menu(); /* calls main menu function */ else if(screen_value == CONTROL) /* call robot control screen & routine */ button_setup(); /*creates the button size position and text*/ robot_control(); else if(screen_value == ROUTINES) /* call robot control screen & routine */ routine_button_setup(); /*creates the button size position and text*/ routines(); else if(screen_value == INFO) /* call robot control screen & routine */ info_button_setup(); /*creates the button size position and text*/ info(); /* end case statement */ if ((new_can_msg_to_send == 1) && (playing_routine == 0)) send_can(); if (playing_routine)

Play_Routine(counter); printf("counter : %d\n", counter); if(counter < 67) counter++; else counter = 0; if (playing_dance_routine) Play_Dance_Routine(dcounter); printf("counter : %d\n", dcounter); if(dcounter < 28) dcounter++; else dcounter = 0; /* end while loop */ close(fd); /* end of main function */ CANCommands.h /* * Roisin Howard * 0850896 * Analog Devices * February 2012 */ #ifndef CANCOMMANDS_H #define CANCOMMANDS_H

Page 140: Control of a Robotic Arm over CAN using a touch screen user interface

124

//#define OLD_ROBOT #define NEW_ROBOT #ifndef EV_SYN #define EV_SYN 0 #endif /* state variables */ #define MAIN 0 #define CONTROL 1 #define ROUTINES 2 #define INFO 3 #define CANID_DELIM '#' #define DATA_SEPERATOR '.' #define AF_CAN 29 /* Controller Area Network */ #define PF_CAN AF_CAN #define BUF_SIZE 512 #define MAX_CANFRAME "12345678#01.23.45.67.89.AB.CD.EF" #define PWM_FREQVALUE 0x3200 /* set PWM freq to 50Hz; 20ms period */ #define START 0x00 #define MSG_STOP 0x01 /* Add 0x300 to these for the CAN message ID sent to Robot IDs 1 & 2 */ /* Add 0x100 for response by Robot ID 1 */ /* Add 0x200 for response by Robot ID 2 */ #define MSG_VREQ 0x08 #define MSG_VERSION_REQ "388#R" /* Add 0x300 to these for the CAN message IDs sent to Robot IDs 1 & 2 */ #define MSG_BASE 0x09 #define MSG_SHOULDER 0x0A #define MSG_ELBOW 0x0B #define MSG_WRIST 0x0C #define MSG_GRIPPER 0x0D #define MSG_WRISTROTATE 0x0E #define ROBOTHOME 0x0F

/*old robot commands */ #define MSG_GRIPPER_CLOSE "38D#02" #define MSG_GRIPPER_OPEN "38D#82" #define MSG_WRIST_UP "38C#82" #define MSG_WRIST_DOWN "38C#02" #define MSG_ELBOW_DOWN "38B#02" #define MSG_ELBOW_UP "38B#82" #define MSG_SHOULDER_DOWN "38A#02" #define MSG_SHOULDER_UP "38A#82" #define MSG_BASE_RIGHT "389#02" #define MSG_BASE_LEFT "389#82" #define MSG_WRISTROTATE_R "38E#02" #define MSG_WRISTROTATE_L "38E#82" #define MSG_ROBOTHOME "38F#00" /* Define the robot ID */ #define ROBOT_ID_1 0x100 #define ROBOT_ID_2 0x200 /*new robot commands */ #define MSG_G_CLOSE "38D#0A" #define MSG_G_OPEN "38D#0F" #define MSG_W_UP "38C#0A" #define MSG_W_DOWN "38C#0F" #define MSG_E_DOWN "38B#0A" #define MSG_E_UP "38B#0F" #define MSG_S_UP "38A#0A" #define MSG_S_DOWN "38A#0F" #define MSG_B_RIGHT "389#0A" #define MSG_B_LEFT "389#0F" #define MSG_WR_RIGHT "38E#0A" #define MSG_WR_LEFT "38E#0F" /* Commands to send motor to given position */ #define MSG_POS_BASE 0x19 /* Add 0x200 for Robot ID 2 */ #define MSG_POS_SHOULDER 0x1A /* Add 0x100 for Robot ID 1 */ #define MSG_POS_ELBOW 0x1B /* Add 0x200 for Robot ID 2 */ #define MSG_POS_WRIST 0x1C /* Add 0x100 for Robot ID 1 */ #define MSG_POS_GRIPPER 0x1D /* Add 0x200 for Robot ID 2 */

Page 141: Control of a Robotic Arm over CAN using a touch screen user interface

125

#define MSG_POS_WRISTROTATE 0x1E /* Add 0x100 for Robot ID 1 */ /* MSG_STATUS_X not implemented*/ #define MSG_STATUS_BASE 0x29 /* Add 0x200 for Robot ID 2 */ #define MSG_STATUS_SHOULDER 0x2A /* Add 0x100 for Robot ID 1 */ #define MSG_STATUS_ELBOW 0x2B /* Add 0x200 for Robot ID 2 */ #define MSG_STATUS_WRIST 0x2C /* Add 0x100 for Robot ID 1 */ #define MSG_STATUS_GRIPPER 0x2D /* Add 0x200 for Robot ID 2 */ #define MSG_STATUS_WRISTROTATE 0x2E /* Add 0x100 for Robot ID 1 */ /* Report of motor position after movement/stop */ #define MSG_MOVED_BASE 0x39 /* Add 0x200 for Robot ID 2 */ #define MSG_MOVED_SHOULDER 0x3A /* Add 0x100 for Robot ID 1 */ #define MSG_MOVED_ELBOW 0x3B /* Add 0x200 for Robot ID 2 */ #define MSG_MOVED_WRIST 0x3C /* Add 0x100 for Robot ID 1 */ #define MSG_MOVED_GRIPPER 0x3D /* Add 0x200 for Robot ID 2 */ #define MSG_MOVED_WRISTROTATE 0x3E /* Add 0x100 for Robot ID 1 */ /* Home positions from lynxmotion */ #define WRISTROTATE_HOME 0x3C0 /* (640000/(1/1.5ms)) */ #define GRIPPER_HOME 0x3C0 /* (640000/(1/1.5ms)) */ #define WRIST_HOME 0x3C0 /* (640000/(1/1.5ms)) */ #define ELBOW_HOME 0x3C0 /* (640000/(1/1.5ms)) */ #define SHOULDER_HOME 0x3C0 /* (640000/(1/1.5ms)) */ #define BASE_HOME 0x3C0 /* (640000/(1/1.5ms)) */ /* Measured max/min positions for Base Rotate and Shoulder */ #define BASE_MAX 0x640 #define BASE_MIN 0x190 #define SHOULDER_MAX 0x4B0 #define SHOULDER_MIN 0x230

/* Max, Min motor positions from lynxmotion */ #define MOTOR_MAX 0x5A0 #define MOTOR_MIN 0x1E0 #define ROUTINE_SIZE 10 /* pick and place routine CAN messages */ static char* pick_and_place[68] = "319#430B", "31B#6E05", "31A#2806", "31D#0A05", "31E#0F07", "31C#6304", "31A#5F06", "31B#2A07", "31D#3808", "31A#2408", "31A#4708", "319#4B05", "31B#4206", "31C#4007", "31E#4C05", "319#430B", "31E#0F07", "31C#6304", "31A#5F06", "31B#2A07", "31D#5104", "31A#4708", "319#4B05", "31B#4206", "31C#4007", "31E#4C05", "319#430B", "31E#0F07", "31C#6304", "31A#5F06", "31B#2A07", "31D#3808", "31A#4708", "319#4B09", "31B#4206", "31C#4007", "31E#4C05",

Page 142: Control of a Robotic Arm over CAN using a touch screen user interface

126

"31E#0F07", "31C#6304", "31A#5F06", "31B#2A07", "31D#5104", "31A#4708", "319#430B", "31B#4206", "31C#4007", "31E#4C05", "319#4B09", "31E#0F07", "31C#6304", "31A#5F06", "31B#2A07", "31D#3808", "31A#4708", "319#430B", "31B#4206", "31C#4007", "31E#4C05", "31E#0F07", "31C#6304", "31A#5F06", "31B#2A07", "31D#5104", "31A#5707", "31C#5807", "31B#2E07", "319#0008", "31E#0E07"; /* dance routine CAN messages */ static char* dance[29] = "31B#4104", "31B#6303", "31E#790A", "31E#6503", "31E#5D07", "31D#1B0B", "31D#6003", "31D#4A07", "31C#2A0A", "31C#1505", "31C#2B07",

"319#3103", "319#5807", "319#220C", "31A#4B07", "31A#6905", "31D#3C04", "31D#590A", "31D#5807", "31E#470A", "31E#2704", "31E#4007", "31A#0107", "31B#2E07", "319#2A07", "31A#3D07", "31B#7304", "31B#2408", "31B#1907"; #endif makefile

#define CC CC = bfin-linux-uclibc-gcc #CFLAGS = -o2 #all clean: app: hello_world.o $(CC) $(CFLAGS) -o app hello_world.o fbutils.o font_8x8.o -ldl hello_world.o: hello_world.c lib.h CANCommands.h $(CC) $(CFLAGS) -c hello_world.c fbutils.c font_8x8.c fbutils.h fbutils.o: fbutils.c font_8x8.c $(CC) $(CFLAGS) -c fbutils.c font_8x8.c fbutils.h font.h font_8x8.o: font_8x8.c $(CC) $(CFLAGS) -c font_8x8.c font.h

Page 143: Control of a Robotic Arm over CAN using a touch screen user interface

Appendices

127

Appendix D

CAN message definitions

Version Request

Remote Transmission Request (Sent by Control Application)

Message

ID*

Responders Data

308 Micro 1, Micro 2 None (RTR, DLC is

8)

Response (Sent by Micro)

Message

ID*

Responders Data

108 Micro 1 00 00 00 xx 00 00 00

xx

208 Micro 2 00 00 00 xx 00 00 00

xx

Description:

The control application can send a version request. Both microcontrollers will receive this request and

respond with version information.

The response has the major version held in data byte 3 and the minor version held in data byte 7.

All of the message IDs include the micro ID (i.e. micro1 or micro2). This forms the 4 most significant bits

in the message ID.

Page 144: Control of a Robotic Arm over CAN using a touch screen user interface

Appendices

128

Start/Stop Motor

(Sent by Control Application)

Motor Used

by

Message

ID

Data

(Forward)

Data

(Reverse)

Base Micro 2 309 0A 0F

Shoulder Micro 1 30A 0A 0F

Elbow Micro 2 30B 0A 0F

Wrist Micro 1 30C 0A 0F

Gripper Micro 2 30D 0A 0F

Wrist

Rotate

Micro 1 30E 0A 0F

Description:

If the relevant motor is stationary and data byte 0 is 0A (10 in decimal), the micro will start increasing

the duty cycle (PWM register value) for that motor until the max duty cycle is reached (or another

command is received).

Similarly if data byte 0 is 0F, the PWM value will be decremented until minimum is reached (or another

command is received).

If micro is moving motor clockwise/counter-clockwise and another clockwise/counter-clockwise

command is received, the motor will stop moving and the relevant micro will send a motor position

report.

Page 145: Control of a Robotic Arm over CAN using a touch screen user interface

Appendices

129

Position Motor

(Sent by Control Application)

Motor Used

by

Message

ID

Data

Base Micro 2 319 xx xx

Shoulder Micro 1 31A xx xx

Elbow Micro 2 31B xx xx

Wrist Micro 1 31C xx xx

Gripper Micro 2 31D xx xx

Wrist

Rotate

Micro 1 31E xx xx

Description:

The relevant micro takes the two data bytes as a value to reach for a given motor’s PWM register (i.e. a

“goal”). If the current PWM value is less than the goal, the value is incremented until the goal is reached

(or the max value for that motor). If it is more than the goal, the value is similarly decremented.

When the goal or max/min is reached, the micro will send a motor position report.

Data byte encoding

The data bytes are encoded as follows: Data byte 0 (sent first) has the lower 7 bits, while data byte 1 has

the upper bits (only 5 are needed). An example is shown below:

PWM value in decimal: 1243

PWM value in hexadecimal: 0x4DB

PWM value in binary: 100 1101 1011

Lower seven bits: 101 1011 = 0x5B

Upper bits: 100 1 = 0x09

CAN message data: 5B 09

Page 146: Control of a Robotic Arm over CAN using a touch screen user interface

Appendices

130

Go To Home Position

(Sent by Control Application)

Message ID: 0x30F

Description:

Micro 1 and 2 set goals for all motors to go to home position. When they reach the home position, the

position is reported.

Motor Position Report

(Sent by micro 1/2 to Control Panel App)

Motor Sender Message

ID

Data

Base Micro 2 239 xx xx

Shoulder Micro 1 13A xx xx

Elbow Micro 2 23B xx xx

Wrist Micro 1 13C xx xx

Gripper Micro 2 23D xx xx

Wrist

Rotate

Micro 1 13E xx xx

Description:

When the micro records a given motor as being stationary, it reports the PWM value (motor position).

Data bytes are encoded as above.

Page 147: Control of a Robotic Arm over CAN using a touch screen user interface

Appendices

131

Appendix E

Task Description Duration

Choose Robotic Arm A suitable robotic arm had to be chosen with servo motors as to

provide absolute control 2 weeks

Choose a microcontroller-

ADuC7026[32]

This microcontroller was chosen because it had both SPI, UART

and PWM functionalities 1 week

Development work with ADuC7026

(CAN)

Work got underway with this microcontroller. The MCP2515

CAN controller was connected and SPI commands were set up to

communicate. There were problems with the SPI RX buffer

there wasn’t enough of a delay before reading the register.

6 weeks

Servo testing

Testing functionality of the servo motors, working out the pulse

width time and duty cycle and transferring it into the value to

put into the register on the microcontroller

1 week

Development work with ADuC7026

(PWM)

Setting up the PWM to have 20ms period and creating a function

to set up the pulse width of each output 3 weeks

Checking the CAN network

Testing to see if the CAN messages were being sent across the

network with an old PC application that was modified for this

Robotic Arm

2 weeks

Choosing a new microcontroller Vs

shift register

ADuC7060[33] Vs ADuC7128 Vs shift register- Keeping original

microcontroller and using a shift register to provide control to 6

servo motors. Implementation work with the shift register was

needed to see if this method was feasible.

4 weeks

Development work with ADuC7060 Transferring the developed code over to a new microcontroller

and testing to see if the same implementation was possible 4 weeks

Development work with ADuC7128 Transferring the code over to this microcontroller and testing

that previous implementation works 4 weeks

Setting up BlackFin BF548-ezkit

Researching the BlackFin, setting up a virtual machine to run in a

Unix/Linux environment, downloading µClinux and building the

kernel around the required interface and drivers

10 weeks

Develop and test the SPI interface Test the functionality of the SPI interface on the ADuC7128 1 week

Develop and test the PWM outputs Test the PWM outputs work accordingly with the servo motors

on the Robotic Arm using ADuC7128 1 week

Update existing PC application Update the original application to work with new robotic arm &

test functionality 2 week

Testing CAN messages Get the CAN messages delivering across the CAN bus using the

PC application board as the other CAN node 1 week

Schematic & board layout

Draw up the schematic for the Robotic arm board and get the

board manufactured. Testing on this board has to be performed

then to make sure everything is still functioning accordingly

3 weeks

Set up kernel & application Start research on developing applications for the BlackFin & set

up the kernel appropriately 3 weeks

Design of user interface Research ergonomic design of touch screen & design the user

interface accordingly 2 weeks

Develop application Begin coding the application and testing the functionality 6 weeks

Test CAN network With both nodes working as required the next step is to test

these nodes together on the CAN network 2 weeks

Create suitable controls for the

demonstration

Add in suitable control screens for the servo motors and a screen

to display pre-recorded routines. An information screen with

instructions to the user is also useful for the application

2 weeks