Final Year Report Template...as well as the problem this project is intended to solve. Market...
Transcript of Final Year Report Template...as well as the problem this project is intended to solve. Market...
Department of Electrical & Computer Engineering Technology (ECET)
Division of Engineering, Computer Programming, and Technology (ECPT)
EET 4950
Senior Design Project
Ubityke: An Interactive Baby Monitor
Submitted by:
Alejandro Neira and Eric Hahn
Supervised by
Professor Ali Notash
4/10/2020
ii
Abstract
Alejandro Neira and Eric Hahn propose to develop an interactive baby monitor with a
customizable user interface by which to control the features and capabilities of the device.
These features include during the first iteration of the product but are not limited to in
subsequent models: day and night video monitoring, microphonics, programmable sounds
and playlists, nightlight, temperature and humidity monitoring, and a servo-driven,
decorative mobile.
The user interface provided during prototype development shall be an application
where the user can: toggle a nightlight, view the camera display, monitor temperature and
humidity, listen to the environment with a microphone, and manually operate the mobile
and audio output, or set them to operate off of a feedback loop initiated when the baby
moves. Future developments could allow for closed-loop operation of certain features
based on temperature and humidity inputs. Data logging and statistics such as a rolling
average of hours baby has rested correlated to environmental conditions are also
possibilities.
Ubityke could provide an economical solution which serves not only the tech-
savvy, but lower income families and child-care facilities. The prototype is not that much
more expensive than the costliest baby monitor on the market. If the modules could be
miniaturized and integrated onto one PCB and components bought in bulk, the cost could
be driven down such that it could be a disruptive technology.
iii
Acknowledgements
Of paramount importance, it is necessary to humbly and gratefully acknowledge our
advisors, Dr. Masood Ejaz, Program Chair and Professor of the Division of Engineering,
Computer Programming, and Technology, and Professor Ali Notash. Both are venerated
as luminaries of academia and industry, and without whose selflessness and generosity, so
many would not be able to realize their full potential and contribute to humanity as they
do. Professor Gerry Reed was also instrumental to this project and engaged in altruistic
service by helping with algorithms and general code structure.
The Ubityke team will be forever grateful for the guidance and the opportunity to
learn mechanical design principles from Carlos Casteleiro. Additionally, without the
OpenCV methods and templates developed by Dr. Adrian Rosebrock from
pyimagesearch this project would not be possible.
This proposal and design team would also like to pay tribute to L3Harris and
Siemens for fostering work environments which nurture personal growth and
development by allowing flexible work schedules and tuition reimbursement.
iv
Table of Contents
Abstract .............................................................................................................................. 𝒊𝒊
Acknowledgements .......................................................................................................... 𝒊𝒊𝒊
Table of Contents ............................................................................................................. 𝒊𝒗
List of Figures ................................................................................................................... 𝒗𝒊
List of Tables ................................................................................................................. 𝒗𝒊𝒊𝒊
Chapter 1 Introduction...................................................................................................... 1
1.1 Introduction ................................................................................................................ 2
1.2 Motivation .................................................................................................................. 2
1.3 Research ..................................................................................................................... 3
1.3.1 Comparable Products ........................................................................................... 3
1.3.2 Raspberry Pi Capabilities .................................................................................... 4
1.3.3 Python 3 and OpenCV ......................................................................................... 5
1.3.4 Motion Detection and Image Processing ............................................................. 6
1.3.5 Python 3 and Libraries ......................................................................................... 8
1.3.6 HTML, CSS, and JavaScript ............................................................................... 9
Chapter 2 Statement of Work ......................................................................................... 10
2.1 Technology Overview .............................................................................................. 11
2.1.1 Interconnect Diagram ........................................................................................ 11
2.1.2 Engineering Requirements Table ...................................................................... 12
2.1.3 Project Timeline ................................................................................................ 14
2.2 System Architecture ................................................................................................. 14
2.2.1 Raspberry Pi4 v.2 .............................................................................................. 13
2.2.2 OV5647 CMOS Camera.................................................................................... 15
2.2.3 Servo Motor ....................................................................................................... 16
2.2.4 Audio I/O ........................................................................................................... 16
2.2.5 Environmental Sensors ...................................................................................... 17
2.2.6 Nightlight ........................................................................................................... 18
2.2.7 Mobile................................................................................................................ 18
2.2.8 Power Considerations ........................................................................................ 19
2.2.9 Safety Considerations ........................................................................................ 21
2.3 Mechanical Design ................................................................................................... 21
2.3.1 Packaging and Materials.................................................................................... 22
2.3.2 Thermal Management ........................................................................................ 23
2.4 Electrical and Software Interface ............................................................................. 24
2.4.1 Basic Operation – Possible Configurations ....................................................... 24
2.4.2 Programming Environment ............................................................................... 24
2.4.3 User Interface .................................................................................................... 25
2.4.4 Power and Compute Budget .............................................................................. 25
2.5 Success Criterion ...................................................................................................... 26
Chapter 3 Contribution ................................................................................................... 28
3.1 Design Implementation ............................................................................................ 29
v
3.1.1 Software Development ...................................................................................... 33
3.1.2 User Intertface ................................................................................................... 33
3.1.3 Web Framework ................................................................................................ 34
3.1.4 Motion Detection Algorithm ............................................................................. 35
3.1.5 Event Detection Triggering Algorithm ............................................................. 36
3.1.6 Functionality of Standalone Modules ................................................................ 38
3.1.7 Validation of Standalone Modules .................................................................... 40
3.1.8 Module Compatibility With Software Interface ................................................ 41
3.1.9 Module Stress Testing ....................................................................................... 44
3.2 Mechanical Design ................................................................................................... 45
3.2.1 Critical Dimensions ........................................................................................... 46
3.2.2 Solid Models ...................................................................................................... 47
3.2.3 3-D Printing ....................................................................................................... 49
3.2.4 Assembly and Mounting Scheme ...................................................................... 51
3.3 System Integration .................................................................................................... 53
3.3.1 System Functionality ......................................................................................... 55
3.3.2 System Validation ............................................................................................. 55
3.3.3 System Compatibility With Software Interface ................................................ 55
3.3.4 System Stress Testing ........................................................................................ 57
3.4 System Validation and Testing ................................................................................ 57
3.4.1 Daytime Use Cases ............................................................................................ 61
3.4.2 Low-Light Use Cases ........................................................................................ 61
3.4.3 Edge Cases......................................................................................................... 62
3.5 Lessons Learned ....................................................................................................... 63
Chapter 4 Non-Technical Aspects .................................................................................. 65
4.1 Environmental, Health, and Safety Concerns .......................................................... 66
4.2 Ethical Concerns ...................................................................................................... 66
4.3 Social Concerns ........................................................................................................ 66
4.4 Economic Impact ...................................................................................................... 67
4.5 Budget ...................................................................................................................... 67
Chapter 5 Summary and Conclusion ............................................................................. 72
5.1 Summary and Conclusions ....................................................................................... 75
5.2 Suggestions for Future Improvements ..................................................................... 62
References ......................................................................................................................... 77
Appendix A: Equations ................................................................................................... 79
Appendix B: Linux Shell Commands ............................................................................. 80
Appendix C: Python Code............................................................................................... 82
Appendix D: HTML and JavaScript Code .................................................................... 96
Appendix E: Datasheets ................................................................................................ 103
Biography........................................................................................................................ 112
vi
List of Figures
Figure 1-1 Nanit Plus Camera .......................................................................................3
Figure 1-2 JavaScript AJAX Function in file \Ubityke\static\script.js ..........................9
Figure 2-1 Interconnect Diagram.................................................................................11
Figure 2-2 Project Timeline .........................................................................................14
Figure 2-3 Raspberry Pi4 Model B ..............................................................................15
Figure 2-4 OV5647 Pi4 Camera ..................................................................................15
Figure 2-5 360˚ Servo Motor .......................................................................................16
Figure 2-6 USB Microphone .......................................................................................16
Figure 2-7 3W Mini Speaker .......................................................................................17
Figure 2-8 DH22 Temperature/Humidity Sensor ........................................................18
Figure 2-9 Mobile Frame .............................................................................................19
Figure 2-10 Nightlight Circuit .......................................................................................19
Figure 2-11 Initial Packaging Concept ..........................................................................23
Figure 2-12 CPU and GPU Temperature.......................................................................25
Figure 3-1 Field of View and Lens Selection ..............................................................30
Figure 3-2 Design Implementation ..............................................................................31
Figure 3-3 Servo Command Example .........................................................................32
Figure 3-4 Subsystem Testing Example ......................................................................33
Figure 3-5 Web Framework Overview ........................................................................34
Figure 3-6 LED Current Draw .....................................................................................39
Figure 3-7 Servo Current Draw ...................................................................................39
Figure 3-8 Ubityke.py Flowchart ................................................................................43
Figure 3-9 CPU Stress Test .........................................................................................45
Figure 3-10 Mechanical Design.....................................................................................46
Figure 3-11 Critical Dimensions ...................................................................................47
Figure 3-12 Ubityke Housing.........................................................................................48
Figure 3-13 Ubityke Exploded View .............................................................................49
Figure 3-14 3-D Printer .................................................................................................50
Figure 3-15 Finished Product ........................................................................................50
Figure 3-16 Nightlight Circuit Wiring ...........................................................................51
Figure 3-17 Ubityke Mounting Scheme .........................................................................52
vii
Figure 3-18 Ubityke-Raspberry Pi Interface ..................................................................53
Figure 3-19 Ubityke Circuit Schematic .........................................................................53
Figure 3-20 Integrated System.......................................................................................54
Figure 3-21 System Functionality when Turning Servo On ..........................................55
Figure 3-22 System Compatibility – Raspbian OS........................................................56
Figure 3-23 Trial 1 Time Delta Between Actual and Sensor Readings ........................59
Figure 3-24 Trial 2 Time Delta Between Actual and Sensor Readings ........................60
Figure 3-25 Trial 3 Time Delta Between Actual and Sensor Readings ........................60
Figure 3-26 Night Vision with Mobile Interface ...........................................................61
viii
List of Tables
Table 1-1 Product Comparison Table ..........................................................................4
Table 1-2 Python3 Libraries used in Ubityke ..............................................................8
Table 2-1 Engineering Requirements Table ...............................................................12
Table 2-2 Raspberry Pi Current Draw ........................................................................20
Table 2-3 Success Criterion .......................................................................................27
Table 3-1 Sensor Data Log .........................................................................................35
Table 3-2 Hardware Validation Testing .....................................................................41
Table 3-3 System Validation Testing .........................................................................58
Table 3-4 Lessons Learned.........................................................................................63
Table 4-1 Proposed Bill of Material ...........................................................................68
Table 4-2 Actual Material Consumed ........................................................................69
Table 4-3 Final Configuration Bill of Material ..........................................................70
- 1 -
Chapter 1
Introduction
1.1 Introduction
1.1 Motivation
1.2 Research
Summary
In this chapter, the original concept and motivation for the project are introduced
as well as the problem this project is intended to solve. Market research and
technology feasibility are presented in a cursory manner to give the reader an
understanding of how Ubityke compares to and differs from current technology.
- 2 -
1.1 Introduction
A new life brings monumental joy and even more responsibility. The primary purpose of
the parents for the first several months of a baby’s life are to ensure the health, safety, and
uninhibited development of the child. A myriad of possibilities exist when it comes to
choosing the right equipment to monitor the baby. Currently, there are devices on the
market which can provide closed-circuit monitoring of the baby, while providing active
two-way communication. Other devices such as walkie-talkie style baby monitors
provide passive audio monitoring of the baby’s environment. A separate lullaby machine
and mobile are typically purchased to sooth the baby. Some smart baby monitors have
built-in temperature and humidity sensors which can activate alarms.
Ubityke integrates all the aforementioned devices into one mechanism and
provides a fully customizable user interface with which to access and control the different
features. A web-based application, and eventually a smartphone application will allow
the user to actively engage the different features or set them to operate off of feedback
from pre-configured settings. For example, if baby is crying, one could push a button and
play baby’s favorite song, or perhaps turn on the nightlight and spin the mobile. What if
the parents are asleep? Ubityke could perhaps be configured to play a lullaby if baby’s
movement is detected over a predetermined number of time or camera frames analyzed
by the processor.
1.2 Motivation
The burden of parenthood is arduous enough without the worry of knowing that the right
equipment was purchased to ensure the safety, security, and serenity of a child. The
catalyst for this project arose from my own journey as both a parent and an engineer. My
wife and I purchased several different devices to offer our child an environment of
comfort, safety and delight. We spent countless hours researching and scrutinizing baby
monitors, security cameras, white noise/lullaby machines, and baby mobiles. When our
daughter was finally born, we had an assortment of devices with neither a common
interface, nor any smart capabilities. A device suite which provides the functionality of
all the aforementioned gizmos would not only provide cost benefits, but having them all
integrated with a common user interface would limit the frustration of having to manage a
host of different baby hardware.
- 3 -
1.3 Research
1.3.1 Comparable Products
Research on different types of baby monitoring devices was conducted to gain insight
into which technologies cornered the market share. Most products in the ‘smart’ class
offered the same functionality. Perhaps the most innovative was the Nanit Plus Camera.
The camera won “Best Invention of 2018” by TIME Magazine. The manufacturer offers
an entire baby monitoring system with additional wearable technology that will alert the
user when baby has an irregular breathing pattern [1]. The camera has some additional
features not included in traditional baby cameras such as a soft-glow nightlight, two-way
communication, sleep metrics, and hooks for additional sensor connection. The camera
without the wearable breathing sensor package is priced at $299. The entire monitoring
system is priced at $379. Since the components are bought and manufactured in bulk, the
price of the hardware is driven down that the company most likely makes profit on the
hardware and software subscription services which allow the user to access saved data.
The product consumes more power than most security cameras, and as such is
configurable with either wall-mounting hard or a floor stand due to the power
requirements being greater than what a typical USB cable can provide.
Figure 1-1: Nanit Plus Camera
After studying the Nanit Plus Camera, more exhaustive market research was
conducted to aggregate similar products and perform a rudimentary, qualitative cost-
benefit analysis of similar product offerings to see how Ubityke compared to other
- 4 -
devices. Some devices showcased superior optical quality and video resolution through
gimbaling the sensor, while others cut features to make the product more affordable. The
results are tabulated below [2].
Table 1-1: Product Comparison Table
Feature Nanit Plus HelloBaby Infant Optics DXR-8 Ubityke
Video Monitoring
Night Lamp
Microphone
Temperature-
Humidity Alarm
Lullabies
Health Analytics
Spinning Mobile
Price $299 $89.99 $165.99 $179.80
1.3.2 Raspberry Pi Capabilities
The versatility and potential of this device lies in the processing power and
interoperability of the Pi4’s Quad core Cortex-A72 (ARM v8) 64-bit SoC. The
architecture of the processor lends itself to processing camera frames, which are basically
intensity maps of the illuminated pixels of the camera sensor. Since the camera will be
the basis for most of the decision-making inputs into Ubityke’s state machine, the
processor must not be bogged down by processing extraordinarily large amounts of data
contained in high frame rate, high resolution cameras when it needs to receive input from
other device peripherals. The large memory, high efficiency video coding and
compression, Wi-Fi capabilities, and compatibility with open-source software
development kits make the Pi4 the incumbent choice to power Ubityke[3].
The Pi4 also comes with 40 GPIO (general purpose input/output) pins with which
to connect peripherals. Additionally the pins and board traces are more robust with this
board revision which allows the pins to draw 16mA of current safely[3]. The Raspbian
operating system is a Linux distribution which allows for multiple devices, using multiple
communication protocols to communicate with one another and the microprocessor.
- 5 -
1.3.3 Python3 and OpenCV
A high-level, object-oriented, programming language with an uncomplicated syntax was
needed for the accelerated development of this project. Python supports an abundance of
open-source libraries needed to interface with all of the peripherals of Ubityke[4]. The
image processing portion of the project will require OpenCV, which stands for Open-
Source Computer Vision. This technology was developed in the mid to late 1990’s at
Intel by Gary Bradsky and Vadim Pisarevsky to solve computer/camera vision problems.
The developers and the team integrated it into ‘Stanley’, the 2005 DARPA Grand
Challenge winner. The enormous amount of sensor data needed for the methods to
actuate the vehicle controls was made possible with OpenCV. Algorithm development to
solve computer vision and machine algorithms are common uses for the software. Python
is one of the programming languages OpenCV supports, and with its multitude of open-
source libraries, image processing algorithms for Ubityke will be developed in OpenCV-
Python[5].
A full installation of OpenCV is not necessary for this project. The full install
gives the user access to patented algorithms developed by the contributors to this project.
Since only single-motion detection will be performed, a lighter version will be installed.
This version will be sufficient for our project because as indicated before, facial
recognition and object classification are not requirements for our product.
As long as the camera being used by the system has a communication interface
which is compatible with OpenCV, the camera can be imported as an object and
instantiated whenever necessary. The Open CV methods are highly-optimized algorithms
written in C++ wrapped in python which allow the user to apply methods to the camera
object like ‘camera.function(arg1, arg2, argn).’ In fact, the entire OpenCV library can be
used as an object. Some of the common functions in the OpenCV library which will be
used in our project are smoothing and filtering functions, converting color images to
grayscale, and encoding the camera output into a usable file format. A deeper dive into
motion detection and image processing will be discussed in section 1.3.4.
- 6 -
1.3.4 Motion Detection and Image Processing
Motion detection and image processing are the core of Ubityke’s technological
capabilities and will provide the basis for some of the features which set this product
apart from similar offerings. These topics, covered in depth are many orders of
magnitude beyond the scope of our project, however, a rudimentary understanding of
some of the simpler methods is critical for algorithm development and system integration.
Additionally, graduate and post-doctorate courses in mathematics and computer science
are needed to fully understand the limitations of these methods in order to fully
implement them in an application. In the following paragraphs, a superficial explanation
of some of the basic methods used by Ubityke shall be given.
In a basic sense, a camera has the ability to capture and display intensity as a
function of space. The particular camera used in this project has an OV5647 sensor. This
sensor has a 3673.6µm x 2738.4µm active area with a pixel count of 2592 x 1944.
Dividing a one-dimensional sensor length by the number of pixels in an array of the same
dimension will yield the pixel size. In the case of this sensor, the effective ‘pixel pitch’ is
1.4µm x 1.4µm[6]. This resolution allows for the easy integration of simple image
processing and motion detection methods.
One of the first steps in any image processing project is the establishment of a
background. If a suitable background is not established, no matter how sophisticated the
algorithm, there will be no way for a computer to distinguish one image from another.
This basic concept is why humans can’t see most stars in the daytime. The need for a
dark background which provides the ability to distinguish between different colors in the
sky and of the stars is necessary for the brain to process the difference between the night
sky and twinkling stars. This same methodology is how computer vision works. A
background, or an accumulation of images in the static part of multiple camera frames is
needed to compare to subsequent frames in order to detect motion. In the scope of this
project, consider an image of a sleeping baby. If a camera has a frame rate of
30FPS(frames per second), the camera takes 30 pictures of a sleeping baby in one second.
These frames are used to construct what is called a background model. The background
model can consist of however many frames the programmer chooses to use in the
algorithm. Next, consider an image of a baby kicking and screaming. The baby’s legs
would be elevated and perhaps the arms would be in a different position. The light
reflected off of the baby in both the sleeping and kicking frames will be captured by the
- 7 -
sensor, but the intensity of the light would be different as it fell on different pixels in the
two situations. The pixel intensities of the two frames would be read in by the computer
as two-dimensional arrays and the intensity values of the two frames would be compared.
The difference in intensity values of the image being compared to the background image,
if numerically greater in magnitude than a predetermined threshold, translates to motion
being found in the frame. This is basis for the single motion detection methods used to
trigger the spinning mobile or lullabies. The threshold of the absolute difference between
background and subsequent frames can be set to whatever the programmer desires. For
our application, the threshold shall be set higher than in some motion detection situations
like security cameras. It would be poor for the dancing of shadows across a bedroom or
baby simply rolling over to set off the device[5].
The next helpful methods used in the motion detection algorithms are called
‘morphological transformations.’ These methods are typically used on binary, or black
and white images. For our case, we are performing simple motion detection, therefore, a
grayscale image using pixel intensity comparison (either present, or not present near a
boundary or transition) in the frame when comparing a subsequent frame to a background
frame. Two of the most common of these transformations are erosion and dilation[5].
The erosion method iteratively, and gradually destroys, or ‘erodes’ the boundary
between the foreground of an image and an object. An area of an image is sampled and if
all the pixel intensities under that area are the same intensity of the original image, then
those pixels will have a value of 1. The rest of the pixels will be given the value of 0 and
discarded when building the new foreground object. This method is useful for detaching
connected objects in the image or reducing blur and noise. This method allows contours
to be drawn more crisply on binary images after they have been converted from color[5].
After eroding an image and removing the white noise to construct a crisper
foreground, the dilation method is typically used to either resize an image after being
shrunk down due to the removal of noisy boundary pixels, or to rejoin disconnected parts
of an object during the erosion process. The resulting image after these two methods is
an image of similar size with better contrast which lends itself for easier contouring.
These methods are used in our project to avoid detection of false positives and/or the
device acting off the passing of shadows or glinting sunlight[5].
Contours are typically drawn after the image has been eroded and dilated.
Drawing a contour is basically joining all points along the boundary of an image with a
curve which have the same binned pixel intensities. Contour detection is the type of
- 8 -
detection used in the motion detection portion of our system. Finding the area of the
motion regions and comparing them to our initial thresholds will be the trigger used to
inform the system that ‘motion’ has occurred. If motion has not occurred, the program
will continue iterating over the frames. Inside of the motion detection loop, the Ubityke
system will start a timer when motion has been detected, and if motion is consistently
detected for a predetermined amount of time, an event will be triggered; otherwise the
timer will continue to run while the motion detection algorithm is iterating over each
frame[5].
1.3.5 Python 3 and Libraries
During the proposal we researched that Python was a no brainer to use due to its large
open source libraries that exist out there. Our main file Ubityke.py is a python 3.5.8
program that not only performs all the logic behind our features, but also deploys a web
server through Flask and creates the Ubityke application. The following table shows all
the libraries that are integrated in the final product.
Table 1-2: Python3 Libraries used in Ubityke
PYTHON3 LIBRARY DEFINITION USE
OPENCV Open Source Computer
Vision
To detect motion for
Threshold Triggered Mode
(TTM)
FLASK Micro-web framework To host the User Interface
GPIO Class to control the GPIO
on a Raspberry Pi
Control the LEDs, Servo
Motor
SQLITE3
Lightweight disk-based
database management
system
To store the sensor values
OMXPLAYER
WRAPPER
Python wrapper to a
command line media
player
To play the lullabies
- 9 -
1.3.6 HTML, CSS, and JavaScript
The Ubityke team proposed a web user interface to control the different features. Most
web interfaces which are currently popular consist of HTML, CSS and JavaScript. The
combination of all three is what makes the user interface so fluid as well as cross platform
functional, which allows the interface to wrap against any size screen so that it can be
fully functional if you are controlling the features through a mobile device, desktop
computer or laptop.
When the Flask app is executed, at the end it renders an HTML page with the web
streaming video of the camera. We modified the HTML code and divided the User
Interface into sections. CSS stands for Cascading Style Sheet which dictates how the
HTML elements are to be displayed.
JavaScript is the mastermind behind the fluidity of the user interface since it
allows to send a http request to the server which returns our action, by simply pressing a
button and not refreshing the whole page. This is done through AJAX (Asynchronous
JavaScript and XML) by creating a JavaScript event that detects when a certain button is
pushed and executes a GET request which returns an action. For example, in Figure 1-2
we can see how we are utilizing an AJAX function to turnOnServo which when pressed
will send a GET request to our server that will set ?servo=on, or turn the servo on. This
GET request is detected in our Python code later on which executes a command.
Figure 1-2: JavaScript AJAX Function in file \Ubityke\static\script.js
- 10 -
Chapter 2
Statement of Work
2.1 Technology Overview
2.2 System Architecture
2.3 Mechanical Design
2.4 Electrical and Software Interface
2.4 Success Criterion
Summary
In this chapter, a brief statement of work is given, along with a conceptual
overview of the system architecture. The mechanical design, along with the
electrical and software interface, and main component blocks will be described at
a high level as they pertain to the project. The code base, web framework, and
success criterion are also laid out in a cursory manner.
- 11 -
2.1 Technology Overview
The Ubityke team has proposed to design and build a baby monitor which will detect
motion from the baby and allow the user to initiate an action from the device suite either
manually, or from a preconfigured state. Furthermore, the device may be configured to
be used in a manual and passive mode wherein the user chooses how to interact with the
device by either switching on a nightlight, listening to the baby via microphone,
activating a spinning mobile, or launch customizable sounds, lullabies, and/or playlists.
Temperature and humidity sensors are also integrated into the device and are configured
to display and/or log data if the user so chooses.
An example of a possible configuration could be to set the camera to constantly
analyze frames and turn on the mobile and play baby’s favorite song if motion is detected
above a certain threshold, and then shut off the music and mobile when movement is no
longer detected above the threshold.
Additionally, a web-based application will be developed to allow the user to
customize and update device configurations, monitor the baby, and support requests from
multiple devices connected to the same local network.
2.1.1 Interconnect Diagram
Figure 2-1: Interconnect Diagram
Ras pi 4
picam
servo module
Raspberry pi (4GB RAM)
3W stereo speaker
DC motor
f/2
67° Field of View with vis/IR capability
Decorative Mobile
USB Microphone
temp/humidity sensors
LEDservo motor
5V 2.5A (DC)
- 12 -
2.1.2 Engineering Requirements Table
Table 2-1: Engineering Requirements Table
Block Name Component Engineering
Specifications
Justification and
Verification Responsibilities
Microprocessor Raspberry pi
4 v.2
Quad-core 64bit ARM processor 40 GPIO pins
A/V Functionality 4GB RAM
ARISC architecture and more RAM provide image processing capabilities, data storage, and ease of peripheral integration. Wi-Fi allows for remote user interface Test connectivity speeds, I/O voltage levels under load.
Alejandro to procure, validate, integrate, and deploy
Camera Picam with
vis/IR f/2 lens
OV5647 vis/IR camera with IR cut
filter 30FPS at 1080p
3.3V input 12M thread adapter
Day/Night cam needed for constant viewing and image processing. Adjustable resolution/frame rate to maximize compute resources.
Test different frame rates while processing data under load to gauge video quality over Wi-Fi during computations. Choose frame rate best on best fit performance for system.
Eric Hahn to procure, validate, integrate, and deploy
Environmental Sensing
temperature/humidity sensor
Detect humidity from 5% to 95%
RH +/- 2% accuracy over linear range
temperature from -30 to 90
+/- 1 accuracy over linear range
Temperature and humidity readings are required inputs for the user interface Temperature and humidity levels measured and compared to a calibrated temperature and humidity monitor
Eric Hahn to procure components, validate operation, and integrate
- 13 -
Audio USB
microphone Driverless USB
Microphone
Used to monitor sounds in the vicinity of baby
Eric Hahn to procure components, validate operation, and integrate
Audio Speaker 3W speaker with
3.5mm audio interface
Needed to deliver music and other audio to baby
Test speaker output music and input from microphone
Eric Hahn to procure, validate, integrate, and deploy
Mobile Servo Motor
3.3V or 5V servo motor
PWM controlled - variable speed
Motor which rotates mobile. Takes software PWM signal and spins under load at rated speed.
Eric Hahn to design, procure, validate, integrate, and deploy
Nightlight LED(s) 20mA
brightness > 2 Lumens
Nightlight
Apply 3V, 20mA of current and measure flux with a radiometer
Eric Hahn to procure components, validate operation, and integrate
Power DC Power
Supply 5V 2.5A
Separate power supply to drive peripherals
Measure current draw under full load
Eric Hahn to procure, validate, generate code, integrate, and deploy
User Interface and Web
Framework
GUI and Web Environment
User Interface capable of
executing all functionality of the system through a web framework
Allows user to interact with baby through system. Test individual components as well as integrated system through user interface.
Alejandro Neira to develop programming environment, user interface, web framework, and deploy
- 14 -
2.1.3 Project Timeline
Figure 2-2: Project Timeline
2.2 System Architecture
2.2.1 Raspberry Pi4 Model B
The Raspberry Pi4 can be configured to use either audio signals, video signals, both audio
and video signals, or temperature and humidity thresholds to activate different functions
of the device. Using ‘while-True’ logic, the processor will scan the peripherals for
activation conditions based on current state values in a closed-loop configuration. If an
activation condition is met, the peripheral will activate/de-activate and behave according
to pre-determined configurations. The Raspberry Pi4 may also be used in a passive state
allowing the user to activate any and all devices manually through an API.
- 15 -
Figure 2-3: Raspberry Pi4 Model B
2.2.2 OV5647 Color QSXGA Camera
The Omnivision 5647 5MP camera module is compatible with the Raspberry Pi4 and
comes equipped with a QXXGA (Quad Super Extended Graphics Array) color CMOS
sensor capable of outputting 1080P video at 30 frames per second. The camera will
interface to the Pi via the Camera Serial Interface. The camera has a motorized IR cut
filter which allows the camera to filter infrared light during the day, and switches open at
night to allow the sensor to collect more light, thus providing a dynamic contrast of sorts.
Higher resolutions can be achieved at lower frame rates, however for Ubityke’s purposes,
lower frame rates could perhaps render the image processing algorithm ineffective
causing latency in activating peripherals. Since the device is not performing facial
recognition, the higher resolution is not needed[6]. The camera is furnished with an M12
lens mount; therefore, the design team has chosen to use a 1.8mm focal length, NOIR cut
filter, f/2 micro video lens which comes standard with the camera module to provide a 85˚
field of view at an approximately one to three-meter range with acceptable resolution[6].
Figure 2-4: OV5647 Pi4 Camera
- 16 -
2.2.3 Servo Motor
The servo motor will be used to spin a decorative mobile attached to the shaft of the
motor. The baby mobile will be exceptionally rigid and light due to safety concerns of
the baby. Therefore, a Pi4-compatible, continuous rotation servo motor will take a
software-generated PWM signal and spin the carousel which holds the mobile
decorations. The stall torque of the motor is 3kg/cm with a 3.3V or 5V input, so a
relatively small diameter wheel could be mounted to the shaft and hold a couple of
pounds before stalling. The baby mobile will weigh a few ounces at most, so this motor-
servo combination is acceptable using a rough estimate of measure[7].
Figure 2-5: 360˚ Servo Motor
2.2.4 Audio I/O
A driverless USB microphone will be used to monitor the audio in the vicinity of the
baby. The versatility of the USB microphone and the fact that it can operate without
extraneous communication protocols such as I2S or SPI make this microphone attractive
for the design. Since the requirement is not dependent on speech recognition fidelity, the
dynamic range of the device is not crucial as long as the audio can be successfully
recorded[8].
Figure 2-6: Sunfounder USB Microphone
- 17 -
In order to soothe baby, audio shall be delivered to a simple 3W, USB powered
mini laptop speaker. Because the device can be powered by the USB port on the
Raspberry Pi, the speaker provides a SWaP (size, weight, and power) advantage.
Additionally, the 3.5mm audio jack eliminates the need for extraneous communication
protocols and resource heavy audio codecs. By relieving the compute burden of DAC
and encoding schemes, the processor will operate more efficiently and run at a cooler
temperature. Image processing is remarkably resource heavy, and every marginal gain in
the system design which contributes to overall efficiency is not to be underestimated.
The package is shown below in figure 2-6[10].
Figure 2-7: 3W Mini Hamburger Speaker
2.2.5 Environmental Sensors
Among the many features of Ubityke are temperature and humidity sensing. The device
will be equipped with the DH22, a low-cost temperature and humidity sensor capable of
detecting humidity from 5% to 95% RH with a ±2% accuracy over a linear range, and
detecting temperature from -30 to 90 with a ±1 accuracy over a linear range. The
environmental sensors are able to output data once every two seconds or .5Hz. The
sensors will have the ability, as most of the peripherals in the system to be operated in
active or passive mode. The temperature and humidity can simply be monitored by
accessing the user interface, or thresholds can be set for these devices to act as inputs to
drive another part of the system like sending an alarm, or turning on the mobile. In the
future, there is the possibility of logging the temperature and/or humidity over time and
- 18 -
mapping it to the baby’s resting patterns to develop optimized environmental conditions
for baby. The DH22 sensor is shown below[11].
Figure 2-8: DH22 Temperature/Humidity Sensor
2.2.6 Nightlight
A soft white nightlight will be provided from the device via a 3V, 20mA white light
LEDs with a 10° viewing angle. The LEDs will be mounted to the top of the package to
deliver a pleasant glow on the ceiling. This mounting scheme saves space and eliminates
the need for extra material to diffuse the light if the LEDs were to be mounted in the field
of view of the baby. Additionally, the infrared portion of the LED emission spectra could
wash out camera frames in dark settings, or interfere with the photoresistors in the camera
assembly if they happened to be in the field of view of the camera[12].
2.2.7 Spinning Mobile
Baby could be soothed by rotating a mobile attached to the shaft of the servo motor about
its axis. Weight is obviously the chief requirement for this block of the device. Since the
servo motor is in the micro category, the maximum weight based on a rough estimate of
measure concerning stall torque should only be an ounce or two. The mobile shall be a 3-
d printed carousel ring with notches from which to hang either paper or thin textile
decorations. This custom feature is to be built with SLA and can be seen in the rendering
below.
- 19 -
Figure 2-9: Mobile Frame
2.2.8 Power Considerations
Raspberry Pi4 is capable of sourcing 2.5A from the USB power supply, however, if the
system is ever configured to have all devices set to passive mode and for instance, if the
servo, camera, and LEDs are active at the same time, the inrush current of these devices
coupled with the Pi constantly executing processes in the background, could cause the
device to malfunction and shut off or have intermittent connectivity[3]. Each GPIO pin is
capable of providing approximately 16mA. For our design, we chose for instance to
place the LEDs for the nightlight in parallel to hedge against any failures. An example of
this strategy is demonstrated below.
2.1
𝑉 = 𝐼 ∗ 𝑅
𝐼 =𝑉
𝑅
Consider the nightlight circuit below in figure 2-10.
Figure 2-10: Nightlight Circuit
Since the LEDs are connected in parallel, 5V from the GPIO pin appear across
each of the four branches. Additionally, the voltage drop at the junction of an LED is ≈
.707V. Therefore...
- 20 -
𝐼𝐿𝐸𝐷 = 𝑉𝐺𝑃𝐼𝑂 − 𝑉𝑗𝑢𝑛𝑐𝑡𝑖𝑜𝑛
𝑅
𝐼𝐿𝐸𝐷 = 5 − .707
330
𝐼𝐿𝐸𝐷 = .013009 𝑜𝑟 13𝑚𝐴
Since the forward operating current limit for the LEDs is 20mA, the current
burden on the GPIO pin is the same regardless of a series or parallel connection, therefore
the 330Ω current limiting resistor is sufficient for our application. In order to save on
overall current draw, the decision was made to place all the LEDs in parallel as to not add
to the overall power budget. The LED output with a few milliamps is sufficient to
provide a warm glow for the baby in the dark.
The other power-hungry device in the system is the servo motor. The servo motor
requires a 3.3VDC input. Extensive testing and documentation support the claim that one
ampere can be sourced from the 3.3V rail[3]. The FS5103R continuous rotation servo
motor will be spinning a carousel which weighs ≈ 148g. There are ≈ 28.3g in one ounce.
Therefore, the servo will need to rotate a load of 5.23oz. The weight of this load is well
within the capabilities of the servo’s stall torque specification of 41.74in-oz[7].
The rest of the peripherals do not require current from the GPIO header, and therefore can
be included in the current draw of the Raspberry pi itself which with all four processors
running at max speed is ≈ 1.3A. The table below shows the current draw of the
Raspberry pi under different conditions[14].
Table 2-2: Raspberry Pi4 Current Draw
Because of the extensive research and analysis of the Ubityke team, the standard 5V 2.5A
power supply recommended by the vendor is sufficient for our application.
- 21 -
2.2.9 Safety Considerations
The safety of a baby is the prime concern, therefore meticulous care has to be taken with
the mechanical design, specifically how the device will be attached, enclosed, and
mounted. Some manufacturer of similar products come with standard mounting hardware
which consists of wall anchors and self-tapping screws even though the devices weigh
less than 10oz. Ubityke, as innovative as it may be, is still a prototype. However, if this
design is ever to be productized, considerations must be taken to ensure that a proper
assembly and mounting technique allow for ease of use with safety as the primary point.
For demonstration and testing purposes, the product will be mounted to a standard
camera tripod with a built-in locking mechanism that comes standard with these products.
A stainless-steel post will be used to extend the camera laterally from the tripod mount
which interfaces with the mounting features on the Ubityke enclosure. The device will be
elevated to a height sufficient to monitor the region of interest without allowing the baby
to interfere with it. Mounting schemes need to evolve with age however, because as soon
as the baby is able to pull up on the crib, or climb, it could be a safety risk. This foresight
pertains to possible future additions and improvements related to the productization of
this device if the need arises.
Additionally, electrical considerations concerned with heat dissipation, static
discharge, wiring schemes that could provide potential hazards, and a multitude of other
factors which are addressed in certifications given by regulatory bodies such as
Underwriters Laboratories or “Conformité Européenne“(French for “European
Conformity”) for devices produced and sold in the E.U. that deal with health, safety, and
environmental standards[13].
2.3 Mechanical Design
Ubityke is a remarkable device in the sense that it combines features from several devices
integrated into one package. Typical mechanical design principles start out with a rough
sketch, or ‘cartoon’ as it is sometimes referred to in industry. During the cartoon stage,
all the devices are usually imported into CAD software to assess how the system will
function when all the devices are integrated. Several analyses can be performed on the
model before fabricating parts; many, and most of which are outside the scope of our
project. However, if the device has dual-use capabilities such as being used in a
- 22 -
commercial or an aerospace/defense application, finite element analysis may be
performed. Finite element analysis is a computer-based numerical method that can
predict how the device will react to physical forces found in nature such as heat, shock,
vibrations, etc...
Most of the items in the system are protected by trademark and patent law, so
finding Solidworks models online of these components to import into our package design
provides a level of complexity to the mechanical design task which could lead to
modifications which detract from the aesthetic value of the final product.
2.3.1 Packaging and Materials
Ubityke will be a 3-D printed package using SLA (stereolithographic apparatus)
techniques. The photochemical process uses a polymer mixture which is extruded
through a nozzle and then exposed to different levels of EM radiation that cause the
compounds to chemically bond, to form a three dimensional solid. Upon researching past
projects and learning about the inaccuracy and difficulty using the Valencia 3-d printers,
the team decided to source a professional to help us with the mechanical design. Carlos
Casteleiro is a multi-patented, multi-disciplined engineer who has a high-end industrial
machine shop who was benevolent enough to volunteer his time to help the team by
allowing us to use his Solidworks capabilities and machine shop with high-end 3-d
printer[15].
The team will measure all critical dimensions of all of the components in the
system after they have been verified during breadboard testing. Timing is critical in this
phase, because if the components aren’t validated before designing the package, and too
much time is taken, any changes in the design could push the project out too far to
complete. Designing a package that is both robust, but light is what drove us to use SLA
materials. The box needs to be able to accommodate all the components without
generating an environment susceptible to heat and mechanical perturbations. The initial
design idea is shown below.
- 23 -
Figure 2-11: Initial Packaging Concept
2.3.2 Thermal Management
One of the primary drivers in any system design is how to construct a device such that it
operates under optimal thermal conditions. Excess energy in the form of heat can lead to
noise, interference, and degradation in performance. Additionally, if several devices are
enclosed in a box and are simultaneously giving off heat, the box essentially becomes an
oven. Computers deal with heat by using fans, active liquid cooling, and fans. Some
passive cooling techniques involve heatsinks and thermal compounds to transfer heat from
the devices into the local atmosphere. In the case of Ubityke, the use of active cooling
would be prohibitive due to the fact that the noise from the fans and water pump would
interfere with the microphone. There’s no way around the noise from the servo, but in
order to hedge, the team elected to place part of the servo outside of the case so the audible
noise from the motor could escape.
One technique for cooling an enclosed environment is through ventilation. There
will be sufficient ventilation in the package in order to allow the warm air generated from
the inside of the box to be pulled out of the box and into the atmosphere due to the disparate
temperature gradient between the inside and outside of the box. Most of the heat generated
from the device will come from the Raspberry Pi itself. In order to combat this, the Pi will
be placed into an aluminum heat sink casing with thermal gap pads attached to all of the
processors and ICs in order to transfer the heat from the discrete components into the larger
Servo Shaft with lock ring
Nightlight
Environmental Sensors
Speaker
Mobile
Camera
- 24 -
thermal mass of the heatsink. The fins of the heatsink case will be exposed to the outside
environment in order to transfer the heat from the case into the air.
2.4 Electrical and Software Interface
2.4.1 Basic Operation – Possible Configurations
Ubityke can run in two possible configurations that will depend on how the end user intends
to operate the device:
Manual Mode: During this mode, the User Interface will function as the only control
of each feature. The end user will be able to turn on and off features as they please. The
end-user is always still able to monitor the baby through live video while interacting with
the UI.
Threshold Triggered Mode (TTM): During TTM we take advantage of OpenCV
to detect motion and we are able to inject our piece of functionality when it detects motion
after x number of seconds to be able to trigger features. This enables Ubityke to detect a
baby in motion after a set threshold time and trigger the mobile, so it spins and distracts the
baby.
2.4.2 Programming Environment
One of the main attractive features of programming with a Raspberry Pi is that it comes
with a variety of programming environments that are ready to use. Python is one of the
environments that comes pre-installed and it is a powerful language that is easy to use, read,
and write. It also allows us to connect to the real world by communicating with hardware
with via the GPIO pins in the Pi. For that reason, plus its fantastic efficiency, the team
decided Python as the main programming language that wraps all the modules together.
The User Interface is a web application powered by Flask, a web framework coded in
HTML, CSS and JavaScript which allows the end user to control all the features of Ubityke
through a clean interface that is simple to use and cross platform so you can easily monitor
your baby with a desktop, or any mobile device.
Flask is a micro web framework written in Python that allows us to build web
applications. Last semester, the proposed environment to handle the user interface was
going to be an Apache Server and PHP, but after multiple unsuccessful trials of connecting
our software with our hardware the team decided to continue doing research to find a bridge
- 25 -
between our code and our hardware. That’s when we came across Flask which was voted
the most popular web framework in the Python Developers Survey 2018[17]. After
learning about these new technologies, the decision was made to adapt Flask since it is able
to wrap around HTML, CSS and JavaScript code, while being a lot more lightweight than
running a full Apache Server.
2.4.3 User Interface
Testing every module via shell commands allowed us to verify that the hardware was
behaving how we intended. In order to control each module or to know if the device was
running, we developed an interactive user interface that will allow you to visualize the
different features available while keeping a feed of your baby as the center highlight of the
user interface.
2.4.4 Power and Compute Budget
As discussed in section 2.3.2, the design had into account thermal management with the
heatsink as well as the airflow holes around the case. It was still important to monitor the
power and computation necessary under no stress and full stress. Using the iOS application
PiHelper, we were able to connect to our Pi and the application helps us visualize the GPU
and CPU temperature[18].
Figure 2-12: CPU and GPU Temperature
- 26 -
CPU throttling as discussed in 3.1.9 Module Stress Testing, doesn’t happen until it
reaches around 80˚C and Figure 2-12 shows how the temperature while running Ubityke
reached 67˚C putting us in the safe zone.
2.5 Success Criterion
The requirements for the Ubityke system are not quantitative or deterministic due to the
prototypical nature of the project. The pass/fail criteria shall be established by whether or
not the device operates as intended within a rough estimate of measure. As a system, the
specifications and limitations of the features will be governed by the specifications of the
underlying components. For example, if the camera is capable of capturing 1080p video
at 30 frames per second, the team will not use this component specification in the
overarching system specifications. Rather, the specifications of the components flow up
into the overall success of a particular system feature, such as whether or not the mobile
starts spinning if the motion of the baby is consistent for a predetermined period of time.
The fact that this system feature works as intended is because the proper component was
selected for this task based on the engineering requirements of the project. Therefore, a
high-level success criterion will be based on a binary result of whether or not the device
works as intended.
During the integration and test phase of the project, a sample size of runs will be
determined by a reasonable confidence level for a project of this nature. Because of the
limited amount of time allotted to test the project coupled with the high reliability of the
underlying components, a simple pass/fail criterion shall be established and perhaps
satisfied based on the outcome of trials during the integration and test phase.
- 27 -
Table 2-3: Success Criterion
Feature Description Pass/Fail
(>90%)
Number
of Trials
Nightlight All LEDs switch on and off based on user
input 50
Mobile Mobile spins and stops at a constant speed
based on user input 50
Lullabies Audio is started and stopped based on user
input 50
Microphone Distinguishable audio recorded and played
back from immediate vicinity of baby 50
Threshold
Triggered Event
(Mobile)
Mobile starts and stops after
predetermined threshold conditions are
met
50
Threshold
Triggered Event
(Lullabies)
Audio playback starts and stops after
predetermined threshold conditions are
met
50
- 28 -
Chapter 3
Contribution
3.1 Design Implementation
3.2 Mechanical Design
3.3 System Integration
3.4 System Validation and Test
3.5 Lessons Learned
Summary
This chapter highlights the contribution to the project. Herein, the design
implementation is discussed in detail, providing charts, diagrams, test results, and
lessons learned throughout the entire project phase. The engineering disciplines
employed are mechanical, electrical, optical, software, and systems.
- 29 -
3.1 Design Implementation
The transition from a thought experiment, to feasibility studies, initial requirements, high-
level system architecture, and component research for project compatibility was not only
intuitive, but straightforward to implement into a project schedule with actionable items
and concrete metrics. Some of the greatest modern-day inventions leveraged pre-existing
technologies which were integrated into a system which utilized and took advantage of
symbiotic relationships between the underlying components and subsystems. Consider
LiDAR (Light Detection and Ranging) systems...When designing these products,
engineers and scientists are given a list of requirements such as range, field of view,
resolution, and accuracy. The device oftentimes consists of commercially developed
lasers, detectors, and digital time-of-flight counting ICs. The devices are integrated into a
package and algorithms are developed to process the data from the sensor. Ubityke is
much like this in the sense that we are using proven technology and components to
integrate into a system for a particular purpose, then developing algorithms and software
to obtain desired results.
At the heart of Ubityke are sensors which take input from the environment,
process the data, and produce an output based on predetermined settings and
configurations. The common baby monitor consists of a camera, microphone,
transmitter, and receiver. Some of these cameras have different features such as auto-
focus and gimbaling. The Ubityke camera is a low-cost, high performance device capable
of capturing 1080p HD video at 30 frames per second. This camera is the primary sensor
in the suite. It provides constant monitoring of the baby, plus can be used as a threshold
trigger. When Ubityke is configured to run in TTM (Threshold Triggered Mode), a
motion detection algorithm is used to monitor an aggregate of frames, and if continuous
motion from the baby is detected for a predetermined amount of time, the microprocessor
sends a command to another device, such as the servo-driven mobile to activate and
soothe the baby. In order to accomplish this, a holistic approach to basic classical optics,
and a little research into computer vision, and a little luck that most baby cribs are built
with standard dimensions of 28” wide and 52” long. In implementing the primary feature
of the camera, the team chose a camera capable of day/night monitoring by selecting a
device with built-in photoresistors that trigger IR LED lamps in low light and a motorized
IR cut glass filter that slides in front of the camera sensor in the daylight so the images
aren’t colored with a red tint from the infra-red light hitting the sensor in the daylight.
- 30 -
Since the baby monitor camera’s only job is to view the baby, the camera lens was
chosen to capture only the required view of the environment. Since motion detection is
the underlying method by which this subsystem functions, a lens with too wide a viewing
angle could capture a human entering the room, or perhaps the family pet, leading to false
positives. The region of interest is inside the perimeter of the crib, so a lens with a
narrower field of view was chosen for this application. An example of how the team
arrived at this decision is shown below.
Consider the Ubityke camera configuration in figure 3-1, where y is the working
distance of the lens (distance from the object plane to the front surface of the lens) and x
is half the distance of the horizontal field of view. It stands that…
𝑡𝑎𝑛𝜃 = 𝑥
𝑦 3.1
𝐹𝑂𝑉° = 2 ∗ 𝑎𝑟𝑐𝑡𝑎𝑛 (𝐹𝑂𝑉𝐻𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙
2 ∗𝑊𝑜𝑟𝑘𝑖𝑛𝑔 𝐷𝑖𝑠𝑡𝑎𝑛𝑐𝑒) 3.2
Figure 3-1: Field of View and Lens Selection
Ras pi 4
picam
servo module
f = 3.6mm
θ
x
y
Object Plane
Field of View
- 31 -
For the case of the standard crib, to maintain a horizontal field of view of 1meter
with the sensor mounted a meter away, the angular field of view would have to be 53°. If
the Ubityke is mounted approximately 3’3” above the crib, the two feet wide crib should
be easy to see with high resolution. Unless baby is pulling themselves up and climbing
on the rails of the crib, this is an acceptable choice in lens based on working distance.
Additionally, the lens is specified to have a 67° field of view when used with a Raspberry
Pi camera. This means that a sensor height of 1.3m or 4’3” would be well within the
manufacturer’s specifications and physical capabilities of the lens when baby gets a bit
older and starts pulling up.
A top-down system level design approach to the project such as just described was
carried out for each subsystem and component. The figure below shows the logical
progression of how the system requirements were flowed down from the system level, to
the subsystem and component level, and then finally validated through testing.
Figure 3-2: Design Implementation
System Requirements
Defined
• Features
• Operational Modes
• HW/SW Compatibility
Subsystems and Components
Chosen
• Engineering Analysis on Individual Components
• Component Interoperability Validated
Device Functionality
• Component Verification
• Component Validation with SW/Algorithms
• Component Stress Testing
System Integration and
Test
- 32 -
Each feature of the system was developed in the same fashion. Typically, each
device was validated by carrying out a series of functional tests consisting of writing
Linux shell scripts to communicate with the device and configuring the device to operate
as intended in the Ubityke system. For example, the FS5103R is a tried and true servo
motor used in countless robotics applications[7], therefore it was not necessary to
characterize the motor, rather, verify that the stall torque of the motor was sufficient to
hold the load of the baby mobile carousel to first order. Also, to what degree could the
rotational speed of the servo be deterministically controlled? Communications to the
motor were established through the Raspberry Pi using commands from the Linux
command line that enabled the GPIO pins of the Raspberry Pi to serve as power, ground,
and data inputs into the servo. The method for communicating with the servo took
advantage of an open-source library which allowed the user to enter the modulation time
or rotational speed of the servo through an argument passed to the particular function
which controlled the servo.
# These commands rotate the servo at approximately 15rpm # Must be in PIGPIO directory sudo pigpiod # rotate the servo at 15rpm by setting output to GPIO 4 and # pulsewidth modulation signal to 1525ms pigs s 4 1525 # stop the servo using a pulsewidth of zero pigs s 4 0
Figure 3-3: Servo Command Example
After each device was validated individually, components were integrated into
subsystems and breadboarded to be validated as a system. Once the subsystem was
validated, it was exercised for a prolonged period of time to simulate agitation over a
portion of the lifespan of the whole system. Finally, the subsystems were integrated into
the final package and tested with the user interface.
- 33 -
Figure 3-4: Subsystem Testing Example
3.1.1 Software Development
Software Development for Ubityke started back in October of 2019, during the research
phase. We knew we were going to use Python to speak to the hardware and concluded
using Flask as the bridge for our hardware and software. It turned out to be a great solution
because it allowed us to integrate front end languages that we already knew in an easy to
use interface. Best of all, it integrates python, so we are able to do calculations and more
complex algorithms while integrating them to a web server user interface. OpenCV, since
it is written in Python, can easily integrate into our code and utilize the different libraries
that are offered. This allowed us to create our own algorithm called Threshold Triggered
Mode that is integrated in our user interface.
3.1.2 User Interface
The team proposed an easy to use interactive user interface that would be cross platform in
case you wanted to control it through your desktop with an extra screen, or with your iPad
or Android phone. Regardless of what platform the end user has, the user interface will
always load since we are using our own local web server. All you need is an internet
connection and a device that can access the web. The User Interface is powered by HTML5
Up which uses CSS and JavaScript to adapt to any size device.
- 34 -
3.1.3 Web Framework
The team had originally proposed to use Apache and PHP as the web framework which
would have eventually worked but it was taking too much time and PHP was giving too
many permission errors. We learned that there was a much more efficient web framework
out there that integrates easily if you are creating a web interface which is exactly what we
promised. Flask delivers just that. It is a micro framework that allowed us to have our own
web server where we published our user interface building a Flask application. Figure 3-5
shows how we are combining the data coming in from the sensors, storing the data in our
local database with SQLite and then using Flask to publish our front-end user interface.
Figure 3-5: Web Framework Overview
The Temperature and Humidity Sensor uses a python script to collect data, store in
our own local database powered by SQLite, a database management system. Our table
requires 3 fields, the exact time and date when there was a reading, a numeric variable temp
to hold the value of the temperature in C˚, which we convert to Fahrenheit, and a numeric
variable hum to hold the value of humidity. A table in SQLite was created to store the
values from the sensors:
sensorsData.db:
Table: DHT_data
- 35 -
Table 3-1: Sensor Data Log
timestamp (DATETIME) temp (NUMERIC) hum (NUMERIC)
‘now’ 28 44
The script in logDHT.py has a threshold, the sample frequency which is set as time
in seconds. It will let our script know how often it should collect data from the sensors and
store it. Default value will be 60 seconds. The script will then grab the data from the sensors
and insert the values to our DHT_data table so we can later grab those to display in our
user interface.
3.1.4 Motion Detection Algorithm
Using cameras and sensors to aid in motion detection and computer vision has been around
for decades. Thanks to the overwhelming support from community contributors, most of
the methods for motion detection, facial recognition, and object classification using
different sensors and computers to handle image processing have made their way into do it
yourself projects than can be accomplished with little to no coding experience. OpenCV
for python provides a free library and examples from which to draw on and practice.
One such online resource, pyimagesearch.com, run by Dr. Adrian Rosebrock has
been an absolute watershed of content. Embedded in the source code for Ubityke is the
single motion detection function developed by Dr. Rosebrock. In a general sense this
function is an open-source standard motion detection function that primarily builds the
method with the following steps:
1. Develop a background model with which to compare incoming frames. This is done
by assigning a weighted average to pixel intensities over a predetermined number
of frames. These pixel intensities are read into memory, recalled, and output into
two-dimensional arrays using the numpy library of functions from python. This
tuple can have a variable assigned to it such as (bg) for background.
2. Next, the incoming frames must be read in as two-dimensional arrays and compared
to the background model. The absolute difference function will compute the
difference between the image and the background model and then be compared to
a predetermined binary threshold value.
- 36 -
3. OpenCV methods such as erosion and dilation are then applied to the thresholded
image to remove noise such as shadows, blur, and other artifacts after the color
image has been converted to grayscale. It is much easier to find contrast in a black
and white image than one with different hues.
4. After the image has been processed using the OpenCV methods described in step
3, another OpenCV method called findcontours is applied to the image to find
bounding shapes or other forms in the image. In the case of the algorithm used in
Ubityke, each subsequent frame is looped over to check for contours that exceed the
threshold declared in function. If contours are found, a bounding box is drawn
around the area containing the motion.
This single motion detection algorithm is the basis for the threshold triggered mode of
the system. The system uses a while true statement to allow the program to constantly loop
over frames and search for motion[15.] One of the most critical parameters in the detection
algorithm is the threshold argument. The threshold is an integer data type that is used to
set the binary level for which the program defines motion as the sufficient change between
the background frame and subsequent frames. If the threshold is set too low, the sensor
could trigger off of shadows sweeping across the room, or if the baby merely rolls around.
The threshold level had its final adjustment when the system was integrated into the
package and the sensors were exposed to all the heat and noise of neighboring components
and the environment.
3.1.5 Event Detection Triggering Algorithm
Perhaps the most sizable difference between Ubityke and other products in the same
marketspace is the intelligence built into the system. Not only can all of the features of the
device be controlled from anywhere in the user’s home, but the device can be set to
TTM(threshold triggered mode), which allows the user to configure the device to take input
from the baby, and based on a pre-configured setting, execute an action. For example, if
mom or dad has had a 12-hour shift at the factory, and they need to doze off for a bit,
Ubityke can be configured to turn the spinning mobile on, or play a lullaby if the baby is
uncomfortable enough that it moves consistently for a predetermined amount of time. This
is not a substitution for shirking parental responsibilities but can be a game-changer when
the parent needs sleep or wants to sleep train the baby.
- 37 -
The event triggering algorithm is fairly straightforward. The most sophisticated
piece of the system software is the motion detection algorithm and the web framework.
The event triggering algorithm is merely a timer with some if-else statements embedded
within the while-true logic of the motion detection algorithm loop. A brief synopsis of the
algorithm is given below:
1. Determine a threshold time in which you want to soothe the baby with the
mobile or lullabies and call that the threshold time.
2. Once motion is detected, start a timer which is linked to the timestamp of the
frame with motion.
3. A variable called elapsed time will be created which is the difference between
the start of the timer and the current time.
4. If the elapsed time is greater than or equal to the threshold time, an event shall
be triggered until motion ceases, and the elapsed time is less than the threshold
time.
5. The algorithm continues to loop over the frames while detecting motion and
compares the elapsed time to the threshold time.
6. If there is motion for any time less than the threshold time, the timer shall
continue to run and compare the elapsed time to the threshold time without
triggering an event.
7. If motion is not detected, the timer will not start again until motion is detected
in the next group of frames.
The algorithm was developed with the help of Professor Gerry Reed who was kind
enough to devote a series of Friday afternoons to help troubleshoot and provide
housekeeping guidance to the team during software development. The camera was
breadboarded, and LEDs were used in place of the servo. Once the algorithm worked and
the team was able to demonstrate that the LEDs would turn on and off according to the
threshold condition, the confidence of the team was high enough to start integrating the rest
of the components into the package. Since the triggered event is basically setting a GPIO
pin on the Raspberry Pi high, the LEDs theoretically could be replaced by the servo motor
since the data line on the servo just needs to have a condition written to it with a pulsewidth
argument. This was verified shortly after when the team integrated the system.
- 38 -
3.1.6 Functionality of Standalone Modules
Ubityke consists of individual devices integrated into a system to perform tasks given by
input from the environment. This system is modeled much like a human and makes use of
physical senses. The camera is the eyes of the system which can see in the dark if necessary.
The temperature and humidity sensor is the skin or touch sense. Audio inputs and outputs
are the ears and mouth of the system. The detection and triggering algorithms are what
gives the device a bit of intelligence. Each of these sensors and transducers have to be
tested as individual components before integrating them together in a system. The third
level of figure 3-2 gives an overview on how the individual devices are tested and screened
for functionality.
During the proposal phase of the project, the team had some lofty ideas on how to
verify and validate components used in the system. The engineering requirements table
gives an ideal view as to how the components should be screened and validated before
placing them into the system. However, in the interest of time, practicality, and resources,
some of the testing and verification methods were condensed or skipped altogether due to
the confidence level in the commercialization of the components and the engineering
analysis and research during the design phase. For example, the LEDs which make up the
nightlight could have been tested with a photodetector or radiometer to measure the output
flux. The manufacturer already performed this test and included it in the datasheet given
in Appendix E. What was more beneficial and practical was to characterize the devices
according to how they would be used in the system. For example, the current draw of the
devices was just as important if not more important than whether or not they were
performing according to the manufacturer’s specifications. Device certifications and
standards set forth by regulatory bodies comprised a large portion of the confidence factor
of the team. We did however characterize the devices though.
- 39 -
Figure 3-6: LED Current Draw
Figure 3-7: Servo Current Draw
- 40 -
The current draw for the individual devices served as inputs into the power budget
to ensure we wouldn’t have inadvertent system shutdowns based on the initial
calculations performed when the team purchased the power supply. The camera has a
ribbon cable which supplies the power and data signals, and there was nowhere on the
schematic of the Raspberry Pi which defined a test point where current could be measured
or derived from a voltage and equivalent resistance. We had to rely on the
manufacturer’s specifications and tribal knowledge from the maker community to secure
our confidence level. The camera functionality was tested in both day and night
conditions and was found to provide an adequate field of view, frame rate, and resolution.
Besides testing the physical characteristics of the devices, Linux shell commands
had to be developed in order to communicate with each device as a standalone
component, and together as a block. These commands may be found in Appendix B. The
proceeding section shall discuss how the individual modules were validated.
3.1.7 Validation of Standalone Modules
The reliable operating parameters of the individual components from which Ubityke is
comprised listed in the datasheets of Appendix E, were chosen specifically to work within
the team’s chosen system architecture. Validation of these components at the device level
would be costly and duplicitous. Since the system has no real deterministic requirements
other than intentional functionality, the design team chose to have a qualitative battery of
tests which will provide a confidence level suitable to proceed with system integration.
As described earlier in section 3.1.6, each device was characterized according to
how it would behave in the system. After deciding that the components were in fact right
for our application, a sample size of runs was chosen which the team was comfortable with.
Statistical methods such as confidence intervals and probability densities would not be
valid with a sample size of one, therefore, the team trusted the manufacturing abilities of
the component vendors and derived the test plan shown in table 3-2.
- 41 -
Table 3-2: Hardware Validation Testing
The USB microphone which failed testing also proved the most difficult device to
characterize. There was no real traceability concerning a datasheet for the original
manufacturer of the device. Most third-party sellers had no dynamic range of the device
on any type of technical documentation, just physical dimensions. Additionally, the
product got terrible reviews, but the design team had to do a hard cut over in the system
architecture due to a bug in the operating system for the Raspberry Pi which wouldn’t
allow for volume adjustment in the built-in DAC mixer, so this device had to be a place
holder in order to keep the system functionality consistent with the statement of work.
The rest of the modules had almost perfect scores during the validation phase due to
the high reliability and low risk of the technology used in the components. LEDs, simple
audio speakers, camera sensors, and plug and play USB devices have been around for
decades and manufacturing processes have become so robust, that some of the devices
aren’t even tested after manufacturing.
3.1.8 Module Compatibility with Software Interface
The modular environment we adopted when we started developing for Ubityke allowed
us to independently test each feature and after a success run it would then be wrapped in
our Flask application. This continued when developing the Classes that were going to
control each feature. 4 main Classes were written in order to control the following
features:
Component SpecificationPass/Fail ≥90%
Result Trials
Camera (Daylight) Record, playback and stream video upon command with resolution adequate to support all OpenCV algorithms
Pass 90% 25
Camera (Dark) Record, playback and stream video upon command with resolution adequate to support all OpenCV algorithms
Pass 90% 25
Mobile (Servo) Servo spins at the appropriate speed when activated. Pass 100% 25
Nightlight (LEDs) LEDs illuminate and shut off upon command Pass 100% 25
Audio-Out(Speaker) Speaker delivers audio at the specified level upon command
Pass 100% 25
Audio-in (Mic) Microphone captures audio in the immediate vicinity with minimal distortion
Fail 76% 25
Temp/Humidity Temperature and humidity levels are read to within ±2°C Pass 100% 25
- 42 -
Class Alert:
sendSMS(to): Function that takes a US phone number as the parameter and sends
a SMS text message that will be used to send an alert when the temperature
threshold is met.
Class ServoController
servo_on(val): Function that will turn the servo on using val as the parameter
which is the pulse width modulation value.
servo_off(): Function that will turn the servo off with a pulse width modulation
value of 0.
Class Timer
start(secs): Function that will start a timer that ends at secs seconds.
tick(): Function that checks if there is a timer running. Returns true if timer has
expired.
stop_timer(): Sets the variable timer_running to false.
Class Player
player_on(path, duration): Function that first kills any instance of omxplayer in
case a lullaby is playing, and then initiates a player with the two parameters given,
full directory path to the song and the duration of the song in seconds.
player_off(): Kills any instance of the player.
These Classes allowed the Ubityke team to debug efficiently because we tested each class
in the main function of each file and made sure they worked before implementing them in
the software. This was more efficient as well because it allowed us to use objects and
classes to instantiate each iteration of the classes. These classes were utilized for both the
manual modes as well as during Triggered Threshold Mode (TTM).
Figure 3-8 follows the logic when you run Ubityke.py. Once it reaches the last part
it renders the user interface with the values it gathered, and acts based on the action by the
end user when they’re controlling the features either manually or through TTM mode.
- 43 -
Figure 3-8: Ubityke.py Flowchart
- 44 -
3.1.9 Module Stress Testing
Oftentimes, during product development efforts, devices undergo what is sometimes
referred to as stress testing, lifetime testing, or accelerated lifetime testing. The purpose of
these tests is to deterministically assess if and when the device will fail. It would be
cumbersome and time-consuming to build a system, assemble it, and then have a live test
case to validate the operation, much less the lifetime of a system. The topic of product
testing and test engineering is a science in itself and is beyond the scope of this project,
although, a member of the Ubityke team has special abilities and unique experience in the
arena of product development and test engineering.
Once primary concern of the device is that of the CPU throttling causing latency,
or the delay of data transfer after a command for transmission has been executed. In the
instance of Ubityke the camera is the most critical sensor in the stack because it is the eyes
of the system. The user has to be able to have a reliable video feed at all times. If this
feature fails, the system is not functioning as intended as a baby monitor. All devices have
physical limitations, and if the Raspberry Pi CPU temp exceeds 80°C, the processor will
be throttled causing latency. This directly affects the video feed because the processor not
only has to chunk all of the camera frames during the motion detection routines, but output
to a web server using multi-threading. To mitigate this risk, the Raspberry Pi with four
cores in the CPU was chosen. Additionally, the Raspberry Pi had thermal putty applied to
the ICs to draw out the heat and transfer them to a thermal mass. A stress test was
performed on a bare Raspberry Pi with no active or passive cooling, nor any methods for
heat exchange. After ten solid minutes of constant motion, the CPU temperature exceeded
80°C or 176°F. The fact that the IC doesn’t melt is impressive. However, this is an
undesirable result, and provided our first limitation of the device. If the device is to be used
to sleep train a baby by allowing the caretakers to not be present in the room for a period
of time while the baby uses the distraction of the mobile to help self-soothe, there is a hard
limit on the time that the baby can constantly move before the video feed isn’t reliable and
in real-time anymore. Perhaps a more efficient algorithm or subroutine could be developed
that would consume less processing power, but for the case of the Ubityke team, this seems
to be a hard limit. The plot in figure 3-8 displays approximately 25,000 data points taken
over the 10-minute stress test. A CPU temperature logging loop was inserted into the main
routine to monitor and record the data while the motion detection algorithm was constantly
taxing the CPU.
- 45 -
Figure 3-9: CPU Stress Test
3.2 Mechanical Design
The mechanical design was perhaps the most challenging portion of the project aside from
the software development. Neither of us have any experience with designing in
Solidworks, 3-D printers, or fixturing. Luckily one of the Ubityke team members has
worked in industry for quite some time and has been exposed to mechanical design
principals, renderings, and mechanical dimensioning/tolerancing. The team was fortunate
enough to employ the help of industry veteran Carlos Casteleiro, who is a multi-disciplined,
multi-patented, generous and humble engineer who gave freely of his time, Solidworks
license, and 3-D printer to teach us and help with the project. He told us he would only do
what we asked him to do regarding the design of the product. For example, if the team told
him we wanted a 3” diameter hole somewhere in the package, he would teach us how to
place it there in Solidworks. If we didn’t tell him to model all the components with all of
the wires and connectors inside the box, he wouldn’t.
CPU Throttle
- 46 -
The design kicked off with a meeting at Carlos’ shop. We sat down and showed
him our proposal and discussed how we envisioned the final product. He made a few
suggestions, but ultimately the team was tied to the components which were already chosen
and validated. We needed to make them fit into a quasi-small package that could be
mounted and adjusted. Carlos returned with a pair of analog Vernier calipers and told us
to start measuring critical dimensions.
3.2.1 Critical Dimensions
All of the components’ critical dimensions had to be measured in order to build a model in
Solidworks that would be useful and realistic when the time came to 3-D print the housing.
Most of the devices used in the system are patent and copyright protected and therefore
finding existing 3-D models online to import into our design was difficult. A few devices
had models online that could be imported into our model, but most didn’t. The ones that
didn’t, we measured all the physical dimensions and used them to create renderings in
Solidworks.
Figure 3-10: Mechanical Design
- 47 -
Figure 3-11: Critical Dimensions
The diameter of the lens mount shown in figure 3-10 had a physical specification
from the manufacturer, however there are tolerances associated with all dimensions. No
two pieces are typically made the same, especially if they are mass-produced in the case
of the plastic molded M12 lens for the Raspberry Pi camera. If the team had just looked
at the data sheet and used the diameter of the lens mount for the opening in the housing,
and the part was at the high end of the tolerance limit, there is a good chance that it
wouldn’t fit through the housing. This is why all critical dimensions on all parts must be
measured before deploying time and resources to 3-D print materials. This is even more
critical in designs involving metal. It’s easier to modify plastic than stainless steel or
aluminum.
3.2.2 Solid Models
Once all the components that were intended to fit inside the housing were measured, the
housing itself was next to be designed. The team tried to cover all precarious areas of
operation such as heat removal through ventilation, component mounting, and housing
mounting. We knew initially that the real value in the project was in the device itself and
not in a fancy floor stand to hold the device, so the team decided that the housing needed a
- 48 -
feature that would allow it to be bolted to a cylindrical rod that could be attached to a base
or a tripod. After several hours of deliberation on topics such as which side of the housing
to mount the speaker, how high the servo motor should protrude from the top of the housing
as to not interfere with the mobile carousel, the fact that the temperature sensor has to be
mounted on the outside of the housing so it wouldn’t be influenced by the heat inside the
box, that the LEDs should be mounted on top of the housing so as to not have to worry
about diffusing the LEDs, how will we run power to the device without getting caught in
the carousel, and a myriad of other topics, a final configuration was decided upon.
Figure 3-12: Ubityke Housing
- 49 -
Figure 3-13: Ubityke Exploded View
Fortunately for the team Carlos’ 3-D printer can import the Solidworks files and
just print them without having to use a .stl file which can sometimes become corrupted or
inaccurate when being imported. Additionally, the team saw some of the problems other
students were having with the 3-D printer at Valencia and were grateful to have the
resources we did.
3.2.3 3-D Printing
The housing for Ubityke was 3-D printed from high temperature SLA resin which is a
photopolymer that is initially in a liquid state when it is ejected from the print nozzle and
then turned into a solid when exposed to UV light[15]. The team basically loaded the
Solidworks file into the print cue and pressed go. This was no ordinary hacker-maker 3-D
printer, but a $50,000 industrial-grade printer which prints the material on a printed support
piece which is then separated from the product by placing it in a bath of sodium hydroxide.
- 50 -
Figure 3-14: 3-D Printer
Figure 3-15: Finished Product
- 51 -
3.2.4 Assembly and Mounting Scheme
Most of the focus during the mechanical design phase of the project focused solely on the
goal of having all the components fit securely and function while inside the housing. Very
little, if any attention was paid to the design from an assembly standpoint. Due to the
compressed nature of the schedule, ‘Flying the plane while building the wing’ as it is
sometimes referred to in industry, the team relied on the tolerancing considerations used in
the mechanical design and the fact that the housing could be modified if necessary in order
to finish the project on time.
During the design of the housing Carlos kept true to his word. He only helped us
model the components we asked him to. We made a mistake in not including the connectors
and cables during the initial mechanical design. This led to clearance problems and some
components not fitting in their intended locations as flush as we had hoped for. More of
this will be explained in detail later in the lessons learned section. After some modifications
to the housing and some creative soldering the components were eventually integrated into
the housing.
Figure 3-16: Nightlight Circuit Wiring
- 52 -
The LED circuit was comprised of four LEDs connected in parallel. In order to
achieve this all the anodes had to be wired together and all the cathodes had to be wired
together with common bias voltage and ground lines. The lid of the housing allowed the
LEDs to sit flush, which was aesthetically pleasing, but posed problems during assembly.
Because the base of the LED domes sat flush with the top of the housing lid, all the wire
and heat shrink could not be fed through the holes because they were designed to have
enough clearance for the LED leads only. This forced the team to have to solder and heat
shrink the circuit in mid-air using a vice and spring clips.
The device is intended to be mounted to a floor stand. A floor stand in the simplest
sense is a pole fixed to a base. For testing and demonstration purposes a tripod was used
to provide an adequate base for the device and the 1” diameter pole extending from the
tripod was used to affix the mounting feature of the device. Adjustment features for the
device were not elegant, but they provided the necessary functionality to adequately adjust
the device.
Figure 3-17: Ubityke Mounting Scheme
¼-20 captive screw with wing-nut
attachment provides pitch adjustment
of the device.
#6 threaded bolt and through hole
configuration allows for ≈ 270˚ yaw
adjustment of the device.
U-bracket with locking
features provides
height adjustment.
- 53 -
3.3 System Integration
After each module passed the validation and verification stages, the system was assembled
by fastening all of the components into the housing and connecting the inputs to their
respective GPIO pins on the Raspberry Pi. The system schematic and pin mapping
illustrate how all the modules were interconnected.
Figure 3-18: Ubityke-Raspberry Pi Interface
Figure 3-19: Ubityke Circuit Schematic
1 2
3 4
5 6
7 8
9 10
11 12
13 14
15 16
17 18
19 20
21 22
23 24
25 26
27 28
29 30
31 32
33 34
35 36
37 38
39 40
Servo Power
Servo DataServo Ground
LED Current
Temp/Hum Data
Temp/Hum Power
LED Ground
Temp/Hum Ground
- 54 -
By following the pin-out and the circuit schematic, the device was able to be
connected fairly easily. The team’s lack of experience in mechanical design was a detractor
to the assembly process, as the housing had to be modified somewhat to allow for some
clearance issues. The shape of the speaker was odd. The speaker was an orb with a
sweeping radius that was difficult to measure, and the opening didn’t offer sufficient
clearance. Since the walls of the housing were only 1/8” thick, the interface between the
speaker and the housing wall was a line fit. The housing had to be opened up to allow the
speaker to fit and the remaining gap had to be filled in with an encapsulant. Aside from the
clearance issue of the speaker, the rest of the integration went smoothly.
Figure 3-20: Integrated System
Once all the devices were properly seated and connected within the system, the
team attached it to the tripod and verified the functionality of all the different modules as
well as some quick field of view tests to ensure that the camera lens was in the correct
position. Equations 3.1 and 3.2 were used to verify the field of view of the camera in
order to give the team the confidence needed to proceed with the top-level system
functionality, validation, and testing.
- 55 -
3.3.1 System Functionality
The user interface was designed around the video feed since the safety of the baby is the
number 1 priority. In order for the user interface to be fluid, every time someone clicked a
button to control a feature instead of having to load the whole page we would ideally want
to click the button and activate the action right away without having to refresh the web page
and interrupt the fluidity of controlling the different features. We used AJAX, or
Asynchronous JavaScript and XML to deliver the request to the webserver via the button
without the need to refresh the whole page by sending the URL GET request and getting a
response automatically just by clicking the button. When you click the button Turn Servo
ON, it sends an AJAX request through the URL via a GET Method which sends the request
to our Flask webserver which calls for the servo_on() function in the ServoController Class
which causes the servo to rotate. This action is illustrated in Figure 3-21.
Figure 3-21: System Functionality when Turning Servo On
3.3.2 System Validation
System validation followed the same course of action as in 3.1.7 where each module was
tested while integrated into the system. All use cases were analyzed and ran according the
test plan defined in our outline and measured against all functional parameters. The same
methods used to characterize the system were the same methods used to qualify the system.
Functionality and repeatability were treated as go/no-go events.
3.3.3 System Compatibility with Software Interface
The versatility with a Linux operating system allowed us to interact with different libraries
and programs that were either already pre-installed or we had to install through the
command line in Raspbian OS. The Raspberry Pi has a multitude of different operating
systems you can install, but for the purposes of Ubityke we needed an operating system
- 56 -
that could handle command lines, have a proper file system that can handle a web server,
and be compatible with all the different features available in Ubityke. Raspbian is a Debian-
based computer operating system and it was able to handle all the features without a
problem. Figure 3-22 shows the configuration of the Operating System that the Raspberry
Pi was running during development and testing of Ubityke.
Figure 3-22: System Compatibility – Raspbian OS
This configuration allowed us to run Python 3.7.3 which was compatible with all
the different libraries and frameworks that we had to install for all the features to function.
It even allowed us to run some heavy frameworks such as Apache servers with PHP and
MySQL configurations which were the original frameworks discussed during the proposal
period of the project. During the first weeks of developing the team soon realized there was
a problem: Apache with the PHP and MySQL configuration is meant for resource heavy
applications and require different permissions that prevented the team with the
compatibility of the software and the hardware. After some more research the team
discovered there was a more efficient solution to the problem by introducing Flask with
SQLite3.
Apache was going to be our web framework; PHP was going to be our software-
hardware bridge and MySQL was going to be our database management system. All these
3 frameworks got replaced by 2, which will handle the web framework as well as the
- 57 -
software-hardware bridge since they are Python libraries; making the process more efficient
while allowing the system to remain compatible. The top reason developers are choosing
Flask and SQLite3 is how lightweight these programs are and with the limited resources
we already had and the problems we were having with PHP this was exactly the solution
we were looking for and after implementing it we were able to fully integrate the hardware
and software seamlessly through the web user interface.
3.3.4 System Stress Testing
After integrating all of the components into the housing a similar battery of stress tests was
conducted as in section 3.1.9. The results were on par with what was discovered when
agitating all of the components during the module stress testing. Most of the components
were large-scale, high-reliability components such as LEDs and servo motors whose mean
time between failures is exceedingly high. Even though the Raspberry Pi was inside the
housing, the ICs still had thermal putty between themselves and the aluminum heat sink.
The ventilation provided a path for the heat to escape but did not mitigate the threat of the
CPU temperature running away. The same latency was observed when observing the
camera output in the web browser. According to the documentation for the Raspberry Pi,
once the CPU temperature reaches the throttling point, it protects itself by dropping
processes from each of the four cores[3].
The LEDs and the servo were toggled 25 times successively just as they were in the
hardware validation testing. The temperature sensor was constantly reading and logging
data, therefore the confidence level for this device was already high going into system
validation. Calling the lullabies to be played through the speaker was not a high-risk
feature, so the 100% passing mark was not surprising. Only the microphone which the
team didn’t have time to fully develop the software wrapper for failed the testing. The
quality of the device was poor as a standalone module, and the performance just degraded
when placed into the housing even when it was tested using Linux shell commands
3.4 System Validation and Testing
Testing module by module is one thing but integrating everything in one system and
verifying that each module is behaving properly was certainly a challenge. After months of
developing and integrating each module, the team created a table (Table 3-3) that will
quantify the effectiveness of each feature.
- 58 -
Table 3-3: System Validation Testing
An important feature that was promised was the versatility of being able to operate the
different features using different systems. The code was written so that it did not matter
what type of device was requesting our user interface, the page wraps around the size of
the device and automatically adjusts the different modules in the interface. The user
interface was tested with different Operating Systems such as Windows, Mac OS, iOS, iPad
OS, and Android. After several trials of switching between the different environments and
testing the user interface, it worked in any of the environments. To initialize the user
interface, the file Ubityke.py and logDHT.py must run simultaneously. The first one
initializes the user interface and the latter is the script that logs the sensor data in a database.
A bash file was created to initialize both Python scripts at the same time and open a web
browser page with the user interface. That batch file was named runme and it was used to
initialize Ubityke for all testing and real-life purposes.
To test the web framework, we ran the runme script and if you were presented with
the User Interface, it meant that the web framework successfully deployed and was
rendered in index.html. After numerous successful trials it was given a 100% Pass result.
For the Data logging to receive the score, the SQLite3 framework gets kick started with the
Component Specification Pass/Fail≥
90%
Result
Trials
User Interface (PC) User Interface fully loads in desktop view and all modules are visible and end user can activate any of
them
Pass 100% 25
User Interface(Mobile)
User Interface fully loads in mobile view and all modules are visible and end user can activate any of
them
Pass 100% 25
Web Framework Flask is enabled and can render index.html (UI) via the IPaddress given to the Raspberry Pi
Pass 100% 25
Data Logging SQLite3 database table can store and retrieve temperatureand humidity values organized by date
Pass 100% 25
Toggle Mode End user is able to toggle each different feature inside the user interface
Pass 100% 25
Threshold TriggeredMode
When enabled via the user interface, the mobile will betriggered automatically when it detects constant
motion for 10 seconds.
Pass 96% 25
- 59 -
runme file as well in the logDHT.py script file that makes the connection with the hardware
sensors and logs the data in the database table. We tested this in two ways:
1. When the User Interface loaded, you can visualize see the temperature and humidity
and right under the last sensor reading.
2. Use the command line to go into SQLite3, load the database and view the tables
with the data containing the last sensor reading.
Comparing the current date and time with the last sensor reading allowed us to verify that
if the script was successfully storing the current data. The following 3 trials were set up to
get an average time difference between the current time and the last sensor reading.
Figure 3-23: Trial 1 Time Delta Between Actual and Sensor Readings
3
5
2
8
3
5
2
8
3
5
2
8
0
1
2
3
4
5
6
7
8
9
5:27:00 PM 5:40:00 PM 8:28:00 PM 8:52:00 PM
5:30:00 PM 5:45:00 PM 8:30:00 PM 9:00:00 PM
Tim
e (m
inu
tes)
Top: Last Sensor ReadingBottom: Current Time
Trial 1: Time Difference (minutes)
- 60 -
Figure 3-24: Trial 2 Time Delta Between Actual and Sensor Readings
Figure 3-25: Trial 3 Time Delta Between Actual and Sensor Readings
The average of all 3 was a difference of 5.25 minutes. Since it falls under our definition of
accurate data, it passes the test with 100% accuracy.
1
5 5
6
0
1
2
3
4
5
6
7
1:29:00 PM 2:25:00 PM 5:25:00 PM 6:54:00 PM
1:30:00 PM 2:30:00 PM 5:30:00 PM 7:00:00 PM
Tim
e (m
inu
tes)
Top: Last Sensor ReadingBottom: Current Time
Trial 2: Time Difference (minutes)
3
10
78
0
2
4
6
8
10
12
9:57:00 AM 10:35:00 AM 11:21:00 AM 12:52:00 PM
10:00:00 AM 10:45:00 AM 11:30:00 AM 1:00:00 PM
Tim
e (m
inu
tes)
Top: Last Sensor ReadingBottom: Current Time
Trial 3: Time Difference (minutes)
- 61 -
3.4.1 Daytime Use Cases
Daytime use cases as expected did not have any trouble executing. The camera was able to
grab the correct field of view and capture movements. The algorithms ran smoothly,
OpenCV detected the motion and when we triggered TTM mode, the correct behavior
followed. To prove the consistency of the algorithm the team developed, we did 25 trials
to see if TTM mode would continue to work. While it only failed once, it had an incredible
96% accuracy.
3.4.2 Low-Light Use Cases
During proposal the team wanted to include as part of the features a way to monitor your
baby even if it is completely dark in the room. With the hardware we chose and the
integration with the software we were able to deploy the Night Vision feature that kicks in
automatically when the room becomes dark. The night vision filter kicks in and you can
clearly see through the camera. Figure 3-26 shows the mobile user interface with the night
camera vision on.
Figure 3-26: Night Vision with Mobile Interface
- 62 -
With Night Vision we duplicated the same trials as daytime cases, and we also had an
accuracy of 96% since 24 out of the 25 trials passed.
3.4.3 Edge Cases
There was significant risk associated with some of the edge cases pertaining to the motion
detection algorithm used in triggering different events. One concern the team had which
Professor Notash alluded to was the case where another moving object occupied the same
frame as the region of interest. Consider the situation where the camera’s horizontal field
is wider than the crib, and the family dog comes into the room and starts sniffing the crib.
Would the algorithm focus on the object which had more contours throughout the
accumulated frames? Or perhaps the overall area or bounding box of the motion contained
within the entire camera frame would dictate priority. The Ubityke team got lucky during
a round of testing and discovered a portion of the answer. During the stress testing of the
CPU, it was discovered while waving at the camera to constantly activate the motion
detection algorithm that the bounding box drawn around the moving object was drawn
around the hand which was closest to the camera. This is by no means a quantitative result,
more so an observation. The best hypothesis the team could formulate was that the intensity
of the light reflected off of the hand closest to the camera triggered the motion detection
algorithm.
Without fully understanding how the OV5647 sensor sets the clip-level threshold
for detecting varying intensities, the team’s hypothesis remained untested. Similar cameras
with sensor arrays such as the OV5647 have a standard intensity threshold for each pixel,
and the intensities are binned according to how the software prioritizes values for each
frame. Though fascinating, the time needed to fully research and digest this was beyond
the scope of the project. However, different objects with varying reflectivity were held at
a distance behind the object closest to the camera, and the motion detection algorithm
behaved the same way. A piece of white paper was held at a distance behind the waving
hand in the camera frame and the motion detection algorithm still drew a bounding box
around the hand even though a piece of white paper is much more reflective than human
skin in the visible and near-infrared wavelengths at close distance.
- 63 -
3.5 Lessons Learned
Throughout the course of the design and project phases of this journey, the team gained
valuable experience and insight into many other branches of the engineering besides their
own respective fields of study. Not was the sheer technical aspect of the project difficult
but adding the complexity of a compressed timeline coupled with learning how to work as
a team to deliver on a project with a deadline forced the team to grow both technically and
emotionally. The team had to learn to rely on each other to meet deadlines and close out
action items. The journey was not easy by any means, but the end result was far more
rewarding than anything either of the team members could have hoped for. Hours could
be spent dissecting all that was learned during each phase of the project, but the heaviest
weighted lessons have been captured in table 3-4.
Table 3-4: Lessons Learned
Understanding how each of the devices would be wrapped in the main program
would have been helpful at the onset of the project phase. Perhaps during the proposal
phase, more research could have been done pertaining to how easy it would be to run
each of the devices in Python3. Most of the devices used in the system are typically used
individually with a Raspberry Pi. Simple Linux shell scripts will suffice for DIY or
hacker projects with a single component or threshold triggered event. In the case of
Ubityke where there were multiple dependencies at different hardware and software
Project Discipline Description Payoff
MechanicalAlways model products with all connectors
and cables
Clearance issues, mechanical obstructions alleviated. Assembly ease and fewer
modifications
ElectricalPlace more time in researching
communication protocol compatibility with different hardware during design phase.
Code base and device communication is less complex due to common component
libraries and package handlers
SoftwareMore emphasis on knowing functionality
and limitations of methods before choosing hardware components
Knowing exactly what the code is supposed to accomplish should drive the hardware selection, allowing for more robust and
efficient code.
SoftwareUnderstanding to what magnitude
different algorithms and routines consume CPU resources
Processes could have been offloaded to GPU or multithreading could save on compute budget, reducing the risk of
latency
PlanningHaving backup hardware with the same
communication protocol in case first choice component fails
Different USB microphone could have been integrated without detracting from the
overall functionality of the system
Project ManagementKnowing how individuals like to work and playing to their strengths in order to bring
the best out of the team members
Less micromanagement and more open-mindedness lends itself to a more healthy,
collaborative, and productive work environment
- 64 -
layers, it would have behooved the team to read deeper into forums about device
compatibility with Python3 and Raspberry Pi. The team could have avoided the
microphone debacle had we known the DAC mixer was broken in the OS from the start.
Additionally, a separate Raspberry Pi or other microprocessor could have been used in
parallel to share the burden of pushing parsed camera frames over the web interface. The
motion detection algorithm could have also been modified to average less frames to build
a background model. Perhaps even a different detection method could have been used as
a trigger such as an object tracker.
Despite all the challenges, the team is proud of their work and lessons learned
through this process. One of the greatest lessons of all is that nothing is impossible, and
anything within reason can be accomplished with the right combination of research,
analysis, planning, and time.
- 65 -
Chapter 4
Non-Technical Aspects
4.1 Environmental, Health, and Safety Concerns
4.2 Ethical Concerns
4.3 Social Concerns
4.4 Economic Impact
4.5 Budget
Summary
This chapter outlines basic non-technical aspects of the project such as
environmental, health, and safety concerns along with social and ethical
considerations. Economic impact of the product is discussed with regards to
affordability across the socio-economic spectrum. Finally, a bill of materials is
presented for the project budget.
- 66 -
4.1 Environmental, Health, and Safety Concerns
The Ubityke team, as engineers, product designers, and fellows trying enrich and improve
the lives of those around them must be aware of several non-technical aspects of
designing and building a product. The interdependence of the health, safety, and welfare
of the population is interwoven in the design and deployment of any product. There are
certain regulatory bodies which enact policy that serves the interest of those whose lives
will be impacted by a certain consumer product.
All of the materials used in the product must be safe, and free of harmful
chemicals. If any hazardous chemicals exist anywhere in the manufacturing or packaging
of the product, there must be sufficient warning on the product to comply with the
governing body who oversees a particular consumer product. Additionally, when
building and testing the product, care must be taken when disposing of hazardous
materials in the testing and manufacturing process. Solvents, used batteries, oily wastes,
and used epoxies must be disposed of properly according to environmental standards.
Finally, the design of the product must incorporate the highest safety standards to
ensure that the device is structurally sound, and when mounted, the integrity of the device
isn’t compromised. This risk can be partially mitigated in the design process by choosing
a sound mechanical design with the proper hardware and mounting techniques.
4.2 Ethical Concerns
When designing, building, testing, and documenting Ubityke the team shall be mindful of
ethical concerns. These include, but are not limited to: proper documentation throughout
the entirety of the project giving credit to sources wherever possible, having integrity and
upholding canons from any inclusive societies related to the project, being honest and
forthright in all aspects of time and material contributions to the project, as well as giving
credit to any individuals who help in the deployment of the product.
4.3 Social Concerns
Ubityke has the potential to be a commercial product. As such, close attention must be
paid to the social implications of having a product such as this on the market. Since the
device will have to connect to a web server via Wi-Fi, and perhaps store camera frames,
and personal information on that server, precautions have to be taken to ensure that if the
- 67 -
product is deployed outside of a test environment, security protocols are followed to
defend against packet sniffing software, port-jacking, and other cyber-security attacks.
Additionally, personal information must be safeguarded such as logon and geolocation
information. Cell phones in particular are subject to attack by giving access to the web
server to send and receive data to the device.
4.4 Economic Impact
One of the primary drivers for the versatility and ease-of-use of Ubityke is the economic
impact the device could have on the public. Typically, devices such as these are a couple
of hundred dollars. This price point could make these devices cost prohibitive to certain
socio-economic portions of the general public. Consider a family who receives subsidies
in whatever form those may take. The primary purpose of these families is to ensure the
safety and development of the child. If a family can barely afford to pay their bills and
provide food and clothing for their loved ones, a device like this could perhaps allow a
single parent working more than one job more sleep, having peace of mind in knowing
that baby could be soothed at 3:00 in the morning if it wakes up. Just a few hours of
sleep a night has the potential to recalibrate someone’s whole outlook on life.
Since Ubityke in the prototype phase is near the price point of several competitive
devices, the commercialization of the product could allow for the price of the components
and materials to be driven down to the point that the device could cost a few dollars to
manufacture. Separate modules could be integrated into a SoC architecture using
firmware methods to operate the device rather than hardware.
4.5 Budget
After researching the project and the different options available to deliver the final
product, the design was divided up into blocks to perform different functions. Those
blocks were either individual pieces of hardware, commercial off the shelf components,
or integrated modules that lent themselves to compatibility with the Raspberry Pi. An
initial bill of materials fell out of this analysis and is displayed in Table 4-1.
- 68 -
Table 4-1: Proposed Bill of Materials
Qty. Part Name
Vendor
Part
Number
Description Vendor Cost Extended
Cost
1 Raspberry pi 4 v.2
Raspberry Pi 4 Model
B/4GB
Broadcom BCM2711, Quad core Cortex-A72
(ARM v8) 64-bit SoC @ 1.5GHz
PiShop.us $55.00 $55.00
1 Power Supply
n/a Raspberry Pi 15.3W USB-C Power Supply
PiShop.us $9.99 $9.99
1 Heatsink/Fan Module
n/a
Heat Sink Single Cooling
Fan RAM Heatsink Set
amazon.com $9.99 $9.99
1 Camera B0154
Arducam NOIR 8MP Sony
IMX219 camera module with
Motorized IR cut filter M12 mount
LS1820 Lens for Raspberry Pi
arducam.com
$64.99 $64.99
1 Lens 64-106
1.9mm FL, No IR-Cut Filter,
f/2, Micro Video Lens
edmundoptics.com
$49.50 $49.50
1 Temp/Humidity Sensor
DHT22
Digital Temperature and Humidity
Sensor
adafruit.com $9.95 $9.95
1 Microphone 107990055 6-Mic Circular Array Kit for Raspberry Pi
seeedstudio.com
$39.90 $39.90
1 Audio Amplifier
ADA3346 I2S 3W Stereo
Speaker Bonnet thepihut.com $16.14 $16.14
1 Speaker ADA1314 3" dia, 3W, 4Ω
Speaker thepihut.com $2.58 $2.58
1 Servo Driver PCA9685
6-Channel 12-bit PWM/Servo
Driver - I2C interface
adafruit.com $14.95 $14.95
1 Servo Motor FS5103R Continuous
Rotation Servo Motor
adafruit.com $11.95 $11.95
1 Nightlight LED
COM-00531
3V 20mA LED sparkfun.com $0.95 $0.95
1 Misc. Hardware
$10.00
1 3-D Printing Supplies
PLA-1.75-C-B
ZIRO 3D Printer Filament
Carbon Fiber PLA 1.75mm
amazon.com $27.99 $27.99
- 69 -
0.8KG Spool - Black
Total $323.88
The initial bill of materials oftentimes doesn’t reflect the final product configuration.
Items could get damaged and scrapped. Improperly specifying components could lead to
purchasing the incorrect part. The next two tables illustrate the inconsistencies in the
actual cost of goods consumed and the cost of goods used in the final product
configuration if one were to purchase the components and build the device themselves.
Table 4-2: Actual Material Consumed
Qty. Part Name
Vendor
Part
Number
Description Vendor Cost Extended
Cost
2 Raspberry pi 4 v.2
Raspberry Pi 4 Model
B/4GB
Broadcom BCM2711, Quad core Cortex-A72 (ARM v8) 64-bit SoC @ 1.5GHz
PiShop.us $55.00 $110.00
1 Power Supply
n/a Raspberry Pi 15.3W USB-C Power Supply
PiShop.us $9.99 $9.99
1 Heatsink/Fan Module
n/a Heat Sink Single Cooling Fan RAM
Heatsink Set
amazon.com
$9.99 $9.99
1 Camera B0154
Arducam NOIR 8MP Sony IMX219
camera module with Motorized IR
cut filter M12 mount LS1820
Lens for Raspberry Pi
arducam.com
$64.99 $64.99
1 Camera B0151
Arducam NOIR IR-Cut 5MP
OV5647 camera module with
Motorized IR cut filter M12 mount (f)1.8 Lens for Raspberry Pi
arducam.com
$25.99 $25.99
1 Speaker
201120-Gsa109070
T
3W mini stereo laptop speaker
with 3.5mm audio jack
amazon.com
$9.99 $9.99
- 70 -
1 Temp/Humidity Sensor
DHT22 Digital
Temperature and Humidity Sensor
adafruit.com
$9.95 $9.95
1 Microphone 107990055 6-Mic Circular Array Kit for Raspberry Pi
seeedstudio.com
$39.90 $39.90
1 Audio Amplifier
ADA3346 I2S 3W Stereo
Speaker Bonnet thepihut.c
om $16.14 $16.14
1 Speaker ADA1314 3" dia, 3W, 4Ω
Speaker thepihut.c
om $2.58 $2.58
1 Microphone Driverless mini
USB microphone sunfounde
r.com $6.99 $6.99
1 Servo Driver PCA9685 6-Channel 12-bit
PWM/Servo Driver - I2C interface
adafruit.com
$14.95 $14.95
1 Servo Motor FS5103R Continuous
Rotation Servo Motor
adafruit.com
$11.95 $11.95
1 3-D Printing Supplies/Ha
rdware
PLA-1.75-C-B
ZIRO 3D Printer Filament SLA
1.75mm 0.8KG Spool - Blue
amazon.com
$30.00 $30.00
10 Nightlight LED
COM-00531 3V 20mA LED sparkfun.c
om $0.95 $9.95
Total $373.36
Table 4-3: Final Configuration Bill of Materials
Qty. Part Name
Vendor
Part
Number
Description Vendor Cost Extended
Cost
1 Raspberry pi 4 v.2
Raspberry Pi 4 Model
B/4GB
Broadcom BCM2711, Quad core Cortex-A72 (ARM v8) 64-bit SoC @ 1.5GHz
PiShop.us $55.00 $55.00
1 Power Supply
NA Raspberry Pi 15.3W USB-C Power Supply
PiShop.us $9.99 $9.99
1 Heatsink Module
15733 Heat Sink Single Cooling Fan RAM
Heatsink Set
Sparkfun.com
$9.99 $9.99
1 Camera B0151
Arducam NOIR IR-Cut 5MP
OV5647 camera module with
Motorized IR cut filter M12 mount (f)1.8 Lens for Raspberry Pi
arducam.com
$25.99 $25.99
- 71 -
1 Speaker
201120-Gsa109070
T
3W mini stereo laptop speaker
with 3.5mm audio jack
amazon.com
$9.99 $9.99
1 Temp/Humidity Sensor
DHT22 Digital
Temperature and Humidity Sensor
adafruit.com
$9.95 $9.95
1 Microphone Driverless mini
USB microphone sunfounde
r.com $6.99 $6.99
1 Servo Motor FS5103R Continuous
Rotation Servo Motor
adafruit.com
$11.95 $11.95
1 Miscellaneous Hardware
NA Screws, nuts,
washers amazon.c
om $30.00 $30.00
10 Nightlight LED
COM-00531 3V 20mA LED sparkfun.c
om $0.95 $9.95
Total $179.80
There was quite a bit of disparity between the proposed bill of material, the actual
bill of material, and the final product configuration bill of material as tables 4-1 through
4-3 have illustrated. The main reasons for this were the steep learning curve during the
project after the modules were validated, a broken DAC driver in the latest Raspbian
distribution which rendered our I2S devices (microphone array, audio amplifier, and
speakers) useless, and the extra Raspberry Pi purchase so both team members could code
and troubleshoot in parallel. Additionally, the camera module which was originally
purchased was an 8MP camera without IR LEDs. The LED modules which were
compatible with the 5MP version of the Raspberry Pi were sized differently and therefore
weren’t compatible with our original purchase. Luckily, a second camera was donated to
the team which came with the IR LEDs as part of the package. The servo driver was also
not needed. During the research portion of the project, the team was unaware that a
software PWM signal could be generated to drive the servo motor without the need for an
external driver. This would have been duplicitous and eaten into the power and compute
budget. As in most R&D projects, it’s highly improbable to predict all the detractors
from the intended outcome. The Ubityke team members have firsthand, real-world
experience with industry projects running over-budget in either labor, materials, or both.
The fact that the proposed budget came within 15% of the actual goods consumed is a
win in the eyes of the design team. Perhaps the biggest highlight is that the final cost of
the prototype, comes in at a competitive price compared to the rest of the devices
researched and tabulated in table 1-1.
- 72 -
Chapter 5
Summary and Conclusion
5.1 Summary and Conclusions
5.2 Suggestions for Future Improvements
Summary
This chapter summarizes, in a general way, how the design informed the end
product and the steps and engineering practices that ultimately led to a final
product configuration. Finally, a theoretical dialogue of future improvements to
the product is introduced.
- 73 -
5.1 Summary and Conclusions
The Ubityke interactive baby monitor concept which was presented in its infancy during
the Fall 2019 Senior Design Proposal session was brought to fruition through careful
design, analysis, project planning, execution, and teamwork. The project turned out
orders of magnitude better than the team could have hoped for. There was a time of
reckoning at the beginning of the semester when the team was learning how to format the
SD card to load the operating system on the Raspberry Pi when the full weight of what
we had signed up for hit us. Both team members were forced outside of their respective
comfort zones and were required to stretch their minds and learn new skills in order to
deliver on the project. There were optics people learning how to program from the Linux
terminal and computer engineers doing mechanical design work. The experience was
spectacularly transformative, at times frustrating, but in the end overwhelmingly
rewarding.
The main features of Ubityke were specified as day and night video monitoring of
the baby, a nightlight, customizable lullabies, temperature and humidity sensing, a servo-
actuated mobile, and a microphone to monitor the environment; all provided through a
sleek user interface which could be accessed through a PC or smartphone . In addition to
the main features, a subset of features lent itself to a machine-learning platform which has
the potential to be used for sleep training methods and health analytics. At the simplest
level, the user could toggle all of the features and carve out more time for themselves by
entertaining and monitoring the baby. This is not a substitute for shirking parental
responsibilities, but perhaps one would like to hit the snooze button one more time in the
morning, or have a conversation with their loved ones without having to go back into the
baby’s room every time the mobile stops and wind it up again.
At a more sophisticated level, the motion detection and event triggering
algorithms could possibly provide the basis for sleep training the baby. An additional
feature was also added to the system during the middle of the semester during a meeting
with our advisor, Professor Notash. He suggested that the temperature and humidity
sensor serve as the basis for an input into an alert system. The Ubityke team responded to
the request with a gas-gauge dashboard embedded in the user interface that indicated
when the temperature and/or humidity was too high. Furthermore, our software engineer,
Alejandro Neira took the request a step further and created a real-time alert system that
- 74 -
sent an SMS message to the user’s phone when the temperature and humidity conditions
exceeded the threshold.
The biggest challenge was undoubtedly the broken DAC mixer in the Linux
distribution which rendered our I2S devices useless. The intent was to have an audio
amplifier and a condenser microphone array using the I2S bus to transmit and receive
audio over the web through the Raspberry Pi. After several hours of troubleshooting and
research on forums and vendor websites, it was concluded that the team had to find
another way to accomplish this goal. The solution wasn’t as elegant, but it was functional
and saved on power and compute resources in the long run. The only downfall was the
microphone had to be a driverless USB microphone that had poor quality. The dynamic
range was terrible and there was no documentation or support for the device. The team
was too far into the project to do a hard system design cut-over, so we did the best we
could with trying to use a loop to record and then play back the audio after a
predetermined amount of time. The feature worked from the Linux shell, but wasn’t able
to be wrapped into the main code block. Debugging the rest of the features and building a
main program which wrapped all of the features including the algorithms ate up the rest
of the development time. The COVID-19 outbreak didn’t help facilitate the product
development either as social distancing measures forced the team to work remotely on the
project without the resources of the campus laboratories. Luckily the team had the basics
at home like digital volt-ohm meters, soldering iron, and other tools.
To reiterate, the team accomplished 90% of the proposed work with the added
features of data logging and SMS alerts. The project schedule was followed practically to
the day. The project came within 15% of the intended budget. Most importantly the team
learned how to trust and count on each other. Because of this experience, the team will
have an added advantage in the workplace through the experience of working on a team
which neither one can control the final outcome outside the spheres of their own
influence. Additionally, the team will most likely pursue a Kickstarter campaign to
gather funding to productize Ubityke. A random survey of 10 men and 10 women was
taken in a Whole Foods outside of Detroit, Michigan where shoppers were presented with
the already working features of Ubityke and the $200 price point and asked whether or
not they would be interested in such a product. Most, if not all of the subjects asked when
it would be available.
- 75 -
5.2 Suggestions for Future Improvements
The design and architecture of Ubityke allows for scalability and has the hooks for big
data logging, health analytics, and user updates and notifications, as well as the option to
add more sensors and logic to provide multiple threshold conditions for especially
complex monitoring and soothing. Since the sensor data will be stored on a web server,
rest patterns could be correlated with temperature and humidity data, as well as which
soothing techniques work most effectively to allow baby to ease back to rest.
This product, if researched by more seasoned computer scientists, could be a
precursor to different types of AI and machine learning in the arena of child development
and perhaps even general health.
Specifically, to this iteration of the product, enhancements ranging from
mechanical, code optimization, aesthetics, optical, and electrical could be accomplished
through more careful analysis and planning. The learning curve was steep due to the tight
schedule and basically having to work with the design put in place during the proposal
phase of the project. A second attempt would address the major issues such as packaging,
heat management, component package handlers, and overall memory management
through code optimization. A more holistic approach to the mechanical design which
included modeling all components with all the connectors and the wires in the solid
model would alleviate the fit and obstruction issues. A design which took into account
ease of assembly would have made the device easier to put together and service. To
mitigate the risk of individual module device libraries conflicting with different versions
of operating systems, two paths forward are abundantly clear. The first would be to
ensure that all the peripherals used the same communication protocol. The second path
would be to have a microcontroller and/or microprocessor which had no operating
system. The Linux base alleviated some conflicts but allowed package handlers full
control to the kernel which caused difficulties. The ultimate solution would be to use an
FPGA or ASIC which only performed the operations necessary for the device to function.
To optimize the code, some of the algorithms and methods could be modified to save on
compute, such as detecting motion without drawing a bounding box around the contoured
areas. Drawing the boxes around motion in each frame takes compute power. Each
floating-point operation consumes power and dissipates heat. The fewer floating-point
operations a processor has to perform; the less heat will develop in the processor cores.
Perhaps a processor with more cores would have been suitable and could have handled
- 76 -
the constant parsing of frames and reduced latency in the live video feed. A more robust
method for logging and fetching temperature and humidity data could also streamline the
user experience. Timestamps and frame tagging during the motion detection sequence
could provide the basis for correlating the baby’s rest patterns to temperature, humidity,
and perhaps even sound. All this data could be stored on a remote server and downloaded
into useful charts and graphs for health analytics. A more sophisticated codebase could
also facilitate quicker fetch and execute cycles in the system when taking in and
outputting data.
- 77 -
References
[1] “Plus,” Nanit. [Online]. Available: https://www.nanit.com/products/nanit-plus.
[Accessed: 02-Nov-2019].
[2] “The 10 Best Baby Monitors to Buy 2019,” LittleOneMag, 23-Apr-2019. [Online].
Available: https://littleonemag.com/best-baby-monitors/. [Accessed: 03-Nov-2019].
[3] “Raspberry Pi 4 Model B specifications – Raspberry Pi,” Raspberry Pi 4 Model B
specifications – Raspberry Pi. [Online]. Available:
https://www.raspberrypi.org/products/raspberry-pi-4-model-b/specifications. [Accessed:
03-Nov-2019].
[4] “What is Python? Executive Summary,” Python.org. [Online]. Available:
https://www.python.org/doc/essays/blurb/. [Accessed: 03-Nov-2019].
[5] “Introduction to OpenCV-Python Tutorials,” OpenCV. [Online]. Available:
https://docs.opencv.org/master/d0/de3/tutorial_py_intro.html. [Accessed: 03-Nov-2019].
[6] “Raspberry Pi Camera Board - Night Vision ‘IR-CUT’ (5MP),” The Pi Hut. [Online].
Available: https://thepihut.com/products/raspberry-pi-night-vision-camera-ir-cut.
[Accessed: 04-Mar-2020].
[7] Adafruit Industries, “Continuous Rotation Servo,” adafruit industries blog RSS.
[Online]. Available: https://www.adafruit.com/product/154. [Accessed: 03-Nov-2019].
[8] “To use USB mini microphone on Raspbian,” Wiki. [Online]. Available:
http://wiki.sunfounder.cc/index.php?title=To_use_USB_mini_microphone_on_Raspbian.
[Accessed: 05-Mar-2020].
[9] Industries, A. (2019). DHT22 temperature-humidity sensor + extras. [online]
Adafruit.com. Available at: https://www.adafruit.com/product/385 [Accessed 3 Nov.
2019].
[10] M. #41765 and M. #424007, “Hamburger Mini Speaker,” COM. [Online]. Available:
https://www.sparkfun.com/products/14023. [Accessed: 05-Mar-2020].
[11] Sparkfun.com. (2019). [online] Available at:
https://www.sparkfun.com/datasheets/Components/YSL-R547W2C-A13.pdf [Accessed 3
Nov. 2019].
[12] “What is CE Marking?,“Cemarkingnordic.se. (2019). [online] Available at:
http://www.cemarkingnordic.se/pdf/english/what_is_ce_marking.pdf [Accessed 3 Nov.
2019].
[13] “Raspbian,” Raspberry Pi. (2019). Download Raspbian for Raspberry Pi. [online]
Available at: https://www.raspberrypi.org/downloads/raspbian/ [Accessed 3 Nov. 2019].
- 78 -
[14] J. Geerling, “Power Consumption Benchmarks,” Power Consumption Benchmarks |
Raspberry Pi Dramble. [Online]. Available:
https://www.pidramble.com/wiki/benchmarks/power-consumption. [Accessed: 05-Mar-
2020].
[15] Gibson, Ian, and Jorge Bártolo, Paulo. “History of Stereolithography.”
Stereolithography: Materials, Processes, and Applications. (2011): 41-43. Print. 7 October
2015.
[16] A. Rosebrock, “OpenCV - Stream video to web browser/HTML
page,” PyImageSearch, 14-Feb-2020. [Online]. Available:
https://www.pyimagesearch.com/2019/09/02/opencv-stream-video-to-web-browser-html-
page/. [Accessed: 18-Mar-2020].
[17] “Python Developers Survey 2018 Results” Jetbrains | [Online]. Available:
https://www.jetbrains.com/research/python-developers-survey-2018/. [Accessed: 21-Mar-
2020].
[18] PiHelper. (2018). Qian Sha (Version 1.5) [Mobile application software]. Retrieved
from https://apps.apple.com/us/app/pihelper/id1369930932
- 79 -
Appendix A
Equations
𝑉 = 𝐼𝑅 (2.1)
𝑡𝑎𝑛𝜃 = 𝑥
𝑦 (3.1)
𝐹𝑂𝑉° = 2 ∗ 𝑎𝑟𝑐𝑡𝑎𝑛 (𝐹𝑂𝑉𝐻𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙
2 ∗ 𝑊𝑜𝑟𝑘𝑖𝑛𝑔 𝐷𝑖𝑠𝑡𝑎𝑛𝑐𝑒) (3.2)
- 80 -
Appendix B
Linux Shell Commands
LED Script
# These commands run from the shell turn and off the LED cluster # Need to be in python3 shell import RPi.GPIO as GPIO GPIO.setmode(GPIO.BCM) # use GPIO 17 to bias the LEDs GPIO.setup(17, GPIO.OUT) # turn on the LED GPIO.output(17, True) # turn off the LED GPIO.output(17, False)
Audio Output Script
# the omxplayer command will play local files or streams # start the stream with the volume low omxplayer http://ice1.somafm.com:80/groovesalad-128-mp3 -- vol -2000
Microphone Script
# set hardware device with plughw to microphone in # -d (int) is the time to record in seconds; save as a wave file arecord -D plughw:1,0 -d 20 test.wav
- 81 -
Servo Script
# These commands rotate the servo at approximately 15rpm # Must be in PIGPIO directory sudo pigpiod # rotate the servo at 15rpm by setting output to GPIO 4 and # pulsewidth modulation signal to 1525ms pigs s 4 1525 # stop the servo using a pulsewidth of zero pigs s 4 0
Temperature and Humidity Sensor Script
# code base was already developed. credit goes to Emit @pimylifeup.com # the libraries were common and the methods were altered to reflect the configuration of this project import Adafruit_DHT # define the sensor from the Adafruit library used in the project DHT_SENSOR = Adafruit_DHT.DHT22 # define the GPIO pin on the Raspberry pi which will serve as the data line DHT_PIN = 22 # constant loop that probes the sensor to read the temp and humidity from our predefined sensor and pin from lines 3 and 5 # the retry method constantly tries to read data from the sensor while True: humidity, temperature = Adafruit_DHT.read_retry(DHT_SENSOR, DHT_PIN) # only print out useful sensor data to the terminal with a precision of one decimal place . If an error exists, raise a flag. if humidity is not None and temperature is not None: print("Temp = 0:0.1f*C Humidity = 1:0.1f%".format(temperature, humidity)) else: print("Failed to grab data from sensor")
- 82 -
Appendix C
Python Code
Ubityke System Source Code
''' This is the main file to run Ubityke - An Ultimate Baby Monitor A senior design project by Alejandro Neira and Eric Hahn Department of Electronics Engineering Technology Division of Engineering, Computer Programming, & Technology Valencia College – West Campus Spring Semester 2020 To run Ubityke please run the file 'runme' and a browser window will open with the User Interface. Thank you to Dr. Adrian Rosebrock for the Motion Detection Algorithm for OpenCV Another huge thank you to our Professor Jerry Reed for the huge help developing TTM Mode and debugging our code ''' # import the necessary packages for: # OpenCV # Flask # Servo Controller, Timers, SMS alerts # GPIO # SQLite3 # OMXPlayer from pyimagesearch.motion_detection import SingleMotionDetector from imutils.video import VideoStream from flask import Response from flask import Flask, request from flask import render_template
#import the classes for the timers and servo to control the mobile from timer_class import Timer from servo_controller import ServoController from subprocess import call #Import the classes for the Lullabies and the alerts from player import Player
- 83 -
from smstest import Alert import threading import argparse import datetime import imutils import time import cv2 import RPi.GPIO as GPIO
#import framework for database import sqlite3 #import classes for the player for playing wav files. from omxplayer import OMXPlayer from time import sleep #Set local IP for the Raspberry Pi for the UI IP = '192.168.0.16' PHONE_ALERTS = '5616014813' #Temperate ALERT THRESHOLD #If the temperature in the room reaches this temp, it will send an alert tempThresh = 70 #Set Threshold Triggered Mode OFF as default ttmEnabled = 0 #LED GPIO Pins GPIO.setmode(GPIO.BCM) GPIO.setwarnings(False) GPIO.setup(17, GPIO.OUT) #Servo Motor GPIO Pins GPIO.setmode(GPIO.BCM) GPIO.setup(4, GPIO.OUT) pwm = GPIO.PWM(4, 25) #pwm.start(0) #Instantiate the Timer class #Creating two timers to control the logic for the mobile servo_timer = Timer() stopServo_timer = Timer() #Instantiate the ServoController class where we can use the following functions: #servo_on(pmw_value) [to start the servo at a certain speed] #servo_off() [to turn the servo motor off] servo_controller = ServoController(pwm) #Instantiate a Player class for the lullabies
- 84 -
newPlayer = Player() #Instantiate an Alert class to send SMS texts newAlert = Alert() #Defining the initial state of motion for the logic behind Threshold Triggered Motion (TTM) motion_state = False # initialize the output frame and a lock used to ensure thread-safe (OpenCV) outputFrame = None lock = threading.Lock() # initialize a flask object app = Flask(__name__) # initialize the video stream and allow the camera sensor to warmup vs = VideoStream(src=0).start() time.sleep(2.0) @app.route('/', methods=['GET']) def index(): global ttmEnabled, PHONE_ALERTS #Connection to the SQLite3 database that will store the temp and humidity data conn=sqlite3.connect('sensorsData.db') curs=conn.cursor() #It will grab the latest readable entry in the database to be displayed on the website with the Gauge for row in curs.execute("SELECT * FROM DHT_data WHERE temp>0 ORDER BY timestamp DESC LIMIT 1"): time = str(row[0]) temp = str(row[1]) hum = str(row[2]) conn.close() #We are given back the temperature in Celsius so let's convert back to Fahrenheit temp = float(temp) temp = float((temp*(1.8))+32) #Set the alert message if(temp > tempThresh): tempAlert = "TEMPERATURE IS TOO HIGH!! Please check your baby!" newAlert.sendSMS(PHONE_ALERTS) else: tempAlert = "Temperature is ok" #Setting up the arguments that are used in the URL to grab the user's action led = request.args.get('led') servo = request.args.get('servo') play = request.args.get('play')
- 85 -
ttm = request.args.get('ttm') #if they select the LED button, set that GPIO pin HIGH if led == "on": GPIO.output(17, GPIO.HIGH) elif led == "off": GPIO.output(17, GPIO.LOW) #Control the Servo motor in manual mode if servo == "on": servo_controller.servo_on(15) elif servo == "off": servo_controller.servo_off() #control the lullabies if play == "twinkle": newPlayer.player_on("lullabies/twinkle.wav", 37) elif play == "wheels": newPlayer.player_on("lullabies/wheels.wav", 30) elif play == "abc": newPlayer.player_on("lullabies/abcsong.wav", 35) elif play == "bingo": newPlayer.player_on("lullabies/bingo.wav", 45) elif play == "5monkeys": newPlayer.player_on("lullabies/5littlemonkeys.wav", 45) elif play == "row": newPlayer.player_on("lullabies/rowyourboat.wav", 50) elif play == "off": newPlayer.player_off() ttmAlert = "DISABLED" if ttm == "enabled": ttmEnabled = 1 ttmAlert = "**ENABLED**" #Array that will hold the values that are sent to the HTML Flask Page templateData = 'time': time, 'temp': temp, 'hum': hum, 'tempAlert': tempAlert, 'ttmAlert': ttmAlert
- 86 -
return render_template("index.html", **templateData)
def detect_motion(frameCount): # grab global references to the video stream, output frame, and # lock variables global vs, outputFrame, lock, timerRunning, motion_state, ttmEnabled # initialize the motion detector and the total number of frames # read thus far md = SingleMotionDetector(accumWeight=0.1) total = 0 # loop over frames from the video stream while True: # read the next frame from the video stream, resize it, # convert the frame to grayscale, and blur it frame = vs.read() frame = imutils.resize(frame, width=400) gray = cv2.cvtColor(frame, cv2.COLOR_BGR2GRAY) gray = cv2.GaussianBlur(gray, (7, 7), 0) # grab the current timestamp and draw it on the frame timestamp = datetime.datetime.now() cv2.putText(frame, timestamp.strftime( " %A %d %B %Y %I:%M:%S%p"), (10, frame.shape[0] - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.35, (0, 0, 255), 1) # if the total number of frames has reached a sufficient # number to construct a reasonable background model, then # continue to process the frame if total > frameCount: # detect motion in the image motion = md.detect(gray) #servo_timer.start(10) # check to see if motion was found in the frame if motion is not None: # unpack the tuple and draw the box surrounding the # "motion area" on the output frame (thresh, (minX, minY, maxX, maxY)) = motion cv2.rectangle(frame, (minX, minY), (maxX, maxY), (0, 0, 255), 2) motionDetected = True #pi.set_mode(17, pigpio.OUTPUT) # GPIO 17 as output else: motionDetected = False if ttmEnabled == 1:
- 87 -
#print("ttmEnabled ON") if motionDetected == True and motion_state == False: # Motion detected, no previous motion motion_state = True #print("Starting servo timer") servo_timer.start(5) onTimerRunning = servo_timer.tick() #print("onTimerRunning: "+str(onTimerRunning)) if onTimerRunning == False and motionDetected == True: #print("Starting servo") servo_controller.servo_on(15) #GPIO.cleanup() if motionDetected == False and motion_state == True: motion_state = False #print("Starting STOPservo timer") stopServo_timer.start(5) offTimerRunning = stopServo_timer.tick() if offTimerRunning == False and motion_state == False: # #print("Stopping servo") servo_controller.servo_off() #pwm.stop() #GPIO.cleanup() # update the background model and increment the total number # of frames read thus far md.update(gray) total += 1 # acquire the lock, set the output frame, and release the # lock with lock: outputFrame = frame.copy() def generate(): # grab global references to the output frame and lock variables global outputFrame, lock # loop over frames from the output stream while True: # wait until the lock is acquired with lock: # check if the output frame is available, otherwise skip # the iteration of the loop
- 88 -
if outputFrame is None: continue # encode the frame in JPEG format (flag, encodedImage) = cv2.imencode(".jpg", outputFrame) # ensure the frame was successfully encoded if not flag: continue # yield the output frame in the byte format yield(b'--frame\r\n' b'Content-Type: image/jpeg\r\n\r\n' + bytearray(encodedImage) + b'\r\n') @app.route("/video_feed") def video_feed(): # return the response generated along with the specific media # type (mime type) return Response(generate(), mimetype = "multipart/x-mixed-replace; boundary=frame") # check to see if this is the main thread of execution if __name__ == '__main__': # construct the argument parser and parse command line arguments '''ap = argparse.ArgumentParser() ap.add_argument("-i", "--ip", type=str, required=True, help="ip address of the device") ap.add_argument("-o", "--port", type=int, required=True, help="ephemeral port number of the server (1024 to 65535)") ap.add_argument("-f", "--frame-count", type=int, default=32, help="# of frames used to construct the background model") args = vars(ap.parse_args())''' # start a thread that will perform motion detection t = threading.Thread(target=detect_motion, args=(32,)) t.daemon = True t.start() # start the flask app app.run(host=str(IP), port="8000", debug=True, threaded=True, use_reloader=False) # release the video stream pointer vs.stop()
Classes
Player Class:
- 89 -
from omxplayer.player import OMXPlayer from pathlib import Path from time import sleep import os class Player: def __init__ (self): self.path = "" self.duration = 0; def player_on(self,path,duration): command = "sudo killall -s 9 omxplayer.bin" os.system(command) self.path = path self.duration = duration VIDEO_PATH = Path(self.path) player = OMXPlayer(VIDEO_PATH) sleep(self.duration) player.quit() def player_off(self): command = "sudo killall -s 9 omxplayer.bin" os.system(command) def main(): myPlayer = Player() myPlayer.player_on("smile30.wav", 30) if __name__ == "__main__": main()
Timer Class:
import time # timer class class Timer: def __init__ (self): self.num_seconds = 0 self.timer_running = False self.start_time = 0 self.secs_remaining = 0 # Start the timer def start(self,secs): self.num_seconds = secs self.timer_running = True self.start_time = time.time() self.secs_remaining = secs
- 90 -
# Run the timer. Returns True if timer has expired # Call this repeatedly in your loop def tick(self): if self.timer_running: #self.secs_remaining = self.secs_remaining - (time.time() - self.start_time) #print("secs=" + str(self.secs_remaining)) if time.time() - self.start_time >= self.num_seconds: self.timer_running = False return self.timer_running # stop the timer early def stop_timer(self): self.timer_running = False def main(): # Instantiate a timer -- you could have more than one test_timer = Timer() print( "starting timer") # start the timer for 20 seconds test_timer.start(5) # Simulate other processing for i in range(200): # tick the timer and see if it has expired yet result = test_timer.tick() if result == False: print( "done!") break else: print( ".") # Kill time for test time.sleep(.1) print( "finished") if __name__ == '__main__': main()
Servo Class:
import time class ServoController: def __init__ (self,pwm): self.pwm = pwm self.pwm_value = 0 self.running_state = False # off def servo_on(self, val):
- 91 -
self.pwm_value = val if self.running_state == False: self.running_state = True # code to set servo speed here? #print("turning on " + str(self.pwm_value)) self.pwm.start(self.pwm_value) time.sleep(5) #self.pwm.stop() def servo_off(self): self.pwm_value = 0 if self.running_state == True: #print("turning off") self.running_state = False # code to stop servo here? self.pwm.stop() def main(): myServo = ServoController( ) myServo.servo_on(100) myServo.servo_on(150) myServo.servo_off() myServo.servo_off() myServo.servo_on(200) myServo.servo_off() if __name__ == "__main__": main()
SMS Alert Class:
import smtplib from email.mime.text import MIMEText from email.mime.multipart import MIMEMultipart
class Alert: def __init__ (self): self.path = "" self.duration = 0;
- 92 -
def sendSMS(self,to): email = "[email protected]" pas = "lasershow1" sms_gateway = to+'@txt.att.net' smtp = "smtp.gmail.com" port = 587 # This will start our email server server = smtplib.SMTP(smtp,port) server.starttls() server.login(email,pas) # Now we use the MIME module to structure our message. msg = MIMEMultipart() msg['From'] = email msg['To'] = sms_gateway msg['Subject'] = "Ubityke - ALERT\n" body = "Ubityke: An Alert has been detected! Visit the Ubityke User Interface for information\n" msg.attach(MIMEText(body, 'plain')) sms = msg.as_string() server.sendmail(email,sms_gateway,sms) # lastly quit the server server.quit() def main(): newAlert = Alert() newAlert.sendSMS('5616014813') if __name__ == "__main__": main()
Temp Log Class:
import time import sqlite3 import Adafruit_DHT dbname='sensorsData.db' sampleFreq = 60 # time in seconds ==> Sample each 1 min # get data from DHT sensor def getDHTdata(): DHT22Sensor = Adafruit_DHT.DHT22
- 93 -
DHTpin = 22 hum, temp = Adafruit_DHT.read_retry(DHT22Sensor, DHTpin) if hum is not None and temp is not None: hum = round(hum) temp = round(temp, 1) return temp, hum # log sensor data on database def logData (temp, hum): conn=sqlite3.connect(dbname) curs=conn.cursor() curs.execute("INSERT INTO DHT_data values(datetime('now', 'localtime'), (?), (?))", (temp, hum)) conn.commit() conn.close() # main function def main(): while True: temp, hum = getDHTdata() logData (temp, hum) time.sleep(sampleFreq) # ------------ Execute program main()
Motion Detection Algorithm
# These algorithms were developed by Dr. Adrian Rosebrock of Pyimagesearch.com # Some parameters and arguments were modified to work with our applcation # The Ubityke team is grateful for the support # import the necessary packages import numpy as np import imutils import cv2 class SingleMotionDetector:
- 94 -
def __init__(self, accumWeight=0.5): # store the accumulated weight factor self.accumWeight = accumWeight # initialize the background model self.bg = None def update(self, image): # if the background model is None, initialize it if self.bg is None: self.bg = image.copy().astype("float") return # update the background model by accumulating the weighted # average cv2.accumulateWeighted(image, self.bg, self.accumWeight) def detect(self, image, tVal=25): # compute the absolute difference between the background model # and the image passed in, then threshold the delta image delta = cv2.absdiff(self.bg.astype("uint8"), image) thresh = cv2.threshold(delta, tVal, 255, cv2.THRESH_BINARY)[1] # perform a series of erosions and dilations to remove small # blobs thresh = cv2.erode(thresh, None, iterations=2) thresh = cv2.dilate(thresh, None, iterations=2) # find contours in the thresholded image and initialize the # minimum and maximum bounding box regions for motion cnts = cv2.findContours(thresh.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE) cnts = imutils.grab_contours(cnts) (minX, minY) = (np.inf, np.inf) (maxX, maxY) = (-np.inf, -np.inf) # if no contours were found, return None if len(cnts) == 0: return None # otherwise, loop over the contours for c in cnts: # compute the bounding box of the contour and use it to # update the minimum and maximum bounding box regions (x, y, w, h) = cv2.boundingRect(c) (minX, minY) = (min(minX, x), min(minY, y)) (maxX, maxY) = (max(maxX, x + w), max(maxY, y + h)) # otherwise, return a tuple of the thresholded image along # with bounding box return (thresh, (minX, minY, maxX, maxY))
- 95 -
Appendix D
HTML and JavaScript Code
User Interface JavaScript Code
- 96 -
$(document).ready(function() $('#turnOnBtn').on('click', function(e) $.ajax( url: '/?led=on', method: 'GET', success: function(result) console.log(result); ); e.preventDefault(); ); $('#turnOffBtn').on('click', function(e) $.ajax( url: '/?led=off', method: 'GET', success: function(result) console.log(result); ); e.preventDefault(); ); $('#turnOnServo').on('click', function(e) $.ajax( url: '/?servo=on', method: 'GET', success: function(result) console.log(result); ); e.preventDefault(); ); $('#turnOffServo').on('click', function(e) $.ajax( url: '/?servo=off', method: 'GET', success: function(result) console.log(result); ); e.preventDefault(); ); $('#turnOnTTM').on('click', function(e) $.ajax( url: '/?ttm=enabled', method: 'GET', success: function(result)
- 97 -
console.log(result); ); e.preventDefault(); ); $('#turnOffTTM').on('click', function(e) $.ajax( url: '/?ttm=disabled', method: 'GET', success: function(result) console.log(result); ); e.preventDefault(); ); $('#playTwinkle').on('click', function(e) let status; if($(this).text() == 'Turn On') $(this).text('Turn Off') $(this).removeClass().addClass('btn btn-block btn-light'); status = 'twinkle'; else $(this).text('Turn On'); $(this).removeClass().addClass('btn btn-block btn-dark'); status = 'off'; $.ajax( url: '/?play=' + status, method: 'GET', success: function(result) console.log(result); ); e.preventDefault(); ); );
User Interface HTML Code
<!DOCTYPE HTML> <!-- Story by HTML5 UP html5up.net | @ajlkn Free for personal and commercial use under the CCA 3.0 license (html5up.net/license)
- 98 -
--> <html> <head> <title>Ubityke - Ultimate Baby Monitor</title> <meta charset="utf-8" /> <meta name="viewport" content="width=device-width, initial-scale=1, user-scalable=no" /> <link rel="stylesheet" href=" url_for('static', filename='assets/css/main.css') " /> <noscript><link rel="stylesheet" href=" url_for('static', filename='assets/css/noscript.css') " /></noscript> <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script> <script src="url_for('static', filename='script.js')"></script> <script src="url_for('static', filename='raphael-2.1.4.min.js')"></script> <script src="url_for('static', filename='justgage.js')"></script> <script> var g1, g2; document.addEventListener("DOMContentLoaded", function(event) g1 = new JustGage( id: "g1", value: temp, valueFontColor: "black", min: 0, max: 115, title: "Temperature", label: "Fahrenheit" ); g2 = new JustGage( id: "g2", value: hum, valueFontColor: "black", min: 0, max: 100, title: "Humidity", label: "%" ); ); </script> </head> <body class="is-preload"> <!-- Wrapper --> <div id="wrapper" class="divided"> <section class="banner style1 color5 orient-left content-align-left image-position-center "> <div class="content"> <img src="url_for('static', filename='ubinew.png')" width="230" height="73"> <h4>Control Nightlamp</h4> <ul class="actions"> <li><button type="button" class="button primary" id="turnOnBtn">Turn On</button></li> <li><button type="button" class="button" id="turnOffBtn">Turn Off</button></li> </ul>
- 99 -
<h4>Control Spinning Mobile</h4> <ul class="actions"> <li><button type="button" class="button primary" id="turnOnServo">Turn On</button></li> <li><button type="button" class="button" id="turnOffServo">Turn Off</button></li> </ul> <h4>Threshold Triggered Mode (TTM)</h4> <ul class="actions"> <li><button class="button primary" onclick="window.location.href = '?ttm=enabled';">Turn TTM ON</button></li> <li><button onclick="window.location.href = '?ttm=disabled';">Turn TTM OFF</button></li> </ul> </div> <div class="image"> <img src=" url_for('video_feed') " alt="" /> </div> </section> <section class="spotlight style1 orient-right content-align-center image-position-center onscroll-image-fade-in" id="first"> <div class="content"> <h3>TTM Mode Status: <b></b>ttmAlert</b></h3><br /> <div class="box"><h3>tempAlert</h3></div> <div id="g1"></div> <div id="g2"></div> <br /> <h4> Last Sensors Reading: time <a href="/">(Refresh)</a></h4> <p>Data is being collected and stored in a database every 60 seconds. The data above displays the latest room temperature/humidity where Ubityke was placed.</p> </div> </section> <!-- Gallery --> <section class="wrapper style1 align-center"> <div class="inner"> <h2>Play Lullabies</h2> <p>Ubityke allows you to stream lullabies directly to your baby! Simply select a lullaby from one of our favorites and your baby will be enjoying it in no time.</p> <a href="?play=off#play">TURN OFF LULLABIES</a> </div> <!-- Gallery -->
- 100 -
<div class="gallery style2 medium lightbox onscroll-fade-in"> <article> <a href="?play=twinkle#play" class="image" id="play" > <img src="url_for('static', filename='assets/images/twinkle.png')" alt="" /> </a> <div class="caption"> <h3>Click To Play</h3> </div> </article> <article> <a href="?play=bingo#play" class="image"> <img src="url_for('static', filename='assets/images/bingo.png')" alt="" /> </a> <div class="caption"> <h3>Click To Play</h3> </div> </article> <article> <a href="?play=wheels#play"" class="image"> <img src="url_for('static', filename='assets/images/wheels.png')" alt="" /> </a> <div class="caption"> <h3>Click To Play</h3> </div> </article> <article> <a href="?play=abc#play" class="image"> <img src="url_for('static', filename='assets/images/abc.png')" alt="" /> </a> <div class="caption"> <h3>Click To Play</h3> </div> </article> <article> <a href="?play=5monkeys#play" class="image"> <img src="url_for('static', filename='assets/images/5littlemonkeys.png')" alt="" /> </a> <div class="caption"> <h3>Click To Play</h3> </div> </article> <article> <a href="?play=row#play" class="image"> <img src="url_for('static', filename='assets/images/rowyourboat.png')" alt="" />
- 101 -
</a> <div class="caption"> <h3>Click To Play</h3> </div> </article> </div> </section> <!-- Footer --> <footer class="wrapper style1 align-center"> <div class="inner"> A senior design project by Alejandro Neira and Eric Hahn. <p>© Ubityke 2020. Design HTML5 Up</p> </div> </footer> </div> <script src="static/assets/js/jquery.min.js"></script> <script src="static/assets/js/jquery.scrollex.min.js"></script> <script src="static/assets/js/jquery.scrolly.min.js"></script> <script src="static/assets/js/browser.min.js"></script> <script src="static/assets/js/breakpoints.min.js"></script> <script src="static/assets/js/util.js"></script> <script src="static/assets/js/main.js"></script> </body> </html>
- 102 -
Appendix E
Datasheets
- 103 -
- 104 -
- 105 -
- 106 -
- 107 -
- 108 -
- 109 -
- 110 -
- 111 -
Biography
Alejandro Neira
Graduation – Spring 2020 (3.3 GPA)
Two years of experience working with a
digitization team at Siemens Energy
Offered a full-time position at Siemens Energy
after graduation to lead a front-end
development team
Hobbies include traveling, volunteering, and
promoting STEM careers at local schools
Eric Hahn
Graduation - Spring 2020 (3.69 GPA)
13 years working in the photonics industry
Graduated from Valencia College in 2009 with
A.S. in Lasers and Photonics (4.0 GPA)
Individual contributor in multiple technologies
including tunable quantum cascade laser
systems, infrared counter measures, laser
rangefinders, and 3-D scanning LiDAR