SlideShare a Scribd company logo
1 of 5
Download to read offline
Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012
Articles 17
Human-machine Interface for Presentation Robot
Jiri Krejsa, Vit Ondrousek
Submitted 27th
June 2011; accepted 27th
September 2011
Abstract:
The purpose of mobile presentation robot is to interact
with the users providing the required information. This
paper presents the approach used in the highest software
layer of autonomous mobile robot Advee used commer-
cially for presentation purposes. The requirements for
the layer are discussed together with available means
of the robot. The particular building blocks and overall
structure of the module are shown and key features are
described in detail together with the behavior definition.
Given module serves successfully when exploited to vari-
able environment represented by people with computer
literacy of great variation as proved during module veri-
fication tests.
Keywords: human-robot interface, mobile robot, pres-
entation robot
1. Introduction
With mobile robots appearing nowadays outside re-
search laboratories more than ever, human-robot inter-
face (HRI) related issues attract great attention [1, 2].
Most of the research is focused on improving certain
means of communication, especially the voice dialog
(sound used both as inputs – natural language under-
standing and outputs – robot voice) [3, 4] and utilization
of computer vision (human face recognition, face related
higher features recognition, gesture recognition, etc.).
Various means of communication can be combined to
increase the level of interaction. In [5] the speech recog-
nition is combined with user localization using two pro-
cessed microphones signals to obtain the location of the
source, that can be further utilized in the response of the
robot. Face recognition is accompanied with gesture rec-
ognition in [6] bringing wider variety of possible inputs
to the robot. Work of Perzanowski [7] is an example of
multimodal HRI, combining speech and gesture recogni-
tion with external input from Personal Digital Assistant
(PDA).
This paper is focused on HRI of presentation robot
Advee, whose purpose is to serve as an autonomous
mobile source of information, transferable to people of
different computer literacy. Such requirement brings the
necessity of combining all robot means to get the redun-
dancy in communication channels, so the less computer
literate people can still get the message while more liter-
ate user is not repelled.
The paper is organized as follows. After short intro-
duction of the robot itself the requirements for HRI layer
are summarized together with available means (HRI re-
lated inputs and outputs). Proposed structure of the sys-
tem is discussed with attention on key modules. The veri-
fication made during the tests together with discussion
conclude the paper.
2. Presentation robot
The purpose of presentation robot is to present certain
information to the users in interactive manner. Further
paragraphs describe the design of human-robot interface
module for robot Advee. As the HRI is only a part of the
robot system, here is a short introduction of the robot itself.
Presentation robot Advee is 160 cm high, 80 kg heavy
wheeled robot with Ackerman steering. It has rear wheels
driven with single DC actuator through mechanical differ-
ential. Robot is powered with 8cell LiFePo4 accumulator
giving it approximately 8 hours operational range. The lo-
calization of the robot is handled by fusion of robot motion
controllers commands, odometry readings and scanner of
infrared beacons placed in known fixed locations. Fusion
is performed by extended Kalman filter [8].
The software operated on the robot can be divided
into three layers:
·฀ low level provides interaction with hardware devices
·฀ middle level implements robot state estimation and
path planning
·฀ high level handles the user interaction
Fig. 1. Presentation robot Advee
Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012
Articles
18
The robot is equipped with a wide range of sensors
and outputs, serving as the primary resources for HRI
layer. The particular sources are described in detail be-
low.
The size of the robot and access to its main communi-
cation devices is illustrated in Fig. 1.
3. HRI module
3.1. Module requirements
In order to design the HRI module properly the re-
quirements must be stated first. The overall goal is to
create the interface that is friendly and universal (can be
used by variety of users), while the underlying engine is
robust, modular and flexible. The requirements are listed
in order of importance in Tab. 1.
Tab.1. HRI requirements
Requirement Description
redundancy
information should be transferred to user
in all possible ways in parallel (e.g. both
visual and sound)
robustness
HRI module must be capable of operation
even when minor hardware failure occur
(e.g. camera fails)
flexibility simple addition of new feature, sensor, etc
parametrization
overall behavior can be easily changed,
certain features can be disabled/enabled
upon request
adaptability
HRI should adapt to the user abilities, e.g.
longer timeouts for slower users, etc.
modularity
several programmers can work indepen-
dently on the project, particular modules
can be tested independently
3.2. Robot means
Based on the requirements, what are the means of the
robot? We can divide the means into two basic catego-
ries: inputs and outputs, where inputs represent the infor-
mation from the user to the robot and outputs represent
Vision
Sound input
LCD touch screen
Printer output
Sound
output
Proximity
sensors
Motion
SrcScreen
SrcVision
SrcSoundIn
SrcSoundOut
SrcPrinter
SrcLowLevel
…
Motion
Catcher
Video
Selector
Map
Games
…
Active block
Sources Blocks
Tab. 2. Robot HRI means
Type / device Details
Inputs
Screen
capacitive touch
screen
robust touchscreen returning touch
coordinates
Voice microphone,
soundcard
incoming sound can be recorded and
further processed
Vision
CCD camera
instant flow of images further pro-
cessed (face detection, etc)
Lower level
bumpers, proximity
sensors, odometry,
data from lower level of robot control
(distances from obstacles, current loca-
tion, etc.)
Outputs
Screen
LCD screen
hi resolution screen to present visual
information
Voice
soundcard, amp,
speakers
modulated voice, currently prerecorded
Print
printer
thermal printers up to 112mm wide
Motion
motion actuators
safe motion in given area, towards
given goal
Fig. 2. Robot means physical location
robot reaction. Some of the means are bidirectional, e.g.
the touch screen acts both as input and output. The over-
view of Advee’s means is shown in Tab. 2. and physical
location of the means on the robot is shown in Fig. 2.
3.3. HRI building blocks
The key idea in designing the HRI module is to sepa-
rate the interaction into independent blocks, that are se-
quentially activated by main HRI internal engine. Only
single block is active at the time. The overall behavior
of the robot can then be defined in a single routine, that
controls blocks activation.
Each block uses different means of the robot, how-
ever in most cases the majority of the means are used.
Fig. 3. HRI building blocks
Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012
Articles 19
To enable access to the means, the particular means are
encapsulated into so called sources. Programmatically
the blocks and sources are represented by certain classes.
Each block is inherited from BaseBlock class and each
source is inherited from BaseSource class. BaseBlock
contains links to all the sources instances, therefore any
subsequent block inherited can access all the sources. As
the sources can serve to single block only, the mechanism
that assigns and releases sources is implemented. Sources
are assigned prior to block activation and released once
the block is finished.
Currently implemented sources and blocks are shown
in Fig. 3, with an example of Map as active block having
access to all the sources. The sources directly depend on
the means of the robot from previous chapter.
The sources encapsulate the access methods and
events generated by corresponding devices. The key
sources are:
Screen: shows the graphics and generates events
when touch is detected.
Vision: processes the camera images with cascade of
boosted classifiers based on Haar-like features for frontal
face detection.
SoundIn: processes the incoming sound to detect sim-
ple patterns (no speech recognition yet)
SoundOut: plays prerecorded sound files of modu-
lated voice. As the sounds are prerecorded, the variety of
sounds is available for given type of the sound to avoid
repeating the same phrases.
Printer: thermal printer with 112 mm wide output.
LowLevel: encapsulates the methods and events of
lower level software layers. In particular the motion
of the robot, its estimated position (given by Extended
Kalman Filter based localization technique), the motion
planner outputs, proximity sensors calls, hardware moni-
tor calls (battery status, temperatures, etc.), etc.
Each block represents single operation in human-ro-
bot interaction. The key blocks are:
Motion – block that is active when the robot is mov-
ing and not interacting with people. Motion can end for
a number of reasons: people are detected in surrounding
(combination of proximity sensors analysis, face detec-
tion of the camera images, etc), somebody touches the
screen, etc.
Catcher – block that verifies whether there is a per-
son the robot can talk to. The block serves as decision
maker for uncertain situations (low confidence in face
detection, etc)
Selector – block that serves as the menu allowing user
to select from several options, usually activating another
block. The options are shown graphically as buttons on
the screen with both pictograms and text. The selection is
accompanied with voice explanation of the options.
Custom blocks – blocks responsible for certain fea-
ture of the data to present. The names are selfexplanatory,
for example:
Map shows interactive map of the surroundings with
indicated position of the robot in the map.
Video runs the video, interruption is allowed
Games enables users to play games, usually followed
with the prints of reward in the case of victory in the
game.
Print handles printing, guiding the user through
the print process, checking whether user withdrawn
the print, etc.
Within each block a finite state machine is imple-
mented, handling the behavior of the block, in particu-
lar the responses to user inputs (screen taps, incoming
sounds, etc.) and internal events (end of the output sound,
changes in lower level data, etc).
3.4. HRI behavior deinition
Whole behavior of the robot is determined by the se-
quence of blocks. The sequence depends partially on user
and partially on behavior definition. HRI engine works
in simple cycle shown in Fig. 4. Once the appropriate
block is selected, all sources are assigned to the block,
block is activated and starts to operate. Within the block
usually finite state machine is implemented, managing
the incoming events and producing required outputs.
Select block
Assign sources
Activate block
processing
Release sources
Emergency
Handle emergency
yes
no
Motion
Catcher
Selector
Main
Selector
Advee
Selector
Videos
Selector
Games
Map
Schedule
Print
Custom
Selector
Language
Video1
Video2
…
 Game1
 Game2
 …
Print
Victory
Quiz
Print
Victory
Leave Lang1
Lang2
…
Video
Advee
Print
About
Fig. 4. HRI engine routine
Fig. 5. Behavior signal low
Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012
Articles
20
Once the block is done the sources are released and next
block is selected.
Each block can finish in a regular manner, or once an
emergency event occurs (HW failure, etc.). In such a case
the emergencies are handled separately before the main
cycle is entered again.
The behavior can be easily visualized with a flow
chart, an example of the signal flow is shown in Fig. 5.
The flow can be arbitrarily modified, certain features
(blocks or block sequences) omitted or added.
3.5. Implementation details
High level layer of the robot software runs on Win-
dows 7 OS. The HRI module is implemented in C# using
Microsoft Visual Studio environment. Computationally
demanding routines (e.g. image processing) are written
in C and C++.
The predecessor to all sources classes – BaseSource
class implements virtual methods for essential operations
with the sources (startup and cleanup). The predecessor
to all block classes – BaseBlock class implements virtual
event responses for all possible sources events, therefore
any descendant block can respond to such events. Virtual
methods for block activation and deactivation are also
implemented in BaseBlock class, taking care of assigning
the sources to the currently active block and releasing the
sources when block is finished. The termination of block
processing is announced with BlockDone virtual event,
overridden in particular blocks.
Lower layers of the software run on separate com-
puter with Linux OS. Communication with the lower lay-
ers (position estimation, motion, path planning, etc.) is
based on Lightweight Communication and Marshalling
(LCM) library [9] and is fully encapsulated into SrcLow-
Level source.
4. Veriication
The HRI was initially verified using released version
of HRI together with simulator of the lower layers of ro-
bot. During the tests the HRI was presented mainly to
students and basic settings were tuned, such as timeouts
for particular blocks, understandability of the graphics,
etc. Once the basic setup was done the tests were per-
formed using real robot on a number of people from vari-
ous backgrounds and data in the form of questionnaires
were collected afterwards together with the camera re-
cordings.
Based on the collected data the HRI was further mod-
ified, with the emphasis on redundancy of given informa-
tion, e.g. all voice outputs (talk of the robot) are accom-
panied with the text on the screen of the same meaning.
Further tests lead to minor modifications in rephrasing
the speech output and changing the voice modulation for
further clarification. During the use of the robot all the
user inputs are logged and collected data are used for fur-
ther analysis and improvements.
In order to determine whether the HRI module works
according to requirements, several quantities can be ex-
tracted from data collected during the interactions. Such
variables values obtained on several events with differ-
ent types of users can be compared giving indication
of HRI quality (ideally the quantities should not vary).
While there could be numerous number of such indica-
tors with respect to certain features (e.g. time spent with
the map, win rate in games, etc.) only several give gen-
eral impression on success of the HRI. Those indicators
are described below.
ATI – average time of interaction with single user
[sec]: interval between the start of the interaction and its
end (user left the robot). Longer ATI means single user
spent more time with the robot, it is only indirect indica-
tor of how well the interaction went.
CSR – user catching success rate [-]: rate between
successful catch of user (interaction starts) and failure
(user walks away). Lower CSR means that people are
afraid of the robot, do not know how to start an interac-
tion, etc.
TIP – interaction timeout percentage [%]: percentage
of interactions ended with timeout in arbitrary block oth-
er than main menu. Higher percentage means that user
left the robot during interaction.
PPE – person per event [-]: total number of inter-
actions divided by total number of users. If the PPE is
higher than 1 then some users used the robot repeatedly.
BPU – blocks per user [-]: number of blocks activat-
ed per user in single interaction. Higher BPU means that
user exploited more options during interaction.
Data collected on several events with different audi-
ence are listed in Tab. 3. The particular events were:
E1 – VIP audience
E2 – general audience
E3 – university students
E4 – seniors
Tab. 3. HRI indicators results
e1 e2 e3 e4
ATI 48±11 73±19 85±21 96±71
CSR 0.89 0.75 0.97 0.18
TIP 21 13 14 8
PPE 1.12 1.12 1.51 1.05
BPU 5.2±1.2 7.3±2.1 8.7±2.4 3.2±1.9
Results clearly show that the type of the users does
have an impact on given indicators values, however in
none of the events the HRI failed. Very low user catching
Fig. 6. HRI in action
Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012
Articles 21
success rate in event 4 deserves further analysis. From
the questionnaires given to the users we can state that the
reason for the user to avoid the robot was not the HRI
(users knew what to do), but general reluctance towards
the robot itself.
5. Conclusion and future work
Described human-robot interface communication
module was successfully designed and implemented in
mobile robot Advee. HRI is now fully operational and
used in commercial application of the robot with over
200 hours of operation in interaction mode with a variety
of audience.
While HRI exhibits no problems, the parametrization
and flexibility need improvements, that can probably be
achieved only through the design of special script lan-
guage for behavior description. Currently ongoing re-
search is focused on extending the recognition capabili-
ties (age, gender) of image processing unit. The gesture
recognition was rejected as the cost of explaining how
to use the gestures to the user overcomes the potential
benefits.
ACKnowLeDgeMentS
Published results were acquired with the support of the CAS
under the research plan AV0Z20760514 and with the support
of the OPEC, project number CZ.1.07/2.3.00/09.0162. Authors
would like to thank Petr Schreiber from Bender Robotics for his
work on HRI coding.
AutHoRS
Jiri Krejsa* – Institute of Thermomechanics AS CR
v.v.i., Brno branch, Technická 2, Brno, 616 69, Czech
Republic, krejsa@fme.vutbr.cz.
Vit ondrousek – Brno University of Technology, Fac-
ulty of Mechanical Engineering, Brno, 616 69, Czech
Republic, ondrousek@fme.vutbr.cz
*Corresponding author
References
[1] M. W. Kadous, R. K.-M. Sheh, C. Sammut,
“Effective user interface design for rescue robot-
ics”. In: ACM Conference on Human-Robot Inter-
action, Salt Lake City, UT, USA: ACM Press, 2006.
[2] K. Dautenhahn, M. Walters et al., “How may I serve
you? A robot companion approaching a seated per-
son in a helping context”. In: ACM Conference on
Human-Robot Interaction (HRI), Salt Lake City,
UT, USA: ACM Press, 2006
[3] M. Skubic, D. Perzanowski et al., “Spatial language
for human-robot dialogs”, IEEE Transactions on
Systems, Man, and Cybernetics, Part C: Applica-
tions and Reviews, vol. 34, issue 2, 2004, pp. 154-
167.
[4] V. Kulyukin “Human-Robot Interaction Through
Gesture-Free Spoken Dialogue”, Autonomous Ro-
bots, vol. 16, no. 3, 2004, pp. 239-257.
[5] K. Park, S. Lee et al., “Human-robot interface us-
ing robust speech recognition and user localization
based on noise separation device”. In: RO-MAN
2009, pp. 328-333.
[6] M. Chang, J. Chou, “A friendly and intelligent hu-
man-robot interface system based on human face
and hand gesture”. In: AIM 2009, pp.1856-1861.
[7] D Perzanowski, A.C. Schultz et al., “Building
a multimodal human-robot interface”, IEEE Intel-
ligent Systems, vol. 16, no. 1, 2001, pp. 16-21.
[8] J. Krejsa, S. Věchet, “Odometry-free mobile robot
localization using bearing only beacons”. In: Pro-
ceedings of EPE-PEMC 2010 Conference, Macedo-
nia, pp. T5/40-T5/45.
[9] A.S. Huang, E. Olson et al., “LCM: Lightweight
Communications and Marshalling”. In: IROS 2010,
Taipei, Taiwan, pp. 4057-4062.

More Related Content

Similar to Human-Machine Interface For Presentation Robot

AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...butest
 
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...butest
 
Wireless Pick & Place Robot
Wireless Pick & Place RobotWireless Pick & Place Robot
Wireless Pick & Place RobotMarmik Kothari
 
Material Handling Robot For Smart Manufacturing in Industries
Material Handling Robot For Smart Manufacturing in IndustriesMaterial Handling Robot For Smart Manufacturing in Industries
Material Handling Robot For Smart Manufacturing in Industriesdbpublications
 
Figure 1
Figure 1Figure 1
Figure 1butest
 
Figure 1
Figure 1Figure 1
Figure 1butest
 
Figure 1
Figure 1Figure 1
Figure 1butest
 
A Mimicking Human Arm with 5 DOF Controlled by LabVIEW
A Mimicking Human Arm with 5 DOF Controlled by LabVIEWA Mimicking Human Arm with 5 DOF Controlled by LabVIEW
A Mimicking Human Arm with 5 DOF Controlled by LabVIEWMarwan Hammouda
 
VOCAL- Voice Command Application using Artificial Intelligence
VOCAL- Voice Command Application using Artificial IntelligenceVOCAL- Voice Command Application using Artificial Intelligence
VOCAL- Voice Command Application using Artificial IntelligenceIRJET Journal
 
Command, Goal Disambiguation, Introspection, and Instruction in Gesture-Free ...
Command, Goal Disambiguation, Introspection, and Instruction in Gesture-Free ...Command, Goal Disambiguation, Introspection, and Instruction in Gesture-Free ...
Command, Goal Disambiguation, Introspection, and Instruction in Gesture-Free ...Vladimir Kulyukin
 
Learning of robots by using & sharing the cloud computing techniques
Learning of robots by using & sharing the cloud computing techniquesLearning of robots by using & sharing the cloud computing techniques
Learning of robots by using & sharing the cloud computing techniquesEr. rahul abhishek
 
The Wireless Remote Control Car Based On Arm9
The Wireless Remote Control Car Based On Arm9The Wireless Remote Control Car Based On Arm9
The Wireless Remote Control Car Based On Arm9IOSR Journals
 
Robotic Vehicle Movement with Webcam Operated by Cell Phone
Robotic Vehicle Movement with Webcam Operated by Cell PhoneRobotic Vehicle Movement with Webcam Operated by Cell Phone
Robotic Vehicle Movement with Webcam Operated by Cell PhoneIRJET Journal
 
A Review On AI Vision Robotic Arm Using Raspberry Pi
A Review On AI Vision Robotic Arm Using Raspberry PiA Review On AI Vision Robotic Arm Using Raspberry Pi
A Review On AI Vision Robotic Arm Using Raspberry PiAngela Shin
 
IRJET-Arduino based Voice Controlled Robot
IRJET-Arduino based Voice Controlled RobotIRJET-Arduino based Voice Controlled Robot
IRJET-Arduino based Voice Controlled RobotIRJET Journal
 
Camouflage Surveillance Robot In Defense Using Artificial Intelligence
Camouflage Surveillance Robot In Defense Using Artificial IntelligenceCamouflage Surveillance Robot In Defense Using Artificial Intelligence
Camouflage Surveillance Robot In Defense Using Artificial IntelligenceIRJET Journal
 
MARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.pptMARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.pptAfstddrrdv
 

Similar to Human-Machine Interface For Presentation Robot (20)

AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
 
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
AN EMOTIONAL MIMICKING HUMANOID BIPED ROBOT AND ITS QUANTUM ...
 
Wireless Pick & Place Robot
Wireless Pick & Place RobotWireless Pick & Place Robot
Wireless Pick & Place Robot
 
robocity2013-jderobot
robocity2013-jderobotrobocity2013-jderobot
robocity2013-jderobot
 
Material Handling Robot For Smart Manufacturing in Industries
Material Handling Robot For Smart Manufacturing in IndustriesMaterial Handling Robot For Smart Manufacturing in Industries
Material Handling Robot For Smart Manufacturing in Industries
 
Figure 1
Figure 1Figure 1
Figure 1
 
Figure 1
Figure 1Figure 1
Figure 1
 
Figure 1
Figure 1Figure 1
Figure 1
 
Final-Report
Final-ReportFinal-Report
Final-Report
 
A Mimicking Human Arm with 5 DOF Controlled by LabVIEW
A Mimicking Human Arm with 5 DOF Controlled by LabVIEWA Mimicking Human Arm with 5 DOF Controlled by LabVIEW
A Mimicking Human Arm with 5 DOF Controlled by LabVIEW
 
VOCAL- Voice Command Application using Artificial Intelligence
VOCAL- Voice Command Application using Artificial IntelligenceVOCAL- Voice Command Application using Artificial Intelligence
VOCAL- Voice Command Application using Artificial Intelligence
 
Command, Goal Disambiguation, Introspection, and Instruction in Gesture-Free ...
Command, Goal Disambiguation, Introspection, and Instruction in Gesture-Free ...Command, Goal Disambiguation, Introspection, and Instruction in Gesture-Free ...
Command, Goal Disambiguation, Introspection, and Instruction in Gesture-Free ...
 
Learning of robots by using & sharing the cloud computing techniques
Learning of robots by using & sharing the cloud computing techniquesLearning of robots by using & sharing the cloud computing techniques
Learning of robots by using & sharing the cloud computing techniques
 
Robot arm ppt
Robot arm pptRobot arm ppt
Robot arm ppt
 
The Wireless Remote Control Car Based On Arm9
The Wireless Remote Control Car Based On Arm9The Wireless Remote Control Car Based On Arm9
The Wireless Remote Control Car Based On Arm9
 
Robotic Vehicle Movement with Webcam Operated by Cell Phone
Robotic Vehicle Movement with Webcam Operated by Cell PhoneRobotic Vehicle Movement with Webcam Operated by Cell Phone
Robotic Vehicle Movement with Webcam Operated by Cell Phone
 
A Review On AI Vision Robotic Arm Using Raspberry Pi
A Review On AI Vision Robotic Arm Using Raspberry PiA Review On AI Vision Robotic Arm Using Raspberry Pi
A Review On AI Vision Robotic Arm Using Raspberry Pi
 
IRJET-Arduino based Voice Controlled Robot
IRJET-Arduino based Voice Controlled RobotIRJET-Arduino based Voice Controlled Robot
IRJET-Arduino based Voice Controlled Robot
 
Camouflage Surveillance Robot In Defense Using Artificial Intelligence
Camouflage Surveillance Robot In Defense Using Artificial IntelligenceCamouflage Surveillance Robot In Defense Using Artificial Intelligence
Camouflage Surveillance Robot In Defense Using Artificial Intelligence
 
MARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.pptMARK ROBOTIC ARM.ppt
MARK ROBOTIC ARM.ppt
 

More from Angela Williams

Children Are Learning The Numbers, Math Workshee
Children Are Learning The Numbers, Math WorksheeChildren Are Learning The Numbers, Math Workshee
Children Are Learning The Numbers, Math WorksheeAngela Williams
 
A Simple Reading Text About London With 2 Exercises (K
A Simple Reading Text About London With 2 Exercises (KA Simple Reading Text About London With 2 Exercises (K
A Simple Reading Text About London With 2 Exercises (KAngela Williams
 
How To Write A Research Paper - Step By Step Guide - Peachy Essay
How To Write A Research Paper - Step By Step Guide - Peachy EssayHow To Write A Research Paper - Step By Step Guide - Peachy Essay
How To Write A Research Paper - Step By Step Guide - Peachy EssayAngela Williams
 
Why College Is Important Essays Telegraph
Why College Is Important Essays TelegraphWhy College Is Important Essays Telegraph
Why College Is Important Essays TelegraphAngela Williams
 
Student Papers APA Sample Paper
Student Papers APA Sample PaperStudent Papers APA Sample Paper
Student Papers APA Sample PaperAngela Williams
 
Personalized Letter Writing Stationary Set Custom Stationery
Personalized Letter Writing Stationary Set Custom StationeryPersonalized Letter Writing Stationary Set Custom Stationery
Personalized Letter Writing Stationary Set Custom StationeryAngela Williams
 
002 Scholarship Essay Format Example Example
002 Scholarship Essay Format Example Example002 Scholarship Essay Format Example Example
002 Scholarship Essay Format Example ExampleAngela Williams
 
025 Essay Typer Free Maxresdefault Thatsnotus
025 Essay Typer Free Maxresdefault Thatsnotus025 Essay Typer Free Maxresdefault Thatsnotus
025 Essay Typer Free Maxresdefault ThatsnotusAngela Williams
 
Cheap Research Paper Writing Service Essay Freelance Writ
Cheap Research Paper Writing Service Essay Freelance WritCheap Research Paper Writing Service Essay Freelance Writ
Cheap Research Paper Writing Service Essay Freelance WritAngela Williams
 
Write My Paper For Me - Paying Someone To Write A Paper - 20171002
Write My Paper For Me - Paying Someone To Write A Paper - 20171002Write My Paper For Me - Paying Someone To Write A Paper - 20171002
Write My Paper For Me - Paying Someone To Write A Paper - 20171002Angela Williams
 
Introduce Yourself Mba Essay Sample Essay Wri
Introduce Yourself Mba Essay Sample Essay WriIntroduce Yourself Mba Essay Sample Essay Wri
Introduce Yourself Mba Essay Sample Essay WriAngela Williams
 
Examples Of Persuasive Essays For Middle School St
Examples Of Persuasive Essays For Middle School StExamples Of Persuasive Essays For Middle School St
Examples Of Persuasive Essays For Middle School StAngela Williams
 
Basic Steps To Writing Research Papers Ecousarecycling.Com
Basic Steps To Writing Research Papers Ecousarecycling.ComBasic Steps To Writing Research Papers Ecousarecycling.Com
Basic Steps To Writing Research Papers Ecousarecycling.ComAngela Williams
 
12 Greatest Reference Sheet Template – RedlineSP
12 Greatest Reference Sheet Template – RedlineSP12 Greatest Reference Sheet Template – RedlineSP
12 Greatest Reference Sheet Template – RedlineSPAngela Williams
 
How To Write An Objective Summary The Learning Cafe
How To Write An Objective Summary  The Learning CafeHow To Write An Objective Summary  The Learning Cafe
How To Write An Objective Summary The Learning CafeAngela Williams
 
Essay About Academic Integrity - Writinghelper.We
Essay About Academic Integrity - Writinghelper.WeEssay About Academic Integrity - Writinghelper.We
Essay About Academic Integrity - Writinghelper.WeAngela Williams
 
According To Apa Formatting Guidelines Your
According To Apa Formatting Guidelines YourAccording To Apa Formatting Guidelines Your
According To Apa Formatting Guidelines YourAngela Williams
 
Sample Abstract - Photos
Sample Abstract - PhotosSample Abstract - Photos
Sample Abstract - PhotosAngela Williams
 

More from Angela Williams (20)

Children Are Learning The Numbers, Math Workshee
Children Are Learning The Numbers, Math WorksheeChildren Are Learning The Numbers, Math Workshee
Children Are Learning The Numbers, Math Workshee
 
A Simple Reading Text About London With 2 Exercises (K
A Simple Reading Text About London With 2 Exercises (KA Simple Reading Text About London With 2 Exercises (K
A Simple Reading Text About London With 2 Exercises (K
 
How To Write A Research Paper - Step By Step Guide - Peachy Essay
How To Write A Research Paper - Step By Step Guide - Peachy EssayHow To Write A Research Paper - Step By Step Guide - Peachy Essay
How To Write A Research Paper - Step By Step Guide - Peachy Essay
 
Why College Is Important Essays Telegraph
Why College Is Important Essays TelegraphWhy College Is Important Essays Telegraph
Why College Is Important Essays Telegraph
 
Student Papers APA Sample Paper
Student Papers APA Sample PaperStudent Papers APA Sample Paper
Student Papers APA Sample Paper
 
Personalized Letter Writing Stationary Set Custom Stationery
Personalized Letter Writing Stationary Set Custom StationeryPersonalized Letter Writing Stationary Set Custom Stationery
Personalized Letter Writing Stationary Set Custom Stationery
 
002 Scholarship Essay Format Example Example
002 Scholarship Essay Format Example Example002 Scholarship Essay Format Example Example
002 Scholarship Essay Format Example Example
 
025 Essay Typer Free Maxresdefault Thatsnotus
025 Essay Typer Free Maxresdefault Thatsnotus025 Essay Typer Free Maxresdefault Thatsnotus
025 Essay Typer Free Maxresdefault Thatsnotus
 
Cheap Research Paper Writing Service Essay Freelance Writ
Cheap Research Paper Writing Service Essay Freelance WritCheap Research Paper Writing Service Essay Freelance Writ
Cheap Research Paper Writing Service Essay Freelance Writ
 
Fish Printables
Fish PrintablesFish Printables
Fish Printables
 
Write My Paper For Me - Paying Someone To Write A Paper - 20171002
Write My Paper For Me - Paying Someone To Write A Paper - 20171002Write My Paper For Me - Paying Someone To Write A Paper - 20171002
Write My Paper For Me - Paying Someone To Write A Paper - 20171002
 
Introduce Yourself Mba Essay Sample Essay Wri
Introduce Yourself Mba Essay Sample Essay WriIntroduce Yourself Mba Essay Sample Essay Wri
Introduce Yourself Mba Essay Sample Essay Wri
 
Examples Of Persuasive Essays For Middle School St
Examples Of Persuasive Essays For Middle School StExamples Of Persuasive Essays For Middle School St
Examples Of Persuasive Essays For Middle School St
 
Basic Steps To Writing Research Papers Ecousarecycling.Com
Basic Steps To Writing Research Papers Ecousarecycling.ComBasic Steps To Writing Research Papers Ecousarecycling.Com
Basic Steps To Writing Research Papers Ecousarecycling.Com
 
12 Greatest Reference Sheet Template – RedlineSP
12 Greatest Reference Sheet Template – RedlineSP12 Greatest Reference Sheet Template – RedlineSP
12 Greatest Reference Sheet Template – RedlineSP
 
How To Write An Objective Summary The Learning Cafe
How To Write An Objective Summary  The Learning CafeHow To Write An Objective Summary  The Learning Cafe
How To Write An Objective Summary The Learning Cafe
 
Essay About Academic Integrity - Writinghelper.We
Essay About Academic Integrity - Writinghelper.WeEssay About Academic Integrity - Writinghelper.We
Essay About Academic Integrity - Writinghelper.We
 
According To Apa Formatting Guidelines Your
According To Apa Formatting Guidelines YourAccording To Apa Formatting Guidelines Your
According To Apa Formatting Guidelines Your
 
Sample Abstract - Photos
Sample Abstract - PhotosSample Abstract - Photos
Sample Abstract - Photos
 
Critical Essay
Critical EssayCritical Essay
Critical Essay
 

Recently uploaded

Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxAvyJaneVismanos
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfadityarao40181
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxUnboundStockton
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting DataJhengPantaleon
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 

Recently uploaded (20)

Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptx
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdf
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
Blooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docxBlooming Together_ Growing a Community Garden Worksheet.docx
Blooming Together_ Growing a Community Garden Worksheet.docx
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 

Human-Machine Interface For Presentation Robot

  • 1. Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012 Articles 17 Human-machine Interface for Presentation Robot Jiri Krejsa, Vit Ondrousek Submitted 27th June 2011; accepted 27th September 2011 Abstract: The purpose of mobile presentation robot is to interact with the users providing the required information. This paper presents the approach used in the highest software layer of autonomous mobile robot Advee used commer- cially for presentation purposes. The requirements for the layer are discussed together with available means of the robot. The particular building blocks and overall structure of the module are shown and key features are described in detail together with the behavior definition. Given module serves successfully when exploited to vari- able environment represented by people with computer literacy of great variation as proved during module veri- fication tests. Keywords: human-robot interface, mobile robot, pres- entation robot 1. Introduction With mobile robots appearing nowadays outside re- search laboratories more than ever, human-robot inter- face (HRI) related issues attract great attention [1, 2]. Most of the research is focused on improving certain means of communication, especially the voice dialog (sound used both as inputs – natural language under- standing and outputs – robot voice) [3, 4] and utilization of computer vision (human face recognition, face related higher features recognition, gesture recognition, etc.). Various means of communication can be combined to increase the level of interaction. In [5] the speech recog- nition is combined with user localization using two pro- cessed microphones signals to obtain the location of the source, that can be further utilized in the response of the robot. Face recognition is accompanied with gesture rec- ognition in [6] bringing wider variety of possible inputs to the robot. Work of Perzanowski [7] is an example of multimodal HRI, combining speech and gesture recogni- tion with external input from Personal Digital Assistant (PDA). This paper is focused on HRI of presentation robot Advee, whose purpose is to serve as an autonomous mobile source of information, transferable to people of different computer literacy. Such requirement brings the necessity of combining all robot means to get the redun- dancy in communication channels, so the less computer literate people can still get the message while more liter- ate user is not repelled. The paper is organized as follows. After short intro- duction of the robot itself the requirements for HRI layer are summarized together with available means (HRI re- lated inputs and outputs). Proposed structure of the sys- tem is discussed with attention on key modules. The veri- fication made during the tests together with discussion conclude the paper. 2. Presentation robot The purpose of presentation robot is to present certain information to the users in interactive manner. Further paragraphs describe the design of human-robot interface module for robot Advee. As the HRI is only a part of the robot system, here is a short introduction of the robot itself. Presentation robot Advee is 160 cm high, 80 kg heavy wheeled robot with Ackerman steering. It has rear wheels driven with single DC actuator through mechanical differ- ential. Robot is powered with 8cell LiFePo4 accumulator giving it approximately 8 hours operational range. The lo- calization of the robot is handled by fusion of robot motion controllers commands, odometry readings and scanner of infrared beacons placed in known fixed locations. Fusion is performed by extended Kalman filter [8]. The software operated on the robot can be divided into three layers: ·฀ low level provides interaction with hardware devices ·฀ middle level implements robot state estimation and path planning ·฀ high level handles the user interaction Fig. 1. Presentation robot Advee
  • 2. Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012 Articles 18 The robot is equipped with a wide range of sensors and outputs, serving as the primary resources for HRI layer. The particular sources are described in detail be- low. The size of the robot and access to its main communi- cation devices is illustrated in Fig. 1. 3. HRI module 3.1. Module requirements In order to design the HRI module properly the re- quirements must be stated first. The overall goal is to create the interface that is friendly and universal (can be used by variety of users), while the underlying engine is robust, modular and flexible. The requirements are listed in order of importance in Tab. 1. Tab.1. HRI requirements Requirement Description redundancy information should be transferred to user in all possible ways in parallel (e.g. both visual and sound) robustness HRI module must be capable of operation even when minor hardware failure occur (e.g. camera fails) flexibility simple addition of new feature, sensor, etc parametrization overall behavior can be easily changed, certain features can be disabled/enabled upon request adaptability HRI should adapt to the user abilities, e.g. longer timeouts for slower users, etc. modularity several programmers can work indepen- dently on the project, particular modules can be tested independently 3.2. Robot means Based on the requirements, what are the means of the robot? We can divide the means into two basic catego- ries: inputs and outputs, where inputs represent the infor- mation from the user to the robot and outputs represent Vision Sound input LCD touch screen Printer output Sound output Proximity sensors Motion SrcScreen SrcVision SrcSoundIn SrcSoundOut SrcPrinter SrcLowLevel … Motion Catcher Video Selector Map Games … Active block Sources Blocks Tab. 2. Robot HRI means Type / device Details Inputs Screen capacitive touch screen robust touchscreen returning touch coordinates Voice microphone, soundcard incoming sound can be recorded and further processed Vision CCD camera instant flow of images further pro- cessed (face detection, etc) Lower level bumpers, proximity sensors, odometry, data from lower level of robot control (distances from obstacles, current loca- tion, etc.) Outputs Screen LCD screen hi resolution screen to present visual information Voice soundcard, amp, speakers modulated voice, currently prerecorded Print printer thermal printers up to 112mm wide Motion motion actuators safe motion in given area, towards given goal Fig. 2. Robot means physical location robot reaction. Some of the means are bidirectional, e.g. the touch screen acts both as input and output. The over- view of Advee’s means is shown in Tab. 2. and physical location of the means on the robot is shown in Fig. 2. 3.3. HRI building blocks The key idea in designing the HRI module is to sepa- rate the interaction into independent blocks, that are se- quentially activated by main HRI internal engine. Only single block is active at the time. The overall behavior of the robot can then be defined in a single routine, that controls blocks activation. Each block uses different means of the robot, how- ever in most cases the majority of the means are used. Fig. 3. HRI building blocks
  • 3. Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012 Articles 19 To enable access to the means, the particular means are encapsulated into so called sources. Programmatically the blocks and sources are represented by certain classes. Each block is inherited from BaseBlock class and each source is inherited from BaseSource class. BaseBlock contains links to all the sources instances, therefore any subsequent block inherited can access all the sources. As the sources can serve to single block only, the mechanism that assigns and releases sources is implemented. Sources are assigned prior to block activation and released once the block is finished. Currently implemented sources and blocks are shown in Fig. 3, with an example of Map as active block having access to all the sources. The sources directly depend on the means of the robot from previous chapter. The sources encapsulate the access methods and events generated by corresponding devices. The key sources are: Screen: shows the graphics and generates events when touch is detected. Vision: processes the camera images with cascade of boosted classifiers based on Haar-like features for frontal face detection. SoundIn: processes the incoming sound to detect sim- ple patterns (no speech recognition yet) SoundOut: plays prerecorded sound files of modu- lated voice. As the sounds are prerecorded, the variety of sounds is available for given type of the sound to avoid repeating the same phrases. Printer: thermal printer with 112 mm wide output. LowLevel: encapsulates the methods and events of lower level software layers. In particular the motion of the robot, its estimated position (given by Extended Kalman Filter based localization technique), the motion planner outputs, proximity sensors calls, hardware moni- tor calls (battery status, temperatures, etc.), etc. Each block represents single operation in human-ro- bot interaction. The key blocks are: Motion – block that is active when the robot is mov- ing and not interacting with people. Motion can end for a number of reasons: people are detected in surrounding (combination of proximity sensors analysis, face detec- tion of the camera images, etc), somebody touches the screen, etc. Catcher – block that verifies whether there is a per- son the robot can talk to. The block serves as decision maker for uncertain situations (low confidence in face detection, etc) Selector – block that serves as the menu allowing user to select from several options, usually activating another block. The options are shown graphically as buttons on the screen with both pictograms and text. The selection is accompanied with voice explanation of the options. Custom blocks – blocks responsible for certain fea- ture of the data to present. The names are selfexplanatory, for example: Map shows interactive map of the surroundings with indicated position of the robot in the map. Video runs the video, interruption is allowed Games enables users to play games, usually followed with the prints of reward in the case of victory in the game. Print handles printing, guiding the user through the print process, checking whether user withdrawn the print, etc. Within each block a finite state machine is imple- mented, handling the behavior of the block, in particu- lar the responses to user inputs (screen taps, incoming sounds, etc.) and internal events (end of the output sound, changes in lower level data, etc). 3.4. HRI behavior deinition Whole behavior of the robot is determined by the se- quence of blocks. The sequence depends partially on user and partially on behavior definition. HRI engine works in simple cycle shown in Fig. 4. Once the appropriate block is selected, all sources are assigned to the block, block is activated and starts to operate. Within the block usually finite state machine is implemented, managing the incoming events and producing required outputs. Select block Assign sources Activate block processing Release sources Emergency Handle emergency yes no Motion Catcher Selector Main Selector Advee Selector Videos Selector Games Map Schedule Print Custom Selector Language Video1 Video2 …  Game1  Game2  … Print Victory Quiz Print Victory Leave Lang1 Lang2 … Video Advee Print About Fig. 4. HRI engine routine Fig. 5. Behavior signal low
  • 4. Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012 Articles 20 Once the block is done the sources are released and next block is selected. Each block can finish in a regular manner, or once an emergency event occurs (HW failure, etc.). In such a case the emergencies are handled separately before the main cycle is entered again. The behavior can be easily visualized with a flow chart, an example of the signal flow is shown in Fig. 5. The flow can be arbitrarily modified, certain features (blocks or block sequences) omitted or added. 3.5. Implementation details High level layer of the robot software runs on Win- dows 7 OS. The HRI module is implemented in C# using Microsoft Visual Studio environment. Computationally demanding routines (e.g. image processing) are written in C and C++. The predecessor to all sources classes – BaseSource class implements virtual methods for essential operations with the sources (startup and cleanup). The predecessor to all block classes – BaseBlock class implements virtual event responses for all possible sources events, therefore any descendant block can respond to such events. Virtual methods for block activation and deactivation are also implemented in BaseBlock class, taking care of assigning the sources to the currently active block and releasing the sources when block is finished. The termination of block processing is announced with BlockDone virtual event, overridden in particular blocks. Lower layers of the software run on separate com- puter with Linux OS. Communication with the lower lay- ers (position estimation, motion, path planning, etc.) is based on Lightweight Communication and Marshalling (LCM) library [9] and is fully encapsulated into SrcLow- Level source. 4. Veriication The HRI was initially verified using released version of HRI together with simulator of the lower layers of ro- bot. During the tests the HRI was presented mainly to students and basic settings were tuned, such as timeouts for particular blocks, understandability of the graphics, etc. Once the basic setup was done the tests were per- formed using real robot on a number of people from vari- ous backgrounds and data in the form of questionnaires were collected afterwards together with the camera re- cordings. Based on the collected data the HRI was further mod- ified, with the emphasis on redundancy of given informa- tion, e.g. all voice outputs (talk of the robot) are accom- panied with the text on the screen of the same meaning. Further tests lead to minor modifications in rephrasing the speech output and changing the voice modulation for further clarification. During the use of the robot all the user inputs are logged and collected data are used for fur- ther analysis and improvements. In order to determine whether the HRI module works according to requirements, several quantities can be ex- tracted from data collected during the interactions. Such variables values obtained on several events with differ- ent types of users can be compared giving indication of HRI quality (ideally the quantities should not vary). While there could be numerous number of such indica- tors with respect to certain features (e.g. time spent with the map, win rate in games, etc.) only several give gen- eral impression on success of the HRI. Those indicators are described below. ATI – average time of interaction with single user [sec]: interval between the start of the interaction and its end (user left the robot). Longer ATI means single user spent more time with the robot, it is only indirect indica- tor of how well the interaction went. CSR – user catching success rate [-]: rate between successful catch of user (interaction starts) and failure (user walks away). Lower CSR means that people are afraid of the robot, do not know how to start an interac- tion, etc. TIP – interaction timeout percentage [%]: percentage of interactions ended with timeout in arbitrary block oth- er than main menu. Higher percentage means that user left the robot during interaction. PPE – person per event [-]: total number of inter- actions divided by total number of users. If the PPE is higher than 1 then some users used the robot repeatedly. BPU – blocks per user [-]: number of blocks activat- ed per user in single interaction. Higher BPU means that user exploited more options during interaction. Data collected on several events with different audi- ence are listed in Tab. 3. The particular events were: E1 – VIP audience E2 – general audience E3 – university students E4 – seniors Tab. 3. HRI indicators results e1 e2 e3 e4 ATI 48±11 73±19 85±21 96±71 CSR 0.89 0.75 0.97 0.18 TIP 21 13 14 8 PPE 1.12 1.12 1.51 1.05 BPU 5.2±1.2 7.3±2.1 8.7±2.4 3.2±1.9 Results clearly show that the type of the users does have an impact on given indicators values, however in none of the events the HRI failed. Very low user catching Fig. 6. HRI in action
  • 5. Journal of Automation, Mobile Robotics & Intelligent Systems VOLUME 6, N° 2 2012 Articles 21 success rate in event 4 deserves further analysis. From the questionnaires given to the users we can state that the reason for the user to avoid the robot was not the HRI (users knew what to do), but general reluctance towards the robot itself. 5. Conclusion and future work Described human-robot interface communication module was successfully designed and implemented in mobile robot Advee. HRI is now fully operational and used in commercial application of the robot with over 200 hours of operation in interaction mode with a variety of audience. While HRI exhibits no problems, the parametrization and flexibility need improvements, that can probably be achieved only through the design of special script lan- guage for behavior description. Currently ongoing re- search is focused on extending the recognition capabili- ties (age, gender) of image processing unit. The gesture recognition was rejected as the cost of explaining how to use the gestures to the user overcomes the potential benefits. ACKnowLeDgeMentS Published results were acquired with the support of the CAS under the research plan AV0Z20760514 and with the support of the OPEC, project number CZ.1.07/2.3.00/09.0162. Authors would like to thank Petr Schreiber from Bender Robotics for his work on HRI coding. AutHoRS Jiri Krejsa* – Institute of Thermomechanics AS CR v.v.i., Brno branch, Technická 2, Brno, 616 69, Czech Republic, krejsa@fme.vutbr.cz. Vit ondrousek – Brno University of Technology, Fac- ulty of Mechanical Engineering, Brno, 616 69, Czech Republic, ondrousek@fme.vutbr.cz *Corresponding author References [1] M. W. Kadous, R. K.-M. Sheh, C. Sammut, “Effective user interface design for rescue robot- ics”. In: ACM Conference on Human-Robot Inter- action, Salt Lake City, UT, USA: ACM Press, 2006. [2] K. Dautenhahn, M. Walters et al., “How may I serve you? A robot companion approaching a seated per- son in a helping context”. In: ACM Conference on Human-Robot Interaction (HRI), Salt Lake City, UT, USA: ACM Press, 2006 [3] M. Skubic, D. Perzanowski et al., “Spatial language for human-robot dialogs”, IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applica- tions and Reviews, vol. 34, issue 2, 2004, pp. 154- 167. [4] V. Kulyukin “Human-Robot Interaction Through Gesture-Free Spoken Dialogue”, Autonomous Ro- bots, vol. 16, no. 3, 2004, pp. 239-257. [5] K. Park, S. Lee et al., “Human-robot interface us- ing robust speech recognition and user localization based on noise separation device”. In: RO-MAN 2009, pp. 328-333. [6] M. Chang, J. Chou, “A friendly and intelligent hu- man-robot interface system based on human face and hand gesture”. In: AIM 2009, pp.1856-1861. [7] D Perzanowski, A.C. Schultz et al., “Building a multimodal human-robot interface”, IEEE Intel- ligent Systems, vol. 16, no. 1, 2001, pp. 16-21. [8] J. Krejsa, S. Věchet, “Odometry-free mobile robot localization using bearing only beacons”. In: Pro- ceedings of EPE-PEMC 2010 Conference, Macedo- nia, pp. T5/40-T5/45. [9] A.S. Huang, E. Olson et al., “LCM: Lightweight Communications and Marshalling”. In: IROS 2010, Taipei, Taiwan, pp. 4057-4062.