6
Testbed Environment for Wireless Sensor and Actuator Networks Atanasko Atanasov, Andrea Kulakov, Vladimir Trajkovic and Danco Davcev Computer Science Department Faculty of Electrical Engineering and Information Technologies Skopje, Macedonia e-mail: {atanasko.atanasov, kulak, trvlado, etfdav}@feit.ukim.edu.mk Abstract - The Wireless Sensor and Actuator Networks (WSAN’s), defines collaborative operations between the sensors and the actuators, enabling distributed sensing of a physical phenomenon. In this paper, an integrated testbed consisted of wireless sensor network, server station and robot system is proposed. Our main contribution is in the design and development of the server and actuator tier software modules and the communication protocols between them. We provide new applications where actuators are part of the network and perform actions based on the information gathered by sensors. We describe some test scenarios for the potential use of the system. We envision that WSANs will be an integral part of the systems such as battlefield surveillance, nuclear, biological or chemical attack detection, home automation, and environmental monitoring. Our platform offers reusability, modularity and simplicity to the user. Furthermore, it is designed with cheap and off- the- shelf components. Keywords - Wireless Sensor and Actuator Networks; mobility; monitoring; path planning; navigation I. INTRODUCTION A network of robots (actuators or actors) and sensors consists of a collection of sensors distributed over some area that forms an ad-hoc network, and a collection of mobile robots that can interact with the sensor network. Each sensor is equipped with some limited memory and processing capabilities, multiple sensing modalities and communication capabilities. Sensor networks extend the sensing capabilities of the robots and allow them to act in response to events outside their perception range. Mobile robots extend sensor networks through their ability to move and act with the environment. The sensor networks provide a way of examining environments of interest, delivering numerous small snapshots over time. By fusion of these snapshots, a coherent picture of the scene is produced. But these sensor networks face with big challenges – such networks cannot take active role in manipulating and interacting with their environment nor can they physically reconfigure themselves to enable more efficient area coverage, in-depth examination of targets, reliable wireless connectivity or dynamic protection against inclement environmental developments. By incorporating intelligent, mobile robots directly into sensor networks, all this shortcomings can be addressed. These hybrid systems provide interaction with the environment in a new dynamic and decentralized way and allow development of new solutions to classical problems such as localization and navigation. The research challenges for coordination and communication problems related to these systems are discussed in [1]. With the idea of eliminating human intervention from wireless sensor networks (WSNs), Wireless Sensor and Actuator Networks (WSANs) have received growing attention from the research community in the past few years [2]. The set of capabilities provided by robot-sensor networks match up well with those needed to build an effective search and rescue system, guiding mobile actuators to locate survivors for example. In fire detection applications, sensors can relay the exact origin and intensity of the fire to water sprinkler actuators that will extinguish the fire before it spreads. Similarly, motion, acoustic, or light sensors in a building can detect the presence of intruders and command cameras or other instrumentations to track them. Alternatively, mobile actuators can be moved to the area where the intruder has been detected to get high-resolution images, prompt or block the intruder [3]. In the Section 2 of the paper a related work is presented, while in Section 3 the architecture of the system is given. The design and implementation of the platform is presented in Section 4, while in Section 5 the test scenarios and results are given. Section 6 concludes the paper. II. RELATED WORK Robot sensor networks have evolved from both works in WSN and also in mobile robotics, particularly autonomous robot teams. Therefore, the robot-sensor systems need to address the technical challenges posed in both fields, along with novel challenges unique to hybrid robot-sensor networks and suggest a huge problem space open for exploration. Such systems address problems that are mainly categorized as: Localization and mapping Communication and routing Path planning Target tracking 2010 Fifth International Conference on Systems and Networks Communications 978-0-7695-4145-7/10 $26.00 © 2010 IEEE DOI 10.1109/ICSNC.2010.8 1

[IEEE 2010 Fifth International Conference on Systems and Networks Communications (ICSNC) - Nice, France (2010.08.22-2010.08.27)] 2010 Fifth International Conference on Systems and

  • Upload
    danco

  • View
    215

  • Download
    3

Embed Size (px)

Citation preview

Page 1: [IEEE 2010 Fifth International Conference on Systems and Networks Communications (ICSNC) - Nice, France (2010.08.22-2010.08.27)] 2010 Fifth International Conference on Systems and

Testbed Environment for Wireless Sensor and Actuator Networks

Atanasko Atanasov, Andrea Kulakov, Vladimir Trajkovic and Danco Davcev Computer Science Department

Faculty of Electrical Engineering and Information Technologies Skopje, Macedonia

e-mail: {atanasko.atanasov, kulak, trvlado, etfdav}@feit.ukim.edu.mk

Abstract - The Wireless Sensor and Actuator Networks (WSAN’s), defines collaborative operations between the sensors and the actuators, enabling distributed sensing of a physical phenomenon. In this paper, an integrated testbed consisted of wireless sensor network, server station and robot system is proposed. Our main contribution is in the design and development of the server and actuator tier software modules and the communication protocols between them. We provide new applications where actuators are part of the network and perform actions based on the information gathered by sensors. We describe some test scenarios for the potential use of the system. We envision that WSANs will be an integral part of the systems such as battlefield surveillance, nuclear, biological or chemical attack detection, home automation, and environmental monitoring. Our platform offers reusability, modularity and simplicity to the user. Furthermore, it is designed with cheap and off- the- shelf components.

Keywords - Wireless Sensor and Actuator Networks; mobility; monitoring; path planning; navigation

I. INTRODUCTION A network of robots (actuators or actors) and sensors

consists of a collection of sensors distributed over some area that forms an ad-hoc network, and a collection of mobile robots that can interact with the sensor network. Each sensor is equipped with some limited memory and processing capabilities, multiple sensing modalities and communication capabilities. Sensor networks extend the sensing capabilities of the robots and allow them to act in response to events outside their perception range. Mobile robots extend sensor networks through their ability to move and act with the environment. The sensor networks provide a way of examining environments of interest, delivering numerous small snapshots over time. By fusion of these snapshots, a coherent picture of the scene is produced. But these sensor networks face with big challenges – such networks cannot take active role in manipulating and interacting with their environment nor can they physically reconfigure themselves to enable more efficient area coverage, in-depth examination of targets, reliable wireless connectivity or dynamic protection against inclement environmental developments. By incorporating intelligent, mobile robots directly into sensor networks, all this shortcomings can be addressed. These hybrid systems provide interaction with

the environment in a new dynamic and decentralized way and allow development of new solutions to classical problems such as localization and navigation. The research challenges for coordination and communication problems related to these systems are discussed in [1]. With the idea of eliminating human intervention from wireless sensor networks (WSNs), Wireless Sensor and Actuator Networks (WSANs) have received growing attention from the research community in the past few years [2].

The set of capabilities provided by robot-sensor networks match up well with those needed to build an effective search and rescue system, guiding mobile actuators to locate survivors for example. In fire detection applications, sensors can relay the exact origin and intensity of the fire to water sprinkler actuators that will extinguish the fire before it spreads.

Similarly, motion, acoustic, or light sensors in a building can detect the presence of intruders and command cameras or other instrumentations to track them. Alternatively, mobile actuators can be moved to the area where the intruder has been detected to get high-resolution images, prompt or block the intruder [3].

In the Section 2 of the paper a related work is presented, while in Section 3 the architecture of the system is given. The design and implementation of the platform is presented in Section 4, while in Section 5 the test scenarios and results are given. Section 6 concludes the paper.

II. RELATED WORK Robot sensor networks have evolved from both works

in WSN and also in mobile robotics, particularly autonomous robot teams. Therefore, the robot-sensor systems need to address the technical challenges posed in both fields, along with novel challenges unique to hybrid robot-sensor networks and suggest a huge problem space open for exploration. Such systems address problems that are mainly categorized as:

• Localization and mapping

• Communication and routing

• Path planning

• Target tracking

2010 Fifth International Conference on Systems and Networks Communications

978-0-7695-4145-7/10 $26.00 © 2010 IEEE

DOI 10.1109/ICSNC.2010.8

1

Page 2: [IEEE 2010 Fifth International Conference on Systems and Networks Communications (ICSNC) - Nice, France (2010.08.22-2010.08.27)] 2010 Fifth International Conference on Systems and

• Standardization of hardware services/interfaces

• Asymmetric wireless broadcast

Our work mostly focuses on the first three, exploring how such systems can provide useful and robust behaviors with minimal hardware requirements or dependence on favorable environmental conditions.

There are lots of papers that are related to this thematic. Melodia et al. proposed a mathematical model to optimally assign tasks to actuator and control their motion based on the characteristics of the events [4]. Also, they suggest a location management scheme. Sedighian et al. give a solution for communication between multiple actuators in environment with obstacles, exploiting accessible sensor nodes to indirectly root packets between actuators [5]. Buckl et al. presented approach that profit from the benefits of the SOA implementations, such as Web service interfaces and IP compatible addressing. On the other hand, the implementation can be done on resource constraint devices, such a sensor node. The idea is to use data-centric processing paradigm at device level and a gateway that mediates between web service and the embedded device world [6]. In [7] the next generation of SANET (Sensor Actuator Networks) as large-scale autonomous systems deployed by multiple infrastructure providers and running multiple applications, providing ubiquitous services collaboratively to both stationary and mobile users is introduced. A wireless sensory control for mobile robot navigation, where the processing for the robot trajectory is moved to a PC level is presented in [8].

As discussed in previous paragraph, there are many open research challenges to enable real-time communication and coordination in SANET, especially due to resource constraints and scalability issues. Although a few recent papers are specifically concerned with coordination and communication problems in SANET, the literature on the subject is very limited, and there is lack of implementations and test platforms.

We developed a prototype platform that can be used as a testbed to perform real-world sensor-actuator experiments with mobile sensor and actuators. It is different from the testbeds given in [18], [19] and [20] because it solves the sensor-actuator coordination and all related communication problems (which are not the same if we consider only WSN). It can be used in both indoor and outdoor environment. This platform offers a good starting point for further scientific experiments and research in the area of WSANs. For example, the drawbacks of the DSSR (Data-correlation based Sensor Selection Routing) protocol for context aware services in SANETs are resolved in [9], but they made only simulation studies.

III. ARCHITECTURE OF THE SYSTEM

The architecture of the system is built on a base of 7 requirements:

• Easily accessible and cheap hardware

• Semi-autonomous robot

• Automatic robot navigation to a goal

• Indoor localization without use of GPS

• Wireless communication between robot and the sensors

• Robust and flexible routing algorithm for the sensor measurement data packets

• Control management center for administration of the system, including visualization of sensors readings, manual control of robot, alert management, robotic camera processing etc.

Figure 1. General architecture of the system

Figure 1 shows that the system has 3 general

components: WSN, a Server station, and a Robot system. They are interconnected each other with standard communication

Figure 2. Software deployment diagram.

2

Page 3: [IEEE 2010 Fifth International Conference on Systems and Networks Communications (ICSNC) - Nice, France (2010.08.22-2010.08.27)] 2010 Fifth International Conference on Systems and

protocols. Figure 2 clears up, where exactly the particular software components are deployed. Our main contribution is in the design and development of the server and actuator tier software modules and the communication protocols between them, but not in the sensor tier, where commercial software and protocols are used.

The wireless sensor network (WSN) consists of MICAz (MPR2400) sensor nodes with MTS310CA sensor board developed by Crossbow [11]. They use IEEE 802.15.4 compliant, ZigBee radio frequency transceiver. The sensor board is equipped with acoustic sensor, light, magnetic field, temperature sensor and accelerometer. We used MIB600CA interface board that serves as Ethernet gateway for the Mote network and board for mote programming as well. The base mote is mounted on this board. One standard PC configuration acts as server station. The WSN interface board and the PC have established a LAN connection. The software that we have implemented and runs on the PC is discussed in the next section.

The PC (Server station) is equipped with one WLAN adapter that is used for connection with the robot system. The robot system consists of Robosapiens V1 robot [13]. Robosapiens is a battery-operated, remote-controlled robotic toy. It has one infrared communicator that we used to automatically control the robot with a Pocket PC. This device is fixed on the back of the robot and its infrared diode is directed into the infrared receiver of the robot. In this way, we are able to send infrared codes to the robot via software that is installed in the pocket PC. We did a reverse engineering of all command codes for the remote control and used them in the software on the pocket PC for programmatically sending codes through IR port. Actually, we replaced the remote control with a pocket PC and a software simulator. Therefore, the software that is installed in the pocket PC can control the motion of the robot, making it an independent robot, equipped with its own processing unit and camera, with capability to wirelessly communicate with other devices around it. A wireless LAN connection is created between the server PC and a Pocket PC (HP iPAQ 4150) [12]. This enables the robot to be connected with the server, for exchanging different synchronization data, sending commands from the PC to robot, camera streaming, etc. We will see later how this wireless channel is implemented and used in details.

A mobile phone (Sony Ericsson K750 [14]) is plugged in the Server PC with USB cable. This phone behaves like a GSM modem, with installed SIM card, capable to send/receive SMS messages. The software we have developed for the server has a module that uses such functionality to synchronous or asynchronous sending or receiving messages through a gateway (GSM modem). The details of the implementation will be discussed later.

IV. DESIGN AND IMPLEMENTATION OF THE SYSTEM The wireless network deployment is composed of three

distinct software tiers:

• Mote tier, where XMesh [10] resides, is the software that runs on the cloud of sensor nodes forming a mesh network. The Xmesh software provides the networking algorithms required to form a reliable communication backbone that connects all the nodes within the mesh cloud to the server

• Server tier is an always-on facility that handles translation and buffering of the bridge between the wireless Motes and the internet clients. XServe [10] is server tier application that can run on a PC.

• Client tier provides the user visualization software and graphical interface for managing the network. XMesh can be interfaced to custom client software, as well.

XMesh is full featured multi-hop, ad-hoc, mesh networking protocol for wireless networks [10]. The nodes are capable of hopping radio messages base station where they are passed to a PC. The hopping provides: improved radio coverage and improved reliability. XMesh also provides a networking service that is both self-organizing and self-healing. It can route data from nodes to a base station (upstream) or downstream to individual nodes. XMesh software library uses the TinyOS operating system that runs on the MICAz motes. The sensor and data acquisition boards that we are using are supported by XMesh enabled applications. We used XMTS310 application.

XServe serves as the primary gateway between wireless mesh networks and enterprise applications interacting with the mesh (Figure 3). XServe is connected to the XMesh enabled application in the motes through a base station mote running the XMeshBase application. At its core, XServe provides services to route data to and from the mesh network with higher level services, to parse, transform and process data as it flows between the mesh and the outside application. User can interact with XServe through a terminal interface. Applications can access the network directly through a powerful XML RPC command interface.

We have developed software (ClientGUI) that is using this XML RPC command interface and asynchronously receives sensor measurement data packets (Figure 3). Moreover, it can send command downstream, to the motes. This module is part of XServe and is called XCommand. It allows us to get, set or reset parameters for the sensor boards. It is important to mention that we used the DL (Datalong) protocol for acquiring real sensor data, instead of the other option, that is SF (Serial Forwarder) protocol (it is more complex).

This application that acts as a bridge between the robot system and the wireless sensor network is deployed on the Server PC and is described here in more details. It is a Windows Forms application (C#) that uses .NET Framework 3.5 [15]. It is made up of five modules (Figure 3):

3

Page 4: [IEEE 2010 Fifth International Conference on Systems and Networks Communications (ICSNC) - Nice, France (2010.08.22-2010.08.27)] 2010 Fifth International Conference on Systems and

• Live Control Module

• SMS Service

• Sensor Monitoring

• Alert Manager

• Robotic Live Camera

The first module, Live Control, allows the user directly to interact with the robot.

Figure 3. Modular design of the system

The main functionalities are: Remote manual control of the robot motions, sending text messages (written in simplified English) to the robot brain and voice control and recognition (using Microsoft Speech API – SAPI [16]). Robosapiens V1 can perform 67 unique functions, including four different ways of walking and two different ways of turning. We used this limited set of infrared codes to create simple robot language, so, the robot will understand the text message, i.e., “Hello, go straight 10 meters, after that turn left 90 degrees, then go back 5 steps, wait 2 minutes and start a dance, do this at 12:23 h”. The robot will parse this message and it will be transformed into a sequence of atomic movement commands combined with pauses between. For this purpose, we use robot grammar, in which the set of terms and their combination and rules are defined. This is used for the voice control functionality also.

The SMS Service module gives the functionality to send/receive SMS messages. The service creates a connection to the GSM modem. Other modules use this service and they register handlers for incoming and outgoing SMS messages. The administrator and the system can communicate, through sending SMS. The content of the incoming message can be everything that is defined in the robot grammar. So, we can send batch commands order to the robot at any time. When a message is received at the

GSM modem, the handlers in the system (the software at the PC level) dispatch this message directly to the robot brain (the Pocket PC) through WLAN. The software in this device has the required parsers and the grammar.

According to this approach, if we replace the pocket PC with embedded GSM modem, we’ll not make a migration of the implementation from the PC to Pocket PC. The SMS Service is also used in the Alert Manager module for sending alerts in the form of SMS notifications.

Sensor Monitoring is the most important module (Figure 4). It provides connectivity between the robot (Actuator) system and the wireless sensor network. There are sensor nodes and robot positions represented on the interface. Actually, the interface represents the environment, in which the WSN and the Robot are installed. The lines between the node symbols on the screen inform us for the topology structure and the parent/child links between motes. The module supports live monitoring of the sensor measurements. It can change the network topology and displays this information in gradient of colors on the map.

Figure 4. GUI of the Sensor Monitoring module

Furthermore, we can monitor the position of the actuator in the system (Robosapiens V1). There is possibility to define the obstacles in the environment on this map, with simple drawing on the canvas. We set the initial positions of the sensor motes, as well the robot position and orientation. After this, the server station is doing synchronization with the robot system. This process involves sending the environment parameters (obstacle positions, sensor locations, etc.) to the robot’s brain. The robot now have consciousness for the environment on higher level, and using these data, it can make better decisions for the mission it is conducting.

Alert Manager Module enables managing with Alarms in relation with the wireless senor network events. The module provides creating, editing existing and deleting multiple alarm entries. One Alarm entry is dedicated to one mote and one sensor (i.e., light). Threshold value and alarm conditions are defined. If the condition is satisfied (i.e., the

4

Page 5: [IEEE 2010 Fifth International Conference on Systems and Networks Communications (ICSNC) - Nice, France (2010.08.22-2010.08.27)] 2010 Fifth International Conference on Systems and

threshold value of light intensity is exceeded), an alarm is fired. The action can be: 1) sending the robot to inspect the mote location, for which the alarm was defined; 2) sending SMS message notification to predefined SIM number (i.e., administrator’s mobile phone number). The alarms can be also set to have period of validity. Figure 5 describes the activities of the Alert Manager process. It checks all the alerts in the memory and if the required condition is reached, action is performed.

Figure 5. Activity diagram for Alert Manager Module

Figure 6 shows the GUI of the Alert Manager module. There can be seen four already created alerts.

It is advantageous for the network to possess reliable and complete end-to-end network connectivity. However, even when the network is not fully connected, mobile robot may act as conduits of information – either by positioning themselves tactically to fill connectivity gaps, or by distributing information as they physically travel around the network space.

This strategy also enables replacement of failed nodes and dynamic modification of network topology to provide not only greater network connectivity but also improved area coverage [4]. Here comes the use of the alerts. The Alarms can do monitoring of the parent values of the motes. If a change is detected, it means that RF links are re-routed because of changes in the environment and bad signal between motes. The action will be actuator intervention.

Figure 6. GUI of the Alert Manager module

Robot Live Camera module is doing wireless video streaming from the robot’s camera (plugged on the pocket PC) to the Server PC. There is additional functionality implemented for motion detection and recording video during the motion in mg4 format. The videos are stored on the server PC.

V. TEST SCENARIOS In the WSAN applications, the performed actions serve

the purpose of enhancing the operation of the sensor network by enabling or extending its monitoring capability. For example, mobile actuators can accurately deploy sensors, pick up data from the sensors when in close range, buffer it, and drop off the data to wired access points, or perform energy harvesting, or enhance the localization capabilities of sensors.

We made tests of the system in Indoor environment. The action that we’ve tested was reaching of a sensor location by the robot in an environment with obstacles. Because the range of the MICAz motes is approximately 20-30 meters, we were able to deploy the testbed in few rooms. Four Micaz motes are used and one Robosapiens v1. The configuration of the environment (walls and other obstacle positions) was manually set by the user and sent to the robot. After this synchronization, we sent a SMS text message to the robot with content “Go to sensor number 1. The robot will process this SMS message and the result of this parsing is bursting out raw IR codes to the Robosapiens system. After few seconds the robot started to accomplish this order by making the path plan to avoid the obstacles (Figure 7). This robot action of visual detection of the mote was successful performed during all experiments. The localization was implemented with velocity motion model [17]. Additionally we can program the robot to do some action after the arrival at the sensor 1 position, for example, to do some video recording or physical maneuvers (this depends on the robot capabilities). We have done the same experiment, but without explicitly sending SMS message with an order, with creating an alert with light threshold for sensor 1, and action – send robot to inspect the place.

5

Page 6: [IEEE 2010 Fifth International Conference on Systems and Networks Communications (ICSNC) - Nice, France (2010.08.22-2010.08.27)] 2010 Fifth International Conference on Systems and

We manually covered the light sensor with hand, and after ~10 seconds the sensor has detected the event that is occurring and the data was transmitted to the actuator (robot). The robot has processed the data and has started moving forward to the sensor.

Figure 7. Test scenario: The robot is doing path planning to reach the

sensor location

VI. CONCLUSION AND FUTURE WORK We presented an integrated testbed of wireless sensor-

actuator network with cheap and off-the-shelf hardware components.

We have got implementation of semi-autonomous robot-actuator that is aware for his environment events and he is acting according to these events. The XMesh software that runs in the WSN provides the necessary robustness and scalability for systems of this kind.

We have developed a flexible language (simplified English) for communication with the actuator in the system. And finally, we have developed a control center that is doing management of the system and processing the data produced from the sensor network.

Our main contribution is in the design and development of the server and actuator tier software modules and the communication protocols between them.

In the future work, we are going to focus on implementing better localization services for the actuator. We suggest incorporating more advanced processing of the sensor data. Learning the initial model of the environment (to detect time related changes) is one of the promising approaches as machine learning technique. According to this approach, after the training, any changes compared to the learned normal model are treated as anomalies possibly caused by intruder in the system.

REFERENCES [1] I.F. Akyildiz and I.H. Kasimoglu, “Wireless Sensor and Actor

Networks: Research Challenges,” Elsevier, Ad Hoc Networks, vol. 2, no. 4, pp. 351-367, Oct. 2004

[2] A. Zamanifar, M. Sharifi, and O. Kashefi, A Hybrid Approach to Actor-Actor Connectivity Restoration in Wireless Sensor and Actor Networks, IEEE Eighth International Conference on Networks, DOI 10.1109/ICN.2009.37, pp. 76-81, 2009

[3] A. A. Abbasi, M. Younis, and K. Akkaya, Movement-Assisted Connectivity Restoration in Wireless Sensor and Actor Networks, IEEE Transactions on parallel and distributed systems, Vol. 20, No. 9, pp. 1366-1379, 2009

[4] T. Melodia, D. Pompili, I. F. Akyildiz “Handling Mobility in Wireless Sensor and Actor Networks”, IEEE Transactions on mobile computing, vol. 9, no. 2, pp. 160-173, 2010

[5] S. Sedighian, M. Sharifi, S. V. Azhari, and H. Momeni, “Service Requirements for Actor-Actor Coordination through Sensor Nodes in Wireless Sensor Actor Networks”, Innovation in Information Technology’ 08, IEEE, Al Ain, UAE, pp.475-479, 2008

[6] C. Buckl, S. Sommer, A. Scholz, A. Knoll, A. Kemper, J. Heuer, and A. Schmitt, "Services to the Field: An Approach for Resource Constrained Sensor/Actor Networks", 2009 International Conference on Advanced Information Networking and Applications Workshops, pp.476-481, 2009

[7] R. Elt, M. Eltoweissy, and M. Youssef, “Towards Evolving Sensor Actor Networks”, IEEE INFOCOM 2008, IEEE Conference on Computer Communications Workshops, pp. 1-6, 2008

[8] M.Popa, M. Marcu, A.S. Popa, “Wireless Sensory Control for Mobile Robot Navigation”, Intelligent Systems and Informatics, 10.1109/SISY.2009.5291164 , pp. 197 – 201, SISY '09, 2009

[9] B. Koo, J. Won, Sungbum Park, and H. Eom, "PAAR: A Routing Protocol for Context-Aware Services in Wireless Sensor-Actuator Networks", Asian Himalayas International Conference on Internet 2009, pp. 5-7, 2009

[10] Crossbow Technology, XMesh and XServe User’s Manual, Revision D and E, April, 2007

[11] Crossbow Technology. http://www.xbow.com, (last access March 2010), 2009

[12] HP iPAQ h4150/h4155 Quick Specification, http://h18000.www1.hp.com/products/quickspecs/11751_na/11751_na.pdf, (last access April 2010), 2009

[13] WowWee: overview of Robosapiens V1: http://www.wowwee.com/en/products/toys/robots/robotics/robosapiens/robosapien, (last access April 2010), 2009

[14] Sony Ericsson official product overview: http://www.sonyericsson.com/cws/products/mobilephones/overview/k750i, (last access March 2010), 2009

[15] D. Chappell, Chappell & Associates - Microsoft, Introducing the .NET Framework 3.5, pp. 1-32, September 2007

[16] Microsoft Speech API (SAPI) documentation: http://msdn.microsoft.com/en-us/library/ms723627(VS.85).aspx, (last access March 2010), 2009

[17] S. Thrun, W. Burgard, and D. Fox, “Probabilistic Robotics”, MIT Press, 1999‐2000

[18] L. Tytgat, B. Jooris, P. De Mil, B. Latr´e, I. Moerman and P.Demeester, Demo Abstract: WiLab, a real-life Wireless Sensor Testbed with Environment Emulation, (last access May 2010) http://ewsn09.v6testbed.net/files/1569171512.pdf, 2009

[19] G.Werner-Allen, P. Swieskowski, and M. Welsh ,“Motelab: A

wireless sensor network testbed”, 4th Annual Conference on Information Processing in Sensor Networks (IPSN) 2005, pp. 483-488, April, 2005

[20] V. Handziski, A. K¨opke, A. Willig, and A. Wolisz, “TWIST: a

scalable and reconfigurable testbed for wireless indoor experiments with sensor networks.” In Proceedings of the 2nd international Workshop on Multi-Hop Ad Hoc Networks: From theory To Reality (REALMAN ’06), Florence, Italy, pp. 63-70, May, 2006

6