93
CYBER PHYSICAL DESTINATION OPERATING PANEL Ogienoyevbede Raymond Master of Science Thesis Supervisor: Dr. Sébastien Lafond, Jukka Salmikuukka Advisor: Aki Parviainen Embedded Systems Laboratory Department of Information Technologies Åbo Akademi University May 2014

CYBER PHYSICAL DESTINATION OPERATING PANEL

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: CYBER PHYSICAL DESTINATION OPERATING PANEL

CYBER PHYSICAL DESTINATIONOPERATING PANEL

Ogienoyevbede Raymond

Master of Science ThesisSupervisor: Dr. Sébastien Lafond, Jukka Salmikuukka

Advisor: Aki ParviainenEmbedded Systems Laboratory

Department of Information TechnologiesÅbo Akademi University

May 2014

Page 2: CYBER PHYSICAL DESTINATION OPERATING PANEL

ABSTRACT

Destination Control Systems (DCS) are the foundation of modern elevator control sys-tems. They incorporate technology and usability into the intelligence of vertical trans-portation. Interfacing the DCS to elevator users is a touch device known as the Destin-ation Operating Panel (DOP) which act as a sophisticate yet simplified Human DeviceInterface. DOPs exists in different configurations and with different functionalities.Regardless of how they are designed, they all have the core functionality of controllingpeople traffic through one or several elevators to which they are connected.

To enhance the experience of DOP users and the independent functioning of a DOP,the DOP functionalities can be made to span a wide range of technology it can supportsuch as cyber physical system technology. Incrementally, new functionalities are im-plemented to DOPs guided by researches into user experience and technologies of highstandard efficiency. From the scope of human sensory interface with a DOP as a cyberphysical system and how it affects usability and access control indications, this workresearches and implements new functionalities to the DOP such as: a) haptic feedback,b) vision enhancement with ambient light detection, c) audibility system control withthe DOP surrounding sound sensor and d) visual signalization of DOP state with ad-ded design allure. Sensors and actuators are researched, tested and implemented to theDOP to increase its functionality and transform it from a Destination Operating Panelto a Cyber Physical Destination Operating Panel.

Keywords: DCS, DOP, Haptic feedback, Ambient light detection, sound detection,semiotic light, sensors, actuators.

i

Page 3: CYBER PHYSICAL DESTINATION OPERATING PANEL

ACKNOWLEDGMENT

There are a number of people without whom this thesis would not have been a successand to whom I would like to express my deepest appreciation. To my family, whocontinues to encourage, inspire and support me on my quest of finding and realizingmy full potentials, contribute to my field of expertise and the world around me, I say aspecial thanks to you. To Jukka Salmikuukka and Aki Parviainen, who acted as a linkbetween my theoretical educational field and practical working field and for the oppor-tunity to enrich myself with the vast knowledge and experience at their disposal, I sayam grateful. To Jari Tulilahti, Teppo Nurminen, Jarno Argillander, Klaus Mäkelä andMarika Vainio, who made available their expertise in the field of electronics, mechan-ics, embedded systems, embedded programming and more as much as needed in thedevelopment and writing of the thesis. I appreciate all contributions.

ii

Page 4: CYBER PHYSICAL DESTINATION OPERATING PANEL

CONTENTS

Abstract i

Acknowledgment ii

Contents iii

List of Figures vi

Glossary viii

1 Introduction 11.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Goal of Thesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.3 Thesis structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Touch Device 52.1 Hardware Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.1.1 Mechanics . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.1.2 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.2 Software Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.2.1 Application Structure . . . . . . . . . . . . . . . . . . . . . . 72.2.2 Device Operating System . . . . . . . . . . . . . . . . . . . 72.2.3 Device Application Software . . . . . . . . . . . . . . . . . . 92.2.4 Serial Communication with RS485 Protocol . . . . . . . . . . 112.2.5 Serial Communication with RS232 Protocol . . . . . . . . . . 18

3 Device Board 203.1 Hardware Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

3.1.1 Mechanics . . . . . . . . . . . . . . . . . . . . . . . . . . . 213.1.2 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

3.2 Software Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253.2.1 Application Structure . . . . . . . . . . . . . . . . . . . . . . 253.2.2 Software Layers . . . . . . . . . . . . . . . . . . . . . . . . 26

iii

Page 5: CYBER PHYSICAL DESTINATION OPERATING PANEL

4 Automatic Volume Control 304.1 Hardware Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

4.1.1 Mechanics . . . . . . . . . . . . . . . . . . . . . . . . . . . 314.1.2 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

4.2 Software Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334.2.1 Application Structure . . . . . . . . . . . . . . . . . . . . . . 334.2.2 Linux Kernel requirement . . . . . . . . . . . . . . . . . . . 334.2.3 Electret Microphone . . . . . . . . . . . . . . . . . . . . . . 364.2.4 Volume Levels . . . . . . . . . . . . . . . . . . . . . . . . . 37

5 Haptic Feedback System 415.1 Hardware Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

5.1.1 Mechanics . . . . . . . . . . . . . . . . . . . . . . . . . . . 415.1.2 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

5.2 Software Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 475.2.1 Application Structure . . . . . . . . . . . . . . . . . . . . . . 475.2.2 Haptic Patterns . . . . . . . . . . . . . . . . . . . . . . . . . 49

6 Semiotic Lighting System 516.1 Hardware Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

6.1.1 Mechanics . . . . . . . . . . . . . . . . . . . . . . . . . . . 516.1.2 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

6.2 Software Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 556.2.1 Application Structure . . . . . . . . . . . . . . . . . . . . . . 556.2.2 Semiotic Signalization . . . . . . . . . . . . . . . . . . . . . 57

7 Automatic Backlight Control 607.1 Hardware Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60

7.1.1 Mechanics . . . . . . . . . . . . . . . . . . . . . . . . . . . 617.1.2 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . 62

7.2 Software Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 647.2.1 Application Structure . . . . . . . . . . . . . . . . . . . . . . 647.2.2 Brightness Levels . . . . . . . . . . . . . . . . . . . . . . . . 65

8 Conclusions and Future work 698.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 698.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

Bibliography 72

A Appendix 76A.1 Serial IC pin mapping . . . . . . . . . . . . . . . . . . . . . . . . . . 76A.2 Communication Structure . . . . . . . . . . . . . . . . . . . . . . . . 76A.3 Device board pseudo-example . . . . . . . . . . . . . . . . . . . . . 77

iv

Page 6: CYBER PHYSICAL DESTINATION OPERATING PANEL

A.4 Calibration pseudo-example . . . . . . . . . . . . . . . . . . . . . . 82A.5 Linux Kernel requirement . . . . . . . . . . . . . . . . . . . . . . . . 84

v

Page 7: CYBER PHYSICAL DESTINATION OPERATING PANEL

LIST OF FIGURES

2.1 Touch device mechanics assembly. . . . . . . . . . . . . . . . . . . . 62.2 Touch device electronics assembly. . . . . . . . . . . . . . . . . . . . 62.3 Touch device application structure. . . . . . . . . . . . . . . . . . . . 72.4 Touch device kernel structure [1]. . . . . . . . . . . . . . . . . . . . 82.5 Touch device software structure. . . . . . . . . . . . . . . . . . . . . 102.6 Touch device serial communication channel [2]. . . . . . . . . . . . . 112.7 Touch device UART definition. . . . . . . . . . . . . . . . . . . . . . 122.8 Touch device transmit/receive select pins mapping. . . . . . . . . . . 122.9 Touch device serial IC transmit/receive select GPIO declaration. . . . 132.10 Capturing the touch event . . . . . . . . . . . . . . . . . . . . . . . . 172.11 Serial communication application structure . . . . . . . . . . . . . . 172.12 RS232 Serial communication . . . . . . . . . . . . . . . . . . . . . . 18

3.1 Touch device and Device board interface. . . . . . . . . . . . . . . . 203.2 Device board mechanics. . . . . . . . . . . . . . . . . . . . . . . . . 213.3 Device board circuitry. . . . . . . . . . . . . . . . . . . . . . . . . . 223.4 Device board Prototype 1. . . . . . . . . . . . . . . . . . . . . . . . 233.5 Device board Prototype 2. . . . . . . . . . . . . . . . . . . . . . . . 233.6 Device board Printed Circuit Board Diagram. . . . . . . . . . . . . . 243.7 Device board Printed Circuit Board. . . . . . . . . . . . . . . . . . . 253.8 Device board application structure. . . . . . . . . . . . . . . . . . . . 263.9 Communication sequence of backlight and volume control data to the

main board. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283.10 Communication sequence of back light and volume control data from

the device board. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

4.1 Microphone assembly. . . . . . . . . . . . . . . . . . . . . . . . . . 314.2 Automatic volume control circuitry. . . . . . . . . . . . . . . . . . . 324.3 Linux kernel configuration. . . . . . . . . . . . . . . . . . . . . . . . 344.4 Automatic volume control application structure. . . . . . . . . . . . . 37

5.1 Haptic motor (9.5G) bottom left placement. . . . . . . . . . . . . . . 425.2 Haptic motor (14.3G) bottom left placement. . . . . . . . . . . . . . 425.3 Haptic motor (13G) bottom left placement. . . . . . . . . . . . . . . 43

vi

Page 8: CYBER PHYSICAL DESTINATION OPERATING PANEL

5.4 Haptic motor assembly. . . . . . . . . . . . . . . . . . . . . . . . . . 445.5 Vibration isolation layer. . . . . . . . . . . . . . . . . . . . . . . . . 455.6 Haptic feedback system circuitry. . . . . . . . . . . . . . . . . . . . . 465.7 Haptic feedback system application structure. . . . . . . . . . . . . . 485.8 Haptic feedback system application structure for slider UI. . . . . . . 49

6.1 Light assembly with acrylic. . . . . . . . . . . . . . . . . . . . . . . 526.2 Light assembly on back panel. . . . . . . . . . . . . . . . . . . . . . 536.3 Diffused LED light. . . . . . . . . . . . . . . . . . . . . . . . . . . . 536.4 Semiotic lighting system circuitry. . . . . . . . . . . . . . . . . . . . 546.5 Semiotic lighting system application structure. . . . . . . . . . . . . . 56

7.1 Positions of photosensitive component (LDR). . . . . . . . . . . . . . 617.2 Automatic backlight control electronics circuitry. . . . . . . . . . . . 637.3 Automatic backlight control application structure. . . . . . . . . . . . 65

A.1 Serial communication structure [3] . . . . . . . . . . . . . . . . . . . 77A.2 Terminal communication structure [4] . . . . . . . . . . . . . . . . . 77A.3 Kernel requirement for USB audio. . . . . . . . . . . . . . . . . . . . 84A.4 USB 1.0 device support for USB 2.0 port. . . . . . . . . . . . . . . . 85

vii

Page 9: CYBER PHYSICAL DESTINATION OPERATING PANEL

GLOSSARY

ADC Analog to Digital Converter

DCS Destination Control System

DOP Destination Operating Panel

ERM Eccentric Rotating Mass

GPIO General Purpose Input/Output

GUI Graphical User Interface

IC Integrated Circuit

I2C Inter-Integrated Circuit

LCS Landing Call Station

LDR Light Dependent Resistor

LED Light Emitting Diode

LRA Linear Resonant Actuator

PCBA Printed Circuit Board Assembly

RS232 Recommended Standard 232

RS485 Recommended Standard 485

SoC System on Chip

UART Universal Asynchronous Receiver/Transmitter

USB Universal Serial Bus

viii

Page 10: CYBER PHYSICAL DESTINATION OPERATING PANEL

1 INTRODUCTION

1.1 Overview

The elevator world has grown over the years at a rapid rate and in the age of informationand silicon technology, the rate of growth has become more rapid. Elevators are nowsmarter and artificially intuitive. All in all, the aim of an efficient elevator system is toensure effortless people flow from their current location to their destination providingthem a safe and comfortable journey.

Among the new technologies introduced to elevator systems advancement is theDestination Control System (DCS). DCS modifies the traditional way of using the el-evator which involves making a landing call at the Landing Call Station (LCS) and/orinside the elevator with the Car Operation Panel (COP) when using the elevator. Withthe growing per building population, the increasing height of buildings and the increas-ing number of landing stations in a building, a single elevator ride can take twice orthrice the needed time if there are lots of landing station stops along the journey andthat has become a point of concern. The idea of a DCS is to group together peopleheading to similar floors or at least heading to nearby floors thereby reducing the num-ber of stops made before arriving at the last stop.

The DCS has a chain of operations which starts from the Destination OperatingPanel (DOP). The DOP is implemented as a touch device where user inputs are col-lected. Other operations includes series of communications between the DOP and theelevator controllers. Finally back to the DOP where the DCS outputs are presented tothe user. An efficient DCS is capable of: a) handling people traffic at any rate whichmeans handling several hundreds of inputs from users within a short period, b) mak-ing a clever people flow pattern decisions and c) communicating such decisions to theusers. The efficiency of operation in busy period should not be any different than dur-ing a low traffic period. The ongoing development of a more efficient DCS will ensurethat the ever increasing building population and building size will not bring any new

1

Page 11: CYBER PHYSICAL DESTINATION OPERATING PANEL

or unforeseen challenges.From the user point of view, the most important part of the DCS is the DOP. The

DOP takes the user requests to the DCS and give them the corresponding DCS de-cisions. The means of communication, either input or output is made possible withsensors and actuators. To have a satisfied user, the sensors and actuators must operateat their highest efficiency for communicating input to the DCS and communicatingoutput to users respectively. There are different types of sensors and actuators thatperform similar operation with each having its own added advantage on some specificfeature. The features of a DOP influence what kind and type of sensors and actuatorsthat will be incorporated into it.

Another perspective is the user satisfaction and expectations from a DOP. This bor-ders on the biological responses to various types of stimulation and the psychologicalreactions to communicating with an inanimate object. Therefore it needs a genericinvestigation. The term cyber physical system suites these user need as it bridges theboundary between the human world and the cyber world. The need for users to interactwith a more humane DOP is achieved by implementing the DOP as a cyber physicalDOP. There are limited number of sense organs that can be stimulated by actuatorsbut endless number of ways they can be stimulated. There are numerous physicalconditions that can be sensed with a sensor. Also user input sent through a sensor iscountless and can only be limited by the possible inputs a sensing device can accept.The combination of sensors and actuators to a DOP will result in: a) a cyber physicalDOP capable of humane interaction with users, b) an independently functioning DOP,and c) a DOP with high operation efficiency and reliability.

From a large poll of sensors and actuators and the possible usability scenarios ofthe DOP, this work researches on how best the DCS can communicate with the userthrough a sensor-actuator-enhanced DOP. Leading the sensors research is the LightDependent Resistor (LDR) that has the ability to vary its resistance according to theintensity of light falling on its photo sensitive surface. This ability can be harnessedfor the automatic control of the DOP back lighting with reference to the environmentallighting condition in the vicinity of the DOP. Another sensor is the microphone forits sensitivity to sound. A microphone can serve as a means to estimate the soundlevel in the vicinity of the DOP. Having such information, the volume level of theDOP can be automatically controlled to an acceptable level that doesn’t create anyform of discomfort to the user. From the section of actuators, leading the research is

2

Page 12: CYBER PHYSICAL DESTINATION OPERATING PANEL

the Eccentric Rotating Mass (ERM) motor for haptic feedback. The ERM motor is avibrator that responds to input from the user as a means of acknowledging a valid input.It is a tactile way the DOP acknowledge users for been touched. Still serving the hapticfeedback functionality is the Linear Resonant Actuator (LRA) motor. It differs fromERM motor as it doesn’t have a rotating mass that generates its vibration but insteada coil drive placed before a mass is used to generate its vibration. Also as an outputactuator is the Light Emitting Diode (LED) implemented as a semiotic device. TheLED can stimulate the user sense of sight in different ways such as an alert to danger,warning, direction or input confirmation.

A cyber physical DOP as part of the DCS can be enhanced and optimized as muchas technology permits. Serving as a human interface device, the cyber physical DOPrelies on as many sensors and actuators to communicate with its users. These sensorsand actuators have to work in line with the cyber physical DOP’s core functionality inorder to provide user satisfaction. As part of this thesis, the incorporation of sensorsand actuators into the DOP is pushed further with functionalities such as automaticallycontrolling the back lighting system of the DOP with a LDR, automatically adjustingthe DOP volume with reference to the DOP vicinity volume detectable with a micro-phone, giving users haptic feedback with the use of an ERM motor and LRA motor,and semiotic lighting with LED.

1.2 Goal of Thesis

This thesis aims at exploring the possibilities that sensors and actuators provides in thefield of human interface design. Communications with inanimate objects are gener-ally fostered by sensors and actuators of various types. These sensors and actuatorsare engineered for specific purposes and this thesis researches: a) how well varioussensors can receive signals and instructions from its users and environment and b) howwell actuators can be used to communicate the device actions to its users. This thesiswill embrace the concepts of psycho-physical, physiological and neurological found-ation in human interface design and also provide means of first hand observation bycreating prototypes that features other enabling components such as software program,electronics design and mechanics design in the goal of developing a cyber physicalDOP.

This thesis will be completed with a prototype device board (supporting listed cyber

3

Page 13: CYBER PHYSICAL DESTINATION OPERATING PANEL

physical features). It will be created with the accompanying software architecture andcodes and will be used to demonstrate the functionalities of the connected sensors andactuators.

1.3 Thesis structure

The thesis work is structured to present the practical design and implementation ofthe new functionalities provided by a specific sensor or actuator and followed by test-ing, demonstration and analysis of the implemented functionalities. The chapters arestructured as:

Chapter 2 gives a brief description of the touch device DOP which will be modifiedinto a cyber physical touch device DOP, explaining its hardware and software structure.

Chapter 3 introduces the device board which handles the communications betweenthe touch device and the connected sensors and actuators. It is the device used for themodification of a DOP into a cyber physical DOP.

Chapter 4 explains how the automatic volume control functionality is implementedfrom the hardware and software perspective.

Chapter 5 explains how the haptic feedback system is implemented. Hardware andsoftware descriptions are explained in details.

Chapter 6 explores the semiotic capabilities of the LED starting with the hardwarestructure and into the software implementation.

Chapter 7 gives an insight into the operations of the automatic back lighting controlthrough ambient light detection.

Chapter 8 summarizes the work so far and gives some light into the future of cyberphysical DOP.

4

Page 14: CYBER PHYSICAL DESTINATION OPERATING PANEL

2 TOUCH DEVICE

A touch device is a DOP using a touch responsive screen as it input device. It is theinput terminal of a DCS used for the request of an elevator ride and also serves asan output point for the DCS to respond back to the user. The cyber physical DOPfunctionalities demonstrated in this work is built into the touch device.

The touch device used for this work is the touch screen destination operating panelwhich is built as the human interface device to enable an easier user friendly interactionwith a DCS. The device incorporated with the DCS enables an easy use of the elevatorsto which it is connected, better organized lobby in the build where it is installed, orderlyelevator boarding and uncrowded elevators, fewer unnecessary elevator stops and shorttime to destination for elevator users.

Built as a cyber physical system in this work, the touch device can interact withthe user and the environment to serve its functionalities. This work targets the device-environment interaction and aims at expanding the cyber physical functionalities withexisting technologies.

2.1 Hardware Design

The touch device hardware is a single unit that consists of the mechanics and electron-ics architecture. Figure 2.1 shows the basic layout of the touch device assembly.

2.1.1 Mechanics

A touch device has two basic mechanics assembly. The pedestal and wall mountedassembly. Both are similar in design except the mounting point fittings. The wallmounted assembly attaches to the wall while the pedestal assembly includes an exten-ded stand to the point of fitting to enable it standalone.

5

Page 15: CYBER PHYSICAL DESTINATION OPERATING PANEL

Mounting

Touch screen

Electronics Housing

Front View Side View

Figure 2.1: Touch device mechanics assembly.

2.1.2 Electronics

The electronics housing of the mechanics assembly contains the electronic circuitry ofthe touch device. These includes a Printed Circuit Board Assembly (PCBA), wiringand electronic components such as a speaker.

SoC

RS485

USB

MAIN BOARD

Power controller

Touch controller

Audio controller

GPIO Proximity Sensor

Speaker

Touch Display

Keyboard

COMPONENTS

I2C

Figure 2.2: Touch device electronics assembly.

The PCBA in the touch device has different functionalities such as the touch con-troller, power controller and the process controller (main controller) with a Systemon Chip (SoC) architecture. On the main board, there are input ports, processing ele-ments and output ports which defines it as an embedded system. The input/output

6

Page 16: CYBER PHYSICAL DESTINATION OPERATING PANEL

ports for communication includes the serial port such as Inter-Integrated Circuit (i2C),RS232, half duplex RS485 and a Universal Serial Bus (USB); audio port; general pur-pose digital input/output ports; analog input/output ports; and keyboard port. Otherelectronic components attached to the main board are the touch-screen display, powerport, speaker, keyboard and proximity sensors. Figure 2.2 shows a block diagram ofthe touch device PCBA and components assembly.

2.2 Software Design

The touch device is controlled with embedded software. The embedded software is acustomized embedded Linux operating system based on the Linux kernel 2.6.35 [5].On top of this operating system, the touch device application software is implemented.

2.2.1 Application Structure

The Linux kernel resides as the low level structure handling communication and in-terfacing with the hardware devices [6][7]. Next to the kernel is the embedded Linuxoperating system serving as a connection layer between the kernel and the touch deviceapplication [8]. Figure 2.3 shows the touch device application layers and their place-ment relative to one another.

Application Software

Hardware

Operating System

Kernel space

Input devices Graphics devices Device Board Sensors Actuators

Drivers File system Power Mgmt

others

othersARM ARCH

Touch Device controller Communication pluginAudio player

Service management

Memory Mgmt

Libraries Shell Touch Device comm. system

GUI

Figure 2.3: Touch device application structure.

2.2.2 Device Operating System

The touch device operation and communication with the hardware layer implementedthrough the Linux kernel 2.6.35 uses a monolithic kernel. This implies that the entireoperating system is working in the kernel space. The kernel space is a memory area

7

Page 17: CYBER PHYSICAL DESTINATION OPERATING PANEL

reserved for the execution of kernel processes such as kernel extensions and devicedrivers. Figure 2.4 shows the inner structure of a monolithic kernel. Functional com-ponents such as device drivers and middle-ware are integrated into the operating sys-tem along with the system kernel. The monolithic kernel nature of the touch devicemeans that the device operating system is an integration of the kernel and system files[1]. The device operating system brings together the kernel and application softwareand ensures their proper operations.

Higher-level Software (Applications)

Hardware

File I/O

I/O Drivers

Memory Mgmt

Process Mgmt

Memory Drivers

Interrupt Drivers

Monolithic Kernel

Figure 2.4: Touch device kernel structure [1].

The touch device is based on an ARM architecture. Cross compiling of Linux forARM architecture can be done with sourcery codebench lite arm-none-eabi applica-tion [9]. For this work, the application used is the sourcery codebench lite 2013.05-23 IA32 GNU/LINUX installer containing the binary package arm-2013.05-23-arm-none-eabi.bin. The execution in Table 2.1 give the basic steps for cross compiling theLinux kernel for this ARM architecture.

8

Page 18: CYBER PHYSICAL DESTINATION OPERATING PANEL

Command Operation performed

chmod +x arm-2013.05-23-arm-none- Changes the file mode bits toeabi.bin become executable.

./ arm-2013.05-23-arm-none-eabi.bin Executes the file.

sudo apt-get install ia32-libs ia32- Install the required 32 bit binarieslibs-gtk la32-libs-gtk in a 64 bit system.

sudo dpkg –reconfigure -plow dash Installs needed debian packages(NO to the question asked).

run to create ARM configuration

make clean Delete any previous configurations.

export ARCH=arm Export the ARM architecture tobe the used architecture.

export CROSS_COMPILE=/home/. . . . . . Export the arm-none-eabi files as the../bin/ arm-none-eabi- cross compiler files.

make gconfig Run to access the kernel configurationsand make any needed changes.

make DEVICE_defconfig Make the kernel configuration filewith the ARM architecture (executedonly when changes are madeto the kernel).

making a custom kernel image

sudo apt-get install uboot-mkimage Install uboot image maker.

make uImage Make a kernel uboot image.

Table 2.1: Cross compiling Linux for ARM architecture [10][11].

2.2.3 Device Application Software

The touch device application software resides at a layer above the system software andinteracts with the rest of the software architecture through the operating system. Theapplication software structure is illustrated in details in Figure 2.5, showing its innerlayers and communication paths.

The functionalities of the touch device are implemented through the applicationsoftware. These functionalities are needed for the everyday usage of the touch device.

9

Page 19: CYBER PHYSICAL DESTINATION OPERATING PANEL

There are different layers of the application software starting with the Graphical UserInterface (GUI) layer which serves as a user interaction layer for inputs and outputs.Next to the GUI is the touch device controller layer that handles communications andsystem operation regulations.

Application Software

Touch Device Utility

Graphical User Interface

comm. plugin

audio player

system manager

Touch Device Controller

Figure 2.5: Touch device software structure.

At a lower layer of operation is the touch device utility layer responsible for op-erations such as the communication with the elevator controller, audio playback, in-put/output interactions with hardware, main board resource management, communic-ation with the device board and other system level management operations. The touchdevice controller also serves as a communication path between the GUI and other sys-tem utilities present in the touch device utility layer such as synchronizing the GUIactivities with the audio player, transferring data from the communication plugin tothe GUI and relating other system management activities to the GUI. The applicationsoftware architecture and structure are optimized towards fulfilling the expected func-tional and non-functional requirements of the touch device.

10

Page 20: CYBER PHYSICAL DESTINATION OPERATING PANEL

2.2.4 Serial Communication with RS485 Protocol

Among the serial communication interfaces present on the touch device main boardis the Recommended Standard 485 (RS485) which is implemented with a MAX3430Integrated Circuit (IC). The serial interface is used in this work for the communicationbetween the main board and the device board (see chapter 3).

TIA-485-A, also known as ANSI/TIA/EIA-485, TIA/EIA-485, EIA-485 or RS485is a standard defining the electrical characteristics and specification of drivers and re-ceivers used for balanced digital multidrop communications and it is usually imple-mented using a RS485 driver IC connected to a Universal Asynchronous Receiver-/Transmitter (UART) [12]. Implementing it includes understanding the protocol that isbeing used for the device to communicate over the RS485 bus, implementing a versionof the protocol in software and connecting an RS485 bus interface IC to a UART inboth communicating devices [13].

The operation process of the RS485 requires the transmission of electrical signalbetween two RS485 ICs in a circuit architecture that can contain varying number ofICs as illustrated in Figure 2.6. The communication can be full duplex or half duplex.In a full duplex communication, two serial communication channels are available andcan be used for simultaneous receiving and transmitting while in half duplex onlyone communication channel is available and communication can either be receivingor transmitting and not both simultaneously [13][14][15]. Two way communicationbetween the touch device and the device board is not possible in this system due tothe fact that a half duplex communication is implemented in the serial communicationarchitecture. Therefore simultaneous reading and writing to increase system efficiencybetween both circuits is not obtainable.

Figure 2.6: Touch device serial communication channel [2].

11

Page 21: CYBER PHYSICAL DESTINATION OPERATING PANEL

The required connections of RS485 integrated circuit are 3.3 volts input voltage,ground, transmit and receive select pins, transmit signal point, receive signal point andtwo cable connection point to other ICs. The pins, labels and functions are groupedin Appendix A Table A.1. Transmitting or receiving through the RS485 requires thattransmit and receive select pins are set to high for transmit and low for receive respect-ively.

The main board runs an embedded Linux which implements the RS485 as de-scribed in the serial protocol library. To use the serial communication on the embeddedLinux, a serial protocol is defined according to the serial.h file provided by the Linuxdistribution. The RS485 IC connects to a UART which is part of the microcontrollerand through the UART, serial communication is interfaced with the other componentsof the microcontroller. On the touch device, the max3430 is connected to the UART2.The UART2 is located in the microcontroller EIM port and the RS485 transmit pin andreceive pin are connected to EIM_D26 and EIM_D27 pins of the UART2 respectively[16] as indicated in Figure 2.7. These connections are defined in the microcontrol-ler_loco.c file.

/* UART2 ACS */ MX53_PAD_EIM_D26__UART2_TXD_MUX, MX53_PAD_EIM_D27__UART2_RXD_MUX,

Figure 2.7: Touch device UART definition.

To configure the main board to transmit or receive information through the serialport, the max3430 transmit and receive select pins are connected to the microcontrollercontrol pins. The transmit pin is connected to the GPIO_18 and the receive pin con-nected to the GPIO_17 as indicated in Figure 2.8. At low level, the pin mappings aredefined in the iomux-53.h file [17].

#define MX53_PAD_GPIO_17__GPIO_7_12 IOMUX_PAD(0x6D0, 0x340, 1, 0x0, 0, NO_PAD_CTRL) #define MX53_PAD_GPIO_18__GPIO_7_13 IOMUX_PAD(0x6D4, 0x344, 1, 0x0, 0, NO_PAD_CTRL)

Figure 2.8: Touch device transmit/receive select pins mapping.

For the pins to be accessible by the touch application program in the user space,they are defined by the pin numeric variable and evaluated by the factor:

12

Page 22: CYBER PHYSICAL DESTINATION OPERATING PANEL

GPIOpinnumber = ((x− 1) ∗ 32) + y

Where: x is the port number where the GPIO pin is located on the IC and y is thepin number of the GPIO pin on the port.

Therefore:TransmitselectGPIOpin = ((7− 1) ∗ 32) + 13 = 205

ReceiveselectGPIOpin = ((7− 1) ∗ 32) + 12 = 204

With the obtained values, the touch application program reference and use theGPIO pins for controlling the touch device serial communications [17][16][18]. Figure2.9 shows the shell command for referencing the GPIO pins needed for serial commu-nication

# RS485 Receive enable pin echo 204 > /sys/class/gpio/export echo out > /sys/class/gpio/gpio204/direction echo 0 > /sys/class/gpio/gpio204/value # RS485 Transmit enable pin echo 205 > /sys/class/gpio/export echo out > /sys/class/gpio/gpio205/direction echo 1 > /sys/class/gpio/gpio205/value

Figure 2.9: Touch device serial IC transmit/receive select GPIO declaration.

The serial port has different modes of operation in the embedded Linux environ-ment. It can be on transmit only mode, receive only mode or transmit/receive mode.Transmit only mode requires that both transmit and receive select pins are set to 1while for receive only mode, both transmit and receive select pins are set to 0. Forthe transmit/receive mode, transmit select pin is set to 1 and receive select pin is setto 0. Unlike the transmit select pin, the receive select pin is connected to an invertedgate within the RS485 IC. Due to this, setting both select pins to 1 will mean that thereceive select pin is 0 and therefore disabled. Setting both select pins to 0 will meanthat the receive select pin is 1 and hence enabling it. When transmit select pin is 1 andreceive select pin is 0, both pins will have the value 1 and therefore set to the mode oftransmit/receive.

Irrespective of the mode to which the serial port is set, a quick testing of the serialcommunication is done by sending serial data through the command line in the se-quence described in Table 2.2. A successful transmission is obtained in both transmit

13

Page 23: CYBER PHYSICAL DESTINATION OPERATING PANEL

only mode and transmit/receive mode while failed transmission is obtained from re-ceive only mode.

root@DEVICE:/#stty –F /dev/ttymxc1 speed 9600 cs8 –cstopb –parenb –crtsctsroot@DEVICE:/#echo 205 >/sys/class/gpio/exportroot@DEVICE:/#echo out >/sys/class/gpio/gpio205/directionroot@DEVICE:/#echo 1 >/sys/class/gpio/gpio205/valueroot@DEVICE:/#echo –en ‘\x12 \x02’ >/dev/ttymxc1root@DEVICE:/#echo –n ^A^B >/dev/ttymxc1root@DEVICE:/#echo ‘A’ >/dev/ttymxc1root@DEVICE:/#echo “test” >/dev/ttymxc1root@DEVICE:/#cat /proc/tty/driver/ttymxc

Table 2.2: Serial channel test.

The use of serial communication in the application software abides by specific ruleswhich start with configuring the serial port as specified by the serial protocol for theembedded Linux. The port is configured with the flags specified by the serial.h file forserial communications through RS485 and termios.h for input/output terminal com-munications [19]. Appendix A Figure A.1, shows the serial protocol and appendix AFigure A.2, shows the terminal configuration protocol required for the serial commu-nication configuration.

Using the flags and configuring the variables correctly, serial communication isachieved. Setting up the configuration with the serial.h file involves enabling the set-tings that are needed such as SER_RS485_ENABLE which will allow the use of theRS485 port. The termios configuration on the other hand is more complicated andrequires thorough reading of the flag settings in order to enable the needed flags. Theflag settings include:

Flag Function

i.flag (input flags) disabled

BRKINT Ignore BREAK, a BREAK causes the input and output queues tobe flushed.

ICRNL Ignore translate carriage return to newline on input.Table 2.4: Terminal flags [20]

14

Page 24: CYBER PHYSICAL DESTINATION OPERATING PANEL

INLCR Ignore translate newline to carriage return on input.

PARMRK Read a character with a parity error or framing error as \0.

INPCK Disable input parity checking.

ISTRIP Ignore strip off eighth bit.

IXON Disable XON/XOFF flow control on output.

IXOFF Disable XON/XOFF flow control on input.

IXANY Allow just the START character to restart output.

i.flag enabled

IGNPAR Ignore framing errors and parity errors.

IGNBRK Ignore BREAK condition on input.

o.flag (output flags) disabled

OCRNL Ignore map carriage return to new line on output.

ONLCR Ignore map new line to carriage return on output.

ONLRET Output carriage return.

ONOCR Output carriage return at column 0.

OFILL Don’t send fill characters for a delay, rather use a timed delay.

OLCUC Don’t map lowercase characters to uppercase on output.

OPOST Disable implementation-defined output processing.

l.flag (local flags) disabled

ECHO Don’t echo input characters.

ECHONL Don’t echo new line.

ICANON Disable canonical mode.

IEXTEN Disable implementation-defined input processing.

ISIG Don’t generate signal for the characters INTR, QUIT,SUSP, or DSUSP.

ECHOK Prevent the KILL character from erasing the current line.

ECHOCTL Prevent terminal special characters other than TAB,NL, START, and STOP to be echoed as ^X.

ECHOPRT Prevent character printing as they are being erased.

ECHOE Prevent the ERASE character from erasing the preceding inputcharacter, and WERASE from erasing the preceding word.

Table 2.5: Terminal flags continue [20]

15

Page 25: CYBER PHYSICAL DESTINATION OPERATING PANEL

ECHOKE KILL is echoed by erasing each character on the line,as specified by ECHOE and ECHOPRT.

c.flag (control flags) disabled

CSIZE Clear character size mask.

PARENB Disable parity generation on output and parity checking for input.

CRTSCTS Disable RTS/CTS (hardware) flow control.

CBAUD Disable baud speed mask (4+1 bits).

CSTOPB Set one stop bit.

c.flag enabled

CLOCAL Ignore modem control lines.

CREAD Enable receiver.

CS8 Character size mask to 8.

B9600 Baud rate 9600.

cc.flag (control characters flags) settings

VMIN Minimum number of characters for noncanonical read (MIN).

VTIME Timeout in deciseconds for noncanonical read (TIME).

Table 2.6: Terminal flags continue [20]

Serial communication is obtained after thorough configuration, testing and imple-mentation to the existing touch device application software. Each input to the touchdevice requires a complete process cycle starting from receiving the data from the in-put device to processing it. Next, each output data is sent through the serial port to thedevice board. Due to resource sharing and the resource requirement of the serial com-munication process, when multiple input data are received or sent through the serialport, they are queued in line while other touch device processes are executed. Thishalts the touch device processes until the serial data transfer is completed resulting ina touch device behaviour that is not synchronized with the GUI. Figure 2.10 illustratethe transfer of touch input to the device board.

16

Page 26: CYBER PHYSICAL DESTINATION OPERATING PANEL

Touch Inputs

Valid Touch Inputs

Signal to Device Board

Figure 2.10: Capturing the touch event

Another implementation of the serial communication process with the rest of thetouch device processes is to run the RS485 serial communication on a separate thread.Running a serial communication thread within a class of the existing touch deviceapplication software delivered an improvement on the overall performance of the touchdevice and better execution of the serial communication process together with otherapplication processes. System halting and not been synchronized with the GUI waseliminated. This was not a perfect solution as running the thread within the touchdevice application class seldom result in a bottleneck access to the GUI communicationas they are sharing the same touch device controller. An improved solution is a serialcommunication thread running on its own class with a dedicated device controller.

Application Software

Touch Device Utility

Graphical User Interface

Touch Device Controller

comm. plugin

audio player

system manager

Device Board Controller

Figure 2.11: Serial communication application structure

17

Page 27: CYBER PHYSICAL DESTINATION OPERATING PANEL

Figure 2.11 application software architecture ensures better communication anddata transfer between the touch device processes and prevents resource jamming. Run-ning a dedicated thread, implementing thread and serial terminal functions such asmutex locking and unlocking and terminal memory/register flushing, the serial com-munication process has an increased performance rating. The serial communicationprocess is not executed alone in the touch device. It also transfers and receives datafrom other touch device processes. The inter process connections and communicationsare implemented with signals and slots connections for coordinated and synchronizedtouch device processes.

2.2.5 Serial Communication with RS232 Protocol

The RS232 is a standard that defines the electrical characteristics and timing of signals.The RS232 present on the main board is used mainly for monitoring the processesrunning on the main board with a computer. RS232 is a serial port which can inter-face with a computer RS232 serial port or with a USB port through a RS232-to-USBconverter. To monitor the processes and remotely control the main board, serial com-munication tool such as minicom is used. The tool is installed and configured to thecorrect accessing port and if required a password will be used to access the destinationmain board. On successful interfacing as shown in Figure 2.12, the main board canbe issued shell commands from the terminal either to perform a specific function ormonitor an ongoing process.

rs232

Main Board

UC-232A

PC

USB Port

Figure 2.12: RS232 Serial communication

A quick setup of minicom for a Linux machine can be performed by first installingminicom with sudo apt-get install minicom then launched for configuration with sudominicom. The serial port settings is configured by changing the serial port value to

18

Page 28: CYBER PHYSICAL DESTINATION OPERATING PANEL

the serial port (USB) to which the cable is connected such as /dev/ttyUSB0. Also theconfiguration setup is saved as dfc before exiting.

Before initializing communication, the serial port device group is added to dialoutwith the command sudo addgroup GROUPNAME dialout. Executing the commandssudo chmod a+a /dev/ttyUSB0, sudo chmod a+w /dev/ttyUSB0 and sudo chmod a+r/dev/ttyUSB0 enables read/write data transfer in the port. Afterward, the serial port accessto the main board is displayed showing a command line such as DEVICElogin:. Use-ful diagnostics commands are cat /var/log/Xsession.log, dmesg and less /var/log/Xses-sion.log for monitoring the progress of the main board processes.

19

Page 29: CYBER PHYSICAL DESTINATION OPERATING PANEL

3 DEVICE BOARD

The device board is an embedded system [21] designed as a general purpose microcon-troller to interface and handle the communication between the touch device main boardand varying number of external devices such as sensors and actuators used for the en-hancement of the touch device DOP. As illustrated in Figure 3.1, it is the backbone forthe improvement of the normal DOP into a cyber physical DOP.

SoC

RS485

USB

MAIN BOARD

Micro Controller

RS485

GPIO

DEVICE BOARD

LDR

LED

ERM

MIC

DEVICES

Figure 3.1: Touch device and Device board interface.

At the core of the board is a microcontroller capable to accepting both analog anddigital signals and also outputting analog (PWM) and digital signals [22]. Connectedto other electronic components, the device board acts as a base station for differenttypes of sensors such as a photo sensor (LDR) and actuators such as an ERM motor.It is in charge of their functionalities either giving operation commands to them oraccepting data input from them. Its core functions include accepting signals from allconnected sensors, processing such signals and communicating the processed signalsto the touch device main board. On the reverse path, it listens to the touch device mainboard for instructions, it reads such instructions and perform the actions specified suchas the operation pattern of an actuator connected to its output terminal.

The device board is a combination of several electronic circuits that are controlled

20

Page 30: CYBER PHYSICAL DESTINATION OPERATING PANEL

with a microcontroller. Each sub-circuit is in charge of the functionality of a specificsensor or actuator. Detailed explanation of each individual sub-circuit and the sensoror actuator it controls will be presented in chapter four for automatic volume control,chapter five for haptic feedback system, chapter six for semiotic lighting system andchapter seven for automatic backlight control.

3.1 Hardware Design

The hardware design is made to fit into the existing design of the touch device withlittle or no modification. It is the integration of new components into the normal DOPto provide cyber physical system functionalities and therefore designed in harmonywith the normal DOP (touch device DOP).

3.1.1 Mechanics

The mechanics deals with how the device board fits into the current structure of thetouch device used for prototyping. The device board is a printed circuit board with di-mension 35.0mm x 59.5mm which fits into the board compartment of the touch deviceof dimension 50mm x 70mm.

Device board

Front panel

Figure 3.2: Device board mechanics.

The device board fits into the space behind the front panel (as shown in Figure 3.2)through which communication and power supply cables are connected.

21

Page 31: CYBER PHYSICAL DESTINATION OPERATING PANEL

3.1.2 Electronics

The electronic board is a microcontroller circuit consisting of various electronic com-ponents such as resistors, transistors and capacitors. It is made up of sub-circuits ded-icated to specific operations such as controlling the semiotic lighting system. At thecenter of its operation is the Atmega328 microcontroller [23][24]. The microcontrollerworks with a 16MHz clock oscillator grounded with 22pf capacitors at both terminals.A voltage supply of 5v is used for normal operation as well as to reset it through itsinput pins 7 and 1 respectively. A 10k ohm resistor is connected between the reset pinand the 5v supply to prevent unscheduled reset during its normal operation. For properanalog data analysis, the microcontroller analog voltage input and analog referencevoltage (pins 20 and 21) are both connected to the 5v operation voltage and to a lowpass filter circuit. The serial communication hardware in the device board is providedby a max3430 IC connected to pins 2, 3 and 4 for receiving data, transmitting data andcontrolling the IC chip respectively. The other pins on the microcontroller are used forconnecting to its sub-circuitry.

Figure 3.3: Device board circuitry.

The Device board was developed in several stages and had first the Arduino nano[25] as its core microcontroller. Starting with several prototypes of each sub-circuitry

22

Page 32: CYBER PHYSICAL DESTINATION OPERATING PANEL

with a breadboard, the first device board prototype was developed with all four func-tionalities (automatic volume control, haptic feedback system, semiotic lighting systemand automatic back light control) with the circuit diagram shown in Figure 3.3.

Figure 3.4: Device board Prototype 1.

To the first prototype (Figure 3.4), integration testing to the touch device, per-formance testing, design modification from mechanics and usability perspective, per-formance improvement modification and stability testing were performed for each sub-circuitry. The results were used to enhance its design resulting in the development ofprototype 2 (Figure 3.5). Among the modifications are size ratio, power surge protec-tion, replacement of Arduino nano microcontroller with the Atmega328 microcontrol-ler, increased number of connectible devices and its design pattern.

Figure 3.5: Device board Prototype 2.

On the successful testing and accreditation of the second prototype (Figure 3.5),the device board was developed into a printed circuit board. The printed circuit board

23

Page 33: CYBER PHYSICAL DESTINATION OPERATING PANEL

version enabled the use of surface mounted components and also the possibility toplace components on both sides of the board with less risk of shorting circuiting thecomponents due to its size. For the development of the printed circuit board, a separatelayout and component placement diagram was used.

Figure 3.6: Device board Printed Circuit Board Diagram.

With the layout diagram (Figure 3.6), component placement diagrams, surfacemounted and through hole components list, the device board printed circuit board ver-sion was developed. With the printed circuit board version, similar test were performedto verify its performance and conformance to its predecessor’s architecture and modeof operation. The microcontroller in the printed circuit board device board (Figure 3.7)

24

Page 34: CYBER PHYSICAL DESTINATION OPERATING PANEL

connects to the board through a socket to foster easy removal and reconnection in theneed of reprogramming.

Figure 3.7: Device board Printed Circuit Board.

3.2 Software Design

The microcontroller software architectural design focuses on its communication withthe touch device and with the connected sensors and actuators. Input and output opera-tions are determined by the instructions given by the main board in such a way that theinput from the connected sensors are not analyzed until the main board needs them. Inthe same way, there is no output to actuators until the main board instructs it.

3.2.1 Application Structure

The application is composed of three major layers needed for the device board to ex-ecute the required functions. At the top of its major functions is the communicationwith the touch device main board. The serial communication layer handles this bymaking sure that all data transfer from the main board is received and also all outputdata to the main board is transmitted successfully. Data received or sent ends up at orcomes from the data processing layer respectively. This layer makes sure that instruc-tions from the main board are processed and communicated to the destination layer. Ithandles the data from other layers that needs to be communicated to the main board.

25

Page 35: CYBER PHYSICAL DESTINATION OPERATING PANEL

The sensors and actuators either send or receive instruction as they interact with theenvironment. To monitor such processes, a dedicated layer known as the input/out-put layer interacts with the sensors and actuators and with the data processing layer.Processed instructions by the data processing layer from the main board describingthe actuators operation sequences are communicated to the input/output layer whichin turn relay such instruction to the target actuator. Also the sensor input that is to becommunicated to the main board is first received by the input/output layer. It is trans-ferred to the data processing layer and onward through the serial communication layerto the main board after it has been processed by the data processing layer. AppendixA section A.3 and A.4, shows an implementation of the application structure and theprocedural execution in the microcontroller.

Application Software

Data processing

serial communication

Input/Output control

Figure 3.8: Device board application structure.

As shown in the Figure 3.8, the serial communication layer reads and write serialdata to the touch device main board to and from the data processing layer. The in-put/output layer controls the sensors and actuators either reading from or writing tothem respectively as instructed by the data processing layer.

3.2.2 Software Layers

At the top of the layer is the serial communication layer. The microcontroller has aninbuilt serial communication port which can be used by connecting the transmissionsignal coming from the RS485 IC to Tx, receiver signal to Rx and the transmit selectpin and receive select pin to a single GPIO digital pin [26][27]. Thereby completing

26

Page 36: CYBER PHYSICAL DESTINATION OPERATING PANEL

the serial communication connection path from the main board to the device board[28][29].

To determine if the device will be transmitting or receiving data, transmit and re-ceive select pins are set to high or low through the GPIO digital pin [30][31]. Transmitand receive select pins are connected to the same digital control pin on the microcon-troller. Within the IC of the RS485 chip, the signal to the receive select pin is invertedhence setting the digital control pin to HIGH will be received as 1 in the transmit selectpin and 0 in the receive select pin, thereby setting the microcontroller to transmissionmode [32, 33]. The reverse of setting the digital control pin LOW will set the micro-controller to receiving mode. Unlike the main board running an embedded Linux, thedevice board microcontroller can’t be set to the transmit/receive mode (see chapter 2).

Another means provided by the software API is the creation of a software serial[34] with the serial peripheral interface pins and defining which pins will act as thetransmit Tx, receive Rx and transmit/receive select pins. This is more flexible as itis not bounded to predefined pins connected to serial hardware interface and multipleserial interfaces can be defined for the same microcontroller. There is limited function-ality strength though because the hardware is not accelerated for serial communicationas compared with the hardware serial communication through the Rx and Tx terminals.

The data processing layer handles the processing of sensor data before they arecommunicated to the main board. The time needed to transmit data through the serialinterface from the device board to the main board depends on the size of the dataand this is directly determined by the number of bytes that is to be transmitted. Toeliminate any delay in transmitting or receiving data due to the frequency at whichdata is transmitted or received, the data size is limited to one byte. The encryptionof the data to be transmitted and the decryption of the data received into the one byteformat and from the one byte format respectively is done at the data processing layer.One byte of data is received from the serial communication layer and decrypted. Thedecrypted data is executed or transferred to the input/output layer. The reverse situationinvolves receiving sensor data from the input/output layer and encrypting it into a onebyte data that is transferred to the serial communication layer for onward transmissionto the main board.

The most important task of the data processing layer is to time [35] the operationprocess of the microcontroller in order to be in synch with the touch device commu-nication sequence. When data is received, processed and responded to, it must be at a

27

Page 37: CYBER PHYSICAL DESTINATION OPERATING PANEL

constant time sequence and fast. Take for example the communication process of theautomatic backlight control and automatic volume control data to the main board asillustrated in Figure 3.9.

Start Timer1 Start Timer2

Stop Timer1

Send Request H

5 sec 40 ms

Stop Timer2

Start Reading Serial Data

Figure 3.9: Communication sequence of backlight and volume control data to themain board.

The main board requests for the data after 5sec by sending a coded H value to thedevice board and expect a respond within 40ms at which point it starts reading theavailable serial data or report a failed read. Within the 40ms time interval, the deviceboard will read the LDR (see chapter 7) and electret microphone values (see chapter4), compute their collective values and encrypt it into a one byte data. The deviceboard performs its task within 30ms and starts transmitting for another 20ms therebytransmitting 10ms before the main board is ready to read and 10ms after the mainboard reading task is assumed completed in order to allow time for a reading retry insituations of failed read. This is illustrated in Figure 3.10.

Decrypt H Mic error read

Mic correct read

LDR read

Compute and encrypt

20ms µs µs 15ms 14.9ms µs

Figure 3.10: Communication sequence of back light and volume control data fromthe device board.

28

Page 38: CYBER PHYSICAL DESTINATION OPERATING PANEL

The final layer which is closest to the physical devices is the input/output layer.It mainly control and check the state of the input/output pins to which the physicaldevices are connected. These pins are mostly analog and digital pins and the input/out-put layer uses the library function to read from and write to them. Sensor pins arealways read from and mostly connected to the analog pins while actuators pins arewritten to and are mostly connected to the digital pins or pulse width modulated digitalpins. The actions of the input/output layer are controlled by the instructions from thedata processing layer. The data processing layer instructs the input/output layer to readfrom sensors when the state of the physical environment is needed and the data read isreturned to the data processing layer. Actuators are controlled by the input/output layeras determined by the data processing layer instruction. It include setting the microcon-troller pin to which an actuator is connected to either HIGH for active functionalityor LOW for no functionality. In case of the pulse width modulated digital pin, theinput/output layer set its value to a value between 0 and 255 as instructed by the dataprocessing layer.

With other programming elements and libraries, the collaboration of these layersensures that data from the environment is successfully communicated to the touchdevice main board. Also the instructions from the main board regarding the state ofthe sensors and actuators are successfully communicated to the device board in a timelymanner.

29

Page 39: CYBER PHYSICAL DESTINATION OPERATING PANEL

4 AUTOMATIC VOLUME CONTROL

The volume of the touch device is controlled by the value selected in the volume controlinterface. This value in turn determines the loudness of the touch device speaker andthe loudness is directly proportional to the energy consumed by the volume controlsystem. Setting the volume from a constant loud state of high energy consumptionor a constant low state of inaudible sound can be achieved through automatic volumecontrol. With it, the energy consumed and audibility of the system can be balancedto suite the current environmental situation. In situations such as a noisy room, thevolume can be increased to enable users to hear the touch device announcement and ina quiet room, the volume is reduced to prevent an instance whereby a too loud devicecreates discomfort to its user.

Regulating the volume of the touch device automatically can be achieved throughthe use of the surrounding sound to determine the required volume level. With the useof a USB microphone and an electret microphone, this work researches and demon-strates the automatic volume control functionality of the touch device and how it helpsnot only to make the touch device volume system self-regulating but also in reducingthe need of human attention in controlling the volume system in various situations.

4.1 Hardware Design

Part of the output systems of the touch device is the audio system which notifies a userof the action taken by the device to the corresponding given input. For this outputto serve its purpose, the sound produced must be high enough to be heard and nottoo high to irritate the user. Therefore a microphone must be present in the systemto achieve this functionality. To enhance the detection of the surrounding sound, themicrophone must be placed strategically on the touch device assembly in a location thatis in direct path of the surrounding sound. Proper microphone placement and a well-designed electronic circuit for sound processing are required for automatic volume

30

Page 40: CYBER PHYSICAL DESTINATION OPERATING PANEL

control hardware design.

4.1.1 Mechanics

The location of the microphone on the touch device mechanics is very important asshown in Figure 4.1, it determines whether the microphone will detect the exact noiselevel in the surrounding of the device or not. The microphone is placed around thesides of the touch device due to the glass covered front surface and the proximity ofthe back panel to the mounting wall.

right side left side

top side

bottom side

microphone

Figure 4.1: Microphone assembly.

The microphone can be placed on any of the four sides depending on the install-ation position of the DOP. Testing conducted with the microphone on different sidedid not yield any significant differences as it is mainly affected by the direction fromwhich the surrounding noise is generated.

4.1.2 Electronics

The first requirement for automatic volume control is to be able to retrieve the sur-rounding volume which is done with a microphone. The main board contains a USB2.0 port that is used to connect a USB microphone to it.

Another channel of detecting the surrounding sound is with an electret microphoneconnected through a sound filter circuit and having the output of the circuit fed into the

31

Page 41: CYBER PHYSICAL DESTINATION OPERATING PANEL

analog pin of a microcontroller [36]. The analog pin is connected to an ADC whichbasically converts the analog signal into a digital format that can processed by themicrocontroller.

Figure 4.2: Automatic volume control circuitry.

From the electret microphone to the microcontroller are electronic componentssuch as resistors, IC and capacitors. The complete circuitry is shown in Figure 4.2.The IC amplifies the sound received through the microphone before it is transmitted tothe microcontroller.

32

Page 42: CYBER PHYSICAL DESTINATION OPERATING PANEL

4.2 Software Design

Input from the microphone contains the data needed to determine the required volumelevel of the touch device. For the USB microphone, it first needs to be detectable bythe Linux kernel and an audio driver is needed to translate the audio input.

For the electret microphone, the audio data from the surrounding is first processedby the microcontroller before it is transmitted through the serial port to the main boardwhere it uses the data to set the volume of the system. These complete processesare needed to complement the hardware structure with the automatic volume controlfunctionality.

4.2.1 Application Structure

The automatic volume control can be implemented using a USB microphone or throughan electret microphone. Using a USB microphone, the application structure is onlydesigned around the main board software including its kernel. With an electret micro-phone, the application structure is of two parts; first the input retrieval and processingby the device board and then the volume setting by the main board.

4.2.2 Linux Kernel requirement

Connecting a USB microphone to the main board is not enough for it to be accessibleand useful [37]. The main board runs an embedded Linux and the Linux kernel shouldbe able to detect and register the presence of the microphone. For the USB micro-phone to be detectable and useful, its driver should be enabled within the Linux kernel[38][39]. Appendix A Figure A.3, shows the needed driver to be enabled.

Enabling the USB driver in the Linux kernel setup will make the USB microphonedetectable and usable. However, Linux kernel 2.6.35 basic USB settings only enablesthe communication between USB 1.0 devices on a USB 1.0 port or the communicationbetween USB 2.0 devices on a USB 2.0 port. The touch device main board containsa USB 2.0 port and if the USB microphone is a USB 1.0 device, it will fail with theerror "not enough bandwidth". More kernel support can be obtained by enabling theTransaction translator feature (see appendix A Figure A.4) of the Linux kernel thatenables the communication between a USB 1.0 device and a USB 2.0 port [40].

For the USB port and Linux kernel to support audio functionality, settings in Figure

33

Page 43: CYBER PHYSICAL DESTINATION OPERATING PANEL

4.3 should be made in the Linux kernel. The settings enable communication betweenany type of USB device on any type of USB port.

Figure 4.3: Linux kernel configuration.

Building a custom kernel to meet a specific requirement or in this case to be able todetect and use a USB 1.0 device on a USB 2.0 port requires complete troubleshootingto verify that the built kernel meets the requirement and other supported features of thekernel are still operational. With the new configuration, a new Linux kernel is compiled(see chapter 2, Table 2.1). A good starting point for troubleshooting as shown in Table4.1, is to check that the Linux version in use is the newly compiled version.

root@DEVICE:/#chk linux version uname –aroot@ DEVICE:/#cat /proc/version

Table 4.1: Checking the Linux version [41][42].

Having the correct Linux version in use, the next troubleshooting phase is to checkthe sound feature of the kernel as shown in Table 4.2. This starts with checking thesound card detected by the kernel (that is checking if it detects the USB microphone).

root@DEVICE:/#cat /proc/asound/cardsroot@ DEVICE:/#cat /proc/asound/pcm

Table 4.2: Check connected sound card [41][42].

The proc/asound directory will only exist after the first ALSA module (USB mi-crophone) is inserted. If the ALSA module is connected and there is no proc/asound,

34

Page 44: CYBER PHYSICAL DESTINATION OPERATING PANEL

it simply means the ‘snd’ module (sound driver) was not loaded properly [43][44]. Asimple restart of the kernel fixes such problem. To view the available device node forplaying (speaker) and recording (USB microphone) when the play and record com-mand is invoked, the commands in Table 4.3 are used.

root@DEVICE:/#ls /dev/sndroot@ DEVICE:/#lsusb - -verbose | lessroot@DEVICE:/#arecord - -list – devicesroot@ DEVICE:/#aplay -l

Table 4.3: Check playing and recording devices [41][42].

The output of the commands shows the list of connected devices that can be usedfor sound recording and playback. The inclusion of the USB device to the list confirmsthe proper functioning of the custom Linux kernel created for sound support. A quickreview of all the connected devices to the main board detected by the new kernel canbe performed with the command in Table 4.4. This is important to know in case thenew kernel can no longer detect a previously detected device.

root@DEVICE:/#cat /proc/bus/input/devices

Table 4.4: Check detected device [41][42].

The listed devices in the command output can be compared to a device list con-figured for the kernel in order to ascertain the presence of all devices needed for theproper operation of the touch device. Another means to check if the sound configur-ation is functional is through the ALSA system with the command in Table 4.5. Thisconfirms that the ALSA system has registered the sound device.

root@DEVICE:/#alsamixer

Table 4.5: ALSA sound device [41][42].

If all the expected devices are active in the new kernel, the next troubleshooting stepfor the kernel is to perform a sample recording. The command in Table 4.6 records a 5seconds audio with the USB microphone.

35

Page 45: CYBER PHYSICAL DESTINATION OPERATING PANEL

root@DEVICE:/#arecord –D plughw:1,0 –r 8000 –f S16_LE–c 1 –d 5 sample.wav

Table 4.6: Sample recording [41][42].

The arecord invokes the ALSA recording program within the kernel which willrecord with the sound input hardware registered as 1,0. The recording sampling rateis 8000 Hertz for one channel recorded in the format S16_LE (Signed 16 bit LittleEndian) and stored as sample.wav file. The recorded sample.wav file can be playedwith the aplay command in Table 4.7.

root@DEVICE:/#aplay –D plughw:0 sample.wav

Table 4.7: Sample playback [41][42].

The aplay command invokes the ALSA audio player which plays the audio file(sample.wav) with the sound output hardware registered as 0 (usually a speaker).

4.2.3 Electret Microphone

The use of a USB microphone is not an efficient way of detecting the noise level in thevicinity of the touch device because of the resources demanded by the kernel for realtime analysis of the audio input from the microphone. The electret microphone featureof detecting sound combined with a sound filtration and amplification circuit (Figure4.2) provides another means to automatically control the volume of the touch device.Volume (noise level) of the touch device surrounding is detected, processed and sentas analog signal to the microcontroller. The microcontroller analog pin is connected toan ADC which converts the signal into digital format it can understand [45][46]. Themicrocontroller software features a structure that not only receives a digital format ofthe analog input but also encrypt it to be transferred to the main board through its serialport. Before the volume data is communicated to the main bard, the main board sendsa request for it (see chapter 3, Figure 3.9). The request will initiate the input read fromthe analog pin (ADC), encrypting the data and transmitting it to the main board (seechapter 3, Figure 3.10).

36

Page 46: CYBER PHYSICAL DESTINATION OPERATING PANEL

Application Software(Device Board)

ADC value and predetermination of valid MIC value

MIC Input request and reply (through serial communication)

Input/Output control MIC value

Application Software(Main Board)

System state wake up and timer (volume control)

read /write input signal

Device volume system

Figure 4.4: Automatic volume control application structure.

Figure 4.4 illustrate the electret microphone communication sequence with themain board. After receiving a reply from the device board containing the volumedata, the main board decrypts it and set the volume according to the volume data. Thevolume data contains an instruction to set the touch device volume level either to 2, 4,6, 8 or 10.

4.2.4 Volume Levels

The volume level is controlled by the main board software implemented through acustom audio driver. The normal volume level ranges from 0 to 10 but the volumelevel delivered by the automatic volume control is of the form:

Volume value Volume level

10 high

8 semi-high

6 mid

4 semi-low

2 low

Table 4.8: Selected volume levels.

The reading of the electret microphone is done in stages. Due to interference inthe analog port of the microcontroller, the initial values read from the microphone areinconsistent and not reliable. Therefore the first stage involves reading the microphone

37

Page 47: CYBER PHYSICAL DESTINATION OPERATING PANEL

for 10ms and discarding the values. In the second stage, the microphone is read foranother 10ms or more depending on how long the noise detection is required and themaximum value is stored [47]. Finally, the maximum value which represent the noiselevel in the surrounding is compared with the volume table (predefined noise thresholdtable) and the value representing the volume level to be transmitted is selected from 1(lowest) to 5 (highest) as described in Table 4.9. The MIC value represents the electretmicrophone values obtained through the ADC.

MIC value Volume table Volume level

>0 and <60 1 low

>60 and <150 2 semi-low

>150 and <300 3 mid

>300 and <600 4 semi-full

>600 5 full

Table 4.9: MIC - ADC volume level equivalent.

The microphone value and the automatic backlight control value (see chapter 7)read from the LDR are transmitted at the same time to the main board. When the mainboard requests for the status of the environment, the status sent back includes the noiselevel and the lighting condition of the environment. This method is relevant in orderfor the main board to regulate itself at the same time to all the environmental changesand also to be able to synchronize the monitoring system which in this case is checkedevery 5 seconds. To reduce the data size transmitted in a 20ms transmission time every5 seconds, the environmental data is encrypted as shown in Table 4.10.

38

Page 48: CYBER PHYSICAL DESTINATION OPERATING PANEL

MIC level LDR level No. Alphabet

5 5 1 A

5 4 2 B

5 3 3 C

5 2 4 D

5 1 5 E

4 5 6 F

4 4 7 G

4 3 8 H

4 2 9 I

4 1 10 J

3 5 11 K

3 4 12 L

3 3 13 M

3 2 14 N

3 1 15 O

2 5 16 P

2 4 17 Q

2 3 18 R

2 2 19 S

2 1 20 T

1 5 21 U

1 4 22 V

1 3 23 W

1 2 24 X

1 1 25 Y

Table 4.10: Encryption table of electret microphone and LDR values.

The microphone and LDR values are grouped from 1 (lowest) to 5 (highest). Theyare read separately within a short time interval and the value read is instantly groupedaccording to the predefined threshold. The group equivalent of both values are encryp-ted together with the equation below:

MICandLDRcontrolV alue = a+ (b ∗ (c+ d))

.

39

Page 49: CYBER PHYSICAL DESTINATION OPERATING PANEL

Where:a = MIC_V AL ∗ LDR_V AL

b = MIC_V AL− 1

c = MIC_V AL− LDR_V AL

d = abs(MIC_V AL− 5)

The obtained result is transmitted to the main board through the serial port. Resultsuch as 1 will be transmitted as a 1 byte char A and will represent volume level 5 andback light level 5 therefore the volume level will be set to 10 maximum and the backlight will be set to 128 maximum (see chapter 7).

Automatic volume control system for standalone devices such as the touch device(DOP) is a technical feature that adds a huge improvement to the independent oper-ation of the DOP. It promotes energy saving and user comfort. Also relevant in itsindependent functioning is the elimination of the attention and time of an OperationsManager such as a building manager to constantly manually regulate the volume levelto suite the noise level of the surroundings.

40

Page 50: CYBER PHYSICAL DESTINATION OPERATING PANEL

5 HAPTIC FEEDBACK SYSTEM

Haptic feedback takes advantage of the sense of touch to give acknowledgement to userinput. The principle of haptic feedback directly correlates with the tactile response ofusers which means the natural feeling of being touched. Haptic feedback in this workfocuses on giving the users the feeling of being touched with the use of actuators andthe synchronization of the touch feedback feeling with the user’s GUI touch input. Thehaptic feedback system consists of hardware and software components.

5.1 Hardware Design

The haptic feedback generation is on the surface of the touch device and the surfacewhich is made of glass material, is responsive to user touch. Generating feedback onsuch surface is challenging as the localization of the feedback cannot be achieved withhaptic actuators such as eccentric rotating mass motor and linear resonant actuators.This is because of the rigid glass surface of the touch device. But with the use ofthese actuators, the needed haptic feedback can be provided by vibrating the wholetouch screen surface. The actuators are connected to a circuit which is controlled by amicrocontroller according to the touch event from the touch screen.

5.1.1 Mechanics

Touch device haptic feedback operation is centered on the haptic actuator(s) attachedto the back of the Human Interface Device [48]. The actuators are attached firmlydirectly on the surface or in an actuator shaft behind the touch screen. During oper-ation, the haptic actuator performance can be evaluated by its vibration amplitude inG which basically describes how hard it vibrates and its vibration frequency in roundsper minute (rpm) which describes how often the vibration effects are made.

An ERM motor with 9.5G vibration amplitude and 6700rpm [49] placed behind the

41

Page 51: CYBER PHYSICAL DESTINATION OPERATING PANEL

touch screen gives enough vibration feedback when the screen is touched. The forceof the feedback feeling at different points on the touch screen surface is affected by thelocation of the ERM motor behind the screen. Following the default location of hapticactuator assembly behind the touch screen (shown in Figure 5.1) and placing the ERMmotor on one side of the screen gives strong vibration feedback when part of the screenclose to the ERM motor is touched and less towards the opposite end.

Figure 5.1: Haptic motor (9.5G) bottom left placement.

Experimenting further in an attempt to even out the feedback all over the screenwith a different ERM motor of 14.3G and 6600rmp [50] placed at the bottom left po-sition behind the touch device (shown in Figure 5.2) yielded the same haptic feedbackof higher concentration of the feedback effect at the bottom left corner of the touchdevice and less at the opposite end.

Figure 5.2: Haptic motor (14.3G) bottom left placement.

42

Page 52: CYBER PHYSICAL DESTINATION OPERATING PANEL

Another ERM motor with higher force of 13G greater than 9.5G for stronger vibra-tion but with lower rotation speed of 5400rmp for reduced vibration rate [51] connectedto the same south western end behind the touch screen (shown in Figure 5.3) produceda higher feedback but not well distributed across the front surface of the screen as in-tended. Higher haptic effect is obtained at points closer to the ERM motor and less asthe contact point moves away towards the opposite end.

Figure 5.3: Haptic motor (13G) bottom left placement.

Another major mechanics impediment is the sound produced during vibration dueto the resonance effect of the rotating ERM motor and touch device assembly. Thevibrating edges (highlighted in Figure 5.4) of the ERM motor casing which receiveshigh level of vibration due to its direct contact with the ERM motor and its vibrationinto the empty chases of the touch device assembly creates a loud and discomfortingsound.

43

Page 53: CYBER PHYSICAL DESTINATION OPERATING PANEL

Vibrating edges of the ERM motor casing

Figure 5.4: Haptic motor assembly.

Preliminary step to create a firm contact between the ERM motor casing and thechases of the touch device proved positive as a significant reduction in the resonantsound was observed. Also placing the ERM assembly at the center of the touch deviceonly moved the concentrated haptic point to the center of the touch screen and weakerfeedback towards the edges.

There is still the need to eliminate the vibration sound completely and evenly dis-tribute the haptic effect over the front surface of the touch device screen. To that effectseveral mechanic arrangements and actuator placements can be implemented. Using asingle actuator with strong force and rpm is replaced with multiple actuators, less forceand higher rpm such as a 1.7G 11000rpm actuator [52]. The touch screen connected tothe touch device is placed in a vibration isolation layer between the touch screen andthe touch device assembly.

44

Page 54: CYBER PHYSICAL DESTINATION OPERATING PANEL

Insulator

Touch Screen

Insulator Frame

Figure 5.5: Vibration isolation layer.

Placing the touch screen in a vibration isolation layer described in Figure 5.5, loc-alizes the vibration of the actuators to the screen only and not the whole device thusimproving the vibration feedback. Also using multiple actuators with less force at-tached directly to the surface behind the touch screen eliminated the sound due toresonating actuator assembly.

5.1.2 Electronics

The major component in the sub-circuitry that handles the haptic feedback is the hapticactuator (Eccentric Rotating Mass motor or the Linear Resonance Actuator) whichgenerates the vibration effect that is translated into tactile feeling. The haptic actuatorworks with input voltage level between 2v – 3.3v. It is connected to a pull up transistorand other circuit protection components such as rectifier diodes, ceramic capacitorsand resistors [53][54]. The transistor in turn is connected to the microcontroller whichcontrols it through its gate pin. Details presented in Figure 5.6.

45

Page 55: CYBER PHYSICAL DESTINATION OPERATING PANEL

Figure 5.6: Haptic feedback system circuitry.

The control of the transistor by the microcontroller is done by either setting theGPIO pin to which the transistor gate pin is connected to HIGH or LOW and is de-termined by the software running in the microcontroller. The software instruction issent from the touch device when it receives a touch input.

46

Page 56: CYBER PHYSICAL DESTINATION OPERATING PANEL

5.2 Software Design

The software design focuses on how the haptic actuator is controlled with respect tothe user input and how it synchronizes the user input to the haptic feedback. Havingthe actuator vibrating constantly will not simulate the touch input to feedback responseneeded. Hence the software plays a vital role in the proper use of the actuator in hapticfeedback system.

5.2.1 Application Structure

The controlling of haptic feedback process is implemented through a serial commu-nication between the touch device main board and the device board. Software is im-plemented at both ends for receiving the user input signal from the touch screen andtransmitting it through serial communication. At the device board, the serial commu-nication is received and used for activating the haptic actuator.

The haptic feedback cycle begins at the touch device screen input where users sendcommand to the touch device. The main board software receives the touch input andtransmits it through the serial communication port to the device board microcontrol-ler. The first implementation involves transmitting an activation signal for the hapticactuator when the touch screen receives an input signal and after a set time in milli-seconds, a deactivation signal is transmitted. Due to the buffering delay, signals aredelayed and the activation deactivation process is not synchronized with the user touchinput. The second implementation transmits an activation signal through the serialcommunication from the main board when it receives a user input. The device boarddeactivates the haptic actuator after a set time in milliseconds. The tested set timesare 500ms which is too long to simulate a tactile feedback, 60ms which is too shortconsidering the time a user needs to deliver an input command and remove the contactfrom the touch screen and 120ms which is short enough (adequate tactile feeling) forvibration effect produced with 9.5G, 14.3G and 13G ERM motors [55]. Implementingthe same sequence to a LRA did not yield a positive output as much time is needed bythe actuator to reach the maximum vibration strength that can be felt by the user.

47

Page 57: CYBER PHYSICAL DESTINATION OPERATING PANEL

Application Software(Device Board)

Activate Haptic

User Input signal from main board (through serial communication)

Input/Output control Haptic HIGH/LOW

timer in ms

Deactivate Haptic

Application Software(Main Board)

Valid GUI input

GUI input

Send input signal

Figure 5.7: Haptic feedback system application structure.

After many experimentations with the transmission speed, timing and synchroniza-tion of the user input to the haptic feedback, a clear application structure was designedas shown in Figure 5.7. The structure represents the haptic feedback system neededfor the button elements of the GUI. The haptic feedback system software starts withthe input through the GUI by the user which is received by the touch device board andanalysed if it is a valid input or not. Valid inputs are inputs that are registered by theGUI elements and not from the surrounding touch sensitive areas of the device touchscreen. On receiving a valid input from the GUI, the touch device sends an encryp-ted one byte message to the device board through the serial communication port. Theother half of the haptic feedback system is carried out in the device board where theapplication software decrypts the serial data received and activates the haptic actuatorthrough the GPIO pin in the input/output control. At the same time a timer is set to runfor 120ms which will deactivate the haptic actuator when it times out, thereby creatinga tactile effect to the user.

Another GUI element is the slider which works with a scrolling feature comparedto the button element which works like a click element. The scrolling nature of theslider means that the user contact on the touch screen will be established longer thanthe 120ms needed by the button element and therefore a deactivation signal after 120mscannot be sent by the device board through a timer. A different application structure(Figure 5.8) is created to keep the haptic actuator active as long as the user contactis established and to deactivate the haptic actuator immediate the user contact is nolonger established.

48

Page 58: CYBER PHYSICAL DESTINATION OPERATING PANEL

Application Software(Device Board)

Activate Haptic (slider touched)

User Input signal from main board (through serial communication)

Input/Output control Haptic HIGH/LOW

Deactivate Haptic (slider released)

Application Software(Main Board)

Valid GUI input (slider touched)

GUI input

Send input signal

Valid GUI input (slider released)

Figure 5.8: Haptic feedback system application structure for slider UI.

Like the button element, the signal starts from the touch screen where the userenters a command. But instead of just filtering the valid input, the main board ap-plication software also checks the type of input received from the GUI slider. Aftertouching the touch screen, the GUI signal and the validation of the signal generatesa one byte encrypted data which is different when the user releases the slider on thetouch screen. The different types of data are sent to the device microcontroller throughthe serial communication port. Both touch and release signals are received dependingon the event at the GUI layer and each event will result in the microcontroller takingdifferent actions after decrypting the received serial data. If the data contains a touchevent, the microcontroller will invoke the activate haptic command and activate thehaptic actuator. If the decrypted data is a release event, the microcontroller will invokethe deactivate haptic command which will deactivate the haptic actuator. With thisstructure, the haptic feedback will remain active as long as the user establishes contactwith the touch screen and immediately stops when the contact is broken.

A third type of haptic feedback on the slider GUI, similar to the button applicationstructure, is the activation of the haptic actuator when a user slide to a different checkpoint along the slider indicating a building floor number. In this case, the actuator isdeactivated after 120ms to complete the haptic feedback cycle.

5.2.2 Haptic Patterns

Apart from the general tactile haptic pattern whereby the haptic device is activated anddeactivated after 120ms in a pattern “activate-120ms-deactivate”, there can be other

49

Page 59: CYBER PHYSICAL DESTINATION OPERATING PANEL

haptic patterns which can be used to convey other types of touch instructions to theuser and can also indicate the current user location within the touch device applicationenvironment. A tested implementation is a pattern which will be initiated by a serialinput to activate the haptic device and deactivate it after 50ms. The haptic feedbackis short and doesn’t end there as the haptic device is activated again after a 20msbreak and deactivated again after another 50ms. This leads to a pattern “activate-50ms-deactivate-20ms-activate-50ms-deactivate” in a 120ms period. This pattern gives aslightly different touch feeling and it is used to indicate the navigation back and forthbetween touch device application pages.

The haptic feedback system is about communicating with touch sensation. Havingloud sounds and weak feedback defeats the importance of the system. Users are moreresponsive to feedback generated with actuator having over 8G force but irritated bythe sound it produces. For a large touch screen like the touch device (DOP) it is a com-promise between the audible noises generated during haptic feedback and the hapticimpact received. Nevertheless haptic feedback system tested positive to usability test-ing as it simulates an interaction between a more humane object, a validation of userinput and the satisfaction that the device knows that they are waiting for a response.

50

Page 60: CYBER PHYSICAL DESTINATION OPERATING PANEL

6 SEMIOTIC LIGHTING SYSTEM

The sense of sight has proven to be a medium of accepting instruction from the sur-rounding environment. This work takes that into consideration and attempts to commu-nicate through sight with light. Semiotics is a form of communication without soundsbut using signs and symbols such as a lighting system. Touch device semiotic lightingsystem consist of light sources attached to it and communicates to its users with thelights, its status and response to received inputs.

6.1 Hardware Design

The semiotic lighting system uses a Light Emitting Diode (LED) as it source of light.This work features a strip of LEDs with each LED capable of displaying three colors(Red, Green and Blue). The RGB LEDs are place on the touch device to enhancevisibility at different proximity level. In conjunction with a light control circuitry, thesemiotic light communicates different information to a touch DOP user.

6.1.1 Mechanics

The semiotic lighting system is intended to be in line with the current touch devicemechanics and any change should be reduced to the barest minimum. Focus of themechanics design is to find a means to disperse the light wave to defined distance anda means to conduct the heat produced by the LEDs in order to extend their lifespan.

The first design featured the LED strip on all four sides of the touch device coveredwith a protective acrylic material. Light dispersion through a completely transparentacrylic material at close range is not diffused but instead creates a hot spot for indi-vidual LED on the strip. This pattern is not a good means of semiotic light displayand has a low usability rating due to the individual intense light from each hot spot.Another setback with the design is the poor heat conducting property of the material

51

Page 61: CYBER PHYSICAL DESTINATION OPERATING PANEL

with which the sides of the touch device was made and on which the LED strip wasattached. Therefore the design required extensive modification as it showed no meansfor LED light diffusion and heat removal.

Acrylic materials are transparent enough to let light through them and dependingon the density of the film coating of the materials, the intensity of the light passingthrough it can be varied. For the intensity adjustment with dense acrylic material to bepossible, there is a required minimum distance it has to be from the source of light. Thecompact nature of the touch device gives no room for the minimum acrylic distance.Thus having the material too close to the source of light will render hot spots regardlessof its dense nature. Attempts to circumvent this effect resulted in the use of very denseacrylic material [56]. The hot spots were eliminated but the acrylic material was toodense to allow light visibility beyond few centimeters.

The visibility distance of the LED light is crucial to its semiotic purpose. The LEDlight will be present among other lighting in the surrounding where the touch deviceis mounted and should have enough intensity to be visible irrespective of other lightsource brightness. Therefore having it at few centimeters renders it invisible amongstthe faintest surrounding light. The first testing made with a fully transparent acrylicmaterial shows in Figure 6.1 how the hot spots reduces it usability and design quality.

Hot spots

Figure 6.1: Light assembly with acrylic.

The touch device contains an aluminium back panel which is a good conductor ofheat and a good base for attaching the LED strip in order to conduct the heat producedby the LEDs. Also placing the LED strip behind the touch device means that the lightrays are bounced off the mounting wall thus eliminating the hot spots visible with the

52

Page 62: CYBER PHYSICAL DESTINATION OPERATING PANEL

acrylic material to the user and creating a uniform diffused light. The mechanics designtakes advantage of these features and the appropriate changes were made to reflect itas shown in Figure 6.2.

Aluminium panel

Figure 6.2: Light assembly on back panel.

LED light rays bounced off the mounting wall creates a diffused light output (Fig-ure 6.3) but not evenly spread due to the curve nature of the touch device back panel.The aim of placing the LEDs at the back of the touch device is fulfilled as the LEDshave a heat sink and the LEDs light are displayed as a uniform light with no hot spot.On the other hand, a completely spread diffused light needs to be achieved. Meanssuch as deflecting the LEDs light rays to a longer distance with a prism and other re-flective surfaces did not yield the desired result. The distance between the LEDs andthe mounting wall is very small leading to a limited amount of reflective materials thatcan be used to extend the reach of the light rays.

Diffused light

Figure 6.3: Diffused LED light.

53

Page 63: CYBER PHYSICAL DESTINATION OPERATING PANEL

6.1.2 Electronics

The LED strip contains red, green and blue LEDs each powered by a 12 volts powersource. The control of the LEDs is done through the microcontroller. Therefore theLEDs are not directly connected to the 12 volts power source and ground in a closedcircuit, but through a transistor that is switched by the microcontroller. The micro-controller controls the transistor through a GPIO pin connected to the gate pin of thetransistor. Each LED color has a transistor switching circuit connected to the micro-controller through a resistor (see Figure 6.4) and the running program in the microcon-troller switches the LEDs on and off as instructed by the touch device.

Figure 6.4: Semiotic lighting system circuitry.

54

Page 64: CYBER PHYSICAL DESTINATION OPERATING PANEL

Depending on the LEDs used, most LEDs have voltage level between 5v – 24v.The voltage and current level determines the brightness intensity of a given LED andthis should be checked and equated to the level of needed brightness. For this work the12volts LEDs are used with other electronic components such as resistors, transistors,and a microcontroller. Another means of controlling the brightness of the LEDs irre-spective of the power supply is by connecting the LEDs to a pulse width modulatedGPIO pin of a microcontroller. Through the microcontroller pin, the value range 0 -255 can be set to the pin with the set value equivalent to the LED brightness.

6.2 Software Design

The semiotic lighting system is controlled by the application software responsible forpowering on and off the LEDs. One or more LEDs can be powered on at any timeand which LED is on or off depends on what message is to be communicated to theuser. Also the pattern of LED state such a stable on, fast blinking or slow blinkingis also controlled by the application software which is structured to communicate theintended message to the user.

6.2.1 Application Structure

The control of the LEDs is determined by the touch device main board data sent tothe device board. The data that communicate a semiotic pattern to the device boardcan be generated from the user input on the touch screen or by the state of the touchdevice. The touch device used for this work is capable of accepting user request foran elevator. When combined with the semiotic lighting, the data sent to the deviceboard can communicate to the user relevant information concerning the call made toan elevator. Such user input data for semiotic lighting could be separate light mappingto individual elevator, separate light for accepted or rejected elevator calls and separatelight for different elevator service errors. The touch device state can also be commu-nicated to a user through semiotic light before they make an elevator call where eachstate can be represented by different light colors. State representing the touch devicestatus could include: a) the touch device is locked and elevator calls cannot be madewith it, b) the touch device is opened and ready for use, c) the touch device is sleepingand would wake up for use if needed, d) the touch device is out of service and needsmaintenance and e) the touch device is in fire mode.

55

Page 65: CYBER PHYSICAL DESTINATION OPERATING PANEL

The states of the touch device are not only determined by the internal applicationstructure of the device but can also be triggered by external events connected to thetouch device through a state control structure. Structures like building fire alarm sys-tems can be used to trigger the fire mode of the touch device and an access controlsystem can to used to trigger a locked or unlocked state of the touch device.

The communication of the instruction to the device board, either user triggeredinput or touch device status triggered input, is made through the serial communication.The main board receives an input signal from the touch screen or the touch devicesystem manager and encrypts the signal into a one byte data that is sent to the deviceboard.

Application Software(Device Board)

Invoke semiotic light sequence (Light Emitting Diode control)

User Input signal from main board (through serial communication)

Input/Output control LEDs state HIGH/LOW

Application Software(Main Board)

Valid GUI input (semiotic signal)

GUI input

System state (semiotic signal)

Send input signal

State control

Figure 6.5: Semiotic lighting system application structure.

The device board on the other hand receives the main board data, decrypts it andexecutes a semiotic sequence to match the information to be sent to the user. Theinformation sequence determines which LED to be switched on or off through thetransistor gate connected to the microcontroller GPIO pin. The application structure isillustrated in Figure 6.5.

For testing purpose, the main board implements a timer sequence sending differ-ent data to the device board. The transmitted data trigger the powering of the LEDs.This sequence shows the functioning pattern of the LEDs and simulates the user inputfrom the touch screen. To simulate the system management input, the fire mode dataand lock state data are sent periodically which will trigger a fluctuating red light andconstant red light respectively.

56

Page 66: CYBER PHYSICAL DESTINATION OPERATING PANEL

6.2.2 Semiotic Signalization

The ultimate goal of a semiotic lighting system is to communicate with a user any typeof information using the different colors and combinations of lights available to thesystem. There are universal interpretations of lights and their combinations such as ablinking red light denotes danger and a constant green light means a proper functioningdevice. The semiotic lighting system takes advantage of this existing light meaningand also generates other light combinations that can easily communicate its intendedinformation to the user.

In order to create a semiotic lighting system, the number of light colors presentand the possible combination sequences need to be established. The touch device usedfor this work features the light LEDs which emits three basic colors namely red, greenand blue. These colors can be combined to obtain seven different colors namely red,green, blue, lemon, violet, cyan and white. It is established that seven different col-ors are available for the semiotic lighting communication and the next phase involvesthe sequence of displaying these colors to the user in order to communicate the rightinformation.

At the top of the test listing for this work is the fluctuating red light to symbolizethe fire mode and communicate a state of danger to the user. Basic testing shows thatan intermediate 100ms on and off of the red LED is interpreted by the user as a signof danger not only at the vicinity of the device but from any point where the light canbe seen. Also non device users that can see the blinking light understands the dangermessage it is sending.

Other basic testing of the semiotic lighting system includes a constant red lightwhen the touch device is locked with a locking key and a constant green light when thekey is used to unlock the touch device. This is easily interpreted as having the touchdevice not usable and the touch device usable respectively. A flash of green light whenthe user input on the touch screen is accepted and a constant blue light when the userestablishes contact with the touch screen but has not registered and input call.

New semiotic lighting system should be designed to be easily understandable andif needed a basic explanation can be given to a user. Complex semiotic lighting systemthat needs constant explanation of its meaning will fail to attain its semiotic goal ofcommunicating certain information to a user. With respect to elevator call giving,a possible semiotic lighting system that might need basic explanation would be theallocation of different light color to each elevator and displaying such color when an

57

Page 67: CYBER PHYSICAL DESTINATION OPERATING PANEL

elevator is allocated to a user.Possible lighting signalization representing the state of a device are given in Table

6.1.

LED ON THE FRONT PANEL

Color Meaning

RED Device locked

BLUE Device open and in sleep mode

GREEN Device open and ready for use

Table 6.1: Front panel light pattern.

Possible lighting signalization representing the response of a device are listed inTable 6.2.

LED ON THE BACK PANEL (wall splash)

Color Meaning

RED A elevator allocated to user

BLUE B elevator allocated to user

GREEN C elevator allocated to user

LEMON D elevator allocated to user

VIOLET E elevator allocated to user

CYAN F elevator allocated to user

WHITE H elevator allocated to user

Table 6.2: Back panel light pattern.

Using the LEDs connected to a pulse width modulated GPIO pin, pulsation LEDscharacteristics can be implemented into the signalization sequence. For example, sleepmode is not indicated with a constant blue light but with a heartbeat rhythm (pulsating)blue light displayed at the back and front panel LEDs. Pulsating red light displayed inback and front panel LEDs indicating that the DOP is locked and sleeping can be usedto distinguish the information conveyed with a constant red LED that only indicates alocked device.

Human response to light variation both in color and intensity is very swift and aslight change is noticeable. This medium combined with a well structure semiotic light

58

Page 68: CYBER PHYSICAL DESTINATION OPERATING PANEL

guide can be used to communicate any instruction to a user. The semiotic lighting sys-tem received a positive usability rating as basic light signalization such as red for lock,green for good and blinking red for danger are understood instantly. LEDs as an actu-ator for a cyber physical system provides an avenue for a soundless communication.

59

Page 69: CYBER PHYSICAL DESTINATION OPERATING PANEL

7 AUTOMATIC BACKLIGHT CONTROL

The visibility of the content displayed on the touch device screen is greatly affectedby the brightness of its backlight. The backlight is one of the most energy consumingcomponent of the touch device display system and the energy consumption is directlyproportional to its brightness. Therefore reducing the screen brightness will result ina measurable saving in the touch device total energy consumption. Varying the touchscreen brightness has to be well matched with the lighting condition around the touchdevice as the surrounding light also affects the final touch device brightness perceivedby a user. Usually a dark room requires less backlight brightness for its display contentto be visible and a fully lit room requires maximum backlight brightness to make itsdisplayed content visible to a user.

This work targets the control of the backlight by observing the lighting conditionaround the touch device. Photosensitive electronic components such as Light Depend-ent Resistor (LDR) which has the property of varying resistance according to the lightintensity falling on its photosensitive surface is used to determine the controlling prop-erty of the touch device backlight. Together with other electronic component, an auto-matic backlight control system is designed for the touch device used for prototyping inthis work.

7.1 Hardware Design

The touch device used for testing in this work has a sensitive touch screen responsiblefor the display of information to the users. The touch screen also has a backlightsystem incorporated into the main board. This work focuses on using photosensitiveelectronic component to determine the back light brightness in place of the existingmanual backlight brightness selection system.

60

Page 70: CYBER PHYSICAL DESTINATION OPERATING PANEL

7.1.1 Mechanics

Determining the right backlight brightness to which the touch screen backlight willbe set is dependent on the lighting condition in the vicinity of the touch device. Thetarget is to even out the final light perceived by the touch device user and this consistsof the backlight brightness and the touch device vicinity lighting. To capture the touchdevice vicinity lighting condition as perceived by the user standing in front of the touchdevice, the LDR position on the touch device should be located at a point where thelight rays falling on its photosensitive surface will be of the same angle as the line ofsight of the user.

The position of the touch device relative to the position of the surrounding lightsource also affect the position the LDR will be placed on the touch device. Positionsthat will result in shadow casting over the LDR or any form of obstacle preventing thelight received by the LDR to be the same as the light observed by the user should beavoided. With the touch device used for testing in this work, the positions tested for thelocation of the LDR are at the top, at the center and at the bottom of the front surfaceof the device. These positions are shown in Figure 7.1.

Top position for photosensitive

component

Center position for

photosensitive component

Bottom position for

photosensitive component

Figure 7.1: Positions of photosensitive component (LDR).

Placing the LDR at the top position on the touch device front panel put the LDR ina good light detection position. The tilted shape of the touch devices angles the LDRtowards a ceiling mounted light source and no form of obstruction along the path oflight. The center position also provided a positive result with no significant differencefrom the result obtained when the LDR was placed at the top position but with a rarerisk of a tall user obstructing the light path to it. Unlike the top and center positions,

61

Page 71: CYBER PHYSICAL DESTINATION OPERATING PANEL

the bottom position did not yield a good result due to its closeness to a user and highpotential of having shadow cast over it. The path of light rays to it can easily bedisrupted by any user and the tilted touch device places the LDR much closer to theuser than the other position which makes proper light detection impossible.

7.1.2 Electronics

A part of the device board is the sub-circuitry (Figure 7.2) that handles the function-ality of the automatic backlight control. The device board automatic backlight controlsystem is responsible for collecting data relating to the surrounding lighting conditionaround the touch device and for communicating such data to the main board. Themain board in turn sets the touch device backlight brightness level based on the datareceived.

The major component in the automatic backlight control system sub-circuitry is thephotosensitive component (LDR). Connected to the sub-circuitry are other electroniccomponents such as resistors and capacitors needed for the circuit protection [57]. TheLDR readings are feed into the microcontroller through an analog pin connected to anAnalog to Digital Converter (ADC). The ADC is needed to convert the analog datafrom the LDR into a digital form that is understandable by the microcontroller.

A closer look at the LDR shows a photosensitive compound known as cadiumsulphide aligned in a track shaped across the surface of the LDR. On either sides ofthe track are metal film contacts needed for connecting the LDR. One of the metal filmcontact is connected to a 5v power source and the other is connected to the analogpin of a microcontroller. Depending on the intensity of light on the photosensitivesurface of the LDR, the voltage value from the LDR 5v pin to the microcontrolleranalog pin varies from 0 to 5. Maximum resistance resulting in no voltage conduction(0v) through the LDR is obtained in a dark environment and a fully lit surroundinggives 5v reading to the microcontroller pin through the LDR. The microcontroller ADCconverts the 0 - 5v of the LDR into a value between 0 and 1024 which is analysed forthe touch device backlight control.

62

Page 72: CYBER PHYSICAL DESTINATION OPERATING PANEL

Figure 7.2: Automatic backlight control electronics circuitry.

The complementary circuitry for controlling the touch device backlight is locatedon the touch device main board. As part of large SoC circuitry, the backlight con-trol system is controlled by a GPIO pin located in the SoC processor. The controlpin provides a pulse width modulated signal to the backlight LED through a Schmitt-trigger inverter. The Schmitt-trigger inverter is a comparator circuitry that convertsanalog signals to digital signals. The Schmitt-trigger inverter functionality means thatit retains its converted digital output value until there is a sufficient increase or decrease

63

Page 73: CYBER PHYSICAL DESTINATION OPERATING PANEL

in the analog input value to trigger a change in the converted digital output. Together,the main board and device board automatic backlight control system can read the sur-rounding light value and trigger a change in the touch device backlight brightness.

7.2 Software Design

As important as the electronic system of the automatic backlight control, there is theneed for a supporting software infrastructure to fully implement its autonomous at-tribute. The retrieval of LDR value at the time when the backlight is to be adjusted,the communication of such values to the main board and other support functionalityneeded for the complete execution of an automatic backlight control are implementedthrough software.

7.2.1 Application Structure

Implementing a software structure for the control of an automatic backlight controlsystem, on top of a hardware structure divided into two complimentary units, featurestwo separate but interacting software platforms. The key to the separate platforms isthe control of the separate hardware and their functions. As earlier explained, the LDRis connected to the device board microcontroller ADC through an analog pin which isresponsible for obtaining the surrounding light brightness falling on the touch device.On the device board, an application structure is design to handle the data retrieval fromthe ADC. The device board software also handles the analysis of the obtained data todetermine if it is relevant enough for the brightness of the touch device backlight to bechanged. Relevant data is encrypted and communicated to the main board also by thedevice board software.

After sending the data for the control of the touch device backlight, the applicationstructure (shown in Figure 7.3) implemented on the main board is responsible for de-crypting the retrieved data and adjusting the device backlight according to the receivedinstruction.

64

Page 74: CYBER PHYSICAL DESTINATION OPERATING PANEL

Application Software(Device Board)

ADC value and predetermination of valid LDR value

LDR Input request and reply (through serial communication)

Input/Output control LDR value

Application Software(Main Board)

System state wake up and timer (back light control)

read /write input signal

Back light contol pin interface

Figure 7.3: Automatic backlight control application structure.

For the touch device used for testing and prototyping the automatic backlight sys-tem, the main board application software requests for the LDR value every set timeinterval (see chapter 3, Figure 3.9) and immediately after waking from sleep mode.With the set time interval, the amount of time needed to adjust the screen brightnessafter the surrounding light brightness changes is known (maximum 5 seconds aftersurrounding light brightness changes). At sleep mode, the screen brightness level isset to zero brightness therefore requesting the LDR value on waking up will enable thetouch device backlight brightness to be set to the appropriate brightness level from azero brightness state. The microcontroller software features a structure that not onlyreceives a digital format of the analog input but also encrypt it to be transferred to themain board through its serial port. The request for LDR data from the main board willinitiate the input read from the analog pin (ADC), encrypting the data and transmittingit to the main board (see chapter 3, Figure 3.10).

7.2.2 Brightness Levels

The backlight adjustment is controlled through a Schmitt-trigger inverter which resultsin a non-linear adjustment of the backlight brightness with respect to the brightnessvalue. This is because the Schmitt-trigger inverter trigger level is also non-linear. Thebrightness value of the backlight ranges from 0 – 255 and the highest brightness levelof full brightness is set with the brightness value 128 while turning off the backlightcan be done by setting the brightness value to 255.

Between the brightness value range of 128 for full brightness level and 255 for no

65

Page 75: CYBER PHYSICAL DESTINATION OPERATING PANEL

brightness level, other brightness values can be grouped and divided into the followingbrightness levels: low brightness, semi-low brightness, mid brightness and semi-fullbrightness. Table 7.1 shows the discernible brightness values and the correspondingbrightness levels.

Brightness value Brightness level

255No

241

240

Low

239238237234230

229

semi-low220215210201

200

mid180185171

170

semi-full155140130

129Full128

1250

Table 7.1: Back light brightness levels.

From the obtained brightness values and levels, five major discernible values wereselected for the automatic backlight control as listed in Table 7.2. These values are

66

Page 76: CYBER PHYSICAL DESTINATION OPERATING PANEL

used to determine the touch device backlight brightness according to the value obtainedfrom the LDR.

Brightness value Brightness level

230 Low

210 semi-low

180 mid

150 semi-full

128 Full

Table 7.2: Selected brightness levels.

The highest LDR reading of the surrounding light is compared with the brightnesstable (predefined brightness threshold table) and the value representing the brightnesslevel to be transmitted is selected from 1 (lowest) to 5 (highest). The LDR valuerepresents the LDR readings obtained through the ADC.

LDR value Brightness table Brightness level

>0 and <300 1 Low

>300 and <500 2 semi-low

>500 and <600 3 mid

>600 and <700 4 semi-full

>700 5 Full

Table 7.3: LDR ADC brightness level equivalent.

The automatic backlight control value and microphone value (see chapter 4) readfrom the electret microphone are transmitted at the same time to the main board. Whenthe main board requests the status of the environment, the sent back status includes thenoise level and the lighting condition of the environment. This method is relevant inorder for the main board to regulate itself at the same time all environmental changesand also to be able to synchronize the monitoring system which in this case is checkedevery 5 seconds. To reduce the data size transmitted in a 20ms transmission time every5 seconds, the environmental data is encrypted (see chapter 4, section 4.2.4) beforetransmission through the serial port to the main board.

The most important benefit of the automatic backlight control system is the energysavings achievable. Using the selected levels of brightness, energy measurement res-

67

Page 77: CYBER PHYSICAL DESTINATION OPERATING PANEL

ults showed that the effect of the brightness value to the energy consumption is linear.The width of the pulse width modulator directly controls the portion of the time thatthe backlight is dissipating a “constant” amount of power. So roughly, if the pulsewidth modulator value is changed by one, a change of 2mA in the current drawn fromthe 24V supply is observed which gives 48mW power dissipation. As an example, thedifference in power dissipation between pulse width modulator (brightness) values 230and 180 is calculated with the equation:

Power = (min− value) ∗ (ratio ∗ I ∗ V )

Where:min is the lowest brightness value.value is the brightness value to be compared.ratio is the rate of change of consumed current per unit increase in brightness value.I is the circuit currentV is the circuit voltageTherefore:Powerdiff. = (230− 180) ∗ (2 ∗ 1mA ∗ 24V )

= 50 ∗ 48mW = 2, 4W

Depending of the surrounding lighting, a corresponding brightness level is selected.With each brightness level transition, the power saved is calculated below:

Powerdiff. = (150− 128) ∗ (2 ∗ 1mA ∗ 24V ) = 22 ∗ 48mW = 1, 06W

Powerdiff. = (180− 150) ∗ (2 ∗ 1mA ∗ 24V ) = 30 ∗ 48mW = 1, 44W

Powerdiff. = (210− 180) ∗ (2 ∗ 1mA ∗ 24V ) = 30 ∗ 48mW = 1, 44W

Powerdiff. = (230− 210) ∗ (2 ∗ 1mA ∗ 24V ) = 20 ∗ 48mW = 0, 96W

Using the automatic back light control, an average power saving of 1.23W is ob-tained. Power saving is essential to all electronic devices and a technical feature thatmust be present in all cyber physical systems as it is crucial to its long term operation.

68

Page 78: CYBER PHYSICAL DESTINATION OPERATING PANEL

8 CONCLUSIONS AND FUTURE WORK

This work has ventured into the world of cyber physical systems and investigated withthe touch device some functionalities it is capable of executing.

8.1 Conclusions

The field of cyber physical systems is constantly growing and holds a future wheredevice interaction with the environment is compulsory, for it to be useful to the envir-onment it is located. It must have the capability to independently communicate backand forth with the environment. This work has demonstrated the incorporation of cy-ber physical functionalities to a non cyber physical system and researched the benefitsof including cyber physical capabilities to a DOP (touch device) and any device ingeneral. As part of a DCS, a cyber physical DOP is more efficient as an independentdevice and as part of an elevator traffic control system than a non-cyber physical DOP.

Devices that were created with non cyber physical capabilities can be improved bythe development of a device board to act as an interface between the device and thephysical environment. An obvious example is the device board developed in this workto serve as an interface between the touch device and the environment. The deviceboard uses a microcontroller to process data retrieved by sensors from the environment,communicate such data to the touch device and also controls the actuators connectedto it according to the instructions received from the touch device.

Starting with the energy saving methods and advantages when devices can commu-nicate with their immediate environment to varying their work load to user satisfactionprovided by cyber physical devices, features such as automatic backlight and volumecontrol, semiotic lighting and haptic feedback have proven to be relevant functionalit-ies. They can help improve the acceptability of future devices. Features such as auto-matic backlight and volume control not only help to reduce the energy consumptionby devices but also to reduce the human dependence of devices. Dependence such as

69

Page 79: CYBER PHYSICAL DESTINATION OPERATING PANEL

manually regulating the device volume in different noise level situations and observingthe brightness level of the surrounding in order to adjust the device backlight bright-ness accordingly thus preventing: a) too loud devices in a silent environment, b) toolow volume situation in a noise environment, c) too bright display in a poorly lit envir-onment and d) poorly lit display overshadowed by a bright environment. Also featuressuch as semiotic lighting and haptic feedback exponentially increases the usability ofdevices as this engages multiple sense organs of a user and gives the perception of atwo way more humane interaction with a machine.

As the name suggest, cyber and physical will be the trend in technological devel-opment until the barrier between the physical world and cyber world is minimized orcompletely eliminated. This work has given insight and means for future devices andnon cyber physical devices to embrace their cyber physical potential and become partof the trend of technology.

8.2 Future work

The functionalities researched and demonstrated are still a growing area in the fieldof cyber physical system development and the future holds a better and easily imple-mentable solutions. Taking a look at the tactile or haptic feedback system which wasimplemented with eccentric rotating mass motor could also be implemented with a lin-ear resonant actuator or piezo actuator [58]. The research into those actuators showedthat the current technology does not support their usage due to the size of the touchdevice. Available LRA does not have enough force to deliver the needed haptic feed-back and the power requirement by the off the shelf piezo actuator is not supported bythe touch device circuitry. Future advancement in the cyber physical system techno-logy can provide easily available LDR and piezo actuators for better haptic feedback.Other haptic feedback system that does not require the use of a motor such as theteslatouch [59] which uses electrovibration for haptic feedback shows a promising in-novation into machine tactile technology.

Semiotic lighting technological level is already quite advanced and for this worka future task would be to develop a semiotic lighting system that can be universallyacceptable for the elevator signalization domain. Also the possibility to implementa lighting system with smaller and more intense light with better allure to user eyesshould be investigated. This can be implemented with diffusers, through diffraction

70

Page 80: CYBER PHYSICAL DESTINATION OPERATING PANEL

and with pulse width modulated signals.Using the USB microphone for this work posed a scenario whereby the imple-

mentation will disrupt the normal functionality of the touch device due to the absenceof dedicated processing resources for real time audio analysis and processing. Theabsence of a dedicated digital signal processor can be complemented with a softwareversion and all sound processing handled by software. Creating such software wasout of the scope of this work and would make a good future research work. The useof electret microphone works as designed but with reduced functionality due to thelimitations and sound sensitivity of the microphone. Further prototyping could beimplemented with components such as the condenser microphone and designing ad-vanced pre-processing circuitry [60] to enhance the sensitivity of the automatic volumecontrol system to the surrounding sound level. All embedded systems have softwarestructures (embedded software). Understanding libraries [61] and developing dedic-ated libraries [62], dedicated software structures could be developed for the domain ofcyber physical technology.

There are possible enhancements that can be implemented to this work. Apartfrom those listed above, the serial communication can be improved to support two waysimultaneous communication for reading and writing in order to increase the efficiencyof data transfer and response time to environmental changes of the device. The featuresinvestigated worked perfectly with room for improvement as technology advances.

71

Page 81: CYBER PHYSICAL DESTINATION OPERATING PANEL

BIBLIOGRAPHY

[1] Noergaard T. Embedded operating systems - part 1: Process implementa-tion. Computer, 2013. URL: http://edn.com/design/systems-design/4411129/Embedded-Operating-Systems---Part-1--Process-implementation.

[2] Maxim Integrated. Max3430, 2013. URL: http://www.maximintegrated.com/datasheet/index.mvp/id/3735.

[3] cplusplus.com. ioctl and tiocsrs485 problem, 2013. URL: http://www.cplusplus.com/forum/unices/48693/.

[4] mkssoftware.com. Data structure containing terminal information, 2013. URL:https://www.mkssoftware.com/docs/man5/struct_termios.5.asp.

[5] Linux Kernel Organization. The linux kernel archives, November 2013. URL:https://www.kernel.org/.

[6] Kroah-Hartman G. LINUX KERNEL IN A NUTSHELL. O’Reilly Media Inc.,United States of America., December 2006.

[7] Zhang A. Understanding the configuration and usage of i.mx53 muxio pins fora customizing board -blog archive, December 2013. URL: https://community.freescale.com/thread/258624.

[8] Kroah-Hartman G. Corbet J., Rubini A. LINUX DEVICE DRIVERS. O’ReillyMedia Inc., United States of America., third edition edition, January 2005.

[9] Mentor Graphics. Embedded software, November 2013. URL: http://www.mentor.com/embedded-software/sourcery-tools/sourcery-codebench/editions/lite-edition/.

[10] Mentor Graphics Inc. Sourcery CodeBench Lite 2013.05-23: Getting Started.Computer, 2013.

[11] explainshell.com. URL: http://explainshell.com/.

[12] Acme Systems srl. Rs485 lines, January 2014. URL: http://www.acmesystems.it/rs485.

72

Page 82: CYBER PHYSICAL DESTINATION OPERATING PANEL

[13] Langer M. Rs485 on embedded linux boards, January 2014. URL: http://armbedded.eu/node/322.

[14] Michael R. Serial Programming Guide for POSIX Operating Systems. Computer,1999. URL: http://www.cmrr.umn.edu/~strupp/serial.html.

[15] Vernon C. The linux serial programming howto, September 1999. URL: http://www.lafn.org/~dave/linux/Serial-Programming-HOWTO.txt.

[16] Freescale Semiconductor Inc. i.MX53 Multimedia Applications Processor Ref-erence Manual, 2012.

[17] Freescale Semiconductor Inc. i.MX53 System Development User’s Guide, 2011.

[18] Landley R. Linux Documentation: GPIO Interfaces, 2013. URL: https://www.kernel.org/doc/Documentation/gpio/gpio-legacy.txt.

[19] Kerrisk M. Linux Programmers Manual, 2013. URL: http://man7.org/linux/man-pages/man3/errno.3.html,http://man7.org/linux/man-pages/man2/fcntl.2.html,http://man7.org/linux/man-pages/man2/open.2.html.

[20] die.net. termios(3)-Linux man page. URL: http://linux.die.net/man/3/termios.

[21] Marwedel P. Michael E. Embedded System Design 2.0: Rationale Behind aTextbook Revision. WESE, Taipei, Taiwan, 2011.

[22] Atmel Corporation. 8-bit Microcontroller with 4/8/16/32K Bytes In-System Pro-grammable Flash, 2009.

[23] circuitstoday.com. Introduction to atmega32 (avr series) 8bit micro-controller. Computer, 2014. URL: http://www.circuitstoday.com/atmega32-avr-microcontroller-an-introduction.

[24] mouser.com. Atmel atmega328 8-bit avr mcus, January 2014. URL: http://mouser.com/new/atmel/atmelatmega328/.

[25] arduino.cc. Arduino nano, January 2014. URL: http://arduino.cc/en/Main/arduinoBoardNano.

[26] Axelson J. Designing rs-485 circuits. Computer, 2013. URL: http://www.lvr.com/rs-485_circuits.htm.

[27] BB Electronics. Basics of the rs-485 standard. Computer, 2013.

[28] Arduino. Reference: Serial, 2013. URL: http://arduino.cc/en/Reference/Serial.

[29] Bies L. Rs485 serial information. Computer, 2013. URL: http://www.lammertbies.nl/comm/info/RS-485.html.

73

Page 83: CYBER PHYSICAL DESTINATION OPERATING PANEL

[30] Moxa Inc. The secrets of rs-485 half-duplex communication. Computer, 2009.

[31] Maxim Integrated Products Inc. Guidelines for Proper Wiring of an RS-485(TIA/EIA-485-A) Network, 2012.

[32] Moxa Inc. The basics of rs-232/422/485. Computer, 2013.

[33] Acme Systems. Rs485 basic hardware interface. Computer, 2013. URL: http://www.acmesystems.it/?id=28.

[34] arduino.cc. Softwareserial library, January 2014. URL: http://arduino.cc/en/Reference/SoftwareSerial.

[35] Ford M. How to write timings and delays in arduino, February 2014.

[36] Patarroyo J. Sound sensors, February 2014. URL: http://wiki.wiring.co/wiki/Sound_Sensors.

[37] Tranter J. The Linux Sound HOWTO. Jeff Tranter, July 2001. URL: http://www.tldp.org/HOWTO/Sound-HOWTO/.

[38] Axelson J. Access usb devices from a linux usb embedded host. Computer,2013. URL: http://www.lvr.com/access_usb_devices_from_linux.htm.

[39] Audacity Team. Usb mic on linux, December 2013. URL: http://wiki.audacityteam.org/wiki/USB_mic_on_Linux.

[40] 2013. URL: http://forums.gentoo.org/viewtopic-t-920704-start-0.html.

[41] Sessink V. Alsa-sound-mini-HOWTO. Valentijn Sessink, November 1999. URL:http://www.tldp.org/HOWTO/Alsa-sound-6.html.

[42] die.net. Linux Documentation, 2013. URL: http://linux.die.net/man/.

[43] ALSA-project.org. Advanced linux sound architecture (alsa), December 2013.URL: http://www.alsa-project.org/.

[44] Gentoo Team. Gentoo wiki archives: Howto_compile_kernel_with_alsa, Decem-ber 2013. URL: http://www.gentoo-wiki.info/HOWTO_Compile_Kernel_with_ALSA.

[45] Bloke D. Sound pressure sensor for arduino based on zx-soundboard, February 2014. URL: http://www.instructables.com/id/Sound-Pressure-sensor-for-Arduino-based-on-ZX-soun/all/?lang=es.

[46] Using a microphone with an arduino, 2014. URL: http://electronics.stackexchange.com/questions/36795/using-a-microphone-with-an-arduino.

74

Page 84: CYBER PHYSICAL DESTINATION OPERATING PANEL

[47] Tigoe. Sound in to a microntroller, February 2014. URL: http://www.tigoe.com/pcomp/code/arduinowiring/208/.

[48] Texas Instruments Incorporated. Haptics solutions for erm and lra actuators.Computer, 2013.

[49] Precision Microdrives. Product Data Sheet Uni Vibe 8mm Vibration Motor -20mm Type, 2013.

[50] Precision Microdrives. Product Data Sheet Uni Vibe 12mm Vibration Motor -15mm Type, 2013.

[51] Precision Microdrives. Product Data Sheet Uni Vibe 24mm Vibration Motor -13mm Type, 2013.

[52] Precision Microdrives. Product Data Sheet Pico Vibe 12mm Vibration Motor -3.4mm Type, 2013.

[53] Hewes J. Transistor circuits, January 2014. URL: http://electronicsclub.info/transistorcircuits.htm.

[54] Texas Instruments Incorporated. Haptics: Touch feedback that really moves you.Computer, 2012.

[55] Rao S. High-definition haptics: Feel the difference Computer, 2012.

[56] CEDES Safety & Automation AG. LED-Display Edge CAN & RS485, 2013.

[57] circuitstoday.com. Ldr-light dependent resistors. Computer, 2014. URL: http://www.circuitstoday.com/ldr-light-dependent-resistors.

[58] Blankenship T. Tactile feedback solutions using piezoelectric actuators. Com-puter, January 2011. URL: http://www.maximintegrated.com/app-notes/index.mvp/id/4706.

[59] Bau O. Teslatouch:electrovibration for touch surfaces. Computer, 2010. URL:http://www.olivierbau.com/teslatouch.php.

[60] Arduino. Language reference, January 2014. URL: http://arduino.cc/en/Reference/HomePage.

[61] freeengineer.org. Learn unix. URL: http://freeengineer.org/.

[62] cplusplus.com. URL: http://www.cplusplus.com/.

75

Page 85: CYBER PHYSICAL DESTINATION OPERATING PANEL

A APPENDIX

A.1 Serial IC pin mapping

Serial communication requires an IC for hardware interfacing. The IC pin mapping islisted in Table A.1

Pin Label Function

1 R0 Receiver pin to UART buffer.

2 !RE Receive select pin.

3 DE Transmit select pin.

4 D1 Transmit pin to UART buffer.

5 GND Ground.

6 A Interface A to other RS485 IC.

7 B Interface B to other RS485 IC.

8 Vcc 3.3 volts input.

Table A.1: Serial IC pin mapping [2].

A.2 Communication Structure

Dedicated library structure in Figure A.1 and Figure A.2 are used for serial commu-nication software configuration.

76

Page 86: CYBER PHYSICAL DESTINATION OPERATING PANEL

struct serial_rs485 { __u32 flags; /* RS485 feature flags */ #define SER_RS485_ENABLED (1 << 0) /* If enabled */ #define SER_RS485_RTS_ON_SEND (1 << 1) /* Logical level for RTS pin when sending */ #define SER_RS485_RTS_AFTER_SEND (1 << 2) /* Logical level for RTS pin after sent*/ #define SER_RS485_RX_DURING_TX (1 << 4) __u32 delay_rts_before_send; /* Delay before send (milliseconds) */ __u32 delay_rts_after_send; /* Delay after send (milliseconds) */ __u32 padding[5]; /* Memory is cheap, new structs are a royal PITA .. */ };

Figure A.1: Serial communication structure [3]

struct termios { tcflag_t c_iflag; /* input mode flags */ tcflag_t c_oflag; /* output mode flags */ tcflag_t c_cflag; /* control mode flags */ tcflag_t c_lflag; /* local mode flags */ cc_t c_line; /* line discipline */ cc_t c_cc[NCCS]; /* control characters */ speed_t c_ispeed; /* input speed */ speed_t c_ospeed; /* output speed */ #define _HAVE_STRUCT_TERMIOS_C_ISPEED 1 #define _HAVE_STRUCT_TERMIOS_C_OSPEED 1 };

Figure A.2: Terminal communication structure [4]

A.3 Device board pseudo-example

The handling of sensor input data, serial communication with the main board and thecontrol sequence of actuators are attained with the code sample.

//TIMING

// the interval in mS

#define interval 20

unsigned long time = 0;

77

Page 87: CYBER PHYSICAL DESTINATION OPERATING PANEL

unsigned long time2 = 0;

#define mic_interval 10

unsigned long mic_time;

unsigned long mic_time2;

#define err_interval 10

//COMMUNICATION

#define RS485Transmit HIGH

#define RS485Receive LOW

#define HSerialTxControl 2 //RS485 Direction control

int HSReceived;

//HAPTIC FEEDBACK

#define HAPTIC 13 //RS485 control for haptic feedback

#define HAPTIC2 12 //RS485 control for haptic feedback

#define HAPTIC3 8 //RS485 control for haptic feedback

#define HAPTIC4 7 //RS485 control for haptic feedback

//DESIGN LIGHTING

//back

#define LIGHTBLUE 6

#define LIGHTRED 5

#define LIGHTGREEN 3

//front

#define LIGHTBLUE2 9

#define LIGHTRED2 10

#define LIGHTGREEN2 11

boolean fireset;

//LIGHT SENSOR

int LDRPIN = 0;

int LDRPIN_IN = 0;

int LDR_VAL = 0;

//VOLUME SENSOR

int MICPIN = 3;

int MICPIN_IN = 0;

int maxValue;

int MIC_VAL = 0;

void setup() /****** SETUP: RUNS ONCE ******/

{

// Start the built-in serial port, probably to Serial Monitor

Serial.begin(9600);

pinMode(HSerialTxControl, OUTPUT);

pinMode(HAPTIC, OUTPUT);

pinMode(HAPTIC2, OUTPUT);

pinMode(HAPTIC3, OUTPUT);

pinMode(HAPTIC4, OUTPUT);

pinMode(LIGHTRED, OUTPUT);

78

Page 88: CYBER PHYSICAL DESTINATION OPERATING PANEL

pinMode(LIGHTGREEN, OUTPUT);

pinMode(LIGHTBLUE, OUTPUT);

pinMode(LIGHTRED2, OUTPUT);

pinMode(LIGHTGREEN2, OUTPUT);

pinMode(LIGHTBLUE2, OUTPUT);

maxValue = 0; mic_time = 0;

mic_time2 = 0; fireset = true;

digitalWrite(HSerialTxControl, RS485Receive);

}//--(end setup )---

void loop() /****** LOOP: RUNS CONSTANTLY ******/

{

if (Serial.available() > 0){

HSReceived = Serial.read();

//GREEN LED, LOCK OPEN, FIREMODE OFF

if(HSReceived == 'B'){

if(fireset)fireset = !fireset;

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

digitalWrite(LIGHTRED, LOW); digitalWrite(LIGHTRED2, HIGH);

digitalWrite(LIGHTGREEN, HIGH); digitalWrite(LIGHTGREEN2, HIGH);

digitalWrite(LIGHTBLUE, LOW); digitalWrite(LIGHTBLUE2, LOW);

digitalWrite(HAPTIC, LOW); digitalWrite(HAPTIC2, LOW); digitalWrite(HAPTIC3,

LOW); digitalWrite(HAPTIC4, LOW);

}

//LED OFF, READ MIC, READ LDR, SEND BOTH

else if(HSReceived == 'H'){

digitalWrite(HSerialTxControl, RS485Transmit);

//read mic for 20msec

mic_time = millis();

//read error chk 10ms

while(mic_time2 < (mic_time+err_interval)){

mic_time2 = millis();

MICPIN_IN = analogRead(MICPIN);

}

//read correct chk 10ms

mic_time = millis(); mic_time2 = 0;

while(mic_time2 < (mic_time+mic_interval)){

mic_time2 = millis();

MICPIN_IN = analogRead(MICPIN);

if (maxValue < MICPIN_IN) {

maxValue = MICPIN_IN;

}

}

if(maxValue < 60 ) MIC_VAL = 1;

else if(maxValue > 60 && maxValue < 150) MIC_VAL = 2;

else if(maxValue > 150 && maxValue < 300) MIC_VAL = 3;

else if(maxValue > 300 && maxValue < 600) MIC_VAL = 4;

else MIC_VAL = 5;

/////////////////////////

79

Page 89: CYBER PHYSICAL DESTINATION OPERATING PANEL

//read LDR instantly

LDRPIN_IN = analogRead(LDRPIN);

if(LDRPIN_IN < 300) LDR_VAL = 1;

else if(LDRPIN_IN > 300 && LDRPIN_IN < 500) LDR_VAL = 2;

else if(LDRPIN_IN > 500 && LDRPIN_IN < 600) LDR_VAL = 3;

else if(LDRPIN_IN > 600 && LDRPIN_IN < 700) LDR_VAL = 4;

else LDR_VAL = 5;

///////////////////////////

//pre-calculations

int a = MIC_VAL * LDR_VAL;

int b = MIC_VAL - 1;

int c = MIC_VAL - LDR_VAL;

int d = abs(MIC_VAL - 5);

int mic_ldr_contr = a + (b * (c + d));

/////////////////////////

//start writing for 20ms

time = millis();

while(time2 < (time+interval)){

time2 = millis();

Serial.write(mic_ldr_contr);

}

time2 = 0; mic_time2 = 0; maxValue = 0;

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

}

//LOCKED MODE

else if(HSReceived == 'I'){

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

digitalWrite(LIGHTRED, HIGH); digitalWrite(LIGHTRED2, HIGH);

digitalWrite(LIGHTGREEN, LOW); digitalWrite(LIGHTGREEN2, LOW);

digitalWrite(LIGHTBLUE, LOW); digitalWrite(LIGHTBLUE2, LOW);

digitalWrite(HAPTIC, LOW); digitalWrite(HAPTIC2, LOW); digitalWrite(HAPTIC3,

LOW); digitalWrite(HAPTIC4, LOW);

}

//FIREMAN MODE

else if(HSReceived == 'J'){

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

if (fireset){

digitalWrite(LIGHTRED, HIGH); digitalWrite(LIGHTRED2, HIGH);

digitalWrite(LIGHTGREEN, LOW); digitalWrite(LIGHTGREEN2, LOW);

digitalWrite(LIGHTBLUE, LOW); digitalWrite(LIGHTBLUE2, LOW);

digitalWrite(HAPTIC, LOW); digitalWrite(HAPTIC2, LOW); digitalWrite(HAPTIC3,

LOW); digitalWrite(HAPTIC4, LOW);

fireset = !fireset;

}

else{

digitalWrite(LIGHTRED, LOW); digitalWrite(LIGHTRED2, LOW);

fireset = !fireset;

}

}

//SLEEP MODE

else if(HSReceived == 'K'){

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

80

Page 90: CYBER PHYSICAL DESTINATION OPERATING PANEL

digitalWrite(LIGHTRED, LOW); digitalWrite(LIGHTRED2, LOW);

digitalWrite(LIGHTGREEN, LOW); digitalWrite(LIGHTGREEN2, LOW);

digitalWrite(LIGHTBLUE, HIGH); digitalWrite(LIGHTBLUE2, HIGH);

digitalWrite(HAPTIC, LOW); digitalWrite(HAPTIC2, LOW); digitalWrite(HAPTIC3,

LOW); digitalWrite(HAPTIC4, LOW);

}

//BUTTON ON / OFF

else if(HSReceived == 'Z'){

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

digitalWrite(LIGHTRED, LOW); digitalWrite(LIGHTRED2, LOW);

digitalWrite(LIGHTGREEN, HIGH); digitalWrite(LIGHTGREEN2, HIGH);

digitalWrite(LIGHTBLUE, LOW); digitalWrite(LIGHTBLUE2, LOW);

digitalWrite(HAPTIC, HIGH); digitalWrite(HAPTIC2, HIGH); digitalWrite(HAPTIC3,

HIGH); digitalWrite(HAPTIC4, HIGH);

delay(120); //500ms

digitalWrite(HAPTIC, LOW); digitalWrite(HAPTIC2, LOW); digitalWrite(HAPTIC3,

LOW); digitalWrite(HAPTIC4, LOW);

}

//CUBE ON / OFF

else if(HSReceived == 'Y'){

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

digitalWrite(LIGHTRED, LOW); digitalWrite(LIGHTRED2, LOW);

digitalWrite(LIGHTGREEN, HIGH); digitalWrite(LIGHTGREEN2, HIGH);

digitalWrite(LIGHTBLUE, LOW); digitalWrite(LIGHTBLUE2, LOW);

digitalWrite(HAPTIC, HIGH); digitalWrite(HAPTIC2, HIGH); digitalWrite(HAPTIC3,

HIGH); digitalWrite(HAPTIC4, HIGH);

delay(120); //500ms

digitalWrite(HAPTIC, LOW); digitalWrite(HAPTIC2, LOW); digitalWrite(HAPTIC3,

LOW); digitalWrite(HAPTIC4, LOW);

}

//SLIDER ON

else if(HSReceived == 'X'){

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

digitalWrite(LIGHTRED, LOW); digitalWrite(LIGHTRED2, LOW);

digitalWrite(LIGHTGREEN, LOW); digitalWrite(LIGHTGREEN2, LOW);

digitalWrite(LIGHTBLUE, HIGH); digitalWrite(LIGHTBLUE2, HIGH);

digitalWrite(HAPTIC, HIGH); digitalWrite(HAPTIC2, HIGH); digitalWrite(HAPTIC3,

HIGH); digitalWrite(HAPTIC4, HIGH);

}

//SLIDER OFF

else if(HSReceived == 'W'){

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

digitalWrite(LIGHTRED, LOW); digitalWrite(LIGHTRED2, LOW);

digitalWrite(LIGHTGREEN, HIGH); digitalWrite(LIGHTGREEN2, HIGH);

digitalWrite(LIGHTBLUE, LOW); digitalWrite(LIGHTBLUE2, LOW);

digitalWrite(HAPTIC, LOW); digitalWrite(HAPTIC2, LOW); digitalWrite(HAPTIC3,

LOW); digitalWrite(HAPTIC4, LOW);

}

//OTHER BUTTONS

else if(HSReceived == 'V'){

digitalWrite(HSerialTxControl, RS485Receive); // Init Transceiver for receiving

digitalWrite(LIGHTRED, HIGH); digitalWrite(LIGHTRED2, HIGH);

81

Page 91: CYBER PHYSICAL DESTINATION OPERATING PANEL

digitalWrite(LIGHTGREEN, LOW); digitalWrite(LIGHTGREEN2, LOW);

digitalWrite(LIGHTBLUE, HIGH); digitalWrite(LIGHTBLUE2, HIGH);

digitalWrite(HAPTIC, HIGH); digitalWrite(HAPTIC2, HIGH); digitalWrite(HAPTIC3,

HIGH); digitalWrite(HAPTIC4, HIGH);

delay(120);

digitalWrite(HAPTIC, LOW); digitalWrite(HAPTIC2, LOW); digitalWrite(HAPTIC3,

LOW); digitalWrite(HAPTIC4, LOW);

}

} // end of if read loop

}//--(end main loop )---

A.4 Calibration pseudo-example

Sensor readings varies slightly from one system to another and from one sensor toanother so it is logical to first calibrate the used sensors before combining them withthe entire unit. The calibration can be achieved with the sample code.

// the interval in mS

#define interval 2000

unsigned long time;

unsigned long time2;

#define mic_interval 100

unsigned long mic_time;

unsigned long mic_time2;

//LIGHT SENSOR

int LDRPIN = 0;

int LDRPIN_IN = 0;

int LDR_VAL = 0;

//VOLUME SENSOR

int MICPIN = 3;

int MICPIN_IN = 0;

int maxValue;

int minValue;

int MIC_VAL = 0;

void setup() {

// initialize serial communication at 9600 bits per second:

Serial.begin(9600);

Serial.println("calibrating MIC n LDR");

maxValue = 0; minValue = 1024;

time = 0; time2 = 0;

mic_time = 0; mic_time2 = 0;

}

// the loop routine runs over and over again forever:

void loop() {

//delay(30000);

82

Page 92: CYBER PHYSICAL DESTINATION OPERATING PANEL

Serial.println("start value");

Serial.println(maxValue);

Serial.println(minValue);

Serial.println("===========");

//read mic for 1sec

mic_time = millis();

MICPIN_IN = analogRead(MICPIN);

Serial.println("mid value: ");

Serial.println(MICPIN_IN);

MICPIN_IN = analogRead(MICPIN);

Serial.println("mid value2: ");

Serial.println(MICPIN_IN);

while(mic_time2 < (mic_time+mic_interval)){

mic_time2 = millis();

MICPIN_IN = analogRead(MICPIN);

Serial.println(MICPIN_IN);

if (maxValue < MICPIN_IN) {

maxValue = MICPIN_IN;

}

if (minValue > MICPIN_IN) {

minValue = MICPIN_IN;

}

}

Serial.println("===========");

Serial.println(maxValue);

Serial.println(minValue);

Serial.println("===========");

maxValue = (maxValue + minValue) / 2;

Serial.println("read value");

Serial.println(maxValue);

if(maxValue < 60 ) MIC_VAL = 1;

else if(maxValue > 60 && maxValue < 150) MIC_VAL = 2;

else if(maxValue > 150 && maxValue < 300) MIC_VAL = 3;

else if(maxValue > 300 && maxValue < 600) MIC_VAL = 4;

else MIC_VAL = 5;

/////////////////////////

//read LDR instantly

LDRPIN_IN = analogRead(LDRPIN);

if(LDRPIN_IN < 300) LDR_VAL = 1;

else if(LDRPIN_IN > 300 && LDRPIN_IN < 500) LDR_VAL = 2;

else if(LDRPIN_IN > 500 && LDRPIN_IN < 600) LDR_VAL = 3;

else if(LDRPIN_IN > 600 && LDRPIN_IN < 700) LDR_VAL = 4;

else LDR_VAL = 5;

///////////////////////////

//pre-calculations

int a = MIC_VAL * LDR_VAL;

int b = MIC_VAL - 1;

int c = MIC_VAL - LDR_VAL;

int d = abs(MIC_VAL - 5);

int mic_ldr_contr = a + (b * (c + d));

/////////////////////////

delay(3000);

83

Page 93: CYBER PHYSICAL DESTINATION OPERATING PANEL

time = millis();

while(time2 < (time+interval)){

time2 = millis();

Serial.println(mic_ldr_contr);

}

time2 = 0; mic_time2 = 0; maxValue = 0; minValue = 1024;

}

A.5 Linux Kernel requirement

For a USB microphone to operate with a Linux kernel the options in Figure A.3 andFigure A.4 must be enabled.

Figure A.3: Kernel requirement for USB audio.

Figure A.4: USB 1.0 device support for USB 2.0 port.

84