1
Introduction The introduction of software based laboratory information management systems (LIMS) has helped manage many aspects of the laboratory, from movement of samples and people to the experiments being carried out. These management systems have been developed further to interact directly with devices in the laboratory, automating certain experiments (such as auto-samplers). The use of online systems allows for remote management and control of experiments through web browsers and, where possible, through hand held devices such as mobile phones [1] . Allowing remote control and review can benefit the experimenter in a number of ways such as increasing safety where the experimental environment is unsafe to remain in, improving results where small changes in the environment greatly affect the obtained results and can save time through early detection of failures. Web Tools, Mashups and Automated Lasers Stephen Wilson & Jeremy Frey* School of Chemistry, University of Southampton, Highfield, Southampton, SO17 1BJ, UK; [email protected] Acknowledgments Stephen Wilson thanks the EPSRC for a studentship. Stephen would also like to thank Professor Jeremy Frey and the research group for their support in the project. References 1. J.M. Robinson, J.G. Frey, A.D. Reynolds, B.V. Bedi, and A.J. Stanford-Clark. From the laboratory to the mobile phone: Middleware for laboratory data acquisition using the publication subscribe model. In e-Science 2005, 2005. 2. Andy J. Stanford-Clark. Integrating monitoring and telemetry devices as part of enterprise information resources. WebSphere MQ Integrator, March 2002. 3. Microsoft MSDN. Publish Subscribe. [web page] http://msdn.microsoft.com/en-us/library/ms978603.aspx. [Accessed 29 September 2009]. 4. MQ - Telemetry Transport. MQTT.org. [web page] http://mqtt.org/. [Accessed 29 September 2009]. 5. Simile. Simile. [web page] http://simile.mit.edu/. [Accessed 29 September 2009]. Fig. 1. The interaction between producers and consumers using the message broker Message Brokering In this project we have developed software to control and review an experiment to study liquid interfaces. The data flow, of both captured data and control messages, are handled through a centralised message broker. The large results files are transferred through a parallel higher throughput system, preventing the message broker from becoming overloaded. This broker, called the Microbroker, is part of the IBM WebSphere [2] software suite. The Microbroker follows the publish/subscribe [3] architecture for controlling the flow of messages using the message queue transport telemetry (MQTT) [4] . A data producer acquires data from a source and publishes this to the Microbroker. A data consumer receives message for which they are subscribed to. Using the message broker allows the software to be independent of the overall system; such that should a new data source be introduced no changes to the existing software are required, only that a new topic will be available. To include a new data consumer, this only needs to subscribe to the topics it is interested in, rather than having to change all the existing data producers. This is shown in Fig. 1. Web tools The development of Web 2.0 and XML has led to a number of tools and services which are now available as open source software; many of which can be used in the research domain. Examples of these can be well known websites such as YouTube, Flickr or more specific research domain tools such as CiteULike, Connotea or Delicious. These tools are scalable to a number of domains as they use an application programming interface (API), this is a predefined vocabulary for interacting with the service and will be how they are linked to the message broker. Although these tools alone are useful, many can be combined to generate a more descriptive view of the data; this combination of web services and tools can be described as a mashup. In this work we have looked at tools provided by the Simile group [5] such as graphing and facetted searching solutions as well as services provided by Google. The Liquid Interface Experiment – Control In this project an experiment to study the liquid interface by second harmonic generation (SHG) was automated using the Microbroker to control the flow of messages both to the apparatus and from the data sources. The SHG experiment consists of a number of experimental components, each of which can be controlled through RS-232, and a data acquisition card. The main parts of the experimental set up are a laser source, a shutter, two polarisation rotators (input and output) and a sample diluter; the experimental set up is shown in Fig. 2. Simple Perl scripts were generated to subscribe to a control channel of the message broker, when a message is received this was converted to the device specific command and sent through RS-232 to the device. As the message broker software handles all the communication protocol each script was approximately twenty lines of code. At this time security is handled through restricted network access although it is planned to implement a security system in the future. There are also hardware based security mechanisms to prevent injury and damage. Fig. 2. The SHG laser apparatus showing the components of the system and how they communicate with the message broker The Liquid Interface Experiment – Mashup review The experimental review mashup website was developed in PHP and uses a number of services available via the web from other developers. To allow for faster searching of experiments the Simile Timeline software is used. The Timeline software provides an overview of all the experiments carried out via a timeline, this can be dragged using the mouse to move through time. The granularity of the timeline (such as hour, day or year) can be customised and combined to provide faster searching. Experiments within the timeline can be colour coded to distinguish between certain nodes. Once an experiment has been selected a number of tools are used to generate a view of the experiment in its current state. The experimental metadata is displayed in a tabular format using the Google Docs API; here each cell can be referenced by a single URI, allowing faster abstraction of the metadata. The experimental results are plotted using the Google Chart API, this uses a single URL to generate a plot. This can also provide mouse over functionality to display information about each point and embedded links within the plot to get further data about each data point. The environmental data such as temperature, humidity and light levels associated with the experiment can also be reviewed. If the experiment is still running the most recent values are shown in a tabular view. If the experiment has completed the Simile Timeplot software is used to generate a interactive plot of this environmental data. The Timeplot software reads CSV and XML data files and generates a time plot of the data, this also has mouse over functionality. Fig. 3. shows a number of screenshots from the review website Fig. 3. Various parts of the SHG experimental mashup review website Conclusions The use of the Microbroker as a standardised communication protocol has allowed for rapid development of software for both handling acquired data and sending control messages. As each component is only required to interact with the broker the system can be set up as a distributed deployment. This allows machine demanding processes, such as data acquisition, to run on a remote dedicated server; reducing the load on the other machines. This has also allowed for faster integration of new components, the addition of a new control device only required a new topic. The use of parallel transfer systems for larger data files has prevented the broker from becoming overloaded. It has been shown that a number of web tools and services can be used in the research domain to provide a detailed view of experimental data. The ease of use of these tools is greatly increased by a well documented API. When combined, these tools can provide more detail on the data than if used alone. The Simile tools are all developed with inter-connectivity in mind and therefore are easy to integrate with each other. Message Broker Data Producers Data Consumers Result Modelling Result Review Experiment Timeline Data Point Review Environmental conditions Message Broker Sample Laser source Laser shutter Polarisation rotator Polarisation rotator Detector Dual syringe diluter Database

Introduction The introduction of software based laboratory information management systems (LIMS) has helped manage many aspects of the laboratory, from

Embed Size (px)

Citation preview

Page 1: Introduction The introduction of software based laboratory information management systems (LIMS) has helped manage many aspects of the laboratory, from

IntroductionThe introduction of software based laboratory information management systems (LIMS) has helped manage many aspects of the laboratory, from movement of samples and people to the experiments being carried out. These management systems have been developed further to interact directly with devices in the laboratory, automating certain experiments (such as auto-samplers). The use of online systems allows for remote management and control of experiments through web browsers and, where possible, through hand held devices such as mobile phones[1]. Allowing remote control and review can benefit the experimenter in a number of ways such as increasing safety where the experimental environment is unsafe to remain in, improving results where small changes in the environment greatly affect the obtained results and can save time through early detection of failures.

Web Tools, Mashups and Automated LasersStephen Wilson & Jeremy Frey*

School of Chemistry, University of Southampton, Highfield, Southampton, SO17 1BJ, UK; [email protected]

AcknowledgmentsStephen Wilson thanks the EPSRC for a studentship.Stephen would also like to thank Professor Jeremy Frey and the research group for their support in the project.

References1. J.M. Robinson, J.G. Frey, A.D. Reynolds, B.V. Bedi, and A.J. Stanford-Clark. From the laboratory to

the mobile phone: Middleware for laboratory data acquisition using the publication subscribe model. In e-Science 2005, 2005.

2. Andy J. Stanford-Clark. Integrating monitoring and telemetry devices as part of enterprise information resources. WebSphere MQ Integrator, March 2002.

3. Microsoft MSDN. Publish Subscribe. [web page] http://msdn.microsoft.com/en-us/library/ms978603.aspx. [Accessed 29 September 2009].

4. MQ - Telemetry Transport. MQTT.org. [web page] http://mqtt.org/. [Accessed 29 September 2009].

5. Simile. Simile. [web page] http://simile.mit.edu/. [Accessed 29 September 2009].

Fig. 1. The interaction between producers and consumers using the message broker

Message BrokeringIn this project we have developed software to control and review an experiment to study liquid interfaces. The data flow, of both captured data and control messages, are handled through a centralised message broker. The large results files are transferred through a parallel higher throughput system, preventing the message broker from becoming overloaded. This broker, called the Microbroker, is part of the IBM WebSphere[2] software suite. The Microbroker follows the publish/subscribe[3] architecture for controlling the flow of messages using the message queue transport telemetry (MQTT)[4]. A data producer acquires data from a source and publishes this to the Microbroker. A data consumer receives message for which they are subscribed to. Using the message broker allows the software to be independent of the overall system; such that should a new data source be introduced no changes to the existing software are required, only that a new topic will be available. To include a new data consumer, this only needs to subscribe to the topics it is interested in, rather than having to change all the existing data producers. This is shown in Fig. 1.

Web toolsThe development of Web 2.0 and XML has led to a number of tools and services which are now available as open source software; many of which can be used in the research domain. Examples of these can be well known websites such as YouTube, Flickr or more specific research domain tools such as CiteULike, Connotea or Delicious. These tools are scalable to a number of domains as they use an application programming interface (API), this is a predefined vocabulary for interacting with the service and will be how they are linked to the message broker. Although these tools alone are useful, many can be combined to generate a more descriptive view of the data; this combination of web services and tools can be described as a mashup.In this work we have looked at tools provided by the Simile group[5] such as graphing and facetted searching solutions as well as services provided by Google.The Liquid Interface Experiment – ControlIn this project an experiment to study the liquid interface by second harmonic generation (SHG) was automated using the Microbroker to control the flow of messages both to the apparatus and from the data sources. The SHG experiment consists of a number of experimental components, each of which can be controlled through RS-232, and a data acquisition card. The main parts of the experimental set up are a laser source, a shutter, two polarisation rotators (input and output) and a sample diluter; the experimental set up is shown in Fig. 2. Simple Perl scripts were generated to subscribe to a control channel of the message broker, when a message is received this was converted to the device specific command and sent through RS-232 to the device. As the message broker software handles all the communication protocol each script was approximately twenty lines of code. At this time security is handled through restricted network access although it is planned to implement a security system in the future. There are also hardware based security mechanisms to prevent injury and damage.

Fig. 2. The SHG laser apparatus showing the components of the system and how they communicate with the message broker

The Liquid Interface Experiment – Mashup reviewThe experimental review mashup website was developed in PHP and uses a number of services available via the web from other developers. To allow for faster searching of experiments the Simile Timeline software is used. The Timeline software provides an overview of all the experiments carried out via a timeline, this can be dragged using the mouse to move through time. The granularity of the timeline (such as hour, day or year) can be customised and combined to provide faster searching. Experiments within the timeline can be colour coded to distinguish between certain nodes. Once an experiment has been selected a number of tools are used to generate a view of the experiment in its current state. The experimental metadata is displayed in a tabular format using the Google Docs API; here each cell can be referenced by a single URI, allowing faster abstraction of the metadata. The experimental results are plotted using the Google Chart API, this uses a single URL to generate a plot. This can also provide mouse over functionality to display information about each point and embedded links within the plot to get further data about each data point. The environmental data such as temperature, humidity and light levels associated with the experiment can also be reviewed. If the experiment is still running the most recent values are shown in a tabular view. If the experiment has completed the Simile Timeplot software is used to generate a interactive plot of this environmental data. The Timeplot software reads CSV and XML data files and generates a time plot of the data, this also has mouse over functionality. Fig. 3. shows a number of screenshots from the review website

Fig. 3. Various parts of the SHG experimental mashup review website

ConclusionsThe use of the Microbroker as a standardised communication protocol has allowed for rapid development of software for both handling acquired data and sending control messages. As each component is only required to interact with the broker the system can be set up as a distributed deployment. This allows machine demanding processes, such as data acquisition, to run on a remote dedicated server; reducing the load on the other machines. This has also allowed for faster integration of new components, the addition of a new control device only required a new topic. The use of parallel transfer systems for larger data files has prevented the broker from becoming overloaded.It has been shown that a number of web tools and services can be used in the research domain to provide a detailed view of experimental data. The ease of use of these tools is greatly increased by a well documented API. When combined, these tools can provide more detail on the data than if used alone. The Simile tools are all developed with inter-connectivity in mind and therefore are easy to integrate with each other.

Message Broker

Data Producers Data Consumers

Result Modelling Result

Review

Experiment Timeline

Data Point Review

Environmental conditions

Message Broker

Sample

Laser source

Laser shutter

Polarisation rotator

Polarisation rotator

Detector

Dual syringediluter

Database