64
Also in this issue: Keynote Unconventional Computation by Susan Stepney Research and Innovation Self-Organizing P2P Systems Inspired by Ant Colonies by Carlo Mastroianni Simulation and Assessment of Vehicle Control Network Systems by Alexander Hanzlik and Erwin Kristen ERCIM NEWS European Research Consortium for Informatics and Mathematics www.ercim.eu Number 85, April 2011 Special theme: Unconventional Computing Paradigms

Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Also in this issue:

Keynote

Unconventional Computation by�Susan�Stepney�

Research�and�Innovation

Self-Organizing P2P SystemsInspired by Ant Coloniesby�Carlo�Mastroianni

Simulation and Assessment ofVehicle Control Network Systemsby�Alexander�Hanzlik�and�Erwin

Kristen

ERCIM NEWSEuropean Research Consortiumfor Informatics and Mathematicswww.ercim.eu

Number 85, April 2011

Special theme:

Unconventional

Computing

Paradigms

Page 2: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011

ERCIM News is the magazine of ERCIM. Published quarterly, it

reports on joint actions of the ERCIM partners, and aims to reflect

the contribution made by ERCIM to the European Community in

Information Technology and Applied Mathematics. Through short

articles and news items, it provides a forum for the exchange of

information between the institutes and also with the wider scientific

community. This issue has a circulation of about 9,000 copies. The

printed version of ERCIM News has a production cost of €8 per

copy. Subscription is currently available free of charge.

ERCIM News is publ ished by ERCIM EEIG

BP 93, F-06902 Sophia Antipolis Cedex, France

Tel: +33 4 9238 5010, E-mail: [email protected]

Director: Jérôme Chailloux

ISSN 0926-4981

Editorial Board:

Central editor:

Peter Kunz, ERCIM office ([email protected])

Local Editors:

Austria: Erwin Schoitsch, ([email protected])

Belgium:Benoît Michel ([email protected])

Denmark: Jiri Srba ([email protected])

Czech Republic:Michal Haindl ([email protected])

France: Bernard Hidoine ([email protected])

Germany: Michael Krapp ([email protected])

Greece: Eleni Orphanoudakis ([email protected])

Hungary: Erzsébet Csuhaj-Varjú ([email protected])

Ireland: Dimtiri Perrin ([email protected])

Italy: Carol Peters ([email protected])

Luxembourg: Patrik Hitzelberger ([email protected])

Norway: Truls Gjestland ([email protected])

Poland: Hung Son Nguyen ([email protected])

Portugal: Paulo Ferreira ([email protected])

Spain: Christophe Joubert ([email protected])

Sweden: Kersti Hedman ([email protected])

Switzerla nd: Harry Rudin ([email protected])

The Netherlands: Annette Kik ([email protected])

United Kingdom: Martin Prime ([email protected])

W3C: Marie-Claire Forgue ([email protected])

Contributions

Contributions must be submitted to the local editor of your country

Copyright Notice

All authors, as identified in each article, retain copyright of their work

Advertising

For current advertising rates and conditions, see

http://ercim-news.ercim.eu/ or contact [email protected]

ERCIM News online edition

The online edition is published at http://ercim-news.ercim.eu/

Subscription

Subscribe to ERCIM News by sending email to

[email protected] or by filling out the form at the ERCIM

News website: http://ercim-news.ercim.eu/

Next issue

July 2011, Special theme: “ICT for Cultural Heritage”

Editorial Information

Keynote

UnconventionalComputation

The mathematician Stanislaw Ulam is famously quotedas saying that using a term like nonlinear science islike referring to the bulk of zoology as the study of

non-elephant animals1. Unconventional, or non-classical,computation is a similar term: the study of non-Turing com-putation. But how big is its associated field? Whereaszoology has the advantage of studying many kinds of non-elephants, non-classical computer scientists are currentlyrestricted to studying a few relatively primitive, or dimlyseen, or merely hypothesised, devices.

The elephant in the room here is the classical Turingmachine, with its 70-plus years of detailed study. Somemaintain that there are only elephants, and that the rest of usare like the blind men in the Indian parable, claiming wehave found snakes, and ropes, and walls instead. However, Iprefer the reversed parable, illustrated with a cartoon of sev-eral blind men groping at said snake, and rope, and wall, andconfidently declaring, “yes, it’s an elephant!”

The classical Turing machine was developed as an abstrac-tion of how human “computers”, clerks following pre-defined and prescriptive rules, calculated various mathemat-ical tables. The abstraction has proved extremely powerful,yet, as an abstraction, it omits certain features.

Physics

Despite being formulated post-relativity and post-quantummechanics, classical Turing computation is essentially basedin classical physics. This is understandable given the sourceof its abstraction, but today’s nano-scale computer compo-nents are embodied in the quantum world. Considerableeffort is taken to make them classical again. The study ofquantum computing and quantum communication is takingquantum weirdness seriously, and building devices simplyimpossible to achieve in the classical Turing model. Forexample, John von Neumann noted that “Any one who con-siders arithmetical methods of producing random digits is, ofcourse, in a state of sin”2, yet quantum systems are inherentlyprobabilistic. And the features of General Relativity, where

Page 3: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 3

Keynote

observers, and different parts of the computational device,may experience radically different proper times, have barelybeen addressed.

Parallelism

Weird physics isn’t the only property of the real world notaddressed by the classical model. The real world is mas-sively parallel, yet in the classical Turing model parallelcomputers are no more powerful than sequential ones. Asingle human clerk, following rules, can calculate exactly thesame numerical tables as a whole army of clerks. Of course,the army can do it faster, but in the Turing model, this advan-tage is swallowed up by simple space versus time tradeoffs.When we are considering real time embedded computer sys-tems, with them interacting with the real world, issues oftime become crucial, and parallelism has to be used to meethard deadlines. One reason parallelism is still a relativelyniche study is due to the enormous success in manufacturingcomputers: Moore’s law3 has enabled sequential machineperformance to grow fast enough to cater to our growingeveryday needs, and only specialists have needed to go par-allel. But recently, Moore’s law has taken a new direction:instead of individual chips getting faster, they are goingmulti-core. If Moore’s law continues in this direction, wewill soon have kilo-core chips, and we now have to take thiseveryday parallelism seriously. This will benefit not just forour everyday devices, but also massively parallel cluster-based computation, up to vastly parallel Avogadro-scalecomputation.

Interaction

The classical Turing model is “ballistic”: we give the clerk aset of instructions, shut them in a room, and wait for theanswer. Interactive computing is a “guided” model: there is acontinual interactive feedback loop between the computationand the real world. The computation here is an open systemof closely coupled reciprocal causation, constantly receivinginput and changing its behaviour based on that input, con-stantly providing output and affecting its environment, andhence its own subsequent inputs, through that output.

Susan Stepney, Deptartment of

Computer Science, and Centre

for Complex Systems Analysis,

University of York.

1 Campbell, Farmer, Crutchfield, Jen. “Experimental Math-ematics: the role of computation in nonlinear science”.Comms.ACM 28(4):374-84, 1985

2 John von Neumann. “Various Techniques Used in Con-nection With Random Digits.” In Householder et al,Monte Carlo Method, National Bureau of Standards,1951

3 Gordon E. Moore. "Cramming more components ontointegrated circuits". Electronics. 38(8):114-7, 1965

Nature

The Turing model was inspired by a single paradigm: theactions of human clerks. Unconventional computation can beinspired by the whole of wider nature. We can look tophysics (from general relativity and quantum mechanics, toannealing processes, analogue computing, and on to fullyembodied computation), to chemistry (reaction-diffusionsystems, complex chemical reactions, DNA binding), and tobiology (bacteria, flocks, social insects, evolution, growthand self-assembly, immune systems, neural systems), tomention just a few.

Even if it were to turn out that the theoretical power of uncon-ventional computers is no more than classical, the practicalpower has enormous potential. New problems require newperspectives, new formulations, new conceptual frameworks,new mechanisms, new approaches. Unconventional compu-tation offers these novel opportunities.

Susan Stepney

Page 4: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Joint ERCIM Actions

ERCIM NEWS 85 April 20114

Contents

2 Editorial Information

KEYNOtE

2 Unconventional Computationby Susan Stepney, Deptartment of Computer Science, and

Centre for Complex Systems Analysis, University of York

JOINt ERCIM ACtIONS

6 ERCIM “Alain Bensoussan” Fellowship Programme Announcement

7 AXES - A New ERCIM-managed Project

7 Nominate a promising young researcher for the 2011ERCIM Cor Baayen Award!

8 fet11 - The European Future Technologies Conferenceand Exhibition

9 ERCIM Opens to Multiple Members per Country

9 ERCIM MUSCLE Working Group at the WorldCongress 2011

SPECIAL tHEME

This special theme section on “Unconventional

Computing Paradigms” has been coordinated by Jiri

Vala, National University of Ireland Maynooth and

Giancarlo Mauri, Università di Milano-Bicocca

Introduction to the Special Theme

10 Quantum Computingby Jiri Vala

11 Molecular and Cellular Computingby Giancarlo Mauri

Quantum Computing

12 Quantum Information and the Emergence ofQuantum Engineeringby Thomas Busch

13 Topological Quantum Computation: TheQuantum Knot in the Handkerchiefby Joost Slingerland

14 Algorithms via Quantum Random Walksby Michael Mc Gettrick

15 Unentangled Quantum Proofs and theirApplicationsby Ashley Montanaro

16 Position-Based Quantum Cryptographyby Harry Buhrman, Serge Fehr and ChristianSchaffner

18 High-Speed Quantum Key Distribution and Beyondby Martin Stierle and Christoph Pacher

Molecular and Cellular Computing

20 Membrane Computing – Theory and Applicationsby Marian Gheorghe

21 On Synthesizing Replicating Metabolic Systemsby Giuditta Franco and Vincenzo Manca

22 Semantics, Causality and Mobility in MembraneComputingby Oana Agrigoroaiei, Bogdan Aman and GabrielCiobanu

23 From Energy-based to Quantum (inspired) Psystemsby Alberto Leporati

25 The Spanish Network on Biomolecular andBiocellular Computing: Bio-inspired NaturalComputing in Spainby Mario de Jesús Pérez Jiménez, Alfonso Ortega dela Puente and José M. Sempere

26 Expanding Formal Language and AutomataTheory: Bio-inspired Computational Modelsby Erzsébet Csuhaj-Varjú and György Vaszil

Page 5: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 5

27 Capturing Biological Frequency Control ofCircadian Clocks by Reaction SystemModularizationby Thomas Hinze, Christian Bodenstein, Ines Heiland,Stefan Schuster

29 Engineering with Biological Systemsby Fernando Arroyo, Sandra Gómez and Pedro Marijuán

30 Artificial Wet Neuronal Networks fromCompartmentalised Excitable Chemical Mediaby Gerd Gruenert, Peter Dittrich and Klaus-Peter Zauner

33 New Robustness Paradigms: from Nature toComputingby Giancarlo Mauri and Ion Petre

34 Towards a Chemistry-Inspired Middleware toProgram the Internet of Servicesby Jean-Louis Pazat, Thierry Priol and Cédric Tedeschi

35 Self-Organising Adaptive Structures: The Shifter Experienceby Carlos E. Cuesta, J. Santiago Pérez-Sotelo andSascha Ossowski

37 Alchemy of Servicesby Claudia Di Napoli, Maurizio Giordano and ZsoltNémeth

38 BACTOCOM: Bacterial Computing withEngineered Populationsby Martyn Amos and the BACTOCOM consortium

40 Bio-Sensors: Modelling and Simulation ofBiologically Sensitive Field-Effect-Transistorsby Alena Bulyha, Clemens Heitzinger and Norbert JMauser

RESEARCH ANd INNOVAtION

This section features news about research activities

and innovative developments from European research

institutes

42 Self-Organizing P2P Systems Inspired by AntColoniesby Carlo Mastroiann

43 Dialogue-Assisted Natural Language Understandingfor a Better Webby Mihály Héder

45 Content-Based Tile Retrieval Systemby Pavel Vácha and Michal Haindl

46 Intelligent Design of Multi-Device Service Front-Ends with the Support of Task Modelsby Fabio Paternò, Carmen Santoro and Lucio DavideSpano

47 Training Crisis Managers in Strategic Decision-Making by Amedeo Cesta, Gabriella Cortellessa, Riccardo DeBenedictis, and Keith Strickland

48 Security and Resilience in Cognitive Radio Networksby Vangelis Angelakis, Ioannis Askoxylakis, ScottFowler, David Gundlegård, Apostolos Traganitis and DiYuan

50 Smart Cameras for Cities of the Future:Simultaneous Counting of Pedestrians and Bikesby Ahmed Nabil Belbachir, Norbert Brändle andStephan Schraml

51 Simulation of Urban Traffic including Bicyclesby Jelena Vasic and Heather J. Ruskin

53 Simulation and Assessment of Vehicle ControlNetwork Systemsby Alexander Hanzlik and Erwin Kristen

55 Filling the Gap between ICT and HydroMeteoResearch Communities: the DRIHMS Projectby Alfonso Quarati, Nicola Rebora, Michael Schiffers

56 The SOCIETIES Project: Combining PervasiveComputing with Social Networksby Kevin Doolin, Pierfranco Ferronato and StefaniaMarrara

EVENtS

58 German-Austrian W3C Office at DFKI

58 Euro-India cooperation in Future Internet ResearchExperiments by Sathya Rao

60 DL.org Workshop on Digital Libraries, Open Accessand Interoperability Strategiesby Stephanie Parker, Giuseppina Vullo

61 Announcements

IN BRIEf

63 German Climate and Environment Innovation Prizefor Researchers at Fraunhofer SCAI

63 E-services for the Elderly at Home

63 Van Wijngaarden Award 2011 for Éva Tardos andJohn Butcher

63 CWI Spin-off Company VectorWise sold to Ingres

Page 6: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Joint ERCIM Actions

ERCIM NEWS 85 April 20116

����������������!������������������������������������������������������������������������ �������!����������������������������������������������

����������������������!��������"������������������� ��������

�����!�������������Meet us at fet11 in Budapest 4-6 May - www.fet11.eu

Page 7: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011

AXES - A New ERCIM-managed Project“Access�to�Audiovisual�Archives”�(AXES)�is�the�name�of

a�new�large-scale�integrating�project�(IP)�that�will

investigate�innovative�solutions�for�audiovisual�content

exploration.�The�project�was�launched�by�a�kick-off

meeting�on�January�13-14,�2011�at�Katholieke

Universiteit�Leuven�(Belgium).

The goal of AXES is to develop tools that provide varioustypes of user with new engaging ways to interact with audio-visual libraries, helping them discover, browse, navigate,search and enrich archives. In particular, apart from a search-oriented scheme, the project will explore how suggestionsfor audiovisual content exploration can be generated via amyriad of information trails crossing the archive. This willbe approached from three perspectives (or axes): users, con-tent, and technology.

Innovative indexing techniques will be developed in closecooperation with a number of user communities through tai-lored use cases and validation stages. Rather than juststarting new investments in technical solutions, the co-devel-opment will investigate innovative paradigms of use andnovel navigation and search facilities. Targeted users aremedia professionals, educators, students, amateurresearchers and home users.

Based on an existing Open Source service platform for dig-ital libraries, novel navigation and search functionalities willbe offered via interfaces tuned to user profiles and workflow.For this purpose, AXES will develop tools for contentanalysis deploying weakly supervised classificationmethods.

Information in scripts, audio tracks, wikis or blogs will beused for the cross-modal detection of people, places, events,etc., and for link generation between audiovisual content.Users will be engaged in the annotation process: with thesupport of selection and feedback tools, they will enable thegradual improvement of tagging performance. AXES tech-nology will open up audiovisual digital libraries, increasingtheir cultural value and their exposure to the European publicand academia at large.

The consortium is a perfect match to the multi-disciplinarynature of the project, with professional content owners, aca-demic and industrial experts in audiovisual analysis,retrieval, and user studies, and partners experienced insystem integration and project management. The projectcomprises thirteen partners including the ERCIM membersFraunhofer Institute for Intelligent Analysis and InformationSystems IAIS, INRIA, Katholieke Universiteit Leuven, andDublin City University. The project started on 1 January2011 and is managed by ERCIM.

Link: http://www.axes-project.eu/

Please contact: Philippe Rohou, AXES project coordinator, ERCIM Office E-mail: [email protected]

Tinne Tuytelaars AXES scientific coordinator, KU Leuven, Belgium E-mail: [email protected]

7

Nominate a promising young researcher

for the 2011 ERCIM Cor Baayen Award!

Deadline for nominations: 30 April 2011

Each year, ERCIM awards a promising young researcher in computer sci-ence and applied mathematics with the Cor Baayen Award. The winner ofthe award receives €5,000, together with an award certificate. The awardwill be presented at a ceremony during the 2011 ERCIM fall meetings.

Nominees must have carried out their work in one of the 'ERCIM countries'(ie, Austria, Belgium, Czech Republic, Finland, France, Germany, Greece,Hungary, Ireland, Italy, Luxembourg, Norway, Poland, Portugal, Spain,Sweden, Switzerland, The Netherlands and the United Kingdom). Nomineesmust have been awarded their PhD (or equivalent) after 30 April 2008.

Nominations should be made by a staff member of the university orresearch institute where the nominee is undertaking research. Self nomina-tions are not accepted.

More information: http://www.ercim.eu/activity/cor-baayen-award

ERCIM 2011 Cor Baayen Award

Page 8: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Joint ERCIM Actions

ERCIM NEWS 85 April 20118

fet11

the European futuretechnologiesConference and ExhibitionBudapest 4-6 May 2011

The European Future TechnologiesConference and Exhibition 2011 is thesecond edition of a new forum dedi-cated to frontier research in informationand communication technologies.

Following the first FET conference heldin 2009 in Prague, fet11 is designed to behighly interactive and engaging a broadand multi-disciplinary community. Itwill involve key policy makers, and fea-tures a mix of a panel discussion,keynote speakers, scientific sessions,and posters sessions.

Keynotes

Seven visionary keynotes will bepresented: • Josh Bongard: "How Evolution

Shapes the Way Roboticists Think"• Jean-Philippe Bouchaud: "The

endogenous dynamics of markets:price impact and feedback loops"

• Rodney Douglas: "Constructive Cor-tical Computation"

• Artur Ekert: "Is the age of computa-tion yet to begin?"

• John Pendry: "The Science of Invisi-bility"

• Gábor Prószéky: "The (hopefullynear) future of human language tech-nologies"

• Claire Tomlin: "Mathematical mod-els to help understand developmentalbiology and cancer"

FET Flagship Pilots

Neelie Kroes, Vice-President of theEuropean Commisson will official lylaunch six FET Flagship pilot actionswhich will prepare for FET Flagships.FET Flagships are science-driven,large-scale research partnerships, envi-sioned to run for at least 10 yearsstarting in 2013.

Exhibits

A hands-on exhibition featuring 28exhibits will run throughout, in parallel

to the conference, showcasing the latestresearch developments in future andemerging information technologiesincluding flying robots, graphene baseddevices, and many others.

A Science Café will be open during thewhole conference and will offer a casualmeeting place for discussions and ignite-style short presentations.

More than a traditional scientific confer-ence, fet11 is thus a unique intellectualevent which aims at creating an atmos-phere of excitement for the opportunitiespresented by FET-type research inEurope through its distinctive contentand structure.

Opportunity for Young Researchers

fet11 offers a unique opportunity for doc-toral students and young researchers:

• find out what the current state of think-ing is from leaders across a broad rangeof future information science and tech-nology

• share and discuss your ideas in one ofour Science Café discussions with lead-ing scientists from around the world

• take your chance to tell leading politi-cians what you think about the future ofscience in Europe

Students (limited to 200), journalists andscience bloggers can attend fet11 free ofcharge.

fet11 is organised by the EuropeanCommission Futureand EmergingTechnologies program, ERCIM andSZTAKI.

More information:http:// www.fet11.eu

Page 9: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 99

ERCIM Opens to MultipleMembers per Country The�ERCIM�consortium�is�currently�re-organising�into�a

two-tier�structure:�a�non-profit�association�(AISBL)�open

to�any�suitable�institution�based�on�criteria�of�excellence,

and�the�current�European�Economic�Interest�Group�(EEIG)

composed�of�a�few�members�which�will�act�as�the�host�of

W3C�Europe�and�the�ERCIM�office.�The�association�will

be�able�to�drive�the�ERCIM�mission�without�the�current

restriction�of�one�member�organisation�per�country.�

The creation of the ERCIM AISBL is progressing rapidlyand it is expected that the first general assembly consisting ofall current ERCIM Members will be held in Trondheim on 9-10 June 2011.

Why participate in ERCIM ?

ERCIM is a European-wide open network of centres ofexcellence in ICT and is internationally recognized as a rep-resentative organisation in its field, providing access to allmajor ICT research groups in Europe.

ERCIM has a track record of successful initiatives promotingICT research and cooperation in Europe and beyond. Amongthese are: • the Cor Baayen Award, awarded each year to a promising

young researcher in computer science and applied mathe-matics. The award consists of a cheque for 5000 Eurotogether with an award certificate.

• Alain Bensoussan Fellowship Program, supported by theEC, offers fellowships for PhD holders from all over theworld hosted in leading European Research Institutes.

• ERCIM News, the quarterly magazine of ERCIM with aprint distribution of over 9000 copies, as well as free on-

line access. Through short articles and news items onresearch and innovation, it provides a forum for theexchange of information between the institutes and alsowith the wider scientific community.

• Cooperation with professional bodies in domains of inter-est, such as the European Mathematical Society, the Euro-pean Telecommunications Standards Institute (ETSI), theEuropean Science Foundation (ESF)

• Organisation of strategic seminars on innovation• White papers and reports for decision-making bodies such

as the EC, eg Cooperations in Grid research between EUand South East Asian countries; Internet in the Mediter-ranean Region, Tunisian Research in IT, HPCN in thesouthern Mediterranean Region; Evaluations for theWorld Bank’s INFODEV Programme

• Organization of strategic workshops, eg Beyond-The-Horizon, EU-Mediterranean co-operations, EU-USNational Science Foundation strategic workshops

• Research project management support: ERCIM has suc-cessfully applied for and managed numerous leading ICTresearch projects in the range of European FrameworkProgrammes

• Working Groups: ERCIM runs working groups onfocussed topics ranging from Computing and Statistics toSoftware Evolution, preparing the way for excellentresearch in new domains.

Organisations interested in joining ERCIM are invited tocontact the ERCIM office to express their interest. Entry cri-teria and the process for joining are being finalised and itshould be possible to formulate official applications forERCIM membership from June 2011.

Link: http://www.ercim.eu

Please contact: ERCIM office, E-mail: [email protected]

ERCIM MUSCLEWorking Group at the WorldCongress 2011New York City, USA,

30 August - 3 September 2011

The ERCIM Working Group onMultimedia Understanding throughSemantics, Computation and Learning(Muscle) is involved in the scientific sup-port and organization of the WorldCongress “The frontiers in intelligentdata and signal analysis”, DSA. The con-gress features three international confer-ences plus a number of tutorials, work-shops and an industrial exhibition.

The congress combines three interna-tional conferences:• International Conference on Machine

Learning and Data Mining, MLDM,• Industrial Conference on Data Min-

ing, ICDM, and• International Conference on Mass

Data Analysis of Images and Signalsin Medicine, Biotechnology, Chem-istry and Food Industry, MDA.

A broad range of applications are pre-sented at this congress, including mar-keting, medical and life-sciences,industry, and agriculture. The WorldCongress thus responds to all novelaspects of data analysis. Leading scien-tists from research and industry will bepresenting their work and discuss novelideas. The World Congress is also a plat-form for companies, decision makers

from industry and market, and for net-working. Next to the scientific program,interested companies and representativesfrom industry have the possibility topresent their projects on intelligent dataand signal analysis.

The main sponsor and organizer of DSA2011 is Ibai, the Institute of ComputerVision and Applied Computer Sciences,based in Leipzig, Germany and NewYork City, USA, and member of theERCIM MUSCLE Working Group.Several MUSCLE members have beencollaborating in the congress organiza-tion, in the framework of MUSCLE’straining and scientific dissemination ini-tiatives.

More information:http://www.worldcongressdsa.com/http://wiki.ercim.eu/wg/MUSCLE/

Page 10: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201110

Quantum Computingby Jiri Vala

We are living at the beginning of a quantum revolution whichwill no doubt have a profound impact on all aspects of ourlives. Quantum systems, which were once restricted to aca-demic research, are now becoming a technological reality.This inevitable evolution is particularly well documented inthe context of information and communication technologieswhere the first quantum devices are becoming availablecommercially.

This special theme of the ERCIM News brings quantuminformation science and technology closer to its readers viathree invited and three contributed articles which explorevarious aspects of this fascinating field of research and itsexciting applications.

Quantum computation, which is naturally a main focus ofthis issue, represents a highly unconventional and alsoextremely powerful computation paradigm. Quantum com-putation can provide immense computing power that reachesbeyond the capabilities of any conventional computer. Thispower derives from the exotic properties of quantum sys-tems, from quantum ‘weirdness’ which has no analog in theworld of classical mechanics and thus no analog in the worldwe can perceive directly by our senses. This ‘weirdness’reflects the counterintuitive facts that distinct quantum statesof a system can coexist in what is called a quantum superpo-sition and can also be entangled by strong correlations whichhave no classical analog.

Introduction into quantum information and related quantumengineering challenges is provided in the invited article byThomas Busch from University College Cork who exploresthe essential principles which are common to all quantuminformation processing.

The realization of quantum computing faces highly complexchallenges, the most important one being protection ofquantum information against errors. Engineering approacheshave been developed allowing us to achieve fault-tolerantquantum computation in principle. They are based on con-catenated quantum error correction schemes, and as such,require a large amount of physical resources and extremecontrol over each component of the quantum computingprocess. An attractive alternative, topological quantum com-putation, has recently emerged which enables fault-toleranceto be built into quantum computing hardware.

Topological quantum computation is presented in the invitedarticle by Joost Slingerland from National University ofIreland, Maynooth. It explains how the quantum ‘weirdness’mentioned above can be even more exotic in certain two-

dimensional quantum systems which permit quasiparticlescalled anyons, and how knotting trajectories of these anyonscan be exploited for fault-tolerant quantum computation.

Efficient quantum algorithms are keys to harvesting theenormous power of quantum computing. Indeed quantuminformation science was highly accelerated by the discoveryby Peter Shor of quantum algorithm for large number factor-ization which provides an exponential speedup of computa-tion compared to any conventional (classical) algorithm.Tremendous progress in development of quantum algorithmshave been experienced since then. Development of newalgorithms is indeed closely related to study of novel para-digms for quantum computation.

Quantum random walks and their role in the development ofquantum algorithms is presented in the invited article byMichael Mc Gettrick from National University of Ireland,Galway. The detailed understanding of quantum randomwalks, including for example the effect of memory on theircontrollability, is an important step towards new quantumalgorithms.

The role of entanglement in the context of quantum algo-rithms and complexity is addressed in the article by AshleyMontanaro from University of Cambridge who investigatesunentangled quantum proofs in Merlin-Arthur games andtheir applications for efficiently solving hard computationalproblems. Interesting results in this area have been obtainedfor example for the 3-SAT problem.

Important developments have taken place in other areas ofquantum information processing, particularly in quantumcommunication and cryptography. While realization ofquantum computing as the ultimate quantum informationprocessor is challenging, quantum cryptography and itsapplication for quantum key distribution provide commeri-cally available quantum security.

New quantum cryptographic methods are explored in thearticle by Harry Buhrman, Serge Fehr, and ChristianSchaffner from CWI. Quantum cryptography is also the topicof the article by Martin Stierle and Christoph Pacher from theAustrian Institute of Technology which focuses on high speedquantum key distribution and presents important recentdevelopments towards commercial quantum cryptography.

Please contact:Jiri ValaNational University of Ireland Maynooth / IUAE-mail: [email protected]

Introduction to the special theme:

Page 11: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 11

Molecular and Cellular Computingby Giancarlo Mauri

In 1994, a pioneering experiment by Leonard Adlemanshowed that DNA could be used to solve mathematical prob-lems by exploiting its capability to store and process infor-mation and using molecular biology tools to perform arith-metic or logic operations on the encoded data. Since then, theinterest of researchers in the area of (bio-)molecular com-puting has been growing continuously, taking the disciplinewell beyond the original idea of using biological moleculesas fundamental components of computing devices.

Within molecular computing much work has been done onthe study of theoretical models of DNA computation andtheir properties, such as Tom Head’s splicing systems,together with experimental work. Convergences withnanosciences, nanoengineering, synthetic biology have alsobeen explored and exploited.

One of the most significant achievements of molecular com-puting, for example, has been its contribution to the under-standing of self-assembly, which is among the key conceptsin nanosciences.

The next step has been the passage from the molecular levelto the cellular level. The living cell is perhaps the most out-standing self-organized, hierarchical, adaptable, economicand robust information processing system that we know, andcellular processes such as gene regulatory networks, protein-protein interaction networks, biological transport networksand signalling pathways can be studied from the point ofview of information processing. This knowledge will alsoenable us to harness the cell as a “nano-bot”, which can beprogrammed to carry on specific tasks such as targeted drugdelivery, housekeeping of chemical factories and coordina-tion of bio-film scaffolding and self-assembling.

The following contributions on the theme of molecular andcellular computing describe some of the projects carried outin Europe, and can be grouped into three groups:

• The first group of papers mainly concern membrane com-puting, a new unconventional computing model thatabstracts from the structure and functionality of the living

cell, illustrating both the theoretical basis of the modelsintroduced as well as applications in various fields. M.Gheorghe gives an overview of the field, then G. Francoand V. Manca describe a research project that aims to syn-thesize a minimal cell, starting from a model based onmembrane systems. The third paper (Agrigoroaiei, Amanand Ciobanu) discusses operational semantics, and thenotions of causality and mobility in membrane systems.Possible connections with quantum computing are dis-cussed by Leporati. The next paper describes the goals andactivities of a Spanish network on Biomolecular and Bio-cellular Computing, which is not limited to membranesystems, but includes networks of bioinspired processors,synthetic biology, computational biology and bioinformat-ics. The last paper in this group (Csuhaj-Varjú and Vaszil)is of a theoretical nature, and relates membrane systems toautomata, focusing in particular on the distributed organi-zation of components of living systems.

• The papers in the following group focus on biologicalinformation processing, based on chemical reactions(Hinze, Bodenstein, Heiland, and Schuster), the engineer-ing of biological systems with a given behaviour (Arroyo,Gómez and Marijuán), and molecular information proces-sors (Gruenert, Dittrich and Zauner). Chemical reactionsand the robustness of living systems with respect to thepossible failure of their components are the inspiration forarchitectural ideas for robust distributed systems (Mauriand Petre), middleware (Pazat, Priol and Tedeschi), self-adapting systems (Cuesta, Pérez-Sotelo and Ossowski),and services (Di Napoli, Giordano and Németh).

• Finally, the two closing papers describe projects for thedevelopment of a computing device based on bacterialcolonies (Amos and the BACTOCOM consortium) andfor modelling and simulating biosensors, with possibleapplications to nanomedicine.

Please contact: Giancarlo MauriUniversity of Milano-BicoccaE-mail: [email protected]

Unconventional

Computing Paradigms

Page 12: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Progress in micro- and nanotechnolo-gies has been a key element in thehighly successful quest to satisfy theneed for evermore powerful informa-tion processing and communicationdevices during the last decades. Thisroute of progress, however, has a prin-ciple limit as smaller structures are nec-essarily made from fewer and fewerparticles and industry is rapidly pro-gressing towards a regime wherein tran-sistors will be made from only a singleelectron. One can easily see thatachieving this goal will lead to a funda-mental barrier: due to the indivisibilityof electrons the current miniaturisationprocesses will have reached their finalstep and new ideas and technologies forenhancing computational power willhave to be developed. At the same time,reaching this barrier will also require todevelop a completely new set of engi-neering techniques, since single parti-cles do no longer behave according tothe laws of classical mechanics, butinstead have to be described quantummechanically.

As a result, researching the physics ofsingle particles on the quantum scale isof significant importance and has, inrecent years, led to a wealth of knowl-edge and technologies that have under-pinned and revolutionised our funda-mental understanding of the world. Aparticularly successful approach hasbeen the application of concepts ofinformation theory to the theory ofquantum mechanics, which has led tothe development of so-called quantuminformation devices. The existence ofadditional resources, such as entangle-ment, in the quantum regime means thatthese new devices can have superiorqualities to classical devices and solveproblems for which no classical algo-rithms are known. The theoretical workon these issues has been accompaniedby tremendous progress in experimentalpossibilities, and controlled coherentevolution and high fidelity measure-ments of single particles are, by today,feasible in several synthetic quantumsystems. The area is, however, still in its

infancy and the development of newideas for quantum engineering tech-niques is of large importance for futureadvances in information and communi-cation technologies. While quantumdevices initially only seemed unavoid-able, they have turned out to be a highlydesirable part of the technological land-scape of the future.

Isolating single quantum particles insuch a way that they can be determinis-tically engineered is a very difficulttask, due to the often unavoidable envi-ronmental interactions or thermalnoise. Identifying appropriate systemsis, therefore, of large importance andcurrently a very active strand ofresearch is to test and characterise dif-ferent candidates for their suitability.One area in which systems that satisfythe basic conditions of good isolations,low noise and high controllability canbe found is the area of ultracold atomsand various experimental systems havebeen build in recent years that allowone to simulate Hamiltonians of impor-tance in quantum information.However, many other systems, rangingfrom man-made quantum dots in solidstate devices to exotic low-dimensionalelectronic structures are under heavyinvestigation as well.

Two groups at University CollegeCork, one experimental and one theo-retical, work in this area using ultracoldatomic systems. One of their aims is toexplore different methods to buildinterfaces that connect the quantummechanical world of single atoms toour macroscopic one. A promisingmethod for this is to guide light to theultracold atom using an optical fibre,have it interact with the atom, andguide is back to the detector. For thiswe are using a commercial-gradeoptical fibre, which in the interactionregion is stretched to a diameter smallerthan 1 micro-meter. The light then nolonger fits into the fibre and insteadtravels on the outside, where it can beaccessed by the ultracold atoms. Whilethis technology is still in its infancy,

the large flexibility of such a setupmakes it a very promising tool forfuture developments.

The basic entity in a quantum computeris, just as in a classical computer, thebit, now called a quantum bit or qubit.However, while classically a bit canonly have two possible states, 0 and 1, aquantum bit can be in any possiblecombination of these two at the sametime. This allows for a powerful paral-lelism in the computational process,which can be helpful in speeding upalgorithms. One common example isthe database search, in which one pieceof information has to be identifiedamong a large number of elements:instead of sequentially looking at eachentry of the database, as required clas-sically, quantum mechanics allows tolook at all elements in one operation.

While a full scale quantum computermight still be a long way off, by todayalmost all fundamental entities andprocesses have been demonstrated. Infact, first devices using quantum tech-nologies, such as cryptographic sys-tems and random number generators,are already commercially available.Research has started on scaling up thenumber of quantum bits to carry outmore complex processes, as well asimproving the stability of the indi-vidual qubits. This has helped to signif-icantly increase our understanding oftwo of the basic properties of quantummechanics, coherence and entangle-ment, and is leading the way to exploremore complex systems and technolo-gies in the future. Since we are only atthe beginning of this journey, the futureis likely to hold many unexpected sur-prises.

Please contact:Thomas BuschDepartment of PhysicsUniversity College CorkE-mail: [email protected]

ERCIM NEWS 85 April 201112

Special theme: Unconventional Computing Paradigms

Quantum Information and the Emergence of Quantum Engineeringby Thomas Busch

New�technologies�require�new�thinking�and�new�materials.

Page 13: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Quantum computers of sufficient sizeand reliability would not only cause arevolution in data security, but shouldhave many more exciting applications,such as the design of special materialsand chemicals. Nevertheless, progressin the fabrication of quantum computershas been slow because in mostapproaches to quantum informationprocessing, the quantum systemsemployed to store and manipulate infor-mation are very sensitive to noise, eitherfrom the environment or due to smallerrors in controlling the computer'scomponents. This causes these com-puters to effectively lose their memory -we say there is decoherence of thestored quantum information. Quantumerror correction software can in prin-ciple solve this problem, but thisrequires much more reliable basic com-ponents than those currently available.

Topological quantum computation(TQC) aims to eliminates decoherenceat the hardware level, by encodingquantum information in non-local prop-erties of a suitable quantum system,which cannot easily be disturbed bynoise - this is somewhat analogous to theold memory aid of knotting a handker-chief. The information storage andmanipulation in topological quantumcomputers makes use of anyons. Anyonsare particles which have the propertythat the quantum state of the systemtransforms in a non-trivial way whenone of these particles encircles another.As the anyons move around, encirclingone another in different ways, their'world-lines' become tangled up andform knots. To visualize this, we canthink of each anyon as connected to athread and these threads forming a braidas the anyons at their ends are movedaround (figure 1). Systems with suffi-ciently interesting anyons can reliablystore a number of qubits which growslinearly with the number of anyons. The

quantum states which make up thequbits can be obtained from some initialstate, and also manipulated further forcalculation, by 'braiding' the anyons indifferent ways. Crucially, as long as thetemperature is low enough and theanyons are kept well-separated, there isessentially no other way to disturb thestored information, which makes thiscomputation scheme extremely stableagainst decoherence.

Anyons can only exist in systems whichare effectively two-dimensional, that is,

they live in a single plane. Importantexamples of such systems are two-dimensional electron gases (2DEGs)which can be created on the interfacesbetween various semiconducting mate-rials, and graphene, a two-dimensionalform of carbon which recently earnedAndre Geim and Konstantin Novoselovthe 2010 Nobel Prize in physics. Themost advanced experiments on anyonsto date employ 2DEGs at very low tem-peratures and very strong magneticfields. Under these conditions the frac-tional quantum Hall effect occurs - theelectrons form incompressible liquidsand the smallest possible holes (2D'bubbles') in some of these liquids areexpected to be anyons useful forquantum computation. To experimen-tally establish the statistics properties ofthese holes is non-trivial, preciselybecause these properties are imperviousto local probing.

Groups in Maynooth and in the widerDublin area work on many aspects ofplanar physics and its application toTQC: • general mathematical underpinnings

of the theory of anyons• construction and study of exactly

solvable microscopic models inwhich anyons emerge

ERCIM NEWS 85 April 2011 13

topological Quantum Computation: the Quantum Knot in the Handkerchiefby Joost Slingerland

Future�quantum�computers�may�use�exotic�materials�which�host�entirely�new�types�of�physical

particle,�called�anyons.�Computations�in�such�Topological�Quantum�Computers�would�be�done�by

exchanging�the�anyons,�creating�'quantum�knots'�in�the�process.��This�idea�has�spawned�an

international�effort�encompassing�pure�mathematicians,�theoretical�and�experimental�physicists

and�material�scientists.�The�first�electronic�devices�which�are�supposed�to�detect�and�manipulate

anyons�have�recently�been�constructed�in�two-dimensional�electron�systems,�with�promising�results.�

Figure 1: Two anyons in a plane. One

encircles the other and the figure shows the

braid corresponding to that process.

Figure 2: Schematic of

a double point contact

interferometer, a way to

detect and manipulate

anyons.

Page 14: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

• numerical study of realistic models,for example for fractional QuantumHall systems

• interpretations and predictions forconcrete experiments.

An important line of research hasfocused on interferometers like the onepictured in figure 2. Teams of experimental physicists atBell labs, Princeton, Harvard, theUniversity of Chicago and theWeizmann Institute have already con-structed devices similar to this, andpromising evidence for the existence ofnon-Abelian anyons has been emerging.Microsoft has also poured significantscientific and financial resources intothis line of research through its TQCresearch station, 'Microsoft Station Q'.

The figure shows electric current carriedby anyons, flowing along the edges of a2DEG in the fractional quantum Hall

regime (seen from above). In twoplaces, the edges are pushed togetherusing contacts shown as gray rectan-gles. At these two point contacts,anyons can quantum tunnel from thebottom edge to the top. Current whichflows from the bottom to the top thuscomes from a quantum superposition ofamplitudes to take either the green orthe red path. This causes oscillations inthe current when the size of the regionenclosed by the paths is varied. Thegray disc in the figure represents a con-tact which may be used to vary thenumber of anyons inside the interferom-eter. The number of anyons inside theinterferometric region can dramaticallyaffect the current; in the very simplestmodel, the presence of an odd numberof anyons makes the current oscillationsdisappear completely, while an evennumber of anyons has no such effect.Observing these two behaviors in thesame system would be a dramatic signa-

ture of the presence of non-Abeliananyons (the kind of anyons needed forTQC). Devices such as this are likely toprovide the first smoking gun detectionnon-Abelian anyons, and also providean important tool in manipulating thestate of future anyonic quantum com-puters.

Links:http://www.thphys.nuim.ie/daqist/http://www.thphys.nuim.ie/staff/joost/PI/intro.htmlhttp://stationq.ucsb.edu/

Please contact:Joost SlingerlandDepartment of Mathematical PhysicsNational University of IrelandMaynoothE-mail: [email protected]

ERCIM NEWS 85 April 201114

Special theme: Unconventional Computing Paradigms

Algorithms via Quantum Random Walksby Michael Mc Gettrick

We�are�working�on�features�displayed�by�a�number�of�new�scenarios�in�Quantum�Random�Walks

(QRW).It�is�expected�that�many�features�of�QRWs�(eg�hitting�time,�localization)�will�be�useful�in

developing�quantum�algorithms.�Amongst�other�topics,�we�look�at�"memory�effects"�in�walks,

simulating�a�2-dimensional�walk�with�a�2-state�coin,�and�quantum�fairness�in�lazy�QRWs.

In classical Markov models, one of theways of extending the system is to takein to account the "history" of the previ-ously occupied states. The simplestextension (looking "back" one step) hasbeen studied by us in the correspondingQRW. We get different probability dis-

tributions compared to the QRWwithout history (or, as we call it ,memory). In particular, for certain ini-tial states of the Quantum Walker, thephenomenon of localization is observedat the origin: In this case, even after aninfinite number of steps, the probability

of finding the walker at the origin isgreater than 50%. This does not happenin either "standard" QRWs or indeed inclassical random walks. The techniqueswe employ are both rigorous mathemat-ical calculation, and simulation by "run-ning" a finite number of steps of thewalk on a computer. We have found thesimulation often suggests features of thewalk that we can then try and prove.This work is all carried out in the DeBrun Centre for Computational Algebraat the National University of Ireland,Galway.

In a further project, we look at QRWwith memory on the Cayley Graph of anAbelian Group. This is collaborationwith Claas Roever (Galway), MikeBatty (Durham) and Jaroslaw Miszczak(Polish Academy of Sciences). We havehad some successes here with analysingthe simple case of the QWR on a cycle(which is a Cayley Graph of the cyclicgroup). When we add one step memoryto this, it becomes a QRW on a Cayley

Figure 1: A plot of the probability

distributions in a Quantum

Random Walk with memory on the

straight line (purple). For

comparison, the classical

walk (blue) and Quantum Walk

without memory (red) are also

shown. The number of steps in

this walk is 40. The walk with

memory shows the feature of

localization, with over 50% of the

probability at the origin.

Page 15: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Graph of the Dihedral Group. The aimis now to understand the general pictureof re-writing a QRW with memory asone without memory on a differentgraph. One of the reasons we carry thisout is we believe putting in memory into the process gives us a type of controlover the probabilistic outcomes thatcould be useful in directing algorithms.

Work in collaboration with Carlo DiFranco and Thomas Busch at UniversityCollege Cork has focused on how wecan decrease the resources needed toconstruct QRWs in 2 dimensions.Taking the typical square lattice as anexample, we have successfully con-structed the Grover Walk without using

a "2 dimensional coin". Our walk usessimply a 2-state coin, but alternates themoves up/down and left/right to get 2-dimensional behaviour. The Groverwalk comes from the same ideas as usedin the famour Grover Algorithm forsearching an unordered list. Becauseour walk uses a lower dimensional coin,we expect it to be easier to implementin a physical scenario. We hope soon tobe able to extend the idea of an alter-nating walk to other models.

Finally, along with research studentsDavid Dolphin and Aine Ni Dhonnachaat NUI, Galway, we have looked at theidea of Lazy QRWs, and have phenom-enological results in this area. Our idea

has been to use the 3 state qutrit tomodel the 3 possibilities in a lazy QRW(move left, move right, stay put). A fur-ther online quantum walk simulator hasbeen built (see URLs at the end) to sim-ulate QRWs on the cycle.

Links:http://www.maths.nuigalway.ie/~gettrick/research/http://walk.to/quantum

Please contact:Michael Mc GettrickNational University of Ireland, GalwayE-mail:[email protected]

ERCIM NEWS 85 April 2011 15

The unintuitive features of quantummechanics have delighted and con-founded physicists in equal measure forover a century. A central aspect ofquantum theory is the super-strongquantum correlation known as entangle-ment. This "spooky action at a distance"enables many of the most well-knownand surprising examples of quantumeffects, such as quantum teleportation,and is considered to be an essential com-ponent of the exponential speed-ups thatcan be achieved by quantum computersover standard ("classical") computers.However, recent work at the Centre forQuantum Information and Foundationsat the University of Cambridge, in col-laboration with the University ofWashington, has shown that the absence

of entanglement can also lead to signifi-cant computational advantages.

This work takes place in the setting ofso-called quantum Merlin-Arthurgames. In this framework, a prover(Merlin) sends a computationally lim-ited verifier (Arthur) a quantum state,which Arthur uses as a resource("proof") to enable him to solve a com-putationally challenging problem. Wethink of Merlin as being all-powerful,but not to be trusted: Arthur must checkthe proof to ensure that Merlin does nottrick him into answering incorrectly.This is a model for problems whichcannot necessarily be solved efficiently,but whose solutions can be verified effi-ciently by a quantum computer. Many

natural problems in quantum physicsturn out to fall into this category.

A natural generalisation of this frame-work is to give Arthur access to morethan one prover, but to separate theprovers and not to allow them to com-municate. If Arthur receives two ormore unentangled quantum states asproofs, he might be able to use them tosolve problems which he could notsolve with access to just one quantumstate. The reason for this is that Arthurmay be able to validate the proofsagainst each other to check for errors ormalicious modifications by the provers.By contrast with the classical setting,where multiple proofs can simply beconcatenated together, the fact that sep-

Unentangled Quantum Proofs and their Applicationsby Ashley Montanaro

Entanglement�is�a�key�resource�for�quantum�computers.�However,�the�lack�of�entanglement�can�also

be�a�powerful�concept�in�quantum�computation.

Figure 1: Multiple unentangled quantum proofs can be simulated by only two proofs.

Page 16: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201116

Special theme: Unconventional Computing Paradigms

arate quantum proofs are unentangled iscrucial to enable this cross-checking.

Our work centres on the notoriouslyhard boolean satisfiability problem 3-SAT, for which no classical (or indeedquantum) algorithms are known that runin time subexponential in the length ofthe input. Surprisingly, we show that 3-SAT can in fact be solved efficiently bya quantum computer, as long as it isgiven access to two unentangled proofs,the length of each of which is approxi-mately the square root of the size of theinput. As classical computers appear torequire proofs approximately as long asthe original input to solve 3-SAT effi-ciently, this implies a significantquantum advantage.

It was previously known that 3-SAT canbe solved efficiently, given access to alarge number of unentangled quantumproofs. In order to use this prior work toprove our new result, we develop andanalyse an efficient quantum algorithmthat certifies that the state of a large

number of quantum systems is unentan-gled, given only two copies of thequantum state to be tested. This resultthen allows Arthur, given two unentan-gled proofs, to simulate multiple unen-tangled proofs, and to use the previ-ously known protocol to solve 3-SAT.

On the flip side, our result impliesobstructions to classical simulation oftwo-prover unentangled quantum proofsystems. If such systems could be effi-ciently simulated classically, this wouldimply classical algorithms for 3-SATthat run in less than exponential time,which are considered very unlikely toexist. It turns out that these proof sys-tems are connected to many centraltasks in quantum information theory.For example, our work implies compu-tational hardness of estimating thecapacity of quantum communicationchannels, detecting bipartite entangle-ment, and estimating ground-state ener-gies of certain classes of quantumsystem. Thus the apparently arcaneframework of quantum proof systems

with multiple unentangled provers is infact closely related to other aspects ofquantum information theory, and iseven connected to optimisation prob-lems from outside the field.

While entanglement is seen as the basisof quantum computational power, ourwork implies that the absence of entan-glement can also be desirable. Futurework aims to determine the precisecomputational power of multiplequantum proofs, and also to investigateintriguing connections to the power ofentanglement in quantum informationtransmission.

Link:http://arxiv.org/abs/1001.0017

Please contact:Ashley Montanaro,Centre for Quantum Information andFoundations, Centre for MathematicalSciences, CambridgeTel: +44 1223 760383E- mail: [email protected]

Position-Based Quantum Cryptographyby Harry Buhrman, Serge Fehr and Christian Schaffner

Position-based�cryptography�offers�new�cryptographic�methods�ensuring�that�certain�tasks�can�only

be�performed�at�a�particular�geographical�position.�For�example,�a�country�could�send�a�message�in

such�a�way�that�it�can�only�be�deciphered�at�the�location�of�its�embassy�in�another�country.�Using

classical�communication,�such�tasks�are�impossible�to�perform.�At�CWI,�we�investigate�whether

position-based�cryptography�can�be�achieved�if�players�are�allowed�to�use�quantum�communication.

Quantum cryptography makes use of thequantum-mechanical behavior of naturefor the design and analysis of crypto-graphic schemes. Its aim is to designcryptographic schemes whose securityis guaranteed solely by the laws ofnature. This is in sharp contrast to moststandard cryptographic schemes, whichin principle, can be broken, i.e., whengiven sufficient computing power. Froma theoretical point of view, quantumcryptography offers a beautiful interplaybetween the mathematics of adversarialbehaviour and quantum informationtheory.

The best-known application of quantumcryptography is quantum key distribu-tion (QKD). QKD allows two distantparties to securely communicate in away that cannot be eavesdropped on.The technical requirements to perform

QKD protocols are well within reach oftoday's technology; in fact, QKDdevices are currently produced and soldcommercially. QKD requires the twoinvolved parties to know and trust eachother. In general, however, cryptog-raphy is concerned with scenarioswhere the involved parties want toexchange data without trusting eachother. In order to distinguish themselvesfrom an attacker, legitimate playerscommonly use some form of credential,for example, a digital secret key or bio-metric data such as a fingerprint. Thegoal of position-based cryptography isto let the geographical position of aperson act as its only credential foraccessing secured data and services.This has the important advantage thatno digital keys need to be distributedand locally stored, which is often thebottleneck in standard cryptographic

solutions and offers opportunities forattacks.

Position-based cryptography has anumber of interesting applications. Forexample, it enables secure communica-tion over an insecure channel withouthaving any pre-shared key, with theguarantee that only a party at a specificlocation can learn the content of theconversation; think of a military com-mander who wants to communicatewith a base which is surrounded byenemy territory, or a country that wantsto send instructions to an embassy in aforeign country. Another application isauthenticity verification, where posi-tion-based cryptography enables usersto verify that a received message origi-nates from a particular geographicalposition and was not modified duringthe transmission.

Page 17: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 17

In 2009, it was proven by our collabora-tors from the University of California inLos Angeles (UCLA) that position-based cryptography is impossible in theclassical (non-quantum) world in thesetting where colluding opponents con-trol the whole space which is not occu-pied by honest players. In our latestresearch article, we investigate whetherthe impossibility of position-basedcryptography can be overcome if weallow the players to use quantum com-munication.

The outcome of our theoretical investi-gation demonstrates that the possibilityof doing secure position-based cryptog-raphy depends on the opponents' capa-bility of sharing entangled quantumstates. On the one hand, we show that ifthe opponents cannot share any entan-gled quantum state, then secure posi-tion-based cryptography is possible.We present a scheme which allows aplayer, Alice, to convince the other par-ticipants in the protocol that she is at aparticular geographical position. Incontrast, colluding opponents who arenot at this position and do not share anyentangled quantum state will bedetected lying if they claim to be there.Our scheme is very simple and can beimplemented with today's QKD hard-ware. More advanced applications (asoutlined above) can be based on ourscheme.

On the other hand, we also show that ifthe opponents are able to share a hugeentangled quantum state, then any posi-tioning scheme can be broken and noposition-based cryptography is possibleat all. In fact, our result shows how col-luding opponents can use their entan-gled state to instantaneously and non-locally perform the honest player'soperations and are therefore able tomake it appear as if they were at theclaimed position.

Our results raise various interestingresearch questions. For example, it is aformidable technical challenge to storeand handle large quantum states. Hence,is secure position-based cryptographypossible in the realistic setting whereopponents can only handle a limitedamount of entangled quantum states?Our investigation has already sparkedseveral follow-up works and first resultsindicate that there are schemes whichremain secure in this bounded-entangle-ment setting.

Link:http://homepages.cwi.nl/~schaffne/positionbasedqcrypto.php

Please contact:Christian SchaffnerCWI Amsterdam, NetherlandsTel: +31 20 592 4236E-mail: [email protected]

Figure 1: Quantum

protocol for position

verification in one

dimension.

(Picture by C.

Schaffner with images

from Svilen Milev and

Victor Zuydweg.)

Page 18: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201118

Special theme: Unconventional Computing Paradigms

Quantum Key Distribution (QKD) usesthe properties of individual particles oflight (photons) to establish a digital keybetween two communication partners.Based on the principles of quantumphysics the information carried by aphoton cannot be extracted unnoticed bythe legitimate partners as any extractionof information requires a measurementthat modifies the properties of thephoton. Additionally, the principles ofquantum physics dictate that an identicalphoton cannot be produced by an eaves-dropper. Therefore, any attempt to inter-cept will be detected by both parties andit is proven that the distribution of dig-ital keys is absolutely secure (ieInformation Theoretic Secure (ITS))against eavesdropping. Consequentlymessages encrypted with these keysusing One-Time Pad encryption canwithstand any attacks from arbitrarypowerful computers (and even QuantumComputers). In addition secret keys gen-erated with QKD can be used for ITSmessage authentication. Thus QKD

gives us a tool for absolutely securecommunication within the digitalworld.

How QKD technology evolved

- a bit of history

In 1984, Charles Bennett and GillesBrassard proposed the use of quantumcommunication for cryptographic pur-poses. Over the following years it wasproven that even an adversary withunlimited computing resources cannotbreak quantum key distribution , some-thing that cannot be realized with clas-sical means alone. The technology ofcurrent products corresponds to thestatus of development of 2000-2003.These products are usually operated inpoint to point connections and havestrong restrictions on the key generationrate and distance. Meanwhile research groups around theworld are working on a second genera-tion of QKD link devices and new labo-ratory prototypes have emerged. These

belong to a variety of different classesof systems, the common denominatorbeing that the key generation rate anddistance perform much better. Briefly,the second generation systems include:• discrete variable systems, based on

weak coherent pulses and decoystates

• discrete variable systems, based onquantum entanglement

• continuous variable systems and • distributed phase reference systems.

The AIT Austrian Institute ofTechnology has designed and imple-mented mature entanglement based sys-tems as well as developed the softwarefor setting up the largest quantum net-work (six nodes and eight links) inEurope so far.

Both, the AIT link systems as well asthe SECOQC (Development of a GlobalNetwork for Secure Communicationbased on Quantum Cryptography) net-

High-Speed Quantum Key distribution and Beyondby Martin Stierle and Christoph Pacher

The�design�and�implementation�of�mature�entanglement�based�systems�as�well�as�developing�the

software�for�setting�up�the�largest�quantum�network�in�Europe�is�the�hallmark�of�the�Quantum�Key

Distribution�developments�at�the�AIT�Austrian�Institute�of�Technology

Figure1: entangled photon source.

Page 19: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 19

work were developed during theEuropean 6th Framework Programme,and were deployed in Vienna in 2008.Research and development were carriedout in cooperation with Prof. Zeilinger’sgroup at the Institute of ExperimentalPhysics, University of Vienna andIQOQI of the Austrian Academy ofSciences. Recent research results(related to finite key analysis) demon-strate that the system is particularlysuitable for long-haul communicationas a consequence of its ability to gen-erate a secure key even out of a smallsample of measured signals.

QKD - the research agenda!

Even with telecom’s recent switch fromcopper to fibre – and fibre is a prerequi-site for single photon communication –QKD is far from entering our everydaylife. However, the worldwide researchand development agenda is straightfor-ward. What QKD needs for bettermarket acceptance are: • longer distances• reduced cost per unit especially in the

access to the private customer market• integration into the telecom world

and respective standardization, and• satellite key distribution to overcome

trans-continental distance restrictions.

Austrian Institute of Technology

– our approach

Based on our assessment of the key fac-tors required for a real market accept-ance, AIT is concentrating on the devel-

opment of two specific QKD link sys-tems. The first one will be a backboneQKD link system which can be used inparallel with the telecom´s DWDM(Dense Wavelength Division Multiplex)networks. We expect the new system tooperate over a distance of 150km and topush the metropolitan range key genera-tion rate up to a range between 300Mb/s to 1 Gb/s.

The second QKD link system will be alow-cost, low key generation rate PON(Passive Optical Network) system. Inthis project a one-to-many quantumcommunication concept instead of pointto point concepts will be implementedin order to reduce costs significantly.

In addition, AIT is chairing the ETSIIndustry Specification Group onQuantum Key Distribution. AIT also isinvolved in some analysis on QKDsatellite systems. They will be needed toovercome distance restrictions, espe-cially with respect to intercontinentalkey distribution.

How might the future look?

The quantum communication world!

Not only military, but also civil appli-cations are in use. Whenever you wantsecure and private communication,even over long t ime periods, youmight use QKD and new ways ofencryption. This will be true for banksand governments as well as all the e-health community.

Telecom exchanges will work as trustedrepeater stations and form the backboneof the key distribution infrastructure.PONs will extend the reach to the pri-vate customers, eg medical doctors.Thus, quantum channel networks willbe offered like VPNs. In addition tothese more local structures, WWKE –worldwide key exchange will be offeredby one or more wholesale key exchangesatellite or telecom operators.

Links:http://www.ait.ac.at/departments/safety-security/business-units/optical-quantum-technology/

https://sqt.ait.ac.at/software/ https://sqt.ait.ac.at/ HiPANQ http://bit.ly/LQuNethttp://truq.at (only in German)

Please contact:Martin Stierle Austrian Institute of Technology /AARIT, AustriaTel.: +43 664 235 1847E-mail: [email protected]

Figure 2:

Entangled photon

QKD device

prototype.

Page 20: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

It might appear that at the moment, theinteraction between computer scienceand biology is intensifying, with posi-tive consequences for both fields.Biology has been seen as a rich source ofideas for creating novel approaches,methods, algorithms and techniques forsolving complex computational prob-lems, for generating new concepts andtheories that help specify and analysevarious systems and for providing a newbasis for the theory of computation

itself. On the other hand, complex andsophisticated computer scienceapproaches have been utilized in model-ling various biological systems, by pro-viding a rich set of methods and tech-niques to specify and analyse such sys-tems.

Membrane computing had emerged as abranch of natural computing by the endof the 1990s. The models belonging tothis computational paradigm, calledmembrane (or P) systems, have beenconceived as hierarchically distributeddevices inspired by the structure ofliving cells. Some of these models havealso been abstracted from the structureand functionality of more complex bio-logical entities, like tissues, organs and

populations of cells. Membrane com-puting started off as a new unconven-tional computational paradigm gener-ating a broad range of fundamentalresearch in theoretical computer sci-ence. In recent years, membrane sys-tems have been used as modelling vehi-cles for various biological systems or asspecification languages in graphics orfor describing a large variety of paralleland concurrent systems. For a compre-hensive presentation of this research

field, The Oxford Handbook ofMembrane Computing (see link 1) isrecommended, and for an almostexhaustive list of publications in thisfield, the website available at link 2 canbe consulted. The above mentionedlines of research have been followed byour group at the University of Sheffieldas well.

We have been working on developingvarious types of membrane systemsinspired by phenomena from the worldof biology, including: intra- and inter-cellular signalling mechanisms (cap-tured by P systems with travellingobjects); communication processesoccurring in populations of cells ormore complex organisms (population P

systems); and concerted actions of indi-vidual entities in populations of bacteria(quorum sensing P systems). As withother types of membrane systems,researchers have investigated variousaspects of these models, including theircomputational capabilities, complexityaspects and relationships with otherclasses of computational models,including various process algebras,Petri nets, cellular automata, X-machines and grammar systems.

Establishing such relationships hasallowed us to transfer from thesemodels to membrane systems a rich setof methods and tools to analyse and val-idate the consistency of such systems.

We have been interested in utilising Psystems as specification languages forvarious natural, artificial or engineeredsystems. In this respect a specific classof membrane system has been intro-duced and studied for application ingraphics. Another class has been intro-duced in connection with synchronizingthe components of a system running inparallel. Special attention has been paidto systems with stochastic behaviour,especially from biology. For such sys-tems, we have not only introduced a

ERCIM NEWS 85 April 201120

Special theme: Unconventional Computing Paradigms

Membrane Computing – theory and Applicationsby Marian Gheorghe

Membrane�computing�is�a�new�unconventional�computing�model�that�abstracts�from�the�structure

and�functionality�of�the�living�cell.�Developments�of�this�computational�paradigm�cover�both�the

study�of�the�theoretical�basis�of�the�models�introduced�as�well�as�applications�in�various�fields.

Figure 1: Invariants

checking with PRISM.

Page 21: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

specification language, but have alsoproduced simulation tools and utilizeda number of state-of-the-art tools forsystem analysis.. The power and effec-tiveness of stochastic simulators havebeen complemented by a fine-grainanalysis of various properties of the sys-tems simulated by using a tool calledDaikon (see link 3). These propertiesextracted from simulations have beenthen validated through verificationtechniques using PRISM, a proba-bilistic model checker (see link 4 andFigure 1). More details regarding theseapplications can be found at the link.

We are actively cooperating with manyof the research groups working in mem-brane computing. We have beenworking with the group at the Universityof Verona (Vincenzo Manca) on thedynamical properties of P systems, withthe group at the Universities of Paris XII

(Sergey Verlan) and Metz (MauriceMargenstern) on synchronization prob-lems in membrane systems, withErzsebet Csuhaj-Varju and GyorgyVaszil at the Computer and AutomationInstitute of the Hungarian Academy ofSciences on membrane systems inspiredby social networks, with NatalioKrasnogor at the University ofNottingham on using membrane sys-tems as a modelling tool in syntheticbiology, with Mario Perez-Jimenez andhis group at the University of Seville ondeveloping adequate tools for simu-lating membrane systems, withGiancarlo Mauri at the University ofMilan-Bicocca on classes of systemwith stochastic behaviour. Furtherresearch plans for our group include thedevelopment of some programmabilitycapabilities for these systems associatedwith sound verification and testingmethods.

Links:1. http://ukcatalogue.oup.com/product/9780199556670.do2. http://ppage.psystems.eu/3.http://groups.csail.mit.edu/pag/daikon/download/doc/daikon_manual_html/daikon_8.html4. http://www.prismmodelchecker.org/5. http://staffwww.dcs.shef.ac.uk/people/M.Gheorghe/

Please contact:Marian GheorgheDepartment of Computer ScienceUniversity of Sheffield, UnitedKingdomTel: +44 114 222 1843E-mail: [email protected]

ERCIM NEWS 85 April 2011 21

The idea of synthesizing a ‘minimal cell’by mathematical and engineered tools isnot new in the literature, namely there is arecent trend in synthetic biology towardsbuilding a synthetic endomembranestructure, whose compartments (usuallyformed by liposomes) contain the min-imal components required to perform thebasic function of a biological cell (essen-tially self-maintenance and self-repro-duction). A self-replicating metabolicsystem requires a synchronization of itsinternal dynamics in such a way that themetabolic activity is maintained, whilethe system exchanges matter with theenvironment, grows, and replicates itsown membrane structure together withthe contained metabolic processes. Weaim to model such a dynamical system byusing a P system; a computational modelinspired by the cell.

This research may be framed within thecontext of membrane computingmodels, an area of molecular and cel-lular computing, and aims to build a

self-replicating metabolic membranesystem where molecular and cellularpeculiarities are represented in sym-bolic and algorithmic terms. It wasstarted at the beginning of the 2011, andits purpose is to develop an under-standing of the general rules governingthe synchronization of genomic dupli-cation and membrane reproduction.This is known as a “top-down”approach to synthesize minimal formsof life starting from general principlesof structure and function organization(matter conservation, anabolism andcatabolism, species distribution, enzy-matic control, autopoiesis). In principle,this approach has an experimental coun-terpart, and it can be classified as a con-structivist approach for scientificknowledge, according to Feynman’sfamous motto “What I cannot create, Ido not understand”.

The techniques employed in this projectare a combination of membrane com-puting and metabolic P systems, statis-

tical regression and correlation methods,numerical approximation and timeseries analysis techniques. The researchis intended to have a mathematical/sym-bolical – computational orientation, andthe main expected product is aMetabolic P model of a minimal cell.The research project will be conductedat the department of Computer Scienceof the University of Verona, and thereare involved both the authors togetherwith some members of the MNC group

Regarding future activities of thisresearch, we have plenty of ideas anddreams; we are limited only by ourimagination. Once we have a self-repli-cating MP system which exchangesmatter with the environment to maintainits internal metabolic dynamics, boththe role of energy in such an exchangeand a form of adaption to the environ-ment could be studied, by analyzing theconsequent reactions of the system todifferent (even energetic) stimuli.Receptivity and reactivity should be

On Synthesizing Replicating Metabolic Systemsby Giuditta Franco and Vincenzo Manca

A�research�project�at�the�University�of�Verona�concerns�the�synthesis�of�a�minimal�cell�by�means�of�a�P

system,�which�is�a�distributed�rewriting�system�inspired�by�the�structure�and�function�of�the�biological

cell.�We�aim�to�define�a�dynamical�system�which�exhibits�a�steady�metabolic�evolution,�resulting�in�self-

maintenance�and�self-reproduction.�Metabolic�P�Systems,�shortly�called�MP�systems,�represent�a

specific�class�of�P�systems�which�is�particularly�promising�to�model�a�minimal�cell�in�discrete�terms

Page 22: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201122

Special theme: Unconventional Computing Paradigms

investigated to better understand therobustness of the single cell and of cellnetworks. Communication and interac-tion among (synthetic and/or real) cellsis a crucial task to model morphogen-esis (e.g., embryogenesis) and tissueorganization. From the viewpoint of atissue system, the process of mitosis ofeach single cell is limited in time; singlehealthy cells do not live forever but tis-sues do, while new cells rise and oldones die. Tissues stay alive under cer-tain boundaries (density, dimension),while single cells produce new cells andeventually die. The cellular and tissuesystems are tightly interrelated but havequite different dynamics and function.

The study of phenomena such as celldifferentiation and speciation are funda-mental for the understanding of manyprocesses of biomedical interest. Forexample, embryonic cells are inter-esting as they have illimitable replica-tive power and the ability to generateany type of tissue, a property they havein common with stem cells. On theother hand, non-controlled proliferationand differentiation of stem cells oftendenote presence of cancer. Cell migra-

tion can also be involved in suchprocesses, and through our research weexpect to find a way to represent bothmolecular and cellular migration in thecontext of P systems. In this frameworka sensible place to start is by modellingthe concept of the biological gradient,which might be achieved by means ofnested membrane localization. Finally,cell Darwinian (and other) evolution ofsynthetic cells could give interestinginsights on several controversial issuesin population genetics evolution theo-ries (such as the importance of thechance in evolutional transformations,known as genetic drift).

Other institutes involved in thisresearch project are the partners in a

recent European FET-project calledMinie (MINImal Eukaryotic cells), nowunder valuation: the “Politecnica” andthe “Complutense” University ofMadrid, and the University of Zaragoza,all in Spain, the University of MilanoBicocca, Italy, the University ofBucharest, Romania, the University ofLeiden, Holland, and the MalardalenUniversity in Sweden.

Please contact:Giuditta Franco and Vincenzo MancaUniversity of Verona, ItalyTel: +39 045 802 7045/7981E-mail: [email protected],[email protected]

Research�group�at�the

University�of�Verona.

Semantics, Causality and Mobility in Membrane Computingby Oana Agrigoroaiei, Bogdan Aman and Gabriel Ciobanu

A�research�group�at�the�Alexandru�Ioan�Cuza�University�of�Iași,�Romania,�affiliated�to�the�Romanian

Academy�of�Sciences,�conducts�research�in�the�area�of�membrane�computing�on�topics�such�as

operational�semantics�of�the�membrane�systems,�causality,�object-based�and�rule-based�event

structures,�as�well�as�several�aspects�of�mobile�membranes�including�computability,�complexity�and

links�to�other�existing�formalisms.

Membrane computing is a well-estab-lished and successful research fieldwhich belongs to the more general areaof natural computing. Membrane com-puting was initially developed by Gh.Păun in 1998, and in 2003 was identifiedby Thomson Institute for ScientificInformation as a “fast emerging researchfront in computer science”. Membranecomputing deals with parallel and non-deterministic computing models inspiredby cell biology. Membrane systems arecomplex hierarchical structures with aflow of materials and information whichunderlies their functioning, involvingparallel application of rules, communica-tion between membranes and membrane

dissolution. A “computation” is per-formed in the following way: startingfrom an initial structure, the systemevolves by applying the rules in a non-deterministic and maximally parallelmanner. The maximally parallel way ofusing the rules means that in each stepwe apply a maximal multiset of rules,namely a multiset of rules such that nofurther rule can be added to the multiset.A halting configuration is reached whenno rule is applicable.

Operational Semantics

We define the operational semantics formembrane systems, and prove that thesemantics for both static and dynamic

resource allocation are equivalent. Weconsider some alternative modes of evo-lution, based on either minimal paral-lelism, maximizing the quantities of con-sumed resources or maximizing thenumber of applied rules. For systemswith a complex hierarchical structure, wepresent a reduction to a system with asingle membrane (and additional rules).Such a reduction is achieved by encodingthe semantic constraints of the hierar-chical system within rule using promotersand inhibitors in the system consisting ofjust one membrane. This reduction is sub-sequently used as a technical tool to solveproblems for complex systems byreducing them to simpler cases.

Page 23: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 23

Causality

We construct systems with evolutionrunning in reverse by inverting thedirection of the rules themselves. Weunderline the similarity between the twomodes of evolution (direct and reverse),and propose how both of them can beused during computations. We relatemembrane systems to event structuresby considering events to be related torule applications (in one case), and toresources (in another case). The corre-sponding event structures representformal tools underlying causal relationsin the evolution of a parallel and nonde-terministic system. We prove that thecausal relation given by the resourcebased event structure is directly relatedto the way membrane rules can belinked together by a newly introducedcausal relation.

Mobility

Mobile Membranes represent a paralleland nondeterministic computing modelinspired by cells and their movementsin which mobility is given by specificendocytosis and exocytosis rules. Westudied the computability power andcomplexity of mobile membranes, andestablished some links to processalgebra. We have introduced variousmembrane systems in which mobility isthe key issue. The main mobility rulesdescribing the evolution of the structureare endocytosis (moving an elementarymembrane inside a neighbouring mem-brane), and exocytosis (moving an ele-mentary membrane outside the mem-

brane where it is placed); other mobilityrules are given by pinocytosis andphagocytosis. We related systems ofmobile membranes to some existingprocess calculi (mobile ambients, pi-calculus, brane calculi) and to Petrinets. We define some concepts inspiredfrom process calculi in the frameworkof (mobile) membrane computing. Toobtain more interesting and accuratemodels, we add timers inspired by celllifetime to membranes. We relate these

membrane systems with timers to timedmobile ambients. By describing biolog-ical systems using mobile membranes,we can provide precise answers to manyinteresting biological questions. Usingsoftware tools and process calculibisimulations, (potentially infinite)system behaviour can be investigated,and this information could be inter-esting from a biological viewpoint.

A standard approach in membrane com-puting is to look for membrane systemswith a minimal set of ingredients whichare powerful enough to achieve the fullcomputability power of Turingmachines. We have investigated thecomputability power and efficiency ofour formalisms by using formal lan-guages, register machines and otheringredients of computability theory. Forcertain systems of mobile membranes,we prove that it is enough to have threemembranes to get the same computa-tional power as a Turing machine.Moreover, using systems of mobilemembranes, we got polynomial solutionsof two NP-complete problems. Moredetails can be found on our web pages.

Link:http://iit.iit.tuiasi.ro/TR/

Please contact: Gabriel CiobanuA.I. Cuza University of Iași, RomaniaE-mail: [email protected],[email protected]://profs.info.uaic.ro/~gabriel/

This book presents new ideas and results in

membrane computing. The table of contents

is avavailble at http://profs.info.uaic.ro/

~gabriel/contents.pdf

For more information, please contact the

author.  

Membrane systems (also known as Psystems) were introduced in 1998 byGheorge Păun as a class of distributedand parallel computing devices, inspiredby the structure and function of livingcells. The basic model consists of a hier-archical structure composed of severalmembranes embedded into a main mem-brane called the skin. Membranes dividethe Euclidean space into regions thatcontain some objects and evolution

rules. Using these rules, the objects mayevolve and/or move from a region to aneighbouring one. Every computationstarts from an initial configuration ofthe system, in which multisets ofobjects are scattered among the regions.At each computation step some rulesare selected from the pool of activerules, according to a predefinedstrategy, and are applied; the computa-tion terminates when no evolution rule

can be selected. Usually, the result of acomputation is the multiset of objectscontained in an output membrane oremitted from the skin of the system.

At the Department of Informatics,Systems and Communication (DISCo)of the University of Milano–Bicocca wehave been working on P systems sincetheir invention. In particular, we studythe computational power of several

from Energy-based to Quantum (inspired) P systemsby Alberto Leporati

Taking�into�account�the�amount�of�energy�used�and/or�manipulated�during�computations

may�yield��formal��computational�models�which�are�closer�to�physical�reality.

Page 24: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201124

Special theme: Unconventional Computing Paradigms

variants of P systems by using tech-niques from the theory ofComputational Complexity. When Ijoined the group in 2004, I had justcompleted my Ph.D. in ComputerScience with a thesis on complexityissues related to classical and quantumcircuits. It was therefore very natural tome to enter the exciting world of P sys-tems by defining two models that takeaccount of the amount of energyexchanged with the environment and/ormanipulated during computations:energy-based P systems and UREM(Unit Rules and Energy assigned toMembranes) P systems.

In energy-based P systems, a givenamount of energy is associated witheach object. Moreover, instances of aspecial symbol are used to denote freeenergy units occurring inside theregions of the system. These energyunits can be used to move or transformobjects, by rules that satisfy the prin-ciple of energy conservation. The trans-formation is performed bytaking/releasing an appropriate amountof free energy units from/to the regionwhere the rule is applied. My colleaguesand I have studied several computa-tional properties of energy-based P sys-tems, trying to characterize their com-putational power. We have thus shownthat they are able to simulate anyreversible and conservative circuit com-posed of Fredkin gates, and we haveproved that they are universal – that is,they have computational power equiva-lent to that of Turing machines – pro-

vided that a form of local priorities isassigned to the rules of the system.Without such priorities, the computa-tional power decreases and is no morethan that of Vector Addition Systems.Characterizing the precise computa-tional power of energy-based P systemsis still an open problem; however, theresults obtained so far indicate thatthese systems can be used as a model-ling tool just like Petri nets.

On the other hand, in UREM P systemsa non-negative value of energy isassigned to each membrane. The rulesare of two possible kinds: (ini: a, Δe, b)

and (outi: a, Δe, b), controlling theobjects trying to enter and leave mem-brane i, respectively. When one of theserules is applied, the object a on which itoperates is transformed to b whilecrossing the membrane; simultaneously,the amount of energy associated withthe membrane is changed by adding thequantity Δe, which may be positive,zero or negative. Also these P systemshave undergone an extensive study con-cerning their computational power. Inthis case by using a form of local priori-ties we reach the computational powerof Turing machines, whereas byavoiding priorities the computationalpower decreases. This time, however,we obtain a characterization ofPsMAT, an interesting class of matrixgrammars which is not so easy to obtainwith other computational models.

Another interesting aspect of UREM Psystems is that they have allowed us to

define a quantum-inspired variant of Psystems. Instead of merely translatingeach rule of UREM P systems to a uni-tary matrix, we decided to propose acompletely new computation devicebased on creation and annihilation,which are among the most elementaryoperations which can be conceived inphysics: adding or removing a quantumof energy to/from a quantum system.Studying these P systems will certainlyyield interesting observations both fromthe computational point of view andfrom the quantum physics side. The firstresults we have obtained are that – dif-ferently from the classical version –these systems do not need to use priori-ties to be universal; moreover, the use ofcreation and annihilation allows the def-inition of linear (but non-unitary) opera-tors, that allow one to efficiently solveNP-complete problems. The latteraspect should be further investigated,since this result is in contrast with whatcan be done with “traditional” quantumcomputers.

Please contact:Alberto LeporatiDipartimento di Informatica,Sistemistica e ComunicazioneUniversità degli Studi di Milano –BicoccaE-mail: [email protected]

Figure 1: On the left: an example of an energy-based P system. The skin membrane contains one copy of

object a, two copies of c and three energy units. The rule ae2 → (b,1) adds two energy units to a copy of a,

produces a copy of object b and sends it to the (until now empty) region enclosed by membrane 1. On the

right: illustration of the effect of rule (ini: a, Δe, b) on membrane i. The object a crosses the membrane

becoming b; simultaneously, the energy assigned to the membrane passes from ei to ei + Δe.

Page 25: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 25

Redbiocom is composed of the Bio-inspired Computing and ComplexSystems group at Autonoma Universityof Madrid (UAM), the ComputationalBiology and Bioinformatics Group atBalearic Islands University (UIB), theDecision Modelling, Computation andSimulation group at University of Lleida(Ull), the Natural Computing group atPolytechnic University of Madrid (UPM-I), Laboratorio de Inteligencia Artificial(LIA) at Universidad Politécnica deMadrid (UPM-II), the ComputationModels and Formal Languagesgroup at Polytechnic Universityof Valencia (UPV), and theNatural Computing group atUniversity of Seville (USe).

The main research areas ofRedbiocom are MembraneComputing and P Systems,Networks of Bio-inspiredProcessors, DNA computing andSynthetic Biology, ComputationalBiology and Bioinformatics.

Current research projects of the consor-tium in these areas are:• the “Alignment of metabolic net-

works modelled as Petri nets and P-systems” (UIB), where researchers tryto generalize the comparison algo-rithms developed for Petri nets to P-systems, exploiting the relationshipbetween both models

• the construction of a simple computer,using bacteria rather than silicon(UPM-II) in the European ProjectBACTOCOM

• the simulation of biological processesand design of biomolecular informa-tion processing devices (UPM-II)

• the development of a complete plat-form for natural computation soft-ware, its application to concrete lin-guistic problems, and the proposal oftechniques in the field of evolutionarycomputation which will hybrid differ-ent disciplines (evolutionary compu-tation, formal languages, data struc-

tures in the theory of algebraic com-plexity, automatic learning) andmakes it possible to provide automat-ic (genetic) programming tools fornatural computation (UAM)

• The full development of a computa-tional model based on bio-inspiredoperations over strings organized as a

network of bio-inspired processors(UAM, UPM-I and UPV)

• The development of a multi-compart-mental, stochastic and discrete model-ling framework based on P systems forthe study of systems biology modelsranging from bacterial colonies toecosystems (USe). From a syntheticbiology perspective this project aims atdeveloping models of synthetic generegulatory networks inducing desirablephenotypes in multicellular bacterialsystems. These models are used asblue-prints to assess the viability of thedifferent possible designs prior to theirimplementation in the wet lab

• A systems and synthetic biologyapproach to the modelling and imple-mentation of biological systems with

prefixed behaviour (USe and Prof.Gheorghe Paun, initiator of the Mem-brane Computing discipline).Researchers associated with this proj-ect participate as instructors in theteam that represents the University ofSeville in the International Genetical-ly Engineered Machines competition(iGEM), the premiere undergraduateSynthetic Biology competition

• The production of a simulation tool,based on the Membrane Computingparadigm, that helps the experts to

study different ecological systems(Ull).

Currently, the joint efforts ofthe research groups in the net-work have been oriented toorganize the First InternationalSchool on Biomolecular andBiocellular Computing

(ISBBC) which will be held inOsuna (Spain) from September

5th to 7th, 2011. The school will beorganized in three different topics:

Membrane Computing, Networks ofBio-inspired Processors and SyntheticBiology and DNA computing.

Links:http://www.redbiocom.eshttp://www.p-lingua.org/wiki/index.php/Main_PageFirst International School on Biomolecularand Biocellular Computinghttp://www.redbiocom.es/ISBBC/ISBBC11

Please contact:Mario de Jesús Pérez JiménezUniversity of Seville, SpainE-mail: [email protected]

Alfonso Ortega de la PuenteAutonoma University of MadridE-mail: [email protected]

José M. SemperePolytechnic University of Valencia /SpaRCIM, SpainE-mail: [email protected]

the Spanish Network on Biomolecular and BiocellularComputing: Bio-inspired Natural Computing in Spainby Mario de Jesús Pérez Jiménez, Alfonso Ortega de la Puente and José M. Sempere

The�Spanish�Network�on�Biomolecular�and�Biocellular�Computing�(Redbiocom)�is�a�consortium�of�seven�Spanish

research�groups�whose�research�activities�focus�on�the�bio-inspired�approach�to�Natural�Computing.�The�Network

was�founded�in�2009�and�it�was�funded�by�the�Spanish�Ministry�of�Science�and�Innovation�under�the

Complementary�Action�TIN2008-04487-E/TIN.

Figure 1: main research areas of Redbiocom.

Page 26: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

One major characteristics of living sys-tems is the distributed organization oftheir components. For this reason, ourinvestigations concentrate on distributedbio-inspired computational models. Weenhance the tools and techniques pro-vided by formal language and automatatheory with novel elements to obtaincomputational models which could helpto provide a better understanding and amore precise description of the theoret-ical principles, the driving forces behindnatural processes.

We are interested in both classical andunconventional characteristics of theobtained computational paradigms: theircomputational, descriptional, and com-munication complexity, their robustness,the patterns and dynamics of theirbehaviour, and the dependence of theseproperties on the structure, organizationand functioning of the system.

In recent years, a significant amount ofresearch has been devoted to membranesystems or P systems, constructsabstracted from the architecture anddevelopment of the living cell, firstintroduced by Gheorghe Paun in 1998.The basic variant of these systems con-sists of structures of hierarchicallyembedded membranes that delimitregions which contain objects. Theobjects correspond to molecules orchemical elements which evolveaccording to evolution rules, or movefrom region to region according to com-munication rules.

Re-considering the concept of automata,we have defined P automata, purelycommunicating accepting P systemswhich combine characteristics of bothclassical automata and distributed nat-ural systems. P automata resemble clas-sical automata in the sense that in eachcomputational step they receive inputfrom their environment, and this inputinfluences their operation. What makes

them different from classical automatais that the workspace that they can usefor computation is provided by theobjects of the already consumed input.The objects which enter the systembecome, in a sense, part of the descrip-tion of the machine, that is, the input,the object of computation and thedevice which executes the computation

cannot be separated. This is a featurethat can usually also be observed if weinterpret natural processes as “computa-tions”. Thus, in this sense P automatacan be considered as a type of “naturalautomata’’.

We have characterized several classicallanguage classes in terms of P automata.In cooperation with Oscar H. Ibarra, wedetermined variants which are alterna-tives of a Turing machine. One of thesegives a “natural” representation of theclass of context-sensitive languages,well-known in classical formal lan-

guage theory. In a joint work withJürgen Dassow, we have also provided anatural extension of the notion of reg-ular languages and that of finiteautomata for the case of infinite alpha-bets.

In the future, we plan to perform inves-tigations in the theory of P automata in

several directions, to make the resem-blances and the differences betweenclassical automata and P automata moreexplicit.

Another important focus of our researchis the study of networks of bio-inspiredlanguage processors. These constructsconsist of bio-inspired rewriting sys-tems (language processors) whichoperate on sets or multisets of wordsand are located at nodes of a virtualgraph. The processors use their opera-tions to rewrite the collection of wordsat the nodes and then redistribute the

ERCIM NEWS 85 April 201126

Special theme: Unconventional Computing Paradigms

Expanding formal Language and Automata theory: Bio-inspired Computational Modelsby Erzsébet Csuhaj-Varjú and György Vaszil

One�of�the�goals�of�research�in�bio-inspired�computing�is�to�create�new�and�powerful�computational

paradigms�based�on�the�study�of�the�structure�and�behaviour�of�living�systems.�The�investigations�can�be

based�on�different�mathematical�disciplines.�The�Theoretical�Computer�Science�Research�Group�of�SZTAKI

aims�to�expand�the�classic�fields�of�automata�and�formal�languages�to�explore�the�limits�of�syntactical

approaches�in�describing�natural�systems.

Figure 1: A Network of Bio-inspired Language Processors.

The nodes process multisets of strings and communicate with each other.

Page 27: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

resulting strings according to the com-munication protocol assigned to thesystem.

Due to their possible practical rele-vance, constructs based on operationsinspired by the behaviour or propertiesof DNA strands are particularly impor-tant. One of the first models was the testtube sytems based on splicing (our jointwork with Lila Kari and GheorghePaun). Through this research we haveshown that several other variants ofthese constructs offer alternatives toTuring machines even with minimalsize: networks with components basedon an operation that mimics theWatson-Crick complementarity phe-nomenon of DNA strands (in coopera-tion with Arto Salomaa), or with com-ponents based on splicing and filtersinspired by the laboratory techniquecalled gel electrophoresis (in coopera-tion with Sergey Verlan).

Recently, we have been studying net-works of evolutionary processors, wherethe operations are formal language theo-retic counterparts of point mutations,

evolution of the genome, i.e. insertion,deletion, and substitution. Our mainfuture goals in the area include theexploration of the ''limits'' of thesesimple, basic operations, the identifica-tion of the cornerstones and borderlinesbetween decidability and undecidability.In cooperation with Artiom Alhazov,Carlos Martín-Vide, Victor Mitrana andYurii Rogozhin, we have demonstratedthat these distributed systems of verysimple computing devices may possessthe computational power of Turingmachines, even with networks and com-ponents of very small size.

Via properties of changing string col-lections, in the framework of networksof bio-inspired language processors,population dynamics, behavioural pat-terns and motifs can also be studied,thus new directions for language-theo-retic approaches can be opened. In thecase of networks with deterministicLindenmayer systems as components(models of developing systems), initialresults have been obtained (in coopera-tion with Arto Salomaa). We plan toexplore these topics in the future.

The group is in close cooperation withmembers of the European MolecularComputing Consortium; the investiga-tions are supported by the HungarianScientific Research Fund, OTKA,Grant no. K75952.

Links:P automata:http://www.scholarpedia.org/article/P_automata

P systems webpage:http://ppage.psystems.eu

Theoretical Computer ScienceResearch Group: www.sztaki.hu/tcs

Please contact:Erzsébet Csuhaj-VarjúSZTAKI, HungaryTel: +36 1 279 6139E-mail: [email protected]

György VaszilSZTAKI, HungaryTel: +36 1 279 6121E-mail: [email protected]

ERCIM NEWS 85 April 2011 27

Capturing Biological frequency Control of Circadian Clocks by Reaction System Modularizationby Thomas Hinze, Christian Bodenstein, Ines Heiland, Stefan Schuster

Exploration�of�chronobiological�systems�is�emerging�as�a�growing�research�field�within�bioinformatics,

focusing�on�various�applications�in�medicine�and�agriculture.�From�a�systems�biology�perspective,�the

question�arises�as�to�whether�biological�control�systems�for�regulation�of�oscillatory�signals�utilize

similar�mechanisms�to�their�technical�counterparts.�If�so,�modelling�approaches�adopted�from

building�blocks�can�help�to�identify�general�components�for�frequency�control�in�circadian�clocks�and

can�provide�insight�into�mechanisms�of�clock�synchronization�to�external�stimuli�like�the�daily�rhythm

of�sunlight�and�darkness.�Within�the�Research�Initiative�in�Systems�Biology�funded�by�the�German

Ministry�of�Education�and�Research,�we�develop�new�methods�to�cover�temporal�aspects�of�biological

information�processing�employing�oscillatory�signals.

Oscillatory signals play a major role intriggering time-dependent processeslike melatonin and serotonin segrega-tion to initiate tiredness and alertness.Biochemical core oscillators are simpledevices that generate continuously run-ning clock signals by periodically alter-nating substrate concentrations. There-fore, astonishingly small reaction net-works comprising at least one feedbackloop suffice to keep the system running.It is likely that numerous evolutionary

origins led to oscillatory reaction net-works in biological systems, while anumber of independent attempts led tothe technical construction of singleclocks or clock generators. The situa-tion becomes more complicated if sev-eral core oscillators start to interact inconcert with environmental rhythmici-ties. Resulting biological systems arecommonly driven to achieve a synchro-nous behaviour, resulting in an evolu-tionary advantage for exploitation of

environmental resources. Correspond-ingly, technical clock synchronizationis frequently motivated by the need tofollow a global time.

The way in which clock synchroniza-tion occurs differs between computersystems and biological systems. Whilein distributed computer systems, algo-rithmic approaches predominate, bio-logical systems adjust their clocks moregradually. This gradual adjustment

Page 28: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201128

Special theme: Unconventional Computing Paradigms

Figure 1: General scheme of a pure chemical frequency control system based on the concept of phase-locked loops (PLL). The upper part shows

the coupling structure of the three essential modules: core oscillator, signal comparator, and low pass filter. Each module can be represented by

numerous reaction networks. For instance, complex formation suffices for acting as signal comparator while a signalling cascade exemplifies a

low pass filter. Different types of core oscillators complete the control circuit.

might include sequences of dedicatedmodifications in molecular structures orcompartmental dynamics. Typically, itsdescription is either based on reaction-diffusion kinetics or employs moreabstract analytic techniques adoptedfrom general systems theory. Whatthese state-of-the-art approaches in sys-tems biology have in common is thatthey both exploit systems of differentialequations derived from kinetic laws ofthe underlying set of reactions andtransportation processes. Coping withthe complexity of those monolithicmodels is a challenging and error-pronetask. Additionally, the modelling oftencoincides with some incomplete orimprecise information about the desirednetwork topology and its kinetics. Toovercome these insufficiencies, we sug-gest a specific concept of reactionsystem modularization inspired byengineering..

Our concept is based on the assumptionthat 'structure follows function'.Although, there are many strategies toachieve a certain network function, thepool of sufficient network candidates

can be divided into compositions of alow number of elementary functionalunits called modules. This term is notnew in systems biology when consid-ering recurrent motifs conserved inmetabolic, cell signalling, and gene reg-ulatory networks. We extend its notionin terms of information processing: Inthis context, a module is able to fulfillan elementary computational task.Here, the spatio-temporal course of sub-strate concentrations along with molec-ular and compartmental structures actsas data carrier. Beyond logic and arith-metic functions carried out by themodule's steady-state behaviour, simplebuffer and delay elements contribute toa collection of biochemical modules,each of which comprises a few molec-ular species and reactions. When com-bining modules to form reaction net-works capable of a more complex func-tionality, we permit shared molecularspecies among distinct modules. Thisway of coupling enables compact net-work topologies in accordance withobservations from in-vivo studies.Moreover, there is no need for furtherinterface specifications. In most cases,

the module behaviour can be capturedby chemical counterparts of transferfunctions significantly reducing thenumber of distinct parameters bykeeping the relevant characteristics ofthe entire network.

A case study of our modularization con-cept addresses biological frequencycontrol of circadian clocks, biochemicalregulatory circuits resembling phase-locked loops. Corresponding circuitscomprise three modules: • a core oscillator whose frequency has

been controlled to adapt to an exter-nal stimulus. Intensity and periodici-ty of environmental light convertedinto specific proteins by a photo cas-cade represents a typical stimulus

• a signal comparator responsible fordetermining the deviation betweensignals produced by the core oscilla-tor and by the external stimulus. Itcarries out an arithmetic task. Inter-estingly, a single complex formationconducting a mathematical multipli-cation succeeds here whereas moresophisticated mechanisms could beinvolved instead

Page 29: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 29

• a biochemical low pass filter provid-ing a global feedback loop. It convertsthe signal comparator output into adelayed and damped tuning signalwhich is used by the core oscillator toadjust its frequency. A signalling cas-cade gives an example. Its behaviourcan be specified by a Bode diagramwhich depicts the intensity of signalweakening subject to different fre-quencies. The core oscillator must beable to vary its frequency according tothe tuning signal produced by the lowpass filter. There are numerous typesof core oscillators found in circadian

clocks. The majority reveals theGoodwin type, a cyclic gene regulato-ry network composed of mutual acti-vating and inhibiting gene expres-sions. The most effective way to influ-ence its frequency is modification ofprotein degradation rates. Further-more, core oscillators can be of post-translational type, exploiting a cyclicscheme of protein phosphorylation,complex formation, or decomposi-tion. Here, the involved catalystsaffect the frequency. The third type ofcore oscillators includes compartmen-tal dynamics advantageously mod-

elled using membrane systems com-bining a representation of dynamicalstructures with tracing their spatio-temporal behaviour.

Link:http://pinguin.biologie.uni-jena.de/bioinformatik/Forschung/index.html#Project_2

Please contact:Thomas HinzeFriedrich Schiller University Jena,GermanyTel.: +49 3641 9 46463E-mail: [email protected]

Engineering with Biological Systemsby Fernando Arroyo, Sandra Gómez and Pedro Marijuán

Within�the�Natural�Computing�Group�GCN�we�address�two�working�perspectives:�on�the�one�hand,

we�propose�new�bioinspired�computational�models�and�architectures,�and�on�the�other,�we�propose

computational�techniques�and�user�friendly�tools�to�support�the�advancements�in�synthetic�and

systems�biology.�An�in-depth�reflection�on�the�distinctive�nature�of�biological�information�is

necessary�in�both�directions;�our�group�has�actually�established�several�research�lines�and�projects

in�the�“informational”�confluence�of�the�two�working�perspectives.

From the point of view of the bioin-spired computational models we intendto develop mathematical and computa-tional models abstracted from the infor-mation processing that is present in allliving cellular systems. The emphasis isat the cellular level; and the modellingmethods capture the internal structure ofthe cell and the cellular interactions inan appropriate way such that they allowfor the description of what are essen-tially massively parallel computationalsystems. We study the problem solvingcapabilities inherent in the living cell,which are realized by maintaining a spe-cial, “meaningful” relationship with theinternal/external environment, inte-grating the self-reproduction processeswithin the information flow of incomingand outgoing signals. The designsencoded by natural systems are opti-mized by evolution to provide solutionsto physical problems posed by both indi-vidual development and continued via-bility across many environments andevolution.

From a computational point of view, theliving cell may be viewed as a DNA-based molecular computer, endowedwith an enzyme-protein operatingsystem, which is coded into the DNAmemory bank. We are attracted by the

new possibilities for computing offeredby these self-organizing, interlinked,and evolvable molecular systems: prac-tically an unbound memory and a hugeparallelism which might be used forsolving intractable problems in a rea-sonable time. Furthermore, these mole-cules participate in very complex net-works in cellular subsystems, includingregulatory networks for gene expres-sion, intracellular metabolic networks,and both intra- and intercellular com-munication networks. Subsequently thenew biocomputing ideas extracted fromthese core subsystems have potential tobe applied not only to biological-cel-lular instances, but also to modelling ofinteractive, self-organizing processes inmany other fields: population interac-tions, ecological trophic networks,industrial ecosystems and companies,virtual economies, social and culturaldynamics, etc. We are sure the resultsderived from the advancement of thebioinformational research program willopen new avenues for information sci-ence and technology, and will be con-ducive to a paradigm shift in the waybioinspired technologies are conceivedand applied.

From the point of view of the computa-tional techniques and tools to support

synthetic and systems biology, we pre-tend the realization of simulations andtools that are user-friendly and can sup-port the study, analysis, and extractionof further information in an intelligentand meaningful way. Both moleculardynamics and cellular systemic interac-tions will be considereded. In regard tomolecular dynamics, we specificallyattempt an analysis of the prion propa-gation and the protein aggregation phe-nomena (based on altered processes ofmolecular recognition) with the aim ofhelping researchers in their studiesabout the detection and treatment ofneurodegenerative disorders. Withregard to cell systems interaction weattempt a consistent approach to themultiple varieties of information in theliving cell, also by starting out from theconceptualization of molecular recogni-tion phenomena. Subsequently, an ele-mentary approach to the “informationalarchitectures” behind cellular com-plexity may be chartered. In the inter-play of the different informationalarchitectures, two crucial cellular sub-systems should be highlighted: on theone side the transcriptional regulatorynetwork, and on the other, the cellularsignalling system that is in charge of theinterrelationship with the environmentand how the topological governing of

Page 30: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201130

Special theme: Unconventional Computing Paradigms

this network is deployed by the cellularsignalling system in charge of the inter-relationship with the environment.Models and simulations of this basicinteraction will be applied toMycobacterium tuberculosis signallingin living tissues (in cooperation with anongoing research medical project).More generally, as suggested above, theembodiment of functional proteinagents and the peculiar handling ofDNA sequences along the evolutionary

process will suggest a parallel with thevon Neumann scheme of modern com-puters, including the cellular capabilityto “rewrite the DNA rules” along onto-genetic development. In these theoret-ical and applied research areas we col-laborate with Biomolecular ResearchGroups in the Complutense Universityand Biological Research Center ofMadrid, and with the BioinformationGroup of the Aragon Institute of HealthSciences of Spain, respectively.

Links:http://www.gcn.upm.eshttp://sites.google.com/site/pedrocmarijuan/

Please contact:Fernando Arroyo MontoroUniversidad Politécnica de Madrid,SpainE-mail: [email protected], [email protected]

Artificial Wet Neuronal Networks fromCompartmentalised Excitable Chemical Mediaby Gerd Gruenert, Peter Dittrich and Klaus-Peter Zauner

Following�the�analogy�of�biological�neurons,�the�NEUNEU�(Artificial�Wet�Neuronal�Networks�from

Compartmentalized�Excitable�Chemical�Media)�project�aims�at�using�droplets�of�a�chemical�reaction

medium�to�process�information.�In�this�EU�project,�the�excitable�chemical�Belousov-Zhabotinsky

medium�is�packaged�into�small�lipid-coated�droplets�which�form�the�elementary�components�of�a

liquid�information�processing�medium.�Droplets�communicate�through�chemical�signals�and

excitation�propagates�from�droplet�to�droplet.�Among�the�future�applications�of�molecular

information�processors�is�their�potential�for�fine�grained�control�of�chemical�reaction�processes.

Our modern society and technology isvery much dependent on smart auto-matic control devices, from cell-phonesand electronic driver assistance sys-tems in cars to computer controlledmanufacturing processes. In the area ofchemical production processes, how-ever, the level of control achieved sofar is crude compared with what is seenin nature. Living cells craft moleculesone-by-one with quality controldirectly at the assembly stage of eachmolecule. In contrast, chemical tech-nology controls reaction conditions

only at the level of bulk materials andsubsequently the desired product needsto be extracted from the mixture ofsubstances formed in the reactionprocess. As a consequence, the com-plexity of molecules and the concomi-tant functionality of materials that canbe manufactured today falls far shortof those seen in nature.

Attempts to achieve an increasinglyfiner control of chemical productionprocesses currently face the challengeof bridging from electronic control

technology to the surfaces thatmediate chemical reactions. In a dif-ferent approach, the NEUNEU projectaims to include computational capa-bilities directly into different media,migrating the control from a conven-tional computer outside the system ofinterest directly into the system itself,eliminating the need for a complicatedinterface.

Another interesting application ofmolecular information technology isthe potential for smart drugs, capable of

Figure 1: From left: A) Belousov–Zhabotinsky activation waves propagating in 1 to 10 millimeter sized lipid-coated

droplets in oil (Credit: Gorecki Lab, Warsaw). (B) Manually aligned droplets of BZ-reaction medium (Credit:

Josephine Corsi, Univ. of Southampton).

Page 31: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 31

performing elementary logical deci-sions to determine whether the rightposition in space and time is reached torelease certain drugs. This would notonly make the use of drugs more con-venient for the patient but also lead tomore effective and reliable medica-tions.

Before information technology can betightly interwoven with chemical andbiochemical systems, however, it willbe necessary to find out where the con-cepts of conventional information pro-cessing can be mapped into the chem-ical realm and where new paradigmsneed to be developed. The NEUNEUproject decided to explore this pathwith a minimal system that exhibits theproperties (signal transmission, signalgain, self-repair) desirable for a molec-ular information processing architec-ture that harbors the potential to scaleto applications.

To explore the concept of performingcomputations with excitable media weuse the Belousov-Zhabotinsky (BZ)medium. The BZ reaction was discov-ered in the 1950s to be an oscillating

reaction, existing far from the thermo-dynamic equilibrium. Though there arevariations, the BZ-mixture can be pre-pared from potassium bromide, mal-onic acid, sodium bromate, sulfuricacid and ferroin as redox indicator.

When the system is oscillating, the indi-cator is continuously switching betweenits oxidized and reduced form, changingits color between blue and red orbetween red and white, as seen inFigure 1. Stirring the system leads thewhole mixture to exhibit similar behav-iour in all parts of the test tube. On thecontrary, leaving the mixture relativelyundisturbed in a flat petri dish, the BZmedium generates beautiful two-dimen-sional wave patterns.

Because we want to control the signalsthat are transmitted through the chem-ical medium, continuous excitationsare not desirable in this project. Here,the BZ medium is mixed in proportionsthat do not allow a spontaneous start ofan oscillation. Instead, the excitationneeds to be triggered from the outside(eg, by a silver wire) or by an alreadyexisting excitation wave. Hence we

call the state of the medium "excitable"or "sub-excitatory". When the mediumis excitable, a wave propagates equallyin every direction. So the state of adroplet can be simplified by assuminga well-stirred reactor for each droplet.

Using the "sub-excitatory" conditionalternatively, we can also include logicelements inside single large drops. Inthis case, waves do not spread in everydirection, but preserve their shape anddirection as illustrated in Figure 3A.

When we drop water containing BZmedium into oil, the phases do notmerge and droplets form automatically.We are going to use microfluidics toproduce droplets of smaller volumes,with higher precision, and in largequantities (Figure 2). Wrapping up theBZ reaction into these small dropletshas the advantage of discretizing theotherwise continuously propagatingexcitations into precise units.

Lipid molecules contained in the oilphase self-assemble at the surface ofthe droplets and stabilize the dropletsagainst merging. Living cells are also

Figure 2a, 2b: Examples of

microfluidic structures for

generation and handling of

droplets.

Figure 2c: Generation of a

droplet within a microfluidic

structure (not visible).

Page 32: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201132

Special theme: Unconventional Computing Paradigms

covered with lipids, but in this caselipids assemble between two aqueouscompartments and therefore form thedouble layer typical for biomembranes.The amphiphilic lipid molecules assem-bling at the border between the oil andand the "water with BZ" phase, producea lipid monolayer in our system.

When two of the droplets come into con-tact, the lipid monolayers coating eachdroplet can form a double layer similarto a biological membrane. As a conse-quence it is possible to insert membranechannel proteins at the interface betweentwo droplets to facilitate the exchange ofchemical compounds and consequentlythe propagation of the BZ medium exci-tation from droplet to droplet.

By varying the coupling betweendroplets, the droplet radii, and BZmedium compositions, we will generatedroplets with a wide variety of proper-ties, such as excitability, oscillationphase length or refractory times.Controlling these properties shouldallow us to engineer droplet networkswith computational capabilities as forexample shown in Figure 3B.

The NEUNEU project is funded by theEU Seventh Framework Programme forthree years (FP7, ICT, FET, see linkbelow) since February 2010. It bringstogether biochemical experimentalistsas well as theoretical computer scien-tists. In the team of Maurits de Planqueand Klaus-Peter Zauner (University of

Southampton) and Jerzy Gorecki’s lab(Institute of Physical Chemistry, PolishAcademy of Sciences, Warsaw) thephysics and chemistry of small lipiddroplets and of the Belousov-Zhabotinsky reaction are investigated,both theoretically and practically. Thetheoretical foundations, implicationsand possibilities of droplet based com-puting devices are elaborated in theresearch groups of Andy Adamatzky(University of the West of England,Bristol) and Peter Dittrich (Bio SystemsAnalysis Group, Friedrich-Schiller-University Jena).

For further recent activities in EuropeanBioChem IT research see the COBRAinitiative (link below).

Links:http://www.neu-n.eu/http://uncomp.uwe.ac.uk/holley/sim/sim.htmlhttp://www.cobra-project.euhttp://cordis.europa.eu/fp7/ict/programme/fet_en.html

Please contact:Peter DittrichFriedrich Schiller University JenaTel.: +49 3641 946460E-mail: [email protected]

Figure 3a (left) Simulation of directed waves in sub-excitable medium within a network of droplets with two different inputs a, b and output c

(image by Julian Holley, University of the West of England, for animation see link below). 3b: Stochastic 3-D simulation of a droplet network

counting the number of activated inputs (image by Gerd Gruenert, FSU Jena).

Page 33: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 33

Computing resources in modern society,especially in Europe, are widely availableand range from fixed resources (such asdesktop computers, servers, game con-soles, infrastructure systems), to mobileresources (such as portable computers,smartphones, car navigation systems), tovirtual resources (also called cloud-basedresources). Computer science is takingnotice of these new technological devel-opments and has started adapting its cen-tral computing paradigms from central-ized, pre-programmed solutions to decen-tralized, emerging solutions, able to takeadvantage of the variety of resources thatmay be available during computations.

The goal of this new approach is to makepossible environments where computingresources of many types are smoothlyintegrated into on-going computations,even if each resource is only sporadicallyavailable. An example of such an envi-ronment is that of a future smart trans-portation system, where each car has itsown goals (such as reaching its destina-tion given a number of constraints: somecompulsory intermediate route points, amaximum duration of the trip, a max-imum energy consumption, and others),and it interacts with its environment toreach its goal. The interaction mayinvolve local communication with othercars to gather and distribute informationabout the traffic situation (such as trafficjams, accidents, alternative routes) andwith the transportation infrastructureitself on aspects such as road repairs,speed limits and weather predictions.Collective decentralized decisions maybe taken ad-hoc in this environment,such as collective actions to avoid immi-nent collisions or increased traffic jams,involving changes of the constraints,which in turn will affect the goals of eachcar and the decision they make further.The system has no central server to over-look the computation, but rather it is self-

organizing and attempts to self-optimizeon the basis of brief ad-hoc interactionsbetween the actors and on decentralizeddecision making.

The robustness in decentralized sys-tems, with self-organizing, emergingarchitecture, cannot be achieved throughtop-down structural designs thatinclude, as in “classical computing”,well specified control and through pre-programmed responses to various(fixed) failure scenarios, typically basedon fixed network architecture. Instead,the focus is on dynamic activation ofcomputing resources, dynamic decisionmaking, and self-adaptation, allemerging from processors of weak (non-universal) computing power and of spo-radic availability. Biology serves here asan ideal source of inspiration. Biologicalrobustness is typically implementedthrough mechanisms such as multiplefeedback loops, alternative pathwaysand function redundancy, taking advan-tage of an environment where many dif-ferent types of particle are constantlypresent. Cellular robustness manifeststhrough self-repair mechanisms, theability to mount efficient responses tochanges in the environment, tolerancefor broad parameter spaces, multipleresponse pathways, etc. We work ontaking these lessons from computationalsystems biology, and applying them tothe design of novel paradigms of com-puting based on large numbers ofprocessors, each having limited powerand with sporadic availability. Indeed,emerging computing platforms, such asInternet-based computing, or cloudcomputing offer a perspective wherehigh computing power can be madeavailable on demand. We aim to expandbeyond the typical distributed com-puting paradigm where the networkarchitecture and the communicationmechanisms are fixed, and each

processor is capable of universal com-putations, towards a paradigm whereprocessors of limited power are able toengage in ad-hoc communications andcooperation with any available partner,and where the computing powerengaged in a specific task can beincreased or decreased dynamicallydepending on the evolution of the com-putation. The inspiration again comesfrom biology, where enzymes, catalysts,transcription factors, etc can be madeavailable on-demand in large quantities,and then released for other tasks oncethey are not anymore needed.

This research is carried out in collabora-tion with Diego Liberati (CNR, Italy),Alessandro Villa (University ofLausanne, Switzerland), Tatiana Guy(Institute of Information Theory andAutomation, Academy of Sciences ofthe Czech Republic), and Jüri Vain(Tallinn University of Technology).Other parts of the project focus on bio-inspired cooperation and interaction fordecentralized decision making, brain-inspired evolvable networks for deci-sion making, and on applications torobot swarm systems. The multi-disci-plinary character of the project makesthe enterprise possible through the com-bined, multi-disciplinary expertise ofthe consortium members, spanningfrom computer science to biomedicine,information systems, and neuroscience.

Please contact:Giancarlo MauriUniversita di Milano-Bicocca, ItalyE-mail: [email protected]://www.bio.disco.unimib.it/~mauri/

Ion PetreÅbo Akademi University, Turku,FinlandE-mail: [email protected]://www.users.abo.fi/ipetre/

New Robustness Paradigms: from Nature to Computing by Giancarlo Mauri and Ion Petre

In�the�intracellular�and�the�intercellular�environment�we�observe�architectural�designs�and�dynamical

interactions�that�are�very�different�to�those�in�human-designed�systems:�particles�that�are�partly�transported�and

partly�move�chaotically;�fierce�competition�for�resources;�a�vital�need�to�respond�fast,�yet�efficiently,�to�external

stimuli;�etc.�Many�details�about�the�system-level�motifs�responsible�for�robustness,�performance,�and�efficiency

in�living�cells�have�been�discovered�in�the�last�decade�within�computational�systems�biology.�We�are�currently

working�on�applying�the�lessons�learned�from�the�robust�organization,�functioning,�and�communication

strategies�of�living�cells�to�computing,�and�on�integrating�them�into�the�design�of�a�novel�computing�paradigm.�

Page 34: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201134

Special theme: Unconventional Computing Paradigms

The World Wide Web, as we have usedit in the last two decades, i.e., as a com-munication platform, seems to havecome to an end. Applications invadedthe virtual communication world. Weare no longer just surfing; we are access-ing a service through an application.This shift of paradigm is experienced,for instance, when you connect yoursmart phone to your favourite applica-

tion store. Internet as a computing plat-form has become an ecosystem com-prising a myriad of heterogeneous andautonomous services encapsulating ITcomponents such as storage space, com-puting power, software or sensors. To beable to leverage such computing power,one needs to compose, coordinate andadapt such services dynamically,according to users' requests. This virtualecosystem shares some similarities with

chemistry concepts: users, services andpieces of information can be seen asmolecules interacting freely in the greatWWW chemical solution, sometimesmeeting to react and produce new infor-mation.

The chemical model naturallyexpresses dynamics, parallelism anddecentralization. Within this model, a

program is seen as a solution of inter-acting molecules, reacting togetheraccording to some chemical laws (orrules). These reactions take place in animplicitly parallel, non-deterministic,autonomous and distributed manner. Asa simple example, let us consider theproblem of finding the maximum valueamong a set of integers. Imagine a setof molecules representing these inte-gers in a solution. Now, introduce a rule

specifying how two integers that meetshould react: they are consumed in onereaction that produces a new moleculewith the highest value of the two initialintegers. Step by step, in parallel, reac-tions will make the solution convergeto its inertia: only one molecule withthe highest value remains in the solu-tion. The HOCL language (Higher-Order Chemical Language), which the

team developed in 2007, provides aunified vision of chemical programsand data: every virtual entity is a mole-cule floating in the solution and inter-acting with others. In particular, rulescan modify other rules, programs modi-fying programs. This higher-orderproperty gives a mean to expressdynamic coordination and adaptation.An HOCL interpreter has been devel-oped that will be the core component of

towards a Chemistry-Inspired Middleware to Program the Internet of Servicesby Jean-Louis Pazat, Thierry Priol and Cédric Tedeschi

The�Internet�of�Services�has�emerged�as�a�global�computing�platform,�where�a�myriad�of�services

and�users�are�interacting�autonomously.�Being�able�to�compose,�coordinate�and�adapt�such�services

on�a�large�scale�constitutes�a�new�challenge�in�the�distributed�computing�area.�Nature-inspired

models�have�recently�regained�momentum�in�this�context,�exhibiting��properties�that�might�assist

with�programming�such�a�platform.�Two�years�ago,�at�the�IRISA/INRIA�Rennes�Bretagne�Atlantique

Research�Centre,�in�Brittany,�France,�the�Myriads�team�started�a�research�activity�with�the�aim�of

starting�the�development�of�the�next�generation�middleware�system�able�to�fully�leverage�such

computational�capabilities,�based�on�a�chemical�model.

The Higher-Order Chemical Language developed by the “Myriads” team provides a unified vision of chemical programs and data.

Page 35: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

our on-going work: the development ofa middleware prototype exploitingHOCL properties.

Our project consists of two mainresearch tracks. In the first, following atop-down approach, we leverageHOCL properties to express the decen-tralized and dynamic coordination andadaptation of services. A compositionof services is usually expressedthrough a workflow, i.e., the specifica-tion of data and control dependenciesbetween services to achieve a givenglobal task. Based on the chemical rep-resentation of the workflow, services,equipped with an HOCL engine, coor-dinate themselves autonomously atruntime, without any central coordi-nator or human intervention. We haverecently shown that any complexworkflow pattern can be expressed andexecuted in a decentralized fashion, bydefining the relevant HOCL rules,composing them and distributing themamong the engines of participating

services. Services are similarly capableof self-adaptation to actual conditionsof the platforms. These areas ofresearch are currently being con-cretized by the development of theupper layer of our chemical middle-ware. In this series of works, the teamcollaborates in Europe with CNR inNaples, Italy and SZTAKI in Budapest,Hungary. Two of the team’s PhD stu-dents, Héctor Fernandez and ChenWang, are focusing on these aspects.

The second research track is bottom-upand aims at making the whole a reality,by constructing the low-layer of thissoftware. Here, the main challenge is tobe able to implement a distributedchemical machine, capable of makingthe needed reactions happen even whenmolecules are distributed over thewhole Web, i.e., on an infinitely largeand unreliable platform. This raisesmany new theoretical and practicalchallenges, such as searching efficientlyfor several reactives atomically, syn-

chronizing the reactives at large scale,or detecting the state of inertia in a dis-tributed fashion. Marko Obrovac, PhDstudent in the team, works in this track,in collaboration with the LIP6 labora-tory, hosted by the University Pierre etMarie Curie, in Paris.

All together, these building blocks willcompose the next generation of middle-ware systems, able to fully leverage theInternet of Services, thanks to a pro-gramming model able to express itsinternal complex behaviour.

Link:Myriads team:http://www.irisa.fr/myriads/

Please contact:Cédric TedeschiINRIA, FranceTel: +33 4 99 84 73 05E-mail: [email protected]

ERCIM NEWS 85 April 2011 35

Agreement Technologies (AT) andAgreement Computing refer tomethods, techniques and tools for thedevelopment and deployment of next-generation intelligent information andsoftware systems. These terms describecertain large-scale open distributed sys-tems whose semi-autonomous elementsinteract with each other based on thenotion of agreement – where these ele-ments, often called agents, negotiate toreach both their individual and system-wide goals.

We have developed the Shifterapproach as part of the AT sandbox,that provides a system with the capa-bility of self-adaptation, ie the ability toadapt its behaviour to changes in theenvironment. This is a very relevantand powerful feature, but unfortunately

it is also very complex and difficult toobtain by usual computational means.Just like agreements themselves, adap-tations cannot be based on a closedpredefined strategy – they must beemergent. As a result, the architectureof the system must itself be emergent,so we conclude that it defines a self-organizing system.

In the context of AT, the central idea isto use intelligent-based technologies toprovide powerful features to conven-tional applications. Therefore theapproach starts with a classic AI-basedtechnology, that of multi-agent systems(MAS). However, agents are only thebasic foundation – the system is setup tohide these details from the final user. Infact, the functionality is presented in theform of services.

The management of MAS is essentiallya coordination problem; and severalresearchers have provided an enhancedapproach to that problem including thenotion of organizations. These areessentially structured groups of agents,which join together to achieve or main-tain some long-term goal by applying aset of coordination mechanisms. In thisrespect, a core feature in open systemsis the ability to change between coordi-nation strategies on-the-fly, so as to suc-cessfully adapt behaviour of the systemand its elements to changing situations.

The Shifter approach takes inspirationfrom biological systems – and in partic-ular from cellular systems – to gobeyond that setting. The core of thecomparison is to consider any agent(our shifter) as a stem cell: a powerful

Self-Organising Adaptive Structures: the Shifter Experience by Carlos E. Cuesta, J. Santiago Pérez-Sotelo and Sascha Ossowski

Self-Adaptation�(ie�the�ability�to�react�to�changes�in�the�environment)�is�becoming�a�basic

architectural�concern�for�complex�computational�systems.�However,�it�is�very�difficult�to�provide�this

feature�by�conventional�means.�Inspired�by�the�behaviour�of�biological�systems�(and�stem�cells�in

particular)�we�define�a�new�approach�to�provide�multi-agent�systems�with�a�dynamic�structure,

effectively�transforming�them�into�self-organizing�architectures.

Page 36: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201136

Special theme: Unconventional Computing Paradigms

computational unit, able to perform aconcrete functionality once it has beeninstructed to do so. Just like stem cells,in the beginning an agent is a neutralmodule, capable of transforming intowhatever the system requires; once theyhave acquired a function, they turn intoconventional cells.

We are specifically interested in theemergence of structures – ie the equiva-lent of the formation of biological tis-sues. Just like stem cells acquire theirfunctions by contacting the cells of acertain tissue, agents learn by inter-acting with other agents.

Figure 1 shows the lifecycle of our self-organizing structures. The cycle beginswith a single agent, capable of per-forming certain interactions and withthe potential of exporting some serv-ices. This agent reaches the system, andinitially it does not belong to any organ-ization. However, it finds a number ofpredefined controls (limiting what it isable to do) and protocols (enabling it tointeract to other entities). These ele-ments “guide” the agent’s interactionand enable it to maintain structured con-versations with other agents, composinginformal groups of agents.

At some point, an external changeoccurs, and the system must react withan adaptive behaviour. Of course, thisis the functionality that must trigger theformation of our self-organizing struc-tures. To be able to achieve the desiredreaction, the system is provided with anumber of adaptation patterns. Theseare neither closed strategies nor fulldescriptions of a reactive behaviour,but partial definitions of elements andrelationships, which include enoughinformation for an agent to learn how toperform a certain fine-grained, but rele-vant, behaviour. Therefore, under theinfluence of an adaptation pattern, cer-tain agents within the group acquirespecific functions, and begin to form anactual structure: this is what we call aninitiative.

The analogy is particularly relevanthere: if the process continues with noadditional control, cells continuegrowing, and evolve into cancer – how-ever, metastasis is avoided by reactingto existing inhibitors. Similarly, whenthe initiative is already fulfilling all therequired functions – every service isbeing served by some agent – ourinhibitors transform the initiative into afull organization.

These organizations are adaptive by for-mation, and are themselves able toevolve and participate in larger agree-ments – which could trigger the forma-tion of yet another composite organiza-tion. In summary, the process guaran-tees that the result is indeed an adaptiveagreement architecture.

Ongoing research continues, exploitingand developing this infrastructure byproviding new adaptation patterns andexamining their consequences in realcase studies.

Links:Agreement Technologies(CONSOLIDER Project):http://www.agreement-technologies.org

COST Action IC0801 (AgreementTechnologies):http://www.agreeement-technologies.eu

Please contact:Carlos E. Cuesta, J. Santiago Pérez-Sotelo, Sascha OssowskiRey Juan Carlos University, Madrid,SpainE-mail: [email protected],[email protected],[email protected]

Figure 1: Lifecycle of a

self-organizing structure –

from a single agent to a

full organization.

Page 37: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 37

Alchemy of Servicesby Claudia Di Napoli, Maurizio Giordano and Zsolt Németh

What�do�chemical�reactions�and�service�compositions�have�in�common?�Not�much�at�first�sight.�But

if�we�imagine�the�composition�of�services�as�a�self-evolving�autonomic�process�then�they�show

notable�similarities�to�the�way�chemical�reactions�occur.�Taking�reactions�as�a�metaphor,�there�is�an

ongoing�effort�to�discover�the�power�of�chemical�modelling�studied�within�the�framework�of�a

service�composition�problem.

The Internet of Services (IoS) willchange the way enterprises do business,since it will embrace not just informa-tion and content but also services andreal world objects. In alignment with theparamount importance of the service ori-ented computing paradigm in shapingthe future, the “Software Services andSystems Network” (S-Cube, FP7/2007-2013 under grant agreement 215483) isaimed at establishing a unified, multidis-ciplinary research community and

shaping the software service basedInternet which will underpin the wholeof our future society. S-Cube integratesthree research communities: SoftwareEngineering, Grid/Cloud Computingand Business Process Management,attempting to integrate the research tra-ditionally fragmented into three separatelayers: service infrastructure, servicecomposition & coordination and busi-ness process management. The overallemphasis is on novel service functionali-ties, especially adaptation and pro-activetechniques, methodologies for engi-neering and adapting service-based sys-

tems and end-to-end quality provi-sioning and SLA-Compliance.

According to the S-Cube vision, theservice oriented computing paradigmenables service providers to dynami-cally integrate services in a large-scaleand heterogeneous environmentwhereas service requesters can adjustand compose services according to theiractual needs. Thus, Service BasedApplications (SBAs) are composed of a

number of possibly independent serv-ices, available in a network, which per-form the desired functionalities. It islikely that more service providers canprovide the same functionality at dif-ferent conditions referring to non-func-tional characteristics of a providedservice, like price, time to deliver, andso on. These may change in timedepending on provider policies, and assuch they cannot be advertised togetherwith the service description nor plannedin the composition design. It thereforebecomes necessary to organize compo-sitions of services on demand in

response to dynamic requirements andcircumstances.

Within the S-Cube research activity wemodelled the problem of selecting serv-ices necessary for delivering an SBArequired by a user as a nature metaphor.In our scenario an SBA request is anabstract workflow (AW) specifying thefunctionality of each component, thedependence constraints among them,and some user-preferred non-functional

characteristics of the whole composi-tion. Providers, able to provide therequired AW functionality, issue serviceoffers that specify a service instancetogether with the value of non-func-tional characteristics it can provide theservice with. More offers for the samefunctionality can be available, and addi-tional offers that should be taken intoaccount can become available duringthe selection process.

We envision a chemical solution in aflask where both activities of the AW(service requests) and all service offers

Figure 1: the chemical-

based service instantiation

process.

Page 38: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

The main objective of the ongoingBACTOCOM project is to build asimple computing device, using bacteriarather than silicon. Microbes may bethought of as biological "micro-machines" that process informationabout their own state and the worldaround them. By sensing their environ-

ment, certain bacteria are able to movein response to chemical signals,allowing them to seek out food, forexample. They can also communicatewith other bacteria by leaving chemicaltrails or by directly exchanging geneticinformation. We focus on this lattermechanism.

Parts of the internal "program" of a bac-terial cell (encoded by its genes and theconnections between them) may be"reprogrammed" in order to persuade itto perform human-defined tasks. Byintroducing artificial "circuits" made upof genetic components, we may addnew behaviours or modify existing

ERCIM NEWS 85 April 201138

Special theme: Unconventional Computing Paradigms

BACtOCOM: Bacterial Computing with Engineered Populationsby Martyn Amos and the BACTOCOM consortium

Various�natural�computing�paradigms�inspired�by�biological�processes,�such�as��artificial�neural

networks,�genetic�algorithms,�ant�colony�algorithms,�have�proved�to�be�very�effective�and�successful.

However,�we�can�go�one�step�further:�instead�of�developing�computing�systems�inspired�by�biological

processes,�we�can�now�directly�use�biological�substrates�and�biological�processes�to�encode,�store

and�manipulate�information.�Early�implementations�of�bio-computing�used�DNA�as�a�miniaturised

“storage�medium”,�but�we�go�beyond�this,�to�harness�the�“biological�nanotechnology”�available

inside�every�living�cell.�

are molecules. Aggregation constraintsand non-functional user-requirementson services are expressed as chemicalproperties, i.e. chemical reaction rules.We use the chemical analogy as a mod-elling paradigm and our approach dif-fers significantly from computationalchemistry that faithfully simulateschemical reactions. The chemicalmetaphor can be expressed in thegamma-calculus, a declarative for-malism developed for grabbing thenotion of concurrent indeterministicmultiset rewriting. We implemented ourmodel in the Higher Order ChemicalLanguage (HOCL), a derivative of thegamma-calculus.

The proposed approach allows the cap-ture of service selection in the contextof SBAs in terms of local reaction rulesthat yield aggregated molecules. Oncean aggregation of services is computedthe chemical solution reaches an inertstate, but it can be re-activated whennew offers are available or alreadyexisting offers are changed. New offersare new molecules that may form a newaggregation or alter existing molecules,so that the system can exhibit an adap-tive behaviour. Similarly, control can beadded in the form of molecules, some-thing similar to catalysts and inhibitors.All is dependent on how the serviceoffers and needs are translated intochemical properties (chemical aggres-siveness).

This model does not rely on any central-ized and/or coordinated control, or onany predefined sequences or patterns.The process just follows some (high-level) policies in the same way thatreactions obey the general laws ofnature. With this greater flexibility anddynamicity we expect that the behav-iour of molecules, together with theoverall constraints of the aggregationmay enable adaptation of the system todifferent configurations that are notplanned in advance. Applying thechemical metaphor to the problem ofselecting service instances allows theservice selection process to be modelledas an “evolving” and always runningmechanism that can adapt to environ-mental changes as they occur, so pro-viding adaptability to changes in non-functional characteristics.

Our prospective is to further investigatethe proposed approach in order toanswer the following questions: is itpossible to model the problem offinding an optimal solution to theservice composition problem just like achemical reaction would reach min-imum energy? Is it possible to establishthe chemical properties so that the com-position procedure evolves into a welldefined and desired state? We envisionthat a chemical-based approach forselecting services to form an SBA is aviable approach to answer these ques-tions. Furthermore, the chemical-based

instantiation process may run concur-rently with a workflow enactmentengine thus providing the possibility tointerleave the selection process with theenactment. In such a way it is possibleto consider new offers, as soon as theyappear in the system, that may producesub-parts of instantiated workflows(partially instantiated workflows) pro-viding missing service instances thatwere not available before the executionstarted.

Link:http://www.s-cube-network.eu/

Please contact:Claudia Di Napoli,Maurizio GiordanoIstituto di Cibernetica “E. Caianiello” -CNR, ItalyE-mail: [email protected],[email protected]

Zsolt NémethSZTAKI, HungaryE-mail: [email protected]

Page 39: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

functionality within the cell. Existingexamples of this include a bacterialoscillator, which causes the cells toperiodically flash, and cell-based pollu-tion detectors that can spot arsenic indrinking water. The potential for bio-engineering is huge, but the processitself is made difficult by the noisy,"messy" nature of the underlying mate-rial. Bacteria are hard to engineer asthey rarely conform to the traditionalmodel of a computer or device withwell-defined components laid out in afixed design.

We intend to use the inherent random-ness of natural processes to our advan-tage, by harnessing it as a frameworkfor biological engineering. We startwith a large number of simple DNA-based components taken from a well-understood toolbox, which may bepieced together inside the cell to form

new genetic programs. A population ofbacteria then absorbs these components,which may (or may not) affect thebehaviour of the cells. Crucially, thecore of our bacterial computer is madeup of engineered E. coli. microbes thatcan detect how well they are per-forming, according to some externalmeasure, such as how well they canflash in time with light pulses.

We engineer a set of computationalplasmids; circular strands of DNA rep-resenting "components", which may becombined together within the cell toform a simple logical circuit. Thesecomponents may be exchangedbetween individual bacteria via theprocess of conjugation; the transfer ofgenetic material via direct cell-cell con-tact. By introducing large numbers ofthese computational plasmids, we ini-tialize the system. Over time, the bac-

teria integrate the components into theirgenomes, thus "building" logical cir-cuits. The "output" of these circuits ismeasurable, and by defining "success"in terms of correlation with a desiredsignal profile, we allow successful com-ponents to flourish via a process ofselection. By controlling the desiredsignal profile, we direct the “evolution”of the population towards novel, robustcomputational structures.

By performing massively-parallel bac-terial random search, we will quicklyobtain functional devices without "topdown" engineering. There are manypotential benefits to this work, fromboth a biological and computing per-spective. By "evolving" new functionalstructures, we gain insight into biolog-ical systems. This, in turn, may suggestnew methods for silicon-based com-puting, in the way that both evolutionand the brain have already done. Inbuilding these new bio-devices, weoffer a new type of programmable,microscopic information processor thatwill find applications in areas as diverseas environmental sensing and clean-up,medical diagnostics and therapeutics,energy and security.

The BACTOCOM project is a three-year programme, funded by the Euro-pean Commission FP7 FET Proactiveinitiative: CHEM-IT (Bio-chemistry-based Information Technology). Itbegan in February 2010, and is coordi-nated by Martyn Amos of ManchesterMetropolitan University, UK. Otherpartners include, Universite d'Evry Vald ' E s s o n n e - G e n o p o l e ® - C N R S(France), Universidad Politécnica deMadrid and University of Cantabria(Spain), Interuniversitair Micro-Elec-tronika Centrum (IMEC, Belgium) andCharite - Universitätsmedizin Berlinand the Technical University Munich(Germany).

Link:BACTOCOM project website:http://www.bactocom.eu

Please contact:Martyn AmosNovel Computation GroupManchester Metropolitan UniversityE-mail: [email protected]

ERCIM NEWS 85 April 2011 39

Page 40: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Recent experiments have shown the pos-sibility of detecting bio-molecules in aliquid by the effect of their intrinsiccharge on the conductance of a semicon-ducting nano-wire transducer. Suchsensor devices are fast, cheap and label-free. Figure 1 shows a schematic Bio-FET, which consists of a semicon-ducting transducer separated by an insu-lator layer (typically silicon dioxide ornitride) from the biological recognitionelement (receptors or probe molecules)that surrounds the transducer immobi-lized to the surface, consisting of anti-bodies, proteins, enzymes, nucleic acidsor living biological systems. The trans-ducer immerges into an electrolyte(aqueous solution of NaCl) that containsthe analyte, i.e. the target molecules thatwe want to detect, such as DNAs, anti-gens, proteins. The principle of Bio-FETs is actually simple: when the ana-lyte bio-molecules bind to the surfacereceptors, the charge distribution at thesurface changes, which modulates theelectrostatic potential in the semicon-ductor and thus its conductance whichcan be easily measured. Since the device

detects a specific substance, the changeof the conductance yields the concentra-tion in a rather direct way.

The modelling of such Bio-FET sensorsmust take into account the electrostaticsand geometry of the liquid, the probeand the target molecules in theboundary layer, the binding efficiencyof the probes and targets, the electro-statics and the conductance of the semi-conducting transducer and devicegeometry. Note that the bio-molecularand the nano-electronic part define verydifferent length scales, they have to beconsidered separately and then coupledin a self-consistent manner.

As a mathematical model for the trans-port of electrons and holes inside thetransducer we use the drift-diffusionequation coupled to the Poisson equa-tion. The standard continuum model isthe mean-field Poisson equation, inwhich the free charges are treated aspoints, included using Boltzmann statis-tics. The classical Poisson–Boltzmanntheory is successfully used to study the

planar electric double layer and theelectrolyte bulk.

In our multiscale problem the originalPoisson equation is replaced by ahomogenized problem consisting of thePoisson equation with two interfaceconditions at the surface. The valuesthat determine the interface conditionsare easily evaluated from the simula-tions. The boundary conditions for bio-sensors include the selectivelyadsorbing surface with probe moleculesor purely reflective (inert) walls.

A key aspect of the modelling is the cal-culation of the charge distribution in bio-functionalized surface layers. Both elec-trostatic and the hard-sphere collisionsare sensitive to ion size. Since the elec-trolyte is usually buffered, it is essentialto consider not only mixed-size, but alsomixed-valence ionic systems. Thereforethe radius and valence of the ions arespecified as input parameters.

Since the applied voltage between elec-trodes and the bulk concentrations of

ERCIM NEWS 85 April 201140

Special theme: Unconventional Computing Paradigms

Bio-Sensors: Modelling and Simulation of Biologically Sensitive field-Effect-transistorsby Alena Bulyha, Clemens Heitzinger and Norbert J Mauser

We�present�modelling�and�simulations�of�Bio-FETs�(Field-Effect-Transistors),�a�complex�multi-scale

system�where�a�semiconductor�device�is�coupled�to�a�bio-sensitive�layer�that�detects�bio-molecules

such�as�DNA�in�a�liquid.�Our�mathematical�modelling�yields�qualitative�understanding�of�important

properties�of�Bio-FETs�and�helps�to�provide�high�performance�algorithms�for��predictive�simulations.

Figure 1: Schema of a Bio-FET sensor.

Page 41: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

the ions are controlled in experiments,their influence on the ionic concentra-tions also has to be investigated. TheMetropolis Monte Carlo (MMC) algo-rithm in the constant-voltage ensembleis the appropriate numerical method.We have extended it to investigate theionic concentrations near the groups ofcharged objects with various geometriesand sizes, representing, for instance,DNA strands in different orientations.

Another crucial aspect of the modellingis the simulation of the transport oftarget molecules in the analyte solutionto the active sensor area. There are sev-eral mechanisms:

a) Diffusion. b) Convection. c) in addi-tion, the pump is essential for the fastresponse times. d) Migration: Themovement of charged particles inresponse to a local electric field. Its con-tribution to the total flux is proportionalto the charge of the ions, the ion concen-tration, the diffusion coefficient, and thestrength of the electric field gradient,which is induced by the ion and by theapplied external potential.

The internal electrical potential isobtained via Poisson equation. Thevelocity field is modelled by the Navier-Stokes equations. Note that diffusivitydepends in general on the type (size andshape) of the molecules.

All these processes are taken intoaccount in the algorithms for the diffu-

sion-convection-migration (DCM)problem with chemical reactions nearthe functionalized surface.

The whole simulation consists of dif-ferent steps as follows:

1)DCM is solved to find a concentra-tion of analyte molecules for everyspace point at a time.

2)The target molecules, which havereached the functionalized surface,take part in the chemical reactionsdescribed by ordinary differentialequations for specific and non-specif-ic binding. By solving them we find adistribution as well as binding effi-ciency of captured molecules. Thedensity of probe-target complexes forvarying density of probe molecules isshown in Figure 2 as a function oftime.

3)Then the MMC is applied to find theionic concentrations near the surfaceand within the intermolecular space.At first the simulation is performedwith the surface that is functionalizedwith probe molecules, which have aknown density, and then the MMC isapplied to the probe-target complexeswith the same density.

4)For coupling atomistic and continu-um models, microscopic charge dis-tributions from the MMC solver arerecalculated to the macroscopic sur-face charge densities and to the

macroscopic dipole moment densi-ties. They are weighted by the bind-ing efficiency and used as interfaceconditions in the homogenized Pois-son equation.

5)Finally the Poisson equation providesthe voltage for MMC and externalpotential for migration model.

A self-consistent loop between themicro- and macroscopic simulationsprovides the basis for the quantitativedescription of BioFETs and their pre-dictive simulation.

Acknowledgement: this work was sup-ported by the Austrian Science Fund(FWF, projects W8 and P20871), theViennese Fund for Technology andResearch (WWTF, project MA09-28)and the European Commission (projectMEST-CT-2005-021122 “DEASE”)

Link:http://www.wpi.ac.at

Please contact:Norbert J MauserWolfgang Pauli Institut c/o Univ. WienE-mail: [email protected]

Clemens HeitzingerWPI and DAMTP, Univ. CambridgeE-mail: [email protected]

ERCIM NEWS 85 April 2011 41

De

nsity o

f ca

ptu

red

mo

lecu

les x

10

12, 1/c

m2

Time, s

Density of probe molecules 4x1012, 1/cm2

3x1012, 1/cm2

2x1012, 1/cm2

Figure 2: Simulation results of

reaction-diffusion problem for

density of probe-target complexes

obtained at different probe densities.

A higher probe density (green lines)

leads to a faster binding at the

beginning of the simulated process

and to a decrease of total analyte

concentration in the solution (not

shown here). That results in a lower

binding efficiency of captured

analyte at higher probe density.

Thus, at t=1400s the binding

efficiency will be 48%, 50%, and 52%

for probe densities of 4x1012 (green),

3x1012 (red) and 2x1012 (blue)

molecules per cm2, respectively.

Page 42: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 201142

Research and Innovation

Self-Organizing P2P Systems Inspired by Ant Coloniesby Carlo Mastroianni

We�show�how�bio-inspired�algorithms�can�be�used�to

enhance�the�flexibility�and�efficiency�of�peer-to-peer

computer�networks�and�present�Self-Chord,�a�peer-to-

peer�system�in�which�resources�are�distributed,�in�an

adaptive�and�semantic-aware�fashion,�by�means�of

statistical�algorithms�inspired�by�the�behavior�of�ant

colonies.�This�strategy�enables�the�efficient�execution�of

complex�queries,�improves�the�load�balance�among

peers,�and�strengthens�the�system's�ability�to�self-

organize�and�rapidly�adapt�to�environmental�changes.

Modern peer-to-peer (P2P) systems are based on a structuredoverlay, meaning that peers are organized in accordance witha predefined logical structure – ring, multi-dimensional grid,tree – and resources are assigned to peers with a precisestrategy. This allows discovery requests to be driven,exploiting P2P interconnections, directly to the peers thatstore the desired resources. This type of organization is

adopted by all the most popular P2P platforms (Kademlia,BitTorrent, Freenet), and recently also by Cloud companiesfor their distributed storage systems (for example, theDynamo system adopted by Amazon).

Structured P2P systems are generally based on DistributedHash Tables (DHT), whose purpose is to assign keys toresources, and indexes to peers, by means of hash functions.Each peer is responsible for a portion of the key space, and isrequired to manage the keys belonging to this portion.However, the use of hash functions has an important draw-back: since similar resources are mapped to completely dif-ferent keys, they are generally assigned to peers that arelocated far from each other in the overlay. This hinders theefficient execution of queries for resources having similarcharacteristics, or “range” queries.

At the ICAR-CNR Institute, in collaboration with thePolitecnico di Torino, we have devised a new technique toimplement structured P2P systems without using DHT func-tions. Instead of being computed with a hash function,resource keys may be given a semantic meaning, for examplethe value of a resource attribute. In this way, the keys of sim-ilar resources can be assigned to the same or neighbourpeers, thus enabling the efficient execution of range queries.The technique is based on statistical operations of mobileagents whose behaviour is inspired by ant colonies, and fol-lows the swarm intelligence paradigm. Ants hop from peer topeer, pick/drop resource keys, and sort them on the under-lying P2P structure. Sorting of keys ensures their prompt dis-covery.

Ants Swarming in a Ring: the Self-Chord System

This ant-inspired approach has been adopted in the Self-Chord P2P system. Self-Chord uses the ring-shapedoverlay of Chord, the forefather of structured P2P systems,to establish P2P interconnections, but distributes resourcekeys among peers using the operations of ant-inspiredmobile agents. The underlying principle of this mechanismis illustrated in Figure 1. In this example, keys can beassigned integer values in the range [0..99], and the keyspace is toroidal, so that key 0 is the successor of key 99.Each peer computes its centroid as the key value that mini-mizes the sum of the distances between itself and the keys

stored in the local area. In the example, an ant arrives at thepeer with centroid C=30, and picks key 55, because itsvalue is dissimilar to the centroid. Actually, the pick opera-tion is subject to a Bernoulli trial, whose success proba-bility is inversely proportional to the distance between keyand centroid. The agent carrying the picked key exploits thelong distance links of the underlying Chord structure, hopsto a region of the ring where peers are supposed to havecentroid values close to the key, and then tries to drop thekey in this new region. Note that these operations assumethat keys and centroids are already sorted in the ring. Thepower of the ant algorithm is that, by making this assump-tion, the keys will actually be ordered, even when startingfrom a completely disordered network. In stable conditions,the sorting of keys guarantees their logarithmic discovery.The base principles for this behavior are the self-organizing

Figure 1: Example of a Self-Chord

operation. A mobile agent arrives at a peer

with centroid C=30, picks key 55, and then

hops to a region of the ring where peers

are supposed to have centroids close to 55.

Page 43: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 43

nature and the positive feedback mechanism of ant algo-rithms.

In addition to serving range queries, this approach has furtheradvantages:- the range of values that can be assigned to keys is arbitrary- the keys are fairly balanced among peers. In DHT-based

networks, this feature is hindered by the non uniform dis-tribution of resources and corresponding keys

- the network traffic is stable because it only depends on thefrequency of ant movements. In DHT-based networks, thetraffic is strongly affected by the frequency of peer con-nections and reconnections.

Ants Swarming in Multi-Dimensional and Hierarchical

Structures

The ant approach can be adapted to any kind of overlay. Inparticular, we are analysing its application on multi-dimen-sional and hierarchical structures. A multi-dimensionaloverlay fits very well to situations in which resources aredescribed by multiple and independent attributes: forexample, computers may be described by their amount ofmemory, CPU speed, number of cores, etc. The ant-inspiredapproach allows resources to be described by multi-dimen-sional keys, corresponding to the attribute values, and keysto be sorted over the multi-dimensional structure. Thisenables the efficient execution of multi-attribute rangequeries, a sophisticated functionality that is very challengingin DHT-based systems.

The second case under study is when resources are describedby hierarchical keys: this may be the case of distributedXML databases, in which documents are built in accordancewith a hierarchical schema. For this kind of application, anappropriate overlay is that of Pastry. In Pastry, P2P intercon-nections are organized in a logical tree, and they allow agentsto jump between peers whose indexes have a common prefixof any specified length. Mapping XML instances to thisstructure means that it will be possible to efficiently servecomplex XML queries.

Links:http://self-chord.icar.cnr.ithttp://en.wikipedia.org/wiki/Distributed_hash_tablehttp://en.wikipedia.org/wiki/Swarm_intelligence

Please contact:Carlo MastroianniICAR-CNR, Italy Tel: +39 0984 831725 E-mail: [email protected]

dialogue-Assisted NaturalLanguage Understandingfor a Better Webby Mihály Héder

By�studying�and�improving�the�process�of�digital�content

creation,�we�can�develop�more�intelligent�applications.

One�promising�step�ahead�is�to�acquire�machine

representation�of�content�as�early�as�possible�-�while

the�user�is�still�typing�and�is�available�to�answer

questions.

Internet triggered a major revolution in everyday communi-cation. The basic goal has not changed- to reach other peopleand to share our ideas- but everything else is different.

Computers, as mediators, have been incorporated into com-munication channels and users turn directly to them, andindirectly to other users when they need information.Clearly, the better the computers understand the content, themore useful and convenient they will become. However, toachieve some sort of understanding, they first need at least 1)an adequate machine representation of the content and 2)background knowledge. In this article, I will present a meansof achieving the former.

In general, the life cycle of digital content can be divided intothree stages: 1) the creation and formatting of content carriedout by the user; 2) the attempt to create a machine representa-tion of the content, carried out by indexers, the informationextractor and a text mining software (typical components ofa search engine); and finally, 3) the content can be consumedby other users (see Figure 1).

We have to face the fact that the second step of this process isvery hard to carry out in a fully automated manner, as itwould require unassisted natural language understanding toachieve the best results. Fortunately, when the content is yetto be written, we have an alternative option.

In the frame of a project in SZTAKI, we try to merge the firsttwo phases of the content creation process detailed above.Our experimental software processes content on the fly whilethe user is still typing, and tries to ask relevant questionsfrom him/her using everyday language. The aim of thesequestions is to clarify what the machine representation of thetext should be (see Figure 2). While there are a multitude ofsemantic annotator tools, we think that the dialogue-assistedinput method makes our solution a rather unique one. Also,this same property is the key to enabling lay users to createsemantic annotations.

Our system consists of a rich text editor built aroundTinyMCE which is capable of visually annotating the con-tent, and presenting questions/suggestions to the user. On theserver side, we have a UIMA-compatible text processingsystem which relies on various UIMA Analysis Engines. Oneof these is the Magyarlanc, a tokenizer, lemmatizer and partof speech (POS) analyser for Hungarian. We also have a lan-

Page 44: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Research and Innovation

ERCIM NEWS 85 April 201144

��

guage recognizer, an English POS tagger, a named entity rec-ognizer, and Hitec3, a hierarchical document categorizerintegrated.

We have two applications of this system and we plan todevelop more. One is a complaint letter analyser, which isbeing developed with the help of a corpus of 888 letterswritten to the Hungarian Ministry of Justice. These lettersnormally tackle diverse issues and in many cases they areunclear and unstructured, so their autonomous processing assuch has not been effective. We were able to improve thequality of their processing with semi-autonomous matchingof scripts and frames, using the support of dialogues. Formore information, see the links at the end of this article.

The other application is the “Sztakipedia” project, whichaims to develop a Wikipedia editor. The tool will have anintuitive web interface with rich text editor, which supportsthe user with suggestions to improve the quality of the editedarticle.

Some of the vital aspects of this content creation process arealready working. With the help of a Hitec3 instance, the soft-ware is able to suggest categories; this software was trainedon the entire content of the Hungarian Wikipedia articles. Itcan also suggest “InfoBoxes”, which are table-like descrip-tions of the properties of common entities like “people” and“cities”. Furthermore, the software can offer a number oflinks leading to other Wiki articles by the analysis of words

Figure 1: The average life cycle of the

digital content.

Figure 2: The proposed method of

content creation.

in the currently edited article. In addition to this, the user canask for links in connection with a given phrase; in this case,our tool starts a search in the background, using the Wiki APIand formulates suggestions from the search result.

In spite of the fact that these functions are very promising,we have many problems that we have to overcome. Probablythe hardest one is that while our tool is able to generateproper wiki markup, this format itself does not support everytype of semantic annotations that we need. We have to con-sider other questions as well, for example, whether it isimportant to store meta-information about the user's answersto our questions and suggestions, or what to do with “nega-tive” answers, for example when a given InfoBox is notapplicable on a given article.

We hope that we can start public tests in 2011 with this noveltool. Further details can be found at the project website.

Links:Complaint letter project: http://dialogs.sztaki.huSztakipedia project: http://pedia.sztaki.hu

Please contact:Mihály HéderSZTAKI, HungaryTel: +36 1 279 6027E-mail: [email protected]

Page 45: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 45

Content-Based tileRetrieval System by Pavel Vácha and Michal Haindl

Recent�advances�in�illumination�invariance�and�textural

features�were�utilized�in�the�construction�of�a�content-

based�tile�retrieval�system,�which�eases�laborious

browsing�of�large�tile�catalogues.�Regardless�of�whether

a�user�is�choosing�tiles�for�their�house�or�designing�a

prestigious�office�building,�this�computer-aided

consulting�system�will�help�to�select�the�right�tiles�by

suggesting�tiles�which�have�similar�colours�and�/�or

patterns�as�some�other�tiles�that�the�user�likes.�The

performance�of�the�system�was�verified�on�a�large

commercial�tile�database�in�a�psycho-physical

experiment.

The tile retrieval system is based on an image representationby an underlying multi-spectral Markov random field.Spatial relationships within the image are modelled by a spe-cial type of Markov model, which allows efficient and ana-lytical estimation of its parameters, avoiding the usual time-consuming Monte-Carlo approach. Tile patterns are repre-sented by estimated model parameters, which are trans-formed into our illumination invariant textural features.These features are unaffected by illumination colour androbust to illumination direction variations, therefore arbitraryilluminated tiles do not negatively influence the retrievalresult. To allow separate retrieval of similar patterns and / orcolours, the system utilizes colour representation by mar-ginal cumulative histograms.

The utilized features (colour invariants) are capable of repre-senting colour patterns with same luminance and still remainrobust to illumination colour, which is in contrast with ausual texture representation by gray-value textural features.The performance of our colour invariants was successfully

verified using MUSCLE ERCIM working group cooperationutilizing the Amsterdam Library of Textures (ALOT), whichcontains images of over two hundred natural and artificialmaterials acquired in variable conditions.

The tile retrieval system was verified in a visual psycho-physical experiment on a large commercial tile database con-taining several thousand images. In the experiment, theretrieved tiles were blindly ranked by volunteers as: verysimilar, similar, less similar, and dissimilar. In summary,76% of retrieved images were considered to be very similaror similar, while only 12% were marked as dissimilar.

The system can be used in the following ways: A user cantake a photo of old tile lining and find a suitable replacementfor broken tiles from recent production. During browsing ofdigital tile catalogues, the system can offer tiles with similarcolours or patterns as the selected tile, which could be inte-grated into an internet tile shop. Alternatively, the tiles can beclustered according to visual similarity and consequently,digital catalogues can be browsed hierarchically through therepresentatives of visually similar groups. In all cases, thesystem benefits from its robustness to illumination changesand possible noise degradation.

The presented retrieval system is not limited to tile imagesand it can be used with other kinds of image, where the visualstructure and colour are important properties. Thus otherpossible applications in the home decor industry includeretrieval of similar textiles / cloths and wallpapers. An inter-active demonstration is available online.

Link: http://cbir.utia.cas.cz/tiles/

Please contact:Pavel VáchaCRCIM (UTIA), Czech RepublicTel: +420 2 6605 2365E-mail: [email protected]

Figure 1: The system

offers catalogue tiles,

which have similar

colours or patterns

(characteristics) as the

query tile. The images

are courtesy of Sanita.cz.

query similar colours similar patterns

Page 46: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Figure 1: The design

process for authoring

multi-device service

front-ends.

ERCIM NEWS 85 April 201146

Research and Innovation

Intelligent design of Multi-device Servicefront-Ends with theSupport of task Modelsby Fabio Paternò, Carmen Santoro and Lucio Davide Spano

We�present�a�novel�environment�that�supports�the�design

of�multi-device�service-based�interactive�applications�by

using�several�user�interface�abstraction�levels�and

utilizing�the�composition�of�annotated�services.

Web services are increasingly used to support remote and dis-tributed access to application functionalities, thus enabling alarge gamut of interactive applications. However, when theuser support (the service front end) is developed after deliv-ering the Web services, it is often based on ad-hoc solutionsthat lack generality and do not consider the user viewpoint,thus providing front-ends with low usability.

BPMN (Business Process Modelling Notation) and BPEL(Business Process Execution Language) -based approachesare often used in Service–Oriented Architecture (SOA)efforts, but these mainly focus on direct service compositionin which the output of one service acts as input for another,creating more complex services. Thus, they provide littlesupport to the development of applications that interact withthe users and perform service composition by accessing var-ious services, processing their information and presentingthe results to the end users.

One of the main advantages of logical User Interface (UI)descriptions is that they allow developers to avoid dealingwith a plethora of low-level details associated with UI imple-mentation languages. Our objective is to provide designers,who want to exploit the potential of Web services in theirinteractive applications, with a systematic approach thateffectively addresses the specific issues raised by interactiveapplications when they are accessed through multipledevices. The method is based on the use of logical user inter-face descriptions and is also accompanied by an automatictool (MARIAE, Model-based lAnguage foR InteractiveApplications Environment), which supports all the phases.The approach uses different UI abstraction levels (task,abstract interface, concrete interface), which can also useadditional UI information associated with web services (so-called annotations).

Task models describe the various activities that are supportedby an interactive application, together with their relation-ships, through a number of relevant concepts. We describetasks by exploiting the ConcurTaskTrees (CTT) notation,which is widely used thanks to the availability of the associ-ated public tool (CTT Environment). Using MARIAE, it ispossible to establish an association between some elemen-tary tasks in the task model (namely, the tasks that cannot befurther decomposed) and the corresponding operations spec-ified in the Web services. This association, which involvesonly system tasks (namely, the tasks that are carried out bythe application) is facilitated in CTT because different allo-cations of tasks are represented differently and thereforesystem tasks are easy to identify.

Provided that a suitable granularity is used to specify the taskmodel, this association has the advantage of making the task

Page 47: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 47

model temporal relationships automatically and consistentlyinherited by the associated Web service operations. This isimportant because task model relationships should bederived taking into account the user’s perspective. Once thetask model and Web services are connected, a first draft of anabstract UI description can be generated and then furtheredited by designers; when a satisfactory customization isreached, it can be refined into a concrete, platform-dependent one.

Designers can also include annotations associated with Webservices, which provide (also partial) hints about the possiblyrelated UIs. At the abstract UI level, the annotations canspecify grouping definitions, input validation rules, manda-tory/optional elements, data relations (conversions, units,enumerations), and languages. At the concrete UI level, theannotations can provide labels for input fields, content forhelp, error or warning messages, and indications for appear-ance rules (formats, design templates etc.).

When designing a specific multi-device application, themodels created with device-independent languages shouldalso take into account the features of the target interactionmodality. Thus, for example, in the task model we can havetasks that depend on a certain modality (eg selecting a loca-tion in a graphical map or showing a video), and are neg-lected if they cannot be supported (eg in a platform havingonly the vocal modality).

For each target platform it is possible to focus just on rele-vant tasks and then derive the corresponding abstractdescription, which is in turn the input for deriving a morerefined description in the specific concrete language avail-able for each target platform. Since the concrete languagesshare a common core abstract vocabulary, this work is easierthan working on a number of implementation modality-dependent languages. Our tool MARIAE is currently able tosupport the design and implementation of interactiveservice-based applications for various platforms: graphicaldesktop, graphical mobile, vocal, multimodal (vocal +graphical) and can be freely downloaded.

The tool has been developed in the ServFace project, whosegoal was to create a model-driven service engineeringmethodology to build interactive service-based applicationsusing UI annotations and service composition. The projectbegan in 2008 and finished in 2010. The institutions involvedwere ISTI-CNR, SAP AG (Consortium Leader), TechnischeUniversität Dresden, University of Manchester, W4.

Links:ConcurTaskTrees Environment:http://giove.isti.cnr.it/tools/CTTE/homeMARIAE Tool: http://giove.isti.cnr.it/tools/MARIAE/homeHIIS Laboratory: http://giove.isti.cnr.it/ServFace EU Project: http://www.servface.eu/

Please contact:Fabio PaternòISTI-CNR, ItalyTel: +39 050 315 3066E-mail: [email protected]

training Crisis Managersin Strategic decision-Making by Amedeo Cesta, Gabriella Cortellessa, Riccardo De

Benedictis, and Keith Strickland

The�goal�of�PANDORA�is�to�apply�state-of-the-art�ICT

technologies�to�build�a�learning�environment�for

strategic�crisis�managers.��We�are�currently�refining�a

first�version�where�training�sessions�are�animated�by

reproducing�realistic�crisis�events�and�fostering�creative

decision-making.��Central�to�PANDORA�is�an�original

use�of�the�timeline-based�planning�strategies�used�to

diversify�crisis�scenarios�by�creating�alternative�training

paths�and�to�model�trainees�behavioral�patterns�for

personalized�training.

Crisis management in emergency situations helps signifi-cantly to avoid major losses and may prevent emergenciesfrom becoming disasters. The success of crisis managementdepends heavily on the effectiveness of high-level strategicchoices and the reasoning abilities of decision makers.

Three different levels of decision makers exist in currentapproaches to crisis management: • operational level or bronze commanders, directly operat-

ing on crisis scenarios, whose actions and results are mon-itored and communicated to higher levels

• tactical level or silver commanders, responsible for trans-lating high-level strategic decisions into actions and relat-ed resources allocations

• strategic level or gold commanders, who identify keyissues and decide strategies to resolve the crisis.

The PANDORA project aims at creating an advancedtraining environment for crisis decision makers who operatein highly stressful situations. They must react effectively andcoordinate interventions with different authorities to limit thedangerous effects of crises and to enable quick recoveries.Contrary to almost all the state-of-the-art training systems,aimed at the operational or tactical level, PANDORA targetsdecision-making at the strategic level, which presents inter-esting open challenges.

There are two main approaches to training: (a) tabletop exer-cises (group discussions guided by a simulated disaster); (b)real world simulation exercises (field-tests replicating emer-gency situations). Tabletop exercises are low cost and easy toorganise, but they cannot recreate the real atmosphere, interms of stress and pressure. On the other hand, real worldsimulations can be very effective but are extremely expen-sive and difficult to organize.

PANDORA aims at replicating the benefits of both trainingapproaches by developing a system capable of guaranteeingthe realism of the real world simulation and the practicalityof tabletop exercises. A user-centered approach is followedand we have worked in close collaboration with the UKEmergency Planning College, which has identified the main

Page 48: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Research and Innovation

ERCIM NEWS 85 April 201148

requirements, and is influencing the design and implementa-tion decisions.

Figure 1 summarizes the main concept of PANDORA. Agroup of trainees, from different agencies (eg, Civil Defence,Health, Fire Service, Police, Transportation) access thetraining system. If some authorities are not present, they aresimulated through Non Player Characters. Each trainee feedspersonal data to the PANDORA kernel, which gathers thisinformation to build a user model (Behavioral Module). Onthe basis of this model, the system synthesizes personalizedtraining paths (Behavioral Planner). The output of thisprocess is passed to a second module (Crisis Planner), whichuses both the Behavioral Module indications and knowledgeof the chosen scenario to plan sequences of stimuli appro-priate for the group (information shared among all trainees)

and the individual trainees (information tailored to inducethe “right level of stress”).

The plan is then passed to the Environment and EmotionSynthesizer, responsible for an effective rendering of the var-ious stimuli. A separate module (Trainer SupportFramework) allows trainers to control the training sessionand dynamically adjust the stimuli based on their experience.

PANDORA uses timeline-based planning technology whichallows for rich domain modelling and uses both temporal andresource constraints. A timeline can be seen as a stepwiseconstant function of time. Specifically it is an orderedsequence of values holding on subsequent temporal inter-vals. This approach has been used both in the Behaviouraland the Crisis Framework.

In the first case some psycho-physiological trainee features,shown to influence human behaviour under crisis, are mod-elled and updated during training as timelines. On the basisof this model, the Behavioural Planner synthesizes goals forthe Crisis Planner. The Crisis Planner creates training story-boards, sets of “events” communicated to the trainees (eg, a

Security and Resilience in Cognitive RadioNetworksby Vangelis Angelakis, Ioannis Askoxylakis, Scott Fowler,

David Gundlegård, Apostolos Traganitis and Di Yuan

After�more�than�a�decade�of�research,�system�security

and�resilience�is�now�the�major�technological�barrier�for

the�Cognitive�Radio�(CR)�to�be�adopted�by�the

telecommunication�industry.�New�ideas�are�required�to

make�CR�networks�secure�and�robust�against�attacks

taking�advantage�the�inherent�characteristics�of�the�CR

functionality.�This�work�explores�key�points�that�urgently

need�to�be�addressed.

Cognitive radio (CR) is a term with many possible meaningsin the telecommunications literature of the past decade. Mostcommonly, a cognitive radio device is based on a software-defined radio (SDR), and has an adjustable front-end, whichallows it to tune on different frequencies, power levels andmodulation schemes. The SDR infrastructure has a program-ming interface that enables these configuration options.These, in conjunction with the context of operation (radiointerference and noise, traffic demand, mobility levels, ele-ment status, location, etc) are made available to a decision-making entity, which selects the best configuration bysolving an optimization problem with respect to some objec-tive function. Further input is contained in a system knowl-edge base that codes the contexts encountered and mapsthem to specific radio configurations that can be used. Thismapping can be done through a reasoning engine which isessentially a set of logical inferring rules (policies) and “rea-sons” (i.e. searches) for a proposed set of actions that willmanipulate the current state of the knowledge base in an

news video from the crisis setting, a phone call or e-mailfrom an operational or tactical manager). Additionally, thePlanner “reacts” to trainees’ strategic decisions, triggeringsubequent events to continue the session.

The overall system empowers the trainer with a new meansfor training people. Indeed the suggested crisis stimuli andthe behavioural analysis are presented to the trainer to influ-ence at any moment the training session, in perfect line witha mixed-initiative style.

Links: http://www.pandoraproject.eu/http://epcollege.com/http://pst.istc.cnr.it/

Please contact:Amedeo Cesta, ISTC-CNR, ItalyE-mail: [email protected]

Keith Strickland – EPC, [email protected]

Figure 1: The PANDORA system architecture.

Page 49: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 49

such as jamming, sensory manipulation, belief manipulation,routing, etc , for each layer.

Further on we aim to develop new mechanisms for detecting,isolating and expelling misbehaving insiders. Such maliciousnodes may have all the credentials provided by an “off theshelf” security solution as the ones proposed to be applied inIEEE 802.22. This therefore requires the detection ofabnormal secondary user operation through pattern analysisand node cooperation, since the feedback from the CRdevices will enhance the efficiency required for such anIntrusion Detection System (IDS).

Improving sensory input can reduce the exploitability of cog-nitive radios in a cross-layer fashion. For example, if the CRcould identify the difference between interference and noise,they would distinguish between natural and malicious RFevents. Identification and quantification of potential gainsfrom the interference identification is important in order todefine more robust CR MAC policies. Furthermore, in a dis-tributed environment, a network of cognitive radios can fusesensor data to improve the quality of input for the cognitiveengine. Such techniques should be designed with smallinformation exchange requirements in order to be energyefficient, keeping in mind that the client CR devices will bebattery operated.

This project is a joint effort by the Mobile Telecommunica-tions group of the Department of Science and Technology inLinköping University and the Institute of Computer Scienceof the Foundation for Research and Technology-Hellas(FORTH-ICS).

Please contact:Vangelis Angelakis, Mobile Telecommunications, ITN-LiU, SwedenTel: +46 11 363005E-mail: [email protected]

Ioannis G. Askoxylakis, FORTH-ICS, GreeceTel: +30 2810 391723E-mail: [email protected]

optimal way. In principle, the CR functionality may alsoinclude a learning engine making it capable of starting withno policies, and by utilizing a variety of classic ArtificialIntelligence (AI) learning algorithms identify which configu-ration will work optimally in the current and future contexts.

While CR devices are built with components that have beenwell-established in the telecommunications and computerscience disciplines, the existing approaches to providerobustness and effective security for a network of CR devicesare inadequate. Due to the particular characteristics of theCR systems, new types of attack are possible and some of thewell-known types increase in complexity. Therefore, newideas are needed to make CR networks secure and robustagainst specific attacks, especially against those that areinherent to the CR functionality.

Specifically, sample attacks can target the inputs consideredfor the formation of the CR networks and the respective opti-mization problem, for instance attacks might compromisethe accuracy of the context information sensed or the set ofcandidate nodes that may be involved in the network. One ofthe key features of cognitive radio is that sensory manipula-tion can lead to knowledge manipulation, meaning that mali-cious actions in the present can affect the radio performancein the future. Other attacks can target the outputs describingthe CR network that should be formed even under unharmedinputs, eg, a set of nodes that should be involved, configura-tions that should be selected, etc. Finally, as in all wirelessnetworks, attacks can be designed to lead the client devicesto configurations that are inefficient in terms of energy andmake them run out of batteries.

There is, therefore, the need for comprehensive and energyefficient mechanisms to discourage, identify and mitigate theattacks at all phases of the cognitive cycle, in order to obtainCR systems that are trustworthy, efficient and dependable.Furthermore, in this scope there is need for a new systemicevaluation of the robustness of a CR system, in order to setthe requirements and expectations for a resilient CR networkfrom a security viewpoint.

The targets of this joint work are to initially identify thethreats at the different layers and to classify the major topics,

System Knowledge

Base

Decision Maker

Radioconfig.options

context of operation

InterferenceNoise

MobilityTrafficS/w interface

Radio Frontend

Learning

ReasoningEngine

Figure 1: Components of a SDR-

based Cognitive Radio.

Page 50: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Research and Innovation

Smart Cameras for Citiesof the future:Simultaneous Counting of Pedestrians and Bikesby Ahmed Nabil Belbachir, Norbert Brändle and Stephan

Schraml

The�Austrian�Institute�of�Technology�has�developed�a

stereo�vision�system�that�monitors�and�counts

pedestrians�and�cyclists�in�real-time.�This�vision�system

includes�a�processing�unit�with�embedded�intelligent

software�for�scene�analysis,�object�classification�and

counting.

Europe has one of the highest population densities in theworld, mostly concentrated in urban areas. The human activ-ities in the countries of the European Union produce about14.7 % of the worlds’ CO2 emissions, ranking the EuropeanUnion third highest emitter after USA and China. Roadtraffic is one of the principal contributors of CO2 emissionsmainly due to an aging car fleet on Europe’s roads, growing

congestion, increasingly dense traffic, a lack of traffic man-agement, slow infrastructure improvement and a rise invehicle mileage.CO2 is not a toxic emission for the popula-tion like the pollutants (NOx, PMx, SO2, CO) and it is there-fore not regulated by the EC at the moment. However, CO2emissions contribute to climate change.

Current investigations on cities of the future aim, amongothers, to include concepts for the reduction of the number ofcars on urban roads and implementation of widely availableenergy efficient public transportation. These investigationsalso deal with creating city designs that combine living,working, shopping and entertainment, and which are morepedestrian- and cyclist-friendly.

In other words, the mobility approaches of the future citieswill concentrate on walking, biking and using availablepublic transportation. Therefore, it is very important to actearly in establishing new technologies, tools and approachesfor monitoring and management of future urban traffic tomaximize safety and security of citizens.

Within the framework of a national research activity, AIT hasdeveloped an event-based stereo vision system for real-timeoutdoor monitoring and counting of pedestrians and cyclists.This vision system comprises a pair of dynamic vision

ERCIM NEWS 85 April 201150

Figure 1: Image of the event-based stereo vision system.

Figure 3: Illustration of process for detection and counting pedestrian.

Figure 2: Raw data from the event-based detector spatiotemporally

represented.

Image from a

standard camera

Raw data from the

event-based detector

Tracks of the counted

pedestrian and cyclist

Detected pedestrian

and cyclist

Corresponding depth

map (color coded)

Page 51: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011

detector chips, auxiliary electronics and a standalone pro-cessing unit embedding intelligent software for sceneanalysis, object classification and counting.

The detector chip consists of an array of 128x128 pixels builtin a standard 0.35μm CMOS-technology. The detector aimsto duplicate biological vision with an array including a set ofautonomous self-spiking pixels reacting to relative lightintensity changes by asynchronously generating events. Itsadvantages include high temporal resolution, extremely widedynamic range and complete redundancy suppression due toincluded on-chip preprocessing. It exploits very efficientasynchronous, event-driven information encoding for cap-turing scene dynamics (eg moving objects).

The processing unit embeds algorithms for calculating thedepth information of the generated events by stereomatching. It also embeds a spatiotemporal processingmethod for the robust analysis of visual data encompassingdynamic processes such as motion, variable shape, andappearance enabling real-time object (pedestrian or cyclist)classification and counting. The whole system has been eval-uated in real (outdoor) surveillance scenarios as a compactremote standalone vision for pedestrians and cyclists.

Figure1 depicts the event-based stereo vision systemincluding the pair of detector chips, the optics, the auxiliaryelectronics and the processor core module mounted on thebackside of the board (not visible on the picture). The eventsgenerated by one detector chip are represented in Figure2 fortwo persons crossing the sensor field of view over a timeperiod of 1.73 sec. The asynchronous nature of these eventsallows an ideal spatiotemporal representation and processingof the data. Figure3 shows the processing steps and resultsfor counting pedestrians and cyclists. An image of the sceneusing a standard camera is shown on the left. Its correspon-ding data using one event-based sensor is rendered in animage-like representation (second- left). The middle imagesshow the processing results of the detection and of the stereomatching. The right image depicts the tracking results, whichallow the pedestrians (brown track) and cyclists (red track) tobe counted.

Link:http://www.ait.ac.at

Please contact:Ahmed Nabil Belbachir, Norbert Brändle, Stephan Schraml, Erwin SchoitschAIT Austrian Institute of Technology / AARIT, AustriaE- mail: [email protected], [email protected], [email protected], [email protected]

51

Simulation of Urbantraffic including Bicyclesby Jelena Vasic and Heather J. Ruskin

The�aim�of�this�research�is�to�contribute�to�the�'greening'

of�city�traffic�through�investigation�of�heterogeneous

traffic�dynamics�on�urban�road�networks.

This study, which builds on previous work in the area oftraffic flow modelling done by our group, is one of severalcurrent related projects in the Centre for ScientificComputing & Complex Systems Modelling at Dublin CityUniversity. The work is funded by the Irish Research Councilfor Science, Engineering and Technology (IRCSET),through an 'Embark Initiative' postgraduate scholarship.

The motivation behind the project lies in a desire to con-tribute to the 'greening' of urban transport. Its focus, in con-crete terms, is on dynamics of bicycle traffic as part of theoverall urban transport picture in cities like Dublin. The city,with respect to bicycles, is characterised by a lack of dedi-cated cycle lanes and bicycle-specific control measures, thatis, by motorised and non-motorised modal coexistence underconditions suited to exclusively motorised flows.Understanding these dynamics is crucial to providing thebest control and management strategies, in terms of success-fully promoting 'green' modalities at the lowest possible cost.

Discrete simulation models of traffic based on cellularautomata (CA) have existed since the 1980s but have onlybecome a widely researched topic since the availability ofcomputers sufficiently powerful to tackle the complexities ofvehicular traffic. These models are microscopic in nature, inthat they describe the behaviour of individual particles andtheir interactions at a local level, but have the purpose of ulti-mately providing information about throughput and averagedelays. Complexity is added to the task of modelling trafficflows by the presence of many transport modes, in particularthose for which the units differ widely in size, potential formovement and behaviour. Network structure is another fun-

Page 52: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Research and Innovation

ERCIM NEWS 85 April 201152

literally "from the picture". In order to make this model fitwith the shared road model, transversal positions on routesthrough an intersection are mappable onto transversal posi-tions of a shared road. (3) From the point of view of vehiclemovement, a network can be reduced to a collection of "con-flicts" and "decisions". A conflict point is a place where tworoutes overlap, while a decision point is a point in the net-work where intelligence needs to be applied for the nextmovement to be determined.

Further work on this project will initially involve the applica-tion of the model to more complex network configurations.As the scale of the simulations is increased, additional pro-cessing power will be needed and we aim at devising paral-lelization strategies for the simulations. Once the robustnessof the model is confirmed through simulations of the highlycomplex network configurations, different management andcontrol strategies will be modelled and compared. Real lifemeasures taken in the direction of bicycle-friendliness willbe of particular interest, as simulations of "before and after"scenarios, incorporating available data on the effects ofchanges, will provide a true test for the model.

Links:http://sci-sym.computing.dcu.iehttp://computing.dcu.ie

Please contact:Jelena VasicDublin City University, IrelandTel: +353 17006747E-mail: [email protected]

Figure 1: Discretisation of the natural geometry

of a left turn (on a left-hand driving road, as in

Ireland or the UK): The transversal positions are

defined by lines aiaos ... gigos for straight ahead

and by lines aiaol ... gigol for turning left. Cells of

both the left-most bicycle position and the right-

most car position, in the straight ahead direction

and the left direction, are shown by shadings and

hatchings, respectively. At the entry point, both

cars and bicycles are faced with a decision (left or

straight ahead?). A conflict exists between bicycles

in the left-most position going straight and cars in

the right-most position going left.

damentally complex element in a traffic system under simu-lation.

In the first phase of this project, our aim was to define a sim-ulation model that would answer three questions:• How can we incorporate two modalities, cars and bicycles,

with their differing characteristics, into a shared geometricspace (representing a street without a dedicated cyclelane)?

• How do we generalise the representation of complex infra-structural features of the network, such as intersections,for the multi-modal case?

• How do we define a general behaviour model for vehicleson a random network of roads?

We have, to date, answered these questions through a simula-tion model, which provides macroscopic measurements forrandom network configurations. (1) In addition to the dis-crete movements in the longitudinal dimension of the road,we have introduced a discrete measurement of the trans-versal position of vehicles, necessary in the conditions oflane-sharing, since a vehicle's transversal position is not onlydefined by the lane it occupies but also by its take up of thelane space. This means that allowed positions overlap. Ourmodel is thus strictly not a CA one, although it still retainsCA elements, in its basic movement rules. (2) Since the pres-ence of more than one geometrically different model makesthe abstraction of intersections and other complex features ofthe network extremely difficult, the approach has been todraw a geometrically natural representation of an intersec-tion to be modelled, including conflicts and divergence ofroads, and to extract a numerical model of the intersection

Page 53: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 53

Research and Innovation

Simulation andAssessment of VehicleControl Network Systemsby Alexander Hanzlik and Erwin Kristen

The�conventional�automotive�industry�is�a�market�of

mass�production.�On�the�one�hand,�due�to�the�fact�that

the�same�platform�may�be�used�for�different�car

generations,�the�lifecycle�of�a�control�network

architecture�(physical�network�and�communication

schedule)�is�expected�to�be�at�least�one�decade.�On�the

other�hand,�future�extensions�to�car�functionality�come

along�with�increased�bandwidth�requirements.

Additionally,�there�are�safety�requirements�to�fulfill�that

have�to�be�reflected�in�the�design�of�the�control

architecture.�The�technical�challenge�is�the�design�of�an

architecture�that�fulfills�existing�requirements�while

being�open�for�new�functionalities�without�the�necessity

to�re-design�the�whole�control�network.

This challenge is addressed by the DTF Data time flow simu-lator, a discrete-event simulation environment that will helpto identify and validate an optimal car control architecturewith regard to technical requirements, safety requirementsand industrial norms. The idea is to derive the control archi-tecture in an early design phase, ideally at the requirementspecification stage, to build a simulation model of the archi-tecture and to test the control network in the simulation envi-ronment. If the tests reveal requirement violations (eg dead-lines of safety-relevant signals are missed), the control net-work is re-designed in the DTF simulation environment orthe communication schedule is modified. This cycle isrepeated until all requirements are met.

Figure 2 shows the structure of the DTF simulation environ-ment. Starting with a graphical description of the system, the

converter generates a textual bus model description that inturn is the input for the parser. The parser checks the busmodel description for correctness and consistency andpasses the model description to the DTF simulator engine.

The simulator engine creates the necessary elementsaccording to the model description. Then, the event list isgenerated according to the drive cycle description (anexample for a drive cycle description is a sequence of eventsthat defines different accelerator pedal positions for dif-ferent points in time). Last, the event list processing isstarted with the first event. In the course of the simulationprocess, an event received at an element may lead to the cre-ation of another event that is added to the event list.Simulation is complete when the event list is empty. Allevents are logged into a simulation results file that can beprocessed offline after simulation ends to gain data neces-sary for visualizations and for further analysis.

Principle of operation

The DTF principle of operation is to build up complex con-trol systems from primitive building blocks (eg sensors,processors, actuators), the elements. Each element has aninput buffer, a propagation delay and an action routine.Upon reception of a trigger, an element reads its input buffer,generates an output value according to its action routine, andtransmits the output value with a defined propagation delay.Element triggers are event triggers (an event for this elementis defined in the event list) or time triggers (the element trig-gers itself with a defined period). Each element has one ormore inputs and one output. Elements can be grouped intosuper-elements, derived blocks with a high inner complexitywhich group logical functions belonging together into asingle block (eg a Controller Area Network CAN bus).Signals are issued to the sensor elements and signal propa-gation is observed, both in the domains of value and time,from the sensors over the control network to the actuators.From the signal propagation time distribution over the con-trol network important information can be gained about the

Figure 1: DTF

development cycle.

Page 54: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Research and Innovation

ERCIM NEWS 85 April 201154

dynamics and responsiveness of the system, especially forsafety-relevant signals like the accelerator pedal positionthat are usually subjected to real-time constraints.

Validation

For validation of the DTF implementation a ConceptDevelopment Platform CDP is used to compare simulationresults and bus measurements from the CDP. As shown inFigure 3, the CDP is a single-axe driving platform whereeach wheel is driven by one electric motor, stabilized by afree-running support wheel at the back. The CDP providesthe mechanical system to develop automatic control soft-ware for the powertrain as well as the necessary structuresfor network design. The Data Time Flow Simulator DTF iscurrently under development at the Austrian Institute ofTechnology (AIT) in the context of the ARTEMIS JointUndertaken (JU) POLLUX project that is related to the con-trol electronics architecture design of the next generation ofelectric cars.

First simulation results are available and have been pre-sented at the annual POLLUX project meeting in Brussels.The DTF simulator results were approved by the projectpartners and this tool is going to play an important role insimulation and assessment of vehicle control network sys-tems.

Please contact:Erwin KristenAIT Austrian Institute of Technology GmbH / AARIT,AustriaE-mail: [email protected]

Figure 3: Concept

Development Platform and

DTF Simulation Results.

Figure 2: DTF simulation

environment.

Page 55: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 55

filling the Gap betweenICt and HydroMeteoResearch Communities:the dRIHMS Projectby Alfonso Quarati, Nicola Rebora, Michael Schiffers

Hydrometeorological�science�has�made�strong�progress

over�the�last�decade�as�new�modelling�tools,�post

processing�methodologies�and�observational�data�have

become�available.�The�goal�of�the�DRIHMS�project�is�to

promote�collaboration�between�the�hydrometeorological

and�the�GRID�research�communities�in�order�to�further

advance�the�state-of-the-art.

The DRIHMS (Distributed Research Infrastructure forHydro-Meteorology Study) project is an EU support actionon the use of Grid, High Performance Computing (HPC) andInformation and Communication Technologies (ICT) in thefield of HydroMeteo Research (HMR). Its goals are:• to promote the GRID paradigm within the European

hydrometeorological research community

Figure 1: The HMR forecasting chain: satellite data (top left), meteorological ensemble probabilistic system (top right), stochastic downscaling

(bottom left) and high-resolution meteorological modelling (bottom right).

• to boost European research excellence and competitive-ness in hydrometeorological and Grid research by bridg-ing the gaps between these two communities.

DRIHMS is conducted by a consortium of hydrometeo-rology (DLR and CIMA) and ICT (LMU and IMATI)research centres that integrate complementary multidiscipli-nary know-how enabling the use of Grid-related technolo-gies in the area of hydrometeorological science.

The prediction of floods and other hydrometeorologicalevents relies on hydrological and meteorological forecastmodels that describe the hydrological cycle in the atmos-phere. These predictions are based on observational meas-urements, for example of rainfall and river flow. In recentyears, the quantity and complexity of the tools and data setshave increased dramatically for three reasons:• the increasing availability of remote sensing observations

from satellites and ground-based radars that provide com-plete three-dimensional coverage of the atmospheric andland surface state

• the development of forecasting methods that combinemultiple numerical weather prediction and hydrologicalmodels through stochastic downscaling techniques toquantify the uncertainty in the forecast, multiplying thecomputational costs

Page 56: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

Research and Innovation

knowhow in the European HMR community. The networkingactivities will facilitate the development of a pool of commonknowledge from the knowledge currently available and hope-fully derive new knowledge from last generation hydromete-orological processes observing/modelling systems.

DRIHMS is currently surveying the ICT and HMR commu-nities in order to collect the requirements of the HydroMeteoresearch community and to assess which ICT tools can sat-isfy them. So far almost 300 questionnaires have been com-pleted by scientists from all over the world, but we wouldlike to request the assistance of all those interested in this ini-tiative to spare a few minutes to complete the questionnairesonline at the link below.

Link: http://www.drihms.eu/survey

Please contact:Alfonso Quarati, IMATI-CNR, ItalyE-mail: [email protected]

ERCIM NEWS 85 April 201156

• increasing recognition of the need to understand the entireforecasting chain, from observations through to civildefence response, resulting in complex workflows able tocombine different data sets, models and expertise in a flex-ible manner.

HydroMeteo Research is closely linked to operational fore-casting. Researchers rely on data archives maintained byoperational agencies and increasingly make use of opera-tional modelling tools. However these data sets and tools aremainly the property of national agencies and are not ofteneasily obtainable at a European level. A second major sourceof material for research is the existence of ad hoc collectionsof data from field campaigns and experimental instruments.However, the heterogeneity of the data, and of the metadatadescribing it, makes it difficult for scientists to locate andexploit the data relevant for their task.

Similarly, the scattering of hydrometeorological data toolsamong national and ad hoc collections is a substantial barrierto progress in research. On the one hand, weather systemsfreely cross national boundaries, rendering national archivesof limited use. This has long been recognised in meteorology,and indeed the World Meteorological Organisation (WMO)effectively coordinates the international exchange of muchmeteorological data. Unfortunately this does not includemuch of the high resolution data required for hydrometeoro-logical research. Figure 1 shows the key elements in an HMRforecasting chain; they are all highly demanding from thedata storage and high performance computing standpoints.

Similarly, the scattering of hydrometeorological data toolsamong national and ad hoc collections is a substantial barrierto research progress. On the one hand, weather systemsfreely cross national boundaries, rendering national archivesof limited use. This has long been recognised in meteorology,and indeed the World Meteorological Organisation (WMO)effectively coordinates the international exchange of muchmeteorological data. Unfortunately this does not includemuch of the high resolution data required for hydrometeoro-logical research.

The need for large data sets and complex numerical modelsis not unique to hydrometeorological research. Under thename of e-science, a number of initiatives have been under-taken to provide the necessary ICT infrastructure, includingEGEE (Enabling Grids for E-sciencE), SEE-GRID-SCI(South East Europe -GRID e-Infrastructure for regional e-Science), and the German C3-Grid.

DRIHMS aims at identifying the key areas for hydrometeo-rological research which require a network-based and dis-tributed approach in terms of data and software sharing.Project activities focus on discussing, defining and commu-nicating the requirements for the porting and deployment ofstate-of-the-art research applications and tools over hetero-geneous Grid middleware.

The key element of DRIHMS is the organization of a set ofnetworking activities (including web based questionnaires,invitation-only workshops and open conferences), involvingboth hydrometeorologists and Grid experts, aimed at over-coming current limitations in the sharing of tools and

the SOCIEtIES Project:Combining PervasiveComputing with SocialNetworksby Kevin Doolin, Pierfranco Ferronato and Stefania

Marrara

SOCIETIES�will�design�and�deliver�a�complete�integrated

package�for�the�creation�and�management�of�Social

Networks.�The�project�will�extend�pervasive�systems

beyond�the�individual�to�dynamic�communities�of�users.

Four out of ten of Alexa Top 500 Global sites(http://www.alexa.com/topsites as of November 2010) areSocial Network sites. Barack Obama based his successfulpresidential campaign on social media and used them toobtain funding. LinkedIn, a Social Network of professionalprofiles, has seen its traffic doubling during the economicalcrisis. Although there is no doubt that Social Networks havealtered the expectations of Internet users they are still mostlybased on certain assumptions that limit their capabilities andprevent them from exploiting their full potential.

The first assumption is that the information stream that feedsa Social Network is provided interactively, through a Webbrowser, by the human members of the Network. Thisassumption has recently been partially invalidated with theadoption of contextual data sources for Social Networks (eg,Facebook Places and Facebook Deals).

The second assumption is that the mesh of relationships of aSocial Network user is a “flat” network. In other terms, thereis no way of defining hierarchies of networks where each

Page 57: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 57

node in a network at a given level is actually a network itselfat a lower level in the hierarchy.

To overcome these limitations SOCIETIES, an IntegratedProject of the EC Seventh Framework Programme, willinvestigate how pervasive data sources can increase the use-fulness and accuracy of social networks. SOCIETIES willimprove the utility of Future Internet services, mergingsocial computing and pervasive computing, by focusing onfour main objectives, which will represent the core for thedefinition of Cooperating Smart Spaces (CSSs).

CSSs are the building blocks for enabling pervasive com-puting in social communities, as they constitute the bridgebetween a user's context (devices, sensors etc.) and the user’ssocial community. A CIS (Cooperating Interaction Space) isa collection of CSSs and/or supporting infrastructure serv-ices, that intends to collaborate for mutual agreed purposes.

The four main objectives of SOCIETIES are:• to facilitate communities creation, organisation, manage-

ment and communication via Community Smart Spaces,where pervasive computing is integrated with social com-puting communities.

• to provide an enhanced user experience – both for individ-uals and entire user communities – based on proactivesmart space behaviour and dynamic sharing of communi-ty resources across geographic boundaries.

• to design and prototype a robust open and scalable systemfor self-orchestrating Community Smart Spaces.

• to evaluate, via three user trials, the usefulness andacceptance of the CSS software developed.

The main features that will be offered by a CSS to its user arethe following:• support the manual or automatic creation of static or

dynamic communities. The creator of a community will beable to advertise it, or to invite particular users to join, andto specify principles governing the community itself. Thesystem will be able to discover potential new members,also via social network site queries, as well as detecting

and removing community members whose ties havebecome obsolete.

• enable users to share resources with other communitymembers in a seamless unobtrusive manner.

• support multiple techniques for the discovery of relation-ships and behaviours within communities.

• allow for the orchestration of multiple communities towhich its owner belongs, maintaining a registry of super-and sub- communities in community hierarchies alongwith policies on information disclosure and service accessto members of other related communities.

• enable the proactive exchange of information on the situ-ation, interests and resources of community members.

• provide intelligent conflict resolution among the membersof a community based on mediation and negotiation.

• support ad-hoc communication at both intra- and inter-community levels, exploiting peer to peer communicationtechniques and wireless networking technologies.

The SOCIETIES consortium is coordinated by the WaterfordInstitute of Technology (Ireland), and includes the Heriot-Watt University (United Kingdom), Soluta.Net (Italy), theDeutsches Zentrum fuer Luft – und Raumfahrt Ev(Germany), the Zavod za Varnostne TehnologieInformacijske Druzbe in Elektronsko Poslovanje (Slovenia),the Institute of Communication and Computer Systems(Greece), Lake Communications Ltd (United Kingdom),Portugal Telecom Inovacao Sa (Portugal), IBM Israel –Science and Technology Ltd (Israel), Institut Telecom(France), AMITEC Ltd (Greece), Telecom Italia s.p.a.(Italy), Trialog (France), Stiftelsen Sintef (Norway), andNEC Europe Ltd (United Kingdom). The project started onOctober 2010 and will end in March 2013.

Link:http://www.ict-societies.eu

Please contact:Stefania Marrara SOLUTA NET, ItalyE-mail: [email protected]

Figure 1: Mobile CSSs participate in one or more

Community Interaction Spaces (CIS) which

represent pervasive communities. Each CIS offers

several characteristics to its CSSs such as a set of

shared resources or services.

Page 58: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 58

Events

German-Austrian

W3C Office at dfKI

The�World�Wide�Web�Consortium

(W3C)�has�opened�its�German-

Austrian�office�at�a�new�location:�DFKI,

the�German�Research�Centre�for

Artificial�Intelligence.�This

collaboration�was�celebrated�on

February�10th�with�an�event�in�Berlin.

Speakers�from�W3C,�industry�and

DFKI�as�well�as�German�ministries

gave�presentations�and�made�the

event�a�highlight�for�European�politics,

industry�and�research.�

The internationally renowned DFKI,with facilities in Kaiserslautern,Saarbrücken, Bremen, and a projectoffice in Berlin was founded in 1988 asa non-profit organization by severalGerman IT companies and two researchfacilities. DFKI has been an activemember of W3C for a long time and hascontributed to areas like Semantic Web,EMMA, EmotionML, MathML or mul-timedia semantics. These achievementsare one of several reasons to bring theGerman-Austrian W3C office to DFKI. In his opening presentation, WolfgangWahlster (CEO of DFKI) emphasizedthe central role that the Web plays forany kind of IT intensive industry. DFKIowns an ideal position in Germany tointegrate W3C with research, politicsand industry.

Participation in standardization efforts isa key means for DFKI to connectresearch with innovation and with thedevelopment of applications and prod-

ucts. In their presentations, WolfgangWahlster, Philipp Slusallek and HansUszkoreit gave an overview of the varietyof W3C related topics in which DFKI isengaged: Object Memory Modelling(OMM), the Unified Service DescriptionLanguage (UDSL), XML3D, and lan-guage technologies. Orestiz Terzidis(SAP) expressed his hope that USDL willsoon be entering the regular W3C stan-dardization process.

Andreas Goerdeler from the GermanFederal Ministry of Economics andTechnology gave examples of BWMIfunded projects that have created inputto standardization within W3C. He alsoemphasized the importance of closecooperation between standardizationbodies like W3C or ISO and its nationalcounterparts.

Jeff Jaffe, CEO of W3C, gave a concreteand important example how that cooper-ation is brought to life. Last November,W3C received the status of an “ISO/IECJTC 1 PAS Submitter”. This means thatW3C Recommendations can easily enterthe ISO process and become ISO speci-fications. Jeff Jaffe and PhilippHoschka, W3C Ubiquitous Web domainlead, presented a great variety of currenttopics in Web related standardization.These topics impressively demonstratedthe demands from IT intensive industry.The presentation by Stefan Wess, CEOof the German company Attensity, illus-trated the role of the Web in social mediaanalysis.

The official part of the event ended witha contribution from Michel Cosnard,

Euro-India cooperation

in future Internet

Research Experiments

by Sathya Rao

The�FP7�funded�MyFIRE�project

organised�the�first�Future�Internet

Research�Experiments�(FIRE)

workshop�in�Pune,�India�on�16

December�2010�together�with

itsIndian�partners.�The�workshop�was

co-located�with�the�‘Beyond�Internet’

conference.�The�aim�was�to�identify

common�interests�in�developing

collaborative�European-Indian

experimental�research�platforms�to

study�the�Internet�of�the�future.�

MyFIRE has plans to organize interna-tional workshops related to FIRE inBRIC countries during its lifetime toextend the future Internet Researchexperiments to the international levels,involving research testbeds and researchinstitutes from emerging economies. Asa first of this series of workshops, theproject organized a workshop on 16Dec. 2010, in India at the industrial cityof PUNE (near Bombay) which hoststhe research centre for national grid andcloud computing. The other reason for

President of ERCIM and INRIA.Together with Wolfgang Wahlster,Michel Cosnard emphasized the centralrole that Web technologies and their stan-dardization plays for European ICT infra-structure. To this end, ICT labs (a“Knowledge and InnovationCommunity” supported by the EuropeanInstitute of Innovation & Technology) arean important means to connect IT relatededucation, research and innovation.

The big variety of presentations and par-ticipants in the opening event of theGerman-Austrian W3C office at DFKIdemonstrate that the office fulfils its roleof bringing heterogeneous communitieslike politics, research and industrytogether. The office will help to buildsteady bridges between these communi-ties and will help to fulfil their commongoal, ie to work on the future of the Web.

More information: http://www.w3c.de/http://www.w3.org/Consortium/Offices

From left: Michel Cosnard, Wolfgang Wahlster, Felix Sasaki, Jeff Jaffe and Andreas Goerdeler.

Page 59: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 59

Mobile Broadband: Challenges andOpportunities” and provided details onIndia’s present telecom scenario, back-haul innovations and a low-cost, IP-based distributed architecture.

The afternoon panel discussion waschaired by Prof. Ramjee Prasad to discussthe topic, Future Internet Experiments:Opportunities for Cooperation betweenEurope and India. The panel memberswere Mr.Jørgen Friis (ETSI), ViswanathTalasila (Honeywell), Sathya Rao(Telscom/ETSI) and Prof.AbhayKarandikar (IITB). The panel had a livelydiscussion answering questions on col-laboration opportunities.

MyFIRE provided the overview of FIREactivities in Europe and the surveyresults of MyFIRE findings. Over 135participants were offered a comprehen-sive view of the FIRE initiative priori-ties, projects, and programmes in theEuropean Union while benchmarkingthe Indian ICT research activities andinitiatives, identifying the requirementsof the research experiment landscapefrom providers’ and users’ perspectivesand offering a vision of the way ahead inthis collaborative endeavour.

The discussions both during the work-shop and in private showed considerableinterest of Indian researchers inEuropean FIRE activities and in closercollaboration with possible participationin the future projects. The list of inter-ested persons is available for theresearchers in Europe on request, fordeveloping future collaboration withIndian partners.

The questionnaire was distributed to col-lect the participants’ feedback and theresults were overwhelmingly positive:the Indian Scientific community wouldlook forward to closer collaboration withEuropean researchers, and they wouldlike to have more such events to developcloser networking and cooperation.

The MyFIRE project is organising thenext workshop on 8th of April 2011, inBeijiing, China. Details of the workshopcan be seen at http://www.my-fire.eu/chinese-workshop.

Link: http://www.my-fire.eu

Please contact: Sathya Rao, Telscom, SwitzerlandE-mail: [email protected]

choosing Pune, was to develop closerlinks to international delegates partici-pating in ITU Kaleidoscope event toaddress the ’Beyond the Internet’ chal-lenge, which was also organized in thesame location on 13-16 Dec. 2010.

The agenda and the presentations of theworkshop can be seen at www.my-fire.eu/indian-workshop. The workshopwas organized by ERNET India, whichrepresents the national education andresearch network, representing 1300Universities and research institutes. Theworkshop had 136 participants in total,representing different stakeholders infuture Internet research. Mr. N. MohanRam, Director General of ERNET Indiawelcomed the MyFIRE project part-ners, delegates and other guests to theworkshop and presented ERNET Indiaactivities including network services,international connectivity and researchcollaborations.

The main theme of the workshop was‘Euro-India Future Internet Researchand Experiments Collaboration’ toassemble the experts from both India andEurope addressing the research experi-ments and the different aspects of futureInternet design. In this context, the chiefkeynote speakers were invited from theIndian communication ministry topresent the landscape of Indian Internet

research, networks and testbedsdeployed and planned (Mr. B. K.Murthy, Head, NKN & InternetGovernance / E-Infra Divisions, DIT,India) and Prof. Ashok Jhunjhunwala,the eminent scientist, recipient of mul-tiple national awards, Indian Institute ofTechnology, Madras, India, as well asProf. Ramjee Prasad, Chairman (GISFI))and the speakers from Indian Institute ofScience, Indian Institute of technology,Center for development of advancedcomputing, and Department ofTelecommunication, Indian Governmentand national contact point ECIR repre-sentatives. The workshop had two panelsto address: ‘Testing Environment for

Development of Future Internet:Activities Landscape’ and ‘FutureInternet Experiments: Opportunities forCooperation Between Europe and India’so as to have interactive discussions.Testbeds and research experimentsrelated to IPv6, QoS, sensor networks,mobility, interoperability and Planetlabwere part of speakers’ presentations.

From the Indian side, Prof. AshokJhunjunwala delivered the keynoteaddress and discussed the importance ofinclusive and sustainable innovation forthe Future Internet. He also shared hisvision of ICT innovation for the ruralIndia. Dr. B.K.Murthy from Departmentof Information Technology (DIT) spokeabout Indian ICT landscapes and roadmaps and provided details on DIT initia-tives and presented the NationalKnowledge Network project initiated bythe Government of India, which couldbe a model for many developed coun-tries as well as for developing knowl-edge for society in the future.

The panel discussion in the morningsession was chaired by Prof. Vinodsharma, Chairman, Department ofElectrical and CommunicationsEngineering, Indian Institute ofScience, Bangalore, to discuss a testingenvironment for development of thefuture Internet. The panel members

were Dr.Malati Hegde(IISc,Bangalore), Mr. Badri NarayananParthasarathy (TATA) and Dr.Paventhan.A (ERNET India). Dr.Malati Hegde spoke on 6PANview- anetwork monitoring system for their6LOWPAN network testbed. Mr. BadriNarayanan Parthasarathy gave a talk“Beyond the Internet – Innovation forFuture Internet Services” and provideddetails about IPv6 features and businessbenefits. Dr. A. Paventhan presented atalk on research test beds for MobileIPv6 in Heterogeneous access networksand media-independent handover serv-ices. Prof. Abhay Karandikar (IIT,Bombay) presented “Next Generation

Opening session with key note speech from Prof. Ashok Jhunjunwala.

Page 60: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

dL.org Workshop

on digital Libraries,

Open Access

and Interoperability

Strategies

by Stephanie Parker

and Giuseppina Vullo

DL.org,�an�EC-funded�Coordination

Action�on�Digital�Library�Interoper-

ability,�Best�Practices�and�Model-

ling�Foundations,�has�adopted�a

comprehensive�and�innovative

approach�to�top-level�challenges�in

the�field�by�harnessing�the�global

expertise�that�exists�through�dedi-

cated�groups.

Digital Libraries should enable accessto knowledge in multi-modal format toany citizen, anywhere, anytime,breaking down the barriers of distance,language and culture. Building andmaintaining scientific e-Infrastructures,preserving cultural heritage, and sup-porting educational processes are justsome examples of the value-add ofDigital Libraries, which are by defini-tion complex systems, intrinsicallyinterdisciplinary and heterogeneous.Interoperability is a key-step toensuring that digital libraries continueto grow in a way that allows users tonavigate through different sourceswithin an integrated single environ-ment. It is also a multi-layered, context-specific concept which can be analyzedfrom organizational, semantic, andtechnical levels. The role of repositoriesand Open Access has been crucial tobroaden the function of DigitalLibraries within the research commu-nity. Open Access Repositories (OARs)have also enhanced the reputation ofinstitutions by making their researchmore visible and bringing greater returnon investment for funding agencies.

To weave together these topics centralto advancing a collective mission thatcuts across disciplines, professionalroles and geographical boundaries,DL.org hosted a workshop on 4February 2011 at the British Academyin London. The Workshop shone thespotlight on existing frameworks andbest practices key to achieving open,interoperable information systems by

triggering a multi-disciplinary debate.Discussions proposed common strate-gies for interoperability and how toimplement mechanisms forexchanging, sharing and integratingresults between Digital Libraries andOpen Access Repository communities.

Leonardo Candela (ISTI-CNR. Italy),Vittore Casarosa (ISTI-CNR, Italy),and Giuseppina Vullo (HATII, at theUniversity of Glasgow, UK) showcasedkey DL.org outputs produced for theLibrary and Information Science com-munity. The DL.org Digital LibraryReference Model, stemming from theDELOS Network of Excellence, hasbeen enhanced and expanded to reflectthe investigations of internationalexperts on content, functionality, user,policy, quality and architecture. A dedi-cated Checklist enables designers andassessors of a digital library to deter-mine conformity with the ReferenceModel. The Cookbook, with a compre-hensive and pragmatic approach, pro-vides a portfolio of best practices andpattern solutions to face common chal-lenges when it comes to developinglarge-scale interoperable DigitalLibrary systems.

Hans Pfeiffenberger (Alfred WegenerInstitute, Germany) underscored theneed to ensure long-term preservationof scientific knowledge with policyembedded in data repositories to createdigital data libraries with persistentaccessibility. DataCite, a Digital ObjectIdentifier (DOI) registration agency forresearch data, is now considering to askdata repositories for some kind of certi-fication, while also fostering globalinteroperability about a specific policyissue. Heather Joseph (SPARC, US),director of the Scholarly Publishing &Academic Resources Coalition,brought new policy insights from theU.S. Open Access serves as a compasspoint to ensure more open and equi-table system of scholarly communica-tion, leveraging digital networked tech-nology and ultimately reducing thefinancial pressure on libraries through aholistic approach. Clear trends areemerging at higher policy level forOpen Access to achieve the interoper-ability promised by Open Access bymaking it “the default”.

The talk by Pablo De Castro(University of Carlos III Madrid,Spain) from the SONEX (Scholarly

Output Notification & Exchange)Workgroup focused on deposit-relatedinteroperability and the analysis of usecases by teaming up with initiativesalready working on technical solu-tions. One of the impediments to col-laboration between the CurrentResearch Information Systems (CRIS)and Open Access Repositories (OAR)in the UK stems from differentapproaches to metadata publishingwith data fragmented across differentservices. Peter Burnhill (EDINA, UK)advanced the idea of interoperabilitybetween repositories and of reposito-ries with the wider web. However,important questions include deter-mining whether such an approachshould be chiefly within and for theresearch and education sector orextend beyond it. Wolfram Horstmann,Bielefeld University, explored themany ways to interoperability drawinga number of important conclusions.While interoperability is multi-lev-elled, a network rather than a layermodel is needed. Semantics are a corechallenge for Digital Library interop-erability. Such a focus would allow theautonomy needed to meet heteroge-neous needs. Simplicity is key to theuptake of standards.

The Round Table chaired by SeamusRoss, University of Toronto, broughtinto sharp relief the need for a far-reaching approach to interoperability,as well as the need to address data man-agement, including the data diversitythat exists even within the same disci-pline. While data management isimplemented in several large collabora-tive infrastructure projects, ownershipof preservation and transfer of contentremain important questions.Technology in itself is not enough.There needs to be a policy push toenforce the actions required while fos-tering effective dialogue across a spec-trum of organisations.

Links:http://www.dlorg.eu/http://www.dlorg.eu/index.php/dl-org-events/digital-library-research-open-access-repositories

Please contact:Donatella CastelliISTI-CNR, ItalyE-mail: [email protected]

ERCIM NEWS 85 April 2011 60

Events

Page 61: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 61

Call for Participation

fMICS 2011 -

16th International

ERCIM Workshop

on formal Methods

for Industrial Critical

Systems

Trento, Italy, 29-30 August 29/30,

2011; co-located with RE 2011

The aim of the FMICS workshop seriesis to provide a forum for researcherswho are interested in the developmentand application of formal methods inindustry. In particular, FMICS bringstogether scientists and engineers thatare active in the area of formal methodsand interested in exchanging their expe-riences in the industrial usage of thesemethods. The FMICS workshop seriesalso strives to promote research anddevelopment for the improvement offormal methods and tools for industrialapplications.

Topics of interest include (but are notlimited to):• design, specification, code generation

and testing based on formal methods• methods, techniques and tools to sup-

port automated analysis, certification,debugging, learning, optimizationand transformation of complex, dis-tributed, real-time systems andembedded systems

• verification and validation methodsthat address shortcomings of existingmethods with respect to their indus-trial applicability (eg, scalability andusability issues)

• tools for the development of formaldesign descriptions

• case studies and experience reportson industrial applications of formalmethods, focusing on lessons learnedor identification of new researchdirections

• impact of the adoption of formalmethods on the development processand associated costs

• application of formal methods instandardization and industrial forums

The proceedings of the workshop willbe published the Springer series LectureNotes in Computer Science (LNCS).

Participants will give a presentation oftheir papers in twenty minutes, fol-lowed by a ten-minute round of ques-tions and discussion on participants'work. Following the tradition of the pastedition, a special issue of an interna-tional scientific journal will be devotedto FMICS 2011.

More information: http://events.fortiss.org/fmics2011/http://www.inrialpes.fr/vasy/fmics/

Call for Papers and Participation

SPIRE 2011 -

18th International

Symposium on String

Processing and

Information Retrieval

Pisa, Italy, 17-21 October 2011

The scope of the SPIRE series of sym-posia includes not only fundamentalalgorithms in string processing and infor-mation retrieval, but also SP and IR tech-niques as applied to areas such as compu-tational biology, DNA sequencing, andWeb mining. Given its interdisciplinarynature, SPIRE offers a unique opportu-nity for researchers from these differentareas to meet and network.

Typical topics of interest include, butare not limited to:• String Processing: Dictionary algo-

rithms, Text searching, Pattern match-ing, Text and sequence compression,Succinct and compressed indexing,Automata-based string processing;

• Biological Sequence Processing:Analysis of DNA and RNA sequenc-ing data, Molecular sequence pro-cessing, Recognition of genes andregulatory elements, Comparativegenomics and population genetics;

• Information Retrieval: Informationretrieval models, Indexing, Rankingand filtering, Interface design for IR,Evaluation issues in IR, Text analysis,Text mining, Text classification andclustering, Information extraction,Language models and topic modelsfor search related-tasks, Efficientimplementation of IR systems, Algo-rithms and data structures for IR;

• Search-related tasks: Cross-lingualinformation retrieval, Multimedia /multi-modal information retrieval,Recommendation and collaborativefiltering, Semi-structured dataretrieval, Blog retrieval.

SPIRE 2011 will feature:• A full day of tutorials• A three-days’ technical program of

invited and submitted papers• A full day of workshops

The deadline for the submission ofpapers is 20 April 2011.

More information:http://spire2011.isti.cnr.it/

Competition on

Plagiarism detection,

Author Identification,

and Wikipedia

Vandalism detection

held in conjunction with the CLEF'11

conference, Amsterdam, Netherlands,

19-22 September 2011

This year's edition divides into threetasks, namely plagiarism detection,author identification, and Wikipediavandalism detection.

Plagiarism detection in text documentsis a challenging retrieval task: today'sdetection systems are faced with intri-cate situations, such as paraphrased pla-giarism within and across languages.Moreover, the source of a plagiarismcase may be hidden within a large col-lection of documents such as the Web, orit may not be available at all. Buildingon the successful evaluation frameworkdeveloped in the last two years, we con-tinue to add new challenges this year.

Author identification is the task of deter-mining the true author of a text.Throughout history and especiallytoday, many texts are written anony-mously or under false names, so thatreaders may not be certain of a text'salleged author. Within author identifica-tion, one of the main challenges is toautomatically attribute a text to one of aset of known candidate authors. For the

Page 62: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 62

Events

purpose of the evaluation, we havedeveloped a new authorship evaluationcorpus.

Vandalism has always been one ofWikipedia's biggest problems. However,the detection of vandalism is donemostly manually by volunteers, andresearch on automatic vandalism detec-tion is still in its infancy. Hence, solu-tions are to be developed which aidWikipedians in their efforts.

We invite researchers and practitionersfrom all fields to participate.

Important Dates

• Result submission deadline: 31 May, 2011

• Notebook submission deadline: 30 Jun 30 2011

More information:http://pan.webis.de

Call For Papers

IWPSE-EVOL’2011

Joint ERCIM Workshop

on Software Evolution

and International

Workshop on Principles

of Software Evolution

Szeged, Hungary, 5-6 September 2011

Research in software evolution andevolvability has been thriving in the pastyears, with a constant stream of new for-malisms, tools, techniques, and develop-ment methodologies. Research in soft-ware evolution has two goals; the first isto facilitate the way long-lived suc-cessful software systems can be changedin order to cope with demands fromusers and the increasing complexity andvolatility of the contexts in which suchsystems operate. The second goal is tounderstand and if possible control theprocesses by which demand for thesechanges come about.

The IWPSE-EVOL workshop is themerger of the annual ERCIM Workshopon Software Evolution (EVOL) and theInternational Workshop on Principles ofSoftware Evolution (IWPSE). Therationale for a common event is to capi-talize on the synergies to be found whentheorists and practitioners meet.

The 2011 edition of IWPSE-EVOL willbe held in Szeged, Hungary, as a co-located event of ESEC/FSE’2011, the8th joint meeting of the EuropeanSoftware Engineering Conference andthe ACM SIGSOFT Symposium on theFoundations of Software Engineering.

Topics

IWPSE-EVOL’2011 invites high-quality papers presenting experiments,surveys, approaches, techniques andtools related to the evolution of softwaresystems. Topics of interest include, butare not limited to:• application areas: distributed, embed-

ded, real-time, ultra large scale, andself-adaptive systems, web services,mobile computing, information sys-tems, systems of systems, etc.

• paradigms: support and barriers toevolution in aspect-oriented, agile,

component-based, and model-drivensoftware development, service-orient-ed architectures, etc.

• technical aspects: co-evolution andinconsistency management, impactanalysis and change propagation,dynamic reconfiguration and updat-ing; architectures, tools, languages andnotations for supporting evolution, etc.

• managerial aspects: effort and costestimation, risk analysis, softwarequality, productivity, process support,training, awareness, etc.

• empirical studies related to softwareevolution.

• mining software repositories tech-niques supporting software evolution.

• industrial experience on successes andfailures related to software evolution

• interdisciplinary approaches: adapta-tion of evolutionary concepts andmeasures from other disciplines (biolo-gy, geology, etc.) to software evolution

• theories and models to explain andunderstand software evolution.

Proceedings of the workshop will bepublished in the ACM digital library. Aselection of the best papers will be con-sidered for revision, extension, and pub-lication in a special issue of an interna-tional journal.

IWPSE-EVOL will feature a specialevent dedicated to the memory of Prof.Manny Lehman, known as the “Fatherof Software Evolution”, who sadlypassed away in December 2010.

Important dates:• Abstract due: 12 May 2011• Paper due: 15 May 2011• Notification: 1 July 2011 • Workshop: 5-6 September 2011

IWPSE-EVOL’2011 is co-chaired byRomain Robbes (University of Chile,Chile) and Anthony Cleve (University ofNamur, Belgium).

More information:http://pleiad.cl/iwpse-evol/http://2011.esec-fse.org/

ERCIM Working Group SoftwareEvolution: http://wiki.ercim.eu/wg/SoftwareEvolution/

Please contact:Romain Robbes and Anthony [email protected]

Call for Papers

1st EvAAL Competition

on Indoor Localization

and tracking

EvAAL (Evaluating AAL SystemsThrough Competitive Benchmarking) isorganizing a series of evaluation cam-paigns in order to stimulate the develop-ment of innovative benchmarkingmethodologies for Ambient AssistedLiving systems.

In the initial phase, the focus is on spe-cific, small scale topics, with the aim ofcreating a large dataset, reusable for fur-ther experiments. In a second phase, itwill be possible to evaluate and com-pare more complex systems. The topicof the 2011 edition is Indoor localiza-tion and tracking, a key component forachieving context-awareness in AALsystems. Candidates must submit apaper describing their localizationsystem. Submitters of accepted paperswill be invited to compete at the CIAMILiving Lab in Valencia on 25-29 July.The winner will be announced duringthe AAL Forum (26-28 September2011) a major event in the field ofAmbient Assisted Living in Europe.More information:http://evaal.aaloa.org/current-competition/cfc

Page 63: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM NEWS 85 April 2011 63

In Brief

E-services for the Elderly at Home

The exponential growth in the ageing population in Europeancountries is raising severe problems for the national healthprogrammes, scrambling to create adaptable solutions fordifferent medical situations and diseases. The “OLDES”project, funded by the 6th EU FP, and coordinated byMassimo Busuoli and Marco Carulli, ENEA, Italy, aimed attackling some of the challenges that e-Health for the elderlyhas to face today by implementing an affordable and cus-tomisable tele-cardiology system. The OLDES approach,based on a “participative design” concept, has developed anaccessible and logical medical platform for the elderly,involving interested actors both in the implementation of thesystem and in the design/validation phases. The use of a low-cost PC, combined with essential tele-support and telemedi-cine services, has made OLDES a viable model for futureservice providers and regional social services interested insetting up a programme for remote diabetes and cardiologicservices. To test and validate the OLDES platform, 2 pilotshave been implemented: one in Prague focused on 10 dia-betes patients and one in Bologna, targeted at 100 elderly (10with cardiologic diseases). http://www.oldes.eu

CWI Spin-off Company

VectorWise sold to Ingres

High-tech spin-off company VectorWise from CWI has beensold to Ingres Corporation (USA), a leading open sourcedatabase management company. VectorWise develops ana-lytical database technology and was founded in 2008 bymembers of the pioneering CWI database research team:Peter Boncz, Marcin Zukowski, Sándor Héman and NielsNes. Ingres Corporation has funded the spin-off companysince its inception. VectorWise is based on scientific researchresults and derives its strength from a completely newapproach on data processing. The approach makes use ofvector processing on data sets in which every vector is tai-lored to the size of the cache memory of modern processors.The technology of VectorWise allows organizations to per-form data analysis tasks that were previously not feasible.Application areas include logistics, science, medicine andhealthcare. The development of VectorWise has spurred fur-ther scientific innovation, as other scientists around theworld are using this state-of-the art system as a foundationfor further research. The generation of high-tech spin-offcompanies is an important method for CWI to convert funda-mental knowledge into practical applications.

https://www.cwi.nl/news/2011/cwi-spin-company-vectorwise-sold-ingres-corporation

German Climate and Environment

Innovation Prize for Researchers

at fraunhofer SCAI

A research group from the Department of Optimization at theFraunhofer Institute for Algorithms and ScientificComputing SCAI won the 25,000 Euro innovation prize forClimate and Environment (IKU) in the "Environmentallyfriendly products and services" category. The award isdonated by the Federal Ministry for the Environment, NatureConservation and Nuclear Safety (BMU) and the Federationof German Industries (BDI). The award was handed to theresearchers in a ceremony on March 15 in Berlin.Worldwide, over 7,000 companies use the optimization solu-tions provided by Fraunhofer SCAI. The best known applica-tions are AutoNester (automatic placement of pieces ontotextiles, leather, metal sheets and wood) and PackAssistant(optimizing packing configurations of identical parts intocontainers). The potential savings are enormous: "When cut-ting metal, wood or leather skins our software saves up to 30percent of material, depending on the industry," says Dr. RalfHeckmann, Director of the Department of Optimization atFraunhofer SCAI. http://www.iku-innovationspreis.de

Van Wijngaarden Award 2011

for Éva tardos and John Butcher

On the occasion ofthe 65th anniver-sary of CWI inAmsterdam, theHungarian com-puter scientist ÉvaTardos and the NewZealand mathemati-cian John Butcherreceived the VanWijngaarden Award2011 on 10February 2011. Theaward is intendedfor scientists whocontributed significantly to their fields. Éva Tardos isSchurman Professor of Computer Science at CornellUniversity in Ithaca, USA. Her research interests comprisealgorithms and algorithmic game theory, in which she espe-cially takes selfishness into account. The mathematicianJohn Butcher, Emeritus Professor at the University ofAuckland in New Zealand, works on numerical methods forordinary differential equations. Applications include the sim-ulation of waves. The Van Wijngaarden Award is namedafter Adriaan van Wijngaarden (1916 – 1987), one of thefounders of computer science in the Netherlands. http://www.cwi.nl/soiree-65-year-centrum-wiskunde-informatica

Photo: Peter Lowie

Page 64: Number 85, April 2011 ERCIM NEWSpubdb.ait.ac.at/files/PubDat_AIT_130339.pdf · based computation, up to vastly parallel Avogadro-scale computation. Interaction The classical Turing

ERCIM is the European Host of the World Wide Web Consortium.

Institut National de Recherche en Informatique et en AutomatiqueB.P. 105, F­78153 Le Chesnay, Francehttp://www.inria.fr/

Technical Research Centre of FinlandPO Box 1000FIN­02044 VTT, Finlandhttp://www.vtt.fi/

Irish Universities Associationc/o School of Computing, Dublin City UniversityGlasnevin, Dublin 9, Irelandhttp://ercim.computing.dcu.ie/

Austrian Association for Research in ITc/o Österreichische Computer GesellschaftWollzeile 1­3, A­1010 Wien, Austriahttp://www.aarit.at/

Norwegian University of Science and Technology Faculty of Information Technology, Mathematics andElectrical Engineering, N 7491 Trondheim, Norwayhttp://www.ntnu.no/

Polish Research Consortium for Informatics and MathematicsWydział Matematyki, Informatyki i Mechaniki, Uniwersytetu Warszawskiego, ul. Banacha 2, 02­097 Warszawa, Poland http://www.plercim.pl/

���������������������������

����������������� ������������

Consiglio Nazionale delle Ricerche, ISTI­CNRArea della Ricerca CNR di Pisa, Via G. Moruzzi 1, 56124 Pisa, Italyhttp://www.isti.cnr.it/

Centrum Wiskunde & InformaticaScience Park 123, NL­1098 XG Amsterdam, The Netherlandshttp://www.cwi.nl/

Foundation for Research and Technology – HellasInstitute of Computer ScienceP.O. Box 1385, GR­71110 Heraklion, Crete, Greecehttp://www.ics.forth.gr/

FORTH

Fonds National de la Recherche6, rue Antoine de Saint­Exupéry, B.P. 1777L­1017 Luxembourg­Kirchberghttp://www.fnr.lu/

FWOEgmontstraat 5B­1000 Brussels, Belgiumhttp://www.fwo.be/

FNRSrue d’Egmont 5B­1000 Brussels, Belgiumhttp://www.fnrs.be/

Fraunhofer ICT GroupFriedrichstr. 6010117 Berlin, Germanyhttp://www.iuk.fraunhofer.de/

Swedish Institute of Computer ScienceBox 1263, SE­164 29 Kista, Swedenhttp://www.sics.se/

Swiss Association for Research in Information Technologyc/o Professor Daniel Thalmann, EPFL­VRlab, CH­1015 Lausanne, Switzerlandhttp://www.sarit.ch/

Magyar Tudományos AkadémiaSzámítástechnikai és Automatizálási Kutató IntézetP.O. Box 63, H­1518 Budapest, Hungaryhttp://www.sztaki.hu/

Spanish Research Consortium for Informatics and Mathematics,D3301, Facultad de Informática, Universidad Politécnica de Madrid,Campus de Montegancedo s/n,28660 Boadilla del Monte, Madrid, Spain,http://www.sparcim.es/

Science and Technology Facilities Council, Rutherford Appleton LaboratoryHarwell Science and Innovation CampusChilton, Didcot, Oxfordshire OX11 0QX, United Kingdomhttp://www.scitech.ac.uk/

Czech Research Consortium for Informatics and MathematicsFI MU, Botanicka 68a, CZ­602 00 Brno, Czech Republichttp://www.utia.cas.cz/CRCIM/home.html

Order FormIf you wish to subscribe to ERCIM News

free of charge

or if you know of a colleague who would like to

receive regular copies of

ERCIM News, please fill in this form and we

will add you/them to the mailing list.

Send, fax or email this form to:

ERCIM NEWS

2004 route des Lucioles

BP 93

F-06902 Sophia Antipolis Cedex

Fax: +33 4 9238 5011

E-mail: [email protected]

Data from this form will be held on a computer database.

By giving your email address, you allow ERCIM to send you email

I wish to subscribe to the

o printed edition o online edition (email required)

Name:

Organisation/Company:

Address:

Postal Code:

City:

Country

E-mail:

You can also subscribe to ERCIM News and order back copies by fi l l ing out the form at the ERCIM Web site at http://ercim-news.ercim.eu/

ERCIM – the European Research Consortium for Informatics and Mathematics is an organisation

dedicated to the advancement of European research and development, in information technology

and applied mathematics. Its national member institutions aim to foster collaborative work within

the European research community and to increase co-operation with European industry.

Portuguese ERCIM Groupingc/o INESC Porto, Campus da FEUP, Rua Dr. Roberto Frias, nº 378,4200­465 Porto, Portugal