7
HAL Id: hal-03138581 https://hal.inria.fr/hal-03138581 Submitted on 11 Feb 2021 HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. An Interactive Garment for Orchestra Conducting: IoT-enabled Textile & Machine Learning to Direct Musical Performance Berit Greinke, Giorgia Petri, Pauline Vierne, Paul Biessmann, Alexandra Börner, Kaspar Schleiser, Emmanuel Baccelli, Claas Krause, Christopher Verworner, Felix Biessmann To cite this version: Berit Greinke, Giorgia Petri, Pauline Vierne, Paul Biessmann, Alexandra Börner, et al.. An Interac- tive Garment for Orchestra Conducting: IoT-enabled Textile & Machine Learning to Direct Musical Performance. TEI 2021 - 15th ACM International Conference on Tangible, Embedded and Embodied Interaction, Feb 2021, Virtual, France. 10.1145/3430524.3442451. hal-03138581

An Interactive Garment for Orchestra Conducting: IoT

  • Upload
    others

  • View
    4

  • Download
    1

Embed Size (px)

Citation preview

Page 1: An Interactive Garment for Orchestra Conducting: IoT

HAL Id: hal-03138581https://hal.inria.fr/hal-03138581

Submitted on 11 Feb 2021

HAL is a multi-disciplinary open accessarchive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come fromteaching and research institutions in France orabroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, estdestinée au dépôt et à la diffusion de documentsscientifiques de niveau recherche, publiés ou non,émanant des établissements d’enseignement et derecherche français ou étrangers, des laboratoirespublics ou privés.

An Interactive Garment for Orchestra Conducting:IoT-enabled Textile & Machine Learning to Direct

Musical PerformanceBerit Greinke, Giorgia Petri, Pauline Vierne, Paul Biessmann, AlexandraBörner, Kaspar Schleiser, Emmanuel Baccelli, Claas Krause, Christopher

Verworner, Felix Biessmann

To cite this version:Berit Greinke, Giorgia Petri, Pauline Vierne, Paul Biessmann, Alexandra Börner, et al.. An Interac-tive Garment for Orchestra Conducting: IoT-enabled Textile & Machine Learning to Direct MusicalPerformance. TEI 2021 - 15th ACM International Conference on Tangible, Embedded and EmbodiedInteraction, Feb 2021, Virtual, France. �10.1145/3430524.3442451�. �hal-03138581�

Page 2: An Interactive Garment for Orchestra Conducting: IoT

An Interactive Garment for Orchestra Conducting:IoT-enabled Textile & Machine Learning to Direct Musical

PerformanceBerit Greinke∗

[email protected] University of the Arts

Einstein Center Digital Future Berlin

Giorgia Petri∗[email protected]

Berlin University of the ArtsEinstein Center Digital Future, Berlin

Pauline Vierne∗[email protected]

Paul Biessmann∗[email protected]

Alexandra Börner∗[email protected]

Berlin University of the Arts

Kaspar Schleiser∗[email protected]

Freie Universität BerlinEinstein Center Digital Future, Berlin

Emmanuel Baccelli∗[email protected] Universität Berlin

Einstein Center Digital Future, Berlin

Claas Krause∗Christopher Verworner∗[email protected]@web.de

VKKO Orchestra, Berlin

Felix Biessmann∗felix.biessmann@beuth-

hochschule.deBeuth University Berlin

Einstein Center Digital Future, Berlin

ABSTRACTWe present an overview and initial results from a project bringingtogether orchestra conducting, e-textile material studies, costumetailoring, low power computing and machine learning (ML). Wedescribe a wearable interactive system comprising of textile sensorsembedded into a suit, low-power transmission and gesture recogni-tion using creative computing tools. We introduce first observationsmade during the semi-participatory approach, which placed theconductor’s movements and personal performative expressivenessat the centre for technical and conceptual development. The projectis a two-month collaboration between the Verworner-Krause Kam-merorchester (VKKO), technical and design researchers, currentlystill running. Preliminary analyses of the data recorded while theconductor is wearing the prototype demonstrate that the developedsystem can be used to robustly decode a large number of conductingand performative movements. In particular the user interface of theML system is designed such that the training of the algorithms canbe intuitively controlled by the conductor, in sync with the MIDIclock.

CCS CONCEPTS•Human-centered computing→ Interactive systems and tools.

∗All authors contributed equally to this research.

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).TEI ’21, February 14–17, 2021, Salzburg, Austria© 2021 Copyright held by the owner/author(s).ACM ISBN 978-1-4503-8213-7/21/02.https://doi.org/10.1145/3430524.3442451

KEYWORDSelectronic textiles, music, low-power computing, machine learning,interaction designACM Reference Format:Berit Greinke, Giorgia Petri, Pauline Vierne, Paul Biessmann, AlexandraBörner, Kaspar Schleiser, Emmanuel Baccelli, Claas Krause, ChristopherVerworner, and Felix Biessmann. 2021. An Interactive Garment for OrchestraConducting: IoT-enabled Textile & Machine Learning to Direct MusicalPerformance. In Fifteenth International Conference on Tangible, Embedded,and Embodied Interaction (TEI ’21), February 14–17, 2021, Salzburg, Austria.ACM, New York, NY, USA, 6 pages. https://doi.org/10.1145/3430524.3442451

1 INTRODUCTIONAn orchestra conductor’s motion constitutes a distinctive formof communication, combining standardised instructions with per-sonal performative variations. While conductors mostly use handgestures and eye contact to guide the musicians, conducting is avigorous full-body act requiring coordination and observant inter-action. In this paper we describe an ongoing project which studiesthe use of e-textile sensors and gesture recognition technologiesto track an orchestra conductor’s movements. The outcome is atailored interactive garment which is to be used as a performativemusical tool in a live performance of the VKKO. Wearable technol-ogy for music is often used to create engaging live performances inwhich the performer is not hidden behind a laptop screen. Theseperformances rely on the wearable acting as a controller, replacingor enhancing an existing audio interface or digital sound controller.Borrowing musicologist Bielawski’s description of musical instru-ments as TRANSFORMERS that transform the gestures of a musicianinto musical gestures [9], we explore this concept using the gar-ment to transform the conductor’s motion into musical gestures.We are interested in shifting the emphasis of a wearable as a controlorgan and user interface, to a collaborative relationship betweenthe wearer, their motion and actuation.

Page 3: An Interactive Garment for Orchestra Conducting: IoT

TEI ’21, February 14–17, 2021, Salzburg, Austria Greinke et al.

2 RELATEDWORK2.1 E-textiles for Music MakingTextile interactive materials have been increasingly used to inspirenew formats for musical live performances. While sound is not amajor consideration in textile design (unless to eliminate unwantedsound), artists and designers have embroidered, woven, knitted,printed and knotted musical interfaces and instruments (e.g. [5, 15,16, 18]). [14] report on how designers go through the process ofexploring textile sensing material and sound output when sensorfunctionality is not fully determined. [17] use technology probesto explore “creative and disruptive” uses for e-textile sensors inperformances that include sound. Bridging textiles with music isalso investigated in algorithmic practices, such as live coding [11]and “heritage algorithms” [8] bridging textile crafting and livemusicmaking. While e-textiles feature in numerous electronic musicalperformances, they are little explored in setups involving classicalmusicians or orchestra ensembles. Examples are mostly restrictedto augmenting individual instrument players, e.g. [15]. As a result,these wearables mostly reach audiences which are already familiarwith augmented musical performance, limiting the potential oftechnology to widen the experience of other types of music.

2.2 Wearable IoT Hardware & SoftwarePlatforms

In the context of the Internet of Things (IoT), progress in energy-efficient embedded hardware over the last decade has democratisedthe use of System-on-Chip (SoC) solutions which combine micro-controllers (e.g. Arm Cortex-M, RISC-V, ESP-32...), energy-efficientradios (e.g. Bluetooth Low-Energy, IEEE 802.15.4...) and a variety ofsensors, GPIOs etc. As such, a SoC amounts to a tiny single-boardcomputer which can connect to the network. Basically, comparedto microprocessor-based hardware, a SoC trades off smaller com-puting and memory capacities for a smaller energy consumption.The power consumption of a SoC in this category is typically inthe milliwatt (mW) range, thus a very small battery can suffice. Insome cases, battery-less operation is even a perspective 1. All inall, the relative low price (10€) and small form factor (thumb-sized)make it possible to embed such systems in a variety of wearablese.g. smart watches 2 or electronic tokens sewn into clothing items.

A variety of open source embedded software platforms and openspecification protocols have been developed to interact with andprogramme such pieces of hardware. For simple use cases, ArduinoorMicropython scripts can be used, for instance. For more advancedprogramming and performance fine-tuning, a variety of real-timeembedded operating systems are available [6]. To interoperate withremote elements in a distributed system, a SoC network connec-tivity and security can be provided by adapted standard Internetprotocols [13, 19].

2.3 Machine Learning for Musical ExpressionMachine Learning (ML) has been used for over 25 years for NewInterfaces forMusical Expression (NIME, see [3] for review). Various

1ONiO.zero energy harvesting MCU https://www.onio.com/article/onio-zero-batteries-mcu.html2PineTime low-power smart watch https://www.youtube.com/watch?v=WdOHOHGS0d8

Figure 1: e-Textile prototype: IoT system architecture.

many open source libraries are currently used to enrich musical liveperformances, such as tensorflow [12]. New open source softwaremakes it simple to build ML models in the browser [10] and touse ML models with digital audio workstations (DAWs) [4]. Thereis exciting research aiming at novel ML models, for instance forsound generation [12]. Using the tool Wekinator [2] in this work,we chose to employ ML methods primarily to improve usability oftextile sensors with unknown functional properties.

3 METHODSQualitative and quantitative methods were combined, linking tech-nical evaluation with material experiments, conceptual tailoringand interaction design considerations. While the project is still inprogress, we can provide a first overview of methods.

• Observations of the conductor’s motion, recorded as notesand drawings by the research team members during weeklyrehearsal sessions. Each team member was tasked with ob-servations that would fall into their disciplinary domain(e.g. textile researcher inspecting e-textile responsiveness;ML developer inspecting ML performance; etc.) to gain acomprehensive understanding of the entire system.

• Recorded sensor data was analysed and linked to videorecordings of the conductor. The videos were transcribedwhen oral feedback from the conductor provided additionalinsights.

• Iterative prototyping was used to assess material function-ality, aesthetics and interaction. We started from a simpletextile sensor layout that allowed use of an interactive jacketalready early in the project. Design options were discussedwith the conductor and team, and next steps decided.

4 EXPLORATORY STUDIESIn this section we describe the underlying performance context andthe design process. Technical iterations were done in parallel withconceptual work.

As a preliminary technical step, we developed a generic wirelesswearable IoT system architecture depicted in Fig. 1. A low-powerSoC uses RIOT [1] and Bluetooth Low-energy to interconnect inreal-time the analogue output of textile sensors, with a MachineLearning workstation (based on Wekinator 3) and a digital audioworkstation (running Max and Ableton Live). The textile wearable

3http://www.wekinator.org

Page 4: An Interactive Garment for Orchestra Conducting: IoT

An Interactive Garment for Orchestra Conducting:IoT-enabled Textile & Machine Learning to Direct Musical Performance TEI ’21, February 14–17, 2021, Salzburg, Austria

Figure 2: Initial prototypes: (a) Stretch sensors (b) Knittedtube (c) jacket with modular attached stretch sensors

was developed in three stages. We started with a modular sensorsetup, exploring potential types and placements for textile sensorsbased on a conductor’s jacket. In the second stage the sensors wereadapted towards interaction. In the third stage a prototype wasdeveloped for performance on stage.

4.1 ConceptVKKO blends traditional acoustic instrument playing with elec-tronic soundscapes and improvisational jazz. Describing themselvesas a "music machine" that brings together "contemporary classi-cal music and Dub Techno, Neo-Impressionistic string-clouds anddense Electronic Rhythms"4, VKKO’s live performances bridge clas-sical music with dance floor experience. The number of performingmusicians varies between 12 and 18. The orchestra was founded bytwo conductors and composers, who also act as artistic directors.Both conductors themselves also combine operative conductingwith performative elements, seamlessly linking standardised con-ducting movements (e.g. beat patterns) with expressive movementsusually more associated with club or alternative music culture (e.g.headbanging). For the work described here, we focused on one ofthe two conductors. At the time of writing this paper, the perfor-mance structure was not finalised. However the concept and thesuccession of musical parts was discussed throughout the collabo-ration. The development of the textile technology and an ML setupappropriate for the conductor’s movements also has an influenceof the development of the composition.

4.2 Preliminary prototypesThe aim of the first textile exploration was to get an understandingfor the conductor’s response to an interactive garment with a soundfeedback. Two prototypes were made:

(1) Knitted tube for performative aspects: Exploring how theconductor deals with restricted movement;

(2) Suit jacket for standardised movements aspects: Exploringhow the conductor deals with a familiar garment.

Sensor layout andmaterials:Three stretch sensorsweremade,using Eeonyx LTT-SLPA-20 as a variable resistors strips, and ShieldexTechnik-tex P130+B to stabilise the ends (Fig. 2). The sensors wereattached to the base fabrics with safety pins, and maintained inplace with additional fabric pins.

4http://www.kammerorchester.eu/info

Technical results: Investigating general functionality of sen-sors in this session, we made following observations:

• All sensors were responsive. Sensor ranges for tube werelarger than for jacket due to stretch capability of tube fabric,and use of hands and arms inside the tube for additionalstretch.

• The placement of sensors was adjusted during testing toidentify areas for most stretch. Other areas on the jacket thatappeared to have larger movements were marked with tapefor potential sensor placement in the next step.

Interaction observations:Upon being dressedwith both proto-types, the conductor began exploring the sound response in relationto his movements. Despite no previous experience with body-wornsensors, he felt comfortable immediately in both prototypes. Ob-serving his movements, we identified four ’movement modes’:

• Classic conducting moves: Standardised patterns or expres-sions, e.g. beat patterns;

• Signature performative movements: Personal movements orexpressions which the conductor uses regularly or has usedin previous performances;

• Try and learn: Adapting existing movements while famil-iarising with sensor positions, reactions and sound response;

• Performative control: Bolder movements that result frombeing informed about sensor placement and function, pairedwith performative curiosity.

These were often combined or performed in short sequencesflexibly put together. An important aspect during this phase ofexploration was that the movements should be distinguishable byboth theML algorithm aswell as the audience. This helped guide theexploration towards simpler and more expressive or performativemovement types.

4.3 Improved Prototype for InteractionExploration

Over the next five rehearsal sessions, the ’jacket’ prototype wasused as a basis for further development. The goals were to:

• Explore textile sensor technologies for better integration andrefined aesthetics, adapted to observed movement patterns.

• Gesture following and interaction with systemSensor layout and materials: It was important to provide sen-

sors with robust behaviour, delivering repeatable electrical andmechanical performance. Further material experimentation wasconducted in parallel, studying in-situ polymerisation [7]5 for cus-tomising sensor placement and aesthetics. The microcontroller wassecurely fastened on the jacket and fitted with snap buttons. Cableswere replaced by conductive copper threads encased in thin knittedtubes.

Technical results: We recorded data from six sensors with theprototype during a series of movements (see Figure 3) and analysedthe data with respect to the dynamical properties of the sensorsand their dependency on various movement types. In Figure 4 weshow that different sensors and sensor placements can have verydifferent dynamic properties. In this case, sensor 1 (channel 0) onthe right elbow was very responsive and captured distinct high5see https://hackaday.io/project/168380-polysense for an updated recipe

Page 5: An Interactive Garment for Orchestra Conducting: IoT

TEI ’21, February 14–17, 2021, Salzburg, Austria Greinke et al.

Figure 3: a) Movement 1: Resting position b) Movement 2: Fermata c) Movement 3: Expressive entrance d) Movement 4: Hushe) Movement 5: Attention ’rhythm section’. Right: Sensor distribution.

Sensor 1 (ch 0) Sensor 2 (ch 1) Sensor 3 (ch 2) Sensor 4 (ch 3) Sensor 5 (ch 4) Sensor 6 (ch 5)

Location Right elbow (out) Back (horizontal) Right shoulder Left armpit Left shoulder Right elbow (in)Type Stretch sensor strap Stretch sensor strap Resistive knitted tube Pressure sensor foam Pressure sensor foam Resistive knitted tube

Table 1: Overview sensors improved prototype

Figure 4: Histograms of two different textile sensors illus-trating different conductive properties and how they relateto sensor responses to differentmovements. Left:Histogramof sensor 1 shows pronounced peaks for differentmovementtypes; the sensor captures a wide dynamic range of values.Sensor 6 shows very different properties, with most valuesbeing close to zero and some few large values.

density regions of sensor values with different movements, whileother sensors were less responsive to movements and only showedpeak activation for very few values. We conclude that, as expected,the types of sensors and their location have a direct and strongeffect on the statistical properties of the recorded signals.

Next we analysed the principal directions of variance in the datausing principal component analysis (PCA). In Figure 5 we showthe variance explained by individual principal components (PCs).Most variance is explained by the first two components and thestatistical properties of the two first PCCs reflect the two types ofsensors whose histograms are shown in Figure 4.

After analysing the data of all movements in aggregation, wealso investigated how well a ML model can predict the differentmovements from the sensor activation in offline experiments. Theresults of this offline analysis is shown in Table 1. We performed an80%/20% train/test split of the recorded data and used a k-NearestNeighbor (KNN) classifier with𝑘 = 3 (optimised with grid search) topredict each one of the five movement types performed. The resultsin Table 2 show that all movements in the test set could be predictedwith perfect accuracy, precision and recall. This shows that despite

Figure 5: Principal Component Analysis (PCA) of 6 sensorsrecorded during different movements. Left: Variance of allsensors explained by individual principal components (PCs).The first and second PC explain more than 80% of the vari-ance. Right: First and second PC capture the same sensor dy-namics as in Figure 4, representing the two sensor types.

the different sensor properties, illustrated in Figure 4, a simple MLmodel was able to predict the correct movement type reliably. Weconclude that the prototype yields data that can be easily modelledwith a ML model for transforming the sensor readings into controlsignals defined by the conductor, or more generally the user of theprototype.

Interaction observations: These sessions focused on improv-ing the sensors’ electrical resistance range and investigating suitableML models. Over the course of five weeks, the conductor learnedhow movements were related to success of gesture recognition,and developed a repertoire of suitable static positions and gesturesequences. For the performance set up, two ML methods were se-lected:

• Classification to learn and recognise fixed body positions:Used to trigger big audio events

• Regression to learn and recognisemovement sequences: Con-trol continuous sound values (e.g. volume, filter)

Page 6: An Interactive Garment for Orchestra Conducting: IoT

An Interactive Garment for Orchestra Conducting:IoT-enabled Textile & Machine Learning to Direct Musical Performance TEI ’21, February 14–17, 2021, Salzburg, Austria

Movement 1 Movement 2 Movement 3 Movement 4 Movement 5 accuracy macro avg weighted avg

precision 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0recall 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0f1-score 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0support 39.0 45.0 37.0 39.0 46.0 1.0 206.0 206.0

Table 2: Results of offline analyses of movement classification with a simple ML model demonstrate robust and high qualitypredictions are possible with the prototype.

Figure 6: Jacket part of tailored interactive suit, and sensordistribution.

4.4 Final prototype: Tailored CostumeIn the final design step, the jacket prototype was used as a blueprintfor tailoring a suit-based garment for the conductor (see Figure 6),adapted to sensor use and performance settings. The goal was tocreate a garment with a fit similar to the prototype so far, allowingus to reuse textile sensors and electronic set up. The final proto-type is currently used for rehearsals for an upcoming performance.Therefore technical results and interaction observations can notyet be described.

Sensor layout and materials: A different type of textile ma-terial was used (polyester gaze) which resulted in a considerablystiffer jacket. This required an altercation of all sensors, becauseit limited the stretch of sensors previously tested. In addition, theouter material exerted additional force onto the sensors. All finalsensors were mounted onto a lining (polyester mesh and cottoncalico), with customised material, shape and fit. Two sensors usedin-situ polymerisation on non-woven fabric (both shoulders), sand-wiched between two layers of conductive stretch fabric, formingpressure sensors. A stretch sensor mounted across the back was pro-totyped from polymerised spacermesh fabric. Two sensorsmountedon the insights of the elbows, and one sensor on the right knee, weremade from Polyurethane foam and Eeonyx LTT-SLPA-20 stretchfabric.

5 DISCUSSIONWe would like to reflect on a few first insights:

5.1 E-textilesTo support performative collaboration between conductor and suit,textile sensors have been developed specifically for conductingmovements. We used an in-situ polymerisation technique that al-lowed us to define shape, dimensions, and base materials. More timeis needed to make the process controllable, however we have shownthat sensors prototyped this way, even in a DIY setup, are a reliablealternative to store-bought sensor materials for our purpose.

5.2 Gesture recognition and followingApplying creative computing tools allowed the team to progressconsistently. The mapping between gesture and sound can be intu-itively defined by the conductor, simply by recording some trainingdata. It also enabled a rapid development progress for the entireteam: Once a new prototype was made with different materials,shapes and sensors, we were able to train the ML algorithm withoutany modification on the software side. Importantly the recording oftraining data was synced with the MIDI clock of the digital audioworkstation (DAW) and the recordings were triggered by the con-ductor himself. Of the three tested methods, classifying static poseshas been the most reliable method, achieving 100% accuracy duringmost rehearsals. It should be noted that new training is neededas soon as the prototype is used in a different way or at differenttimes of the day. Given the sensitivity of e-textiles to environmentalconditions and wear, sensor reading can vary vastly. There is morework to be done to investigate if this could, for example, be solvedby adding training data over longer time.

5.3 InteractionThe VKKO conductor garment is an interesting use case for severalreasons. First, the combined operative and performative movementsallowed us to explore reliability and repeatability on the one side,and less predictable but more collaborative interaction betweenwearer and garment on the other side. Second, the conductor wascomfortable in his role and used to a scenario in which (albeit indifferent ways) his movements lead to a musical response. Thisenabled us to enter the iterative prototyping stages promptly andhelped to focus on technical and interaction development. While afinal assessment of interaction cannot be given at this time, the re-hearsals have shown that the systems recognised both standardisedand personal performative movements, and were used creativelyby the conductor to manipulate sound output.

6 CONCLUSION AND OUTLOOKWhile the project is still ongoing, we have developed a proof ofconcept and prototype for an interactive conductor’s suit. We havedemonstrated that gestures of the conductor can be recognisedand used for performative action, employing e-textiles, low-powercomputing and machine learning. In summary, we believe that oursystem can complement existing work on ML software libraries byhelping to further democratise the usage of ML tools in creativedisciplines, including textile wearable technology. First results ofe-textiles development, sensor performance and interaction obser-vations, have been described. In the next step we will analyse videoand sensor data in detail, to gain more insights about the techni-cal developments. Following the performance of the orchestra in

Page 7: An Interactive Garment for Orchestra Conducting: IoT

TEI ’21, February 14–17, 2021, Salzburg, Austria Greinke et al.

November 2020, our focus will be to learn about the conductor’sapproach to interact with the system in orchestra rehearsals andlive performances.

REFERENCES[1] Emmanuel Baccelli. 2018. RIOT: An open source operating system for low-end

embedded devices in the IoT. IEEE Internet of Things Journal 5, 6 (2018), 4428 –4440.

[2] Rebecca Fiebrink. 2009. Wekinator.[3] Rebecca Fiebrink and Laetitia Sonami. 2020. Reflections on Eight Years of In-

strument Creation with Machine Learning. In International Conference on NewInterfaces for Musical Expression.

[4] Adrian Freed. 2017. opensoundcontrol.org : An Enabling Encoding for MediaApplications. http://opensoundcontrol.org

[5] Berit Greinke. 2011. Twiddletone. Technical Report. Queen Mary University ofLondon. http://beritgreinke.net/research/research-placement-culture-lab/

[6] Oliver Hahm and others. 2015. Operating systems for low-end devices in theInternet of Things: a survey. IEEE Internet of Things Journal 3, 5 (2015), 720 –734.

[7] Cedric Honnet, Hannah Perner-Wilson, Marc Teyssier, Bruno Fruchard, JürgenSteimle, Ana C. Baptista, and Paul Strohmeier. 2020. PolySense: AugmentingTextiles with Electrical Functionality using In-Situ Polymerization. In CHI ’20:Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems.Association for Computing Machinery, New York, 1 – 13. https://doi.org/10.1145/3313831.3376841

[8] Shelly Knotts, Jack Armitage, and Alex McLean. 2020. Heritage algorithms.https://hybrid-livecode.pubpub.org/workshop2020

[9] Tellef Kvifte. 2016. Musical Instruments and User Interfaces in Two Centuries.In Material Culture and Electronic Sound. Smithsonian Institution Scholarly Press,Washington, Chapter 8, 203–229.

[10] Louis McCallum and Mick S Grierson. 2020. Supporting Interactive MachineLearning Approaches to Building Musical Instruments in the Browser. In Pro-ceedings of the International Conference on New Interfaces for Musical Expression.Birmingham City University, Birmingham, UK, 322–324.

[11] Alex McLean, Giovanni Fanfani, and Ellen Harlizius-Klück. 2018. Cyclic Patternsof Movement across Weaving, Epiploke and Live Coding. Dancecult: Journal ofElectronic Dance Music Culture 10, 1 (2018), 5–30.

[12] Parag Mital. 2017. Generate your own sounds with NSynth. https://magenta.tensorflow.org/nsynth-fastgen

[13] Roberto Morabito and Jaime Jiménez. 2020. IETF protocol suite for the Internetof Things: Overview and Recent Advancements.

[14] Charlotte Nordmoen, Jack Armitage, Fabio Morreale, Rebecca Stewart, and An-drew McPherson. 2019. Making Sense of Sensors : Discovery Through CraftPractice With an Open-Ended Sensor Material. In Proceedings of the 2019 onDesigning Interactive Systems Conference. Association for Computing Machinery,New York, 135–146. https://doi.org/10.1145/3322276.3322368

[15] Hannah Perner-Wilson and Mika Satomi. 2018. Trombone Breathing Vest. https://www.kobakant.at/KOBA/trombone-breathing/

[16] Afroditi Psarra. 2014. Lilytronica. http://afroditipsarra.com/index.php?/on-going/lily/

[17] Sophie Skach, Anna Xambó, Luca Turchet, Ariane Stolfi, Rebecca Stewart, andMathieu Barthet. 2018. Embodied Interactions with E-Textiles and the Internet ofSounds for Performing Arts. In Proceedings of the Twelfth International Conferenceon Tangible, Embedded, and Embodied Interaction - TEI ’18. Association for Com-puting Machinery, New York, 80–87. https://doi.org/10.1145/3173225.3173272

[18] Paola Torres. 2017. The Augmented Soma Suit : Wearables for Enhanced SomaticExperiences. Technical Report. Sodertorn Hogskola.

[19] Hannes Tschofenig and Emmanuel Baccelli. 2019. Cyberphysical Security for theMasses: A Survey of the Internet Protocol Suite for Internet of Things Security.IEEE Security and Privacy 17, 5 (9 2019), 47–57. https://doi.org/10.1109/MSEC.2019.2923973