Interoperability/Reusability of high level WebAudio …...addressed on windows/android (Guitar to...

Preview:

Citation preview

Interoperability/Reusability of high level WebAudio components

Games on Web W3C Workshop, Seattle 2019

Michel BuffaUniversité Côte d’Azur

FranceI3S/CNRS/INRIA labs

buffa@i3s.unice.fr@micbuffa

Who am I?● Professor / researcher at Université Côte d’Azur, in the South of France

○ member of the WIMMICS research group common to INRIA and I3S lab from CNRS

● National coordinator of the WASABI ANR research project, with WebAudio at its heart,

● W3C Advisory Committee Representative for UCA● I participate to the WebAudio working group

Some ambitious WebAudio examples...

● AudioGraphs of these apps use high-level WebAudio nodes and / or AudioWorklets + WebAssembly.

But… no plugin standard, no “hosts”, no programming model...We find some very good JavaScript libraries (i.e. toneJS)

Some open source github repositories (i.e. https://webaudiodemos.appspot.com/)

Some online tools for music synthesis (genish.js etc.)

Some DSL for DSP programming (FAUST, etc.)

Some effects and instruments

In early 2018, with some researchers and developers we decided to start working on an open

plugin standard for WebAudio

We made a team with different researchers / developers, that share same concerns with different approaches

● 1 - Bringing native developers to the Weba. Jari Kleimola (Aalto University Espoo, Southern Finland, now at Webaudiomodules.org),b. Oli Larkin (Developer of VirtrualCZ, Endless Series, WDL-OL/iPlug, iPlug2)

● 2 - Bringing low level DSP developers to the Weba. Stéphane Letz (senior researcher at GRAME, Lyon, co-author of the FAUST DSL/compiler)

● 3 - Attract Web Developers / JavaScript audio app developersa. Tatsuya Shinyagaito, aka g200kg, (Audio and WebAudio developer, huge WebAudio

contributor, Yokohama, Kanagawa, Japan)b. Jerôme Lebrun and Michel Buffa (I3S/INRIA)

An open standard = API/specification ? Or more… ?

● Use URIs : support local or remote plugins, audio and midi routing

● Asynchronous events in the lifecycle

● Plugins can be headless or with a GUI

● API for the “audio processor part”, as close as possible to AudioNode○ i.e use plugin.connect(gain) or gain.disconnect(plugin), etc.

● Propose an API or at least guidelines on how to package and publish plugins on REST repositories

● Avoid naming conflicts (HTML ids, JS names, CSS rules), metadata...

Be Web-Aware!

Native plugins (C/C++) written as VST, JUCE

etc.

FAUST

WebAudio Plugins(WAPs)

WebAudioModules(WAMs)

JavaScriptWebAssembly

+ AudioWorklet

Max DSP, Pure DataOthers...

Web browser

Simple example of plugin loaded by host...

Low level DSP automatized -> WASM

Conclusion : Where are we today?SDK for JS developers

FAUST scripts and IDE that compile .dsp files to WAPs, embedded GUIs builder, publish to remote servers

WebAudioModules C/C++ toolchain for native audio developers (and WAMs are WAPs)

Multiple examples of hosts that load plugins dynamically using URIs

Tools: plugin validator, repository validator, GUI editor

● Synths, drumbox, audio effects...

Check the GitHub!

Check the pedalboard demo!

Guitar Hero / Rocksmith for the Web?Yes, this is possible :-)

● Good low cpu amp simulation + effects● Pitch detection (i.e tuner)

Still the latency problem need to be addressed on windows/android (Guitar to Speaker)

● Mac OSX : 18-23ms with an audio buffer of 2x128 samples (Chrome, Edge, Opera), FF 32ms, Safari is 70ms.

● Windows : Chrome/Edge 81 ms, FF 109 ms

AudioContext OutputLatency not implemented by any browser yet, baseLatency only by Chrome

Recommended