Hardware question: Arduino support in Interface


I just helped someone connect up a very simple BCI (brain computer interface) using a hacked MindFlex and an Arduino board (details here). The headset puts out a very slow (1Hz) data stream containing EEG information to the Arduino serial monitor display. The guy had a spare MindFlex headset that I bought from him.

I thought it might make a fun project for HiFi - I could control the size of objects, the weather and maybe even the gait of a walk animation based on the headset’s data - if I could get a hold of the data stream in JS. Given the open source, flexible nature of Arduino, integrating with HiFi opens up a lot of other fun possibilities too…

So my question is: what would be the best way to get a feed of Arduino data into JS?


Hmmm. I have an Arduino here waiting to translate between an elliptical trainer and a walk animation. Exercise bike and rowing machine could also be accommodated (if I had them - or even room for them in here! :smile: )

I did buy a Leonardo model so I could just send the data as simulated keyboard, mouse and/or joystick activity.


@LaeMing - sounds like we’re having similar ideas. I think the use of HMDs with exercise machines has been an overlooked application so far - I thought of hooking up a treadmill and an HMD to enable jogging anywhere / with anyone a while back, but I’d not got as far as thinking to use Arduino as the glue though, that’s a great idea!

So, I guess we should be asking a more general question regarding existing / future Arduino support in HiFi - I think something more solid than mapped keyboard / controller input is probably called for. If we find a way to make Arduino serial monitor updates available to JS in HiFi, so much would be possible…

Any ideas / insights anyone?


Hey there.

I recall seeing packet data use with @philip and a smartphone sending to HiFi the gyro info via UDP sending.

I also know @MichelleLeckrone was interested in the same thing. I’ve send data packets to Xplane10 via UDP from an Arduino (using Ethernet Shield), and if the recipient program receives the packet in a proper format, it’s much faster than wrapping it in JS (plug-in method)

If you are wanting to go the path of Arduino >> JS >> HiFi what you will want is to use the Ethernet Shield (or WiFi Arduino shield) to configure a webpage that runs JS and create (in HiFi) a WebEntity to read it from.

I’m no HiFi engineer, but that’s how I’d approach it.

I think you’re better off getting info about the datarefs in HiFi and sending packets directly.
Good Luck!


WebSockets might be one possibility … @thoys has a PR that adds them to Interface’s JavaScript: https://github.com/highfidelity/hifi/pull/5594/files


FWIW a few weeks ago I looked across the C++ for a way to slipstream quaternions over UDP, but couldn’t see an easy path.

In my case synchronous XHR GETs ended up being sufficient to pull JSON armature data in ~15-30Hz (localhost-localhost with Blender).

With the WebSockets PR I’d probably have gone in its direction, except the extra layer of indirection can make debugging more difficult.

To mitigate maybe you could prototype the WebSocket client logic itself using .js / .html in Chrome first? In theory that’d provide two safety nets. First – an easy way to confirm everything “up to” HiFi was working: * Thought -> Brain -> EEG -> Arduino (socket server) -> Network -> Workstation (socket client) -> JavaScript*.

And second – if the PR pans-out quickly enough then the .js can become a Script.include; or as a stop-gap the .html can be loaded directly into a WebWindow instead. Note that WebWindows already support WebSockets, so this .html case becomes a variant of MetaverseCafes’ suggestion (the web page could read data and then emit over EventBridge).

On the Arduino side, since several seem interested it might be worth trying to get http://websocketd.com/ installed/compiled on there – which once working allows arbitrary WebSocket servers to be created as simple STDIO programs.

// BCI.readBrainFarts(function(readings) { ... }, ms)


Thank you for all the replies - after a bit of research, I now have a plan!

I don’t have an Arduino Ethernet shield board, so I’m going to have a look at:

Human brain -> MindFlex -> Arduino -> websocketd -> WebWindow -> regular JS

When @thoys’s PR goes through, I’ll drop the WebWindow step. After I get it up and running, I’ve decided to skip the easier stuff, and dive straight into mapping emotion onto avatar animation. I found an awesome paper on the subject a while back that describes a technique for projecting emotional content onto walk animations that works independently of character proportions - i.e. it works for avatars of different shapes and sizes. Reading through, the technique doesn’t look too daunting to implement:

A reusable model for emotional biped walk cycle animations with implicit retargeting
Marco Romeo, Marcelo Dematei, Julian Bonequi, Alun Evans, and Josep Blat



This could get really interesting…

  • davedub