Did some work this weekend on making np_epoc a little more usable, since v0.1 had VID/PID and encryption keys hardcoded to the headset I have.


VID/PID and key are now selectable by users. You can also get a device count to make sure you have the VID/PID pair set up correctly.

Hopefully key and ID detection will be done by emokit soon, but that's going to take a little more work since we're not yet sure how the key is deduced on connection.

Binaries are available on the sourceforge NP Labs release site. Currently I've only got OS X 10.6 Max 5 binaries up. Pd has been tested on linux, works fine, but the source package is... not the best at the moment, so I'll hopefully have OS X and Linux Pd binaries up soon. I'm still trying to figure out how I'm going to build the binaries for windows at all, since VS10 is giving me some problems against flext.

They say he who dies with the most maintainerships... dies very tired.

The emokit project, started by Daeken, aims to provide a free driver to access raw data coming from the Emotiv EPOC headset. However, he's been really busy being awesome elsewhere lately, so after picking up the decode key for the special pre-release unit, writing a C implementation of the library, and fielding some support emails, I (Kyle) have finally just gone ahead and taken the lead maintainer role on the project.

Many thanks go to Daeken for the initial work on getting the library and community together, and hopefully he'll come back to visit at some point.

The new main repo is at


Next big steps for the project are:

  • Isolating the power level readings
  • Finishing up and formalizing the C library
  • Getting a full v0.1 release out

I also develop the np_epoc external for Max/Pd. I expect that I'll be updating the external along with anything we get done with the headset itself, so keep an eye on that on my personal externals page.

It's been a while since I've started up a new website, and I found myself doing enough work on health hardware that I decided to spin it out into its own site. Not to mention, there's already a ton of drivers and software out there, but there's been no central place to record it thus far. openyou.org will feature posts about open source health technology, as well as information on library and driver projects to give developers and users new ways to learn about themselves via code.

Hopefully I manage to keep it up a little better than I have here and slashdong lately, too.

np_thirdspacevest Max/MSP External

As of last night, I got Windows, OS X, and Linux support working on the C version of libthirdspacevest for the The TN Games Third Space Vest, so now it seemed like a good idea to do my usual followup and make a Max/MSP and PureData external for it. So, there's now a np_thirdspacevest repo, and matching binary distribution site on sourceforge. I've just released version 0.1, the source code and Max 5 external are on sourceforge. Will build more externals as I need them or as I get demand, but, well, I've managed to go from getting hardware to full cross platform drivers and libraries for 3 languages in the span of a week, so I'm just gonna sit here and feel proud of myself for the moment.

Presenting libthirdspacevest, the open source, cross platform driver for the The TN Games Third Space Vest by Kyle Machulis/Nonpolynomial Labs.

Because apparently it wasn't enough to just work on the kinect.

Third Space Vest

This is a USB controlled vest with 8 air cells in it, which can be used to cause quick haptic force via pneumatics. In other words, it's a vest that can simulate being shot, by way of making the user feel like a roll of bubble wrap. I've also heard it described as "being poked by gnomes. And not with their fingers." While I cannot vouch for the physical accuracy of that statement (yet), it does fit well with the mental images of the feeling.

As usual, I decided to reverse engineer it to write my own drivers, which will eventually turn into Max/Pd externals, and also another project that I'll be annoucing later this week.

This normally wouldn't be a problem, but this time around, much like the issues emokit had, the manufacturer encrypted the protocol, but distributed a free (pre-compiled, closed source) SDK. I ended up writing up the procedure for reversing out the protocol and building the library, since it's one of the more complicated procedures I've had to do to get new hardware working. I was thinking about posting the doc here, but it's rather long.

Anyways, it's now done and there's proof of concept python using pyusb in the repo. I'll be extending this to a C API as soon as possible, just to get it over with since there's not a ton of functionality to the vest.

To close, here's a video of my cat versus the vest (running my code!):

Oh, what a crazy month it has been.

Kinect Take Apart Image

The Microsoft Kinect camera came out on Nov. 4th, and has eaten a good portion of my month, in a good way. Work with the OpenKinect Community has been one of the best open source project experiences I've ever had. I'm the lead code integrator on the project, as well as doing some platform support. Haven't had a lot of time to work on my own kinect projects, but it's been fun to watch what comes out of the libraries. The code repo is available on the OpenKinect Organization Github Site.

In terms of press around the kinect project, I did an interview with New Scientist Magazine as well as a talk at DorkbotSF. Video of the talk (audio is horrible, but I might redo it later) is available in 3 parts:

Jerkcity Image

A few weeks ago, I got pretty dire food poisoning. I eventually felt better, and for some reason ended up with an emacs jerkcity mode too. I'm not sure how these things happen.

For a quick rundown on maintenance of other projects and interesting news:

  • libnifalcon got a v1.0.2 release, mainly to fix very nasty build system issues.
  • On that note, I'm taking a step back and re-evaluating what compily_buildd really needs to do, as it at some point got WAY overcomplicated and is making my code exceedingly difficult for others to build.
  • Looks like someone has started up a project to reverse engineer the Phantom Omni haptic device
  • np_mindset seems to have a pretty major bug in the binary release. Going to try to get this fixed and out ASAP.

After 10 months of silence, it's time to get things moving on here again. First off, I've created a new website for all of my Max/MSP and PureData externals. After 5 years of creating externals and having people find them randomly, it seemed like a good time to start advertising them like an actual developer.


np_mindset version 1.1.5 released

I've also been getting a few requests for my Neurosky external lately, which I never actually finished after the biometric presentation project last year. So, I've tightened it up, released one non-working version, and have now released one slightly less non-working (or more working) version! Version 1.1.5 should allow you to reliably pull data from the mindset using Max/MSP, though the CPU usage is still on the high side, and it still crashes horribly on PureData. There will be a version 1.2 release that addresses these issues soon, but this should work for the time being.

For the past month, I've been living in Vienna as part of as artist residency with monochrom. The main goal of this residency was to complete some projects for roboexotica, the robotics cocktail party held each year here in Vienna. This year, roboexotica is being held December 3-6, and now that I've actually seen my projects pour some drinks, I figured it's time to present them to the world.

First off, there's Adult Mario, the mario game that drinks and vibrates!

Then there's Bartris, the tetris that's also a bartender.

I'll have a post next week that goes into the implementation specifics of these projects, but for now, all of the code is available at http://www.github.com/qdot/bartris.

While I was out flying my stunt kites this weekend, the wind ended up being a little iffy. So, after flying a bit, I decided to see what kind of things I could put together using just my iPhone. Thus, the 'Day at the Ports' project was born.

This is consists of 2 and a half videos (the half being a test one I took just to see if looping was going to work correctly). all of these were taken using viewfinders around the park I was at, Middle Harbor park in Oakland.

Both videos use a combinations of tracks generated by rjdj, mixed with the wind and environment sounds taken from the iphone mic while recording the video.

The first one is "Construction from Far Away".

rjdj page for original audio clip

I was really happy with this one. The quality of the viewfinder along with my shakey cinematography ended up inducing a bit of a dreamlike state in the video, and the soundtrack matched up just perfectly.

The second one is called "A Headless Sutro".

rjdj page for original audio clip

This one ended up a little harsher than I think I originally meant it to be. The viewfinders I was using for it seemed to move very quickly, so I decided to use an audio scene that was a little choppier. However, it came out also... schizophrenic in the end.

To show what I was using the take the video, and also to just remind myself where all the buttons were in the video software, I made a little sample video. Uses William Besinski's The River, because, well, water. I'm original like that.

All in all, I'm really happy with how everything turned out. Managed to make something relatively neat while just screwing around waiting for the wind to adjust its attitude (which it never did. Stupid nature.).