How to use frame averaging as a simple method to smooth keypoint data from PoseNet in p5.js.

Google’s Tensorflow.js PoseNet model is extremely useful. It returns real-time pose estimation data in the browser from just a webcam, which allows for all sorts of accessible embodied interactions (like novel navigation modes and online dance parties) in the browser. Using the ml5.js library, it’s also pretty easy to get up and running.

The keypoint data returned from PoseNet is pretty noisy, so it’s helpful to add some smoothing to the points for better user experiences. …


Documentation and reflection on “Recreating The Past,” a ten-week course on computational art taught by zach lieberman at the School for Poetic Computation.

Bridget Riley’s work (left). My digital recreation in openFrameworks (right).

I had the opportunity to take Zach Lieberman’s Recreating The Past course this summer. A silver lining of the Covid pandemic is that the course was offered for the first time online.

Over the ten weeks we studied influential artists in computational arts, then recreated, and later responded to, their works mostly using openFrameworks (OF).

Week 1: Vera Molnar


Machine learning is paving the way for the internet to be a source of creativity and collaboration

Photo: Gunther Kleinert/EyeEm/Getty Images

This is the second of two articles about designing for machine learning on the web. This article addresses machine intelligence as a creative collaborator; the first discusses the body as input. This series of articles was originally published by the Machine Intelligence Design team at Adobe Design.

Among the final presentations I recently witnessed of Live Web, a graduate course at New York University’s Interactive Telecommunications Program (ITP), machine learning on the web played an important role. …


With machine learning, our bodies may be the next step in the way we interact with the internet

Photo: PeopleImages/Getty Images

This is the first of two articles about designing for machine learning on the web. This article discusses the body as input; the second addresses machine intelligence as a creative collaborator. This series of articles was originally published by the Machine Intelligence Design team at Adobe Design.

I recently attended the final presentations of Live Web, a graduate course conceived and taught by Shawn Van Every at New York University’s Interactive Telecommunications Program (ITP). …


We’ve given computers our minds. We now need to lend them our bodies.

This article was originally published in Adjacent Issue 4: Bodies and Borders

Computers are ubiquitous. Aided by the internet, they have infiltrated our individual, social, and political lives. With the expansion of artificial intelligence and machine learning, they are poised to have even greater impact. But we are at a crossroads.

Put simply, there is a problem in placing increasing trust in our machines. Never mind that the average, and even the educated human, doesn’t understand computers so well — computers don’t understand humans so well.

Illustration by Matt Romein

Machines are increasingly tackling real-world issues without real-world inputs, and this greatly affects their…


Learnings and workflow from sending 3D color and depth images across networks in real time.

A few years ago I started working on an open source tool called Kinectron, affectionally taking its name from its two major components: Kinect + Electron. The software streams data from the Kinect V2 over the internet, making the data available client-side in the browser through an (easy to use ;) API.

One big challenge that I’ve run into is working with volumetric, or 3D, images across networks and in the browser. I’ve struggled to read them, store them, transfer them and unpack them. I’ve found workable, but far from perfect, solutions at each step. …


Avoiding common pitfalls when working with cameras and other sensors in public settings.

Flappy Shadow by Tong Wu and Kai-che Hung at the ITP Winter 2017 Show. See more Kinectron experiments.

Back in 2016, I built Kinectron, a tool that sends volumetric and skeletal data online using the Microsoft Kinect. Since then, Kinectron has become widely used by students at NYU ITP, especially in the end of semester shows. After getting some feedback from students, I thought it would be helpful to write a quick post with some tips for using Kinectron in public installations. …


One of the most frequent questions I get from people working with the Kinect, and Kinectron is: what is the skeleton actually made up of, and how is that represented? Although many people have written about this, I haven’t been able to find one place that has all the info I usually end up sharing in one place, so I’ve decided to compile it all here.

The Kinect Skeleton System

Before I get started, it’s important to note that everything in this post is about the Kinect for Xbox One, what most people call the Kinect Version 2.

Kinect for Xbox One (aka Kinect V2). Image from MS Developer blog.

The Kinect can track up to…


We visualized the body electric of three contemporary circus performers on a 120’ x 10’ screen in real time. These are the data considerations that most shaped the project.

Françoise Voranger of Hybrid Movement Company performs Threads at IAC Headquarters. Photo by Andrew T. Foster.

When Aaron Parsekian and I found out we would have the opportunity to program the IAC media wall, we knew immediately we would create an experience working with data and live performance.

We had seen a lot of performances using motion capture — the performer moves and the visualizations respond to the movements — and began to consider how we could expose less obvious processes happening on a performer’s body. Our previous collaboration, Beats Exposed, which visualizes and sonifies a performer’s pulse in real time, was a first attempt at answering this question.

Threads is an interactive performance that continues…

Lisa Jamhoury

Machine Intelligence Design @Adobe • Fondly remembering @ITP_NYU and @BlueChalkMedia • Aerial acrobat in #Brooklyn

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store