Brain Controlled CNC Documentation

From BenningtonWiki
Jump to: navigation, search

Initial Proposal


For my final project I’m interested in building a drawing machine that uses light as a medium and uses electrical impulses of the brain read through a scanner to determine the drawings.


I’m looking at this as a design project and the final product will be a machine that artists can use to make abstract drawings through light. The final result of the artistic process with the machine would be a printed image.

The machine will essentially be a small Computer Numeric Control (CNC) machine that will have the ability to move in two dimensional space with the help of two stepper motors. It will have an RGB LED attached to the end that will be able to move along the x & y axis. There will be a camera on top that will be taking high exposure images of the light moving smoothly in space. These images will then be combined to make a composite image that will then be printed.

The movement of the light will be determined by the inputs from the brain scanner. The following inputs will be used:

1. Excitement levels of the wearer

2. Thinking Up

3. Thinking Down

4. Thinking Left

5. Thinking Right

The excitement levels of the wearer will determine the color of the light. Blue will represent low and red will represent high levels of excitement. The color change would occur as the LED moves in space and will affect the final printed product. The Up, Down, Left & Right thoughts will control movement of the LED on the x & y axis.

Proposal Diagram


Brain Input Device (check) Arduino CNC Kit 2 Stepper Motors Powerful RGB LED Gear Strips

Further Research




Neuro Headset


Possibly the most crucial part of this project was the Emotive Epoc neuro headset. The EMOTIV Epoc is a 14 channel wireless EEG, designed for contextualized research and advanced brain computer interface (BCI) applications. The device is put on the users head and the 14 EEG sensors measure electrical signal in the brain and pass it on to the computer. The SDK that comes with the Emotiv Epoc provides access to the following outputs:

1. Cognitive Commands: The ability to train the scanner to recognise patterns in the brain. So for example the user will think a certain specific thought while the Epoc scans them. This will be repeated enough times that the scanner recognises when thought is being thought and can associate it with an action like "Move Left". 4 of these commands can be stored easily at one time.

2. Effective Suit: Short term and long term excitement, meditation, frustration, valance curves in the form of graphs.

3. Facial Expressions: The Epoc can read muscle movement in the face and distinguish a frown from a smile etc.

4. Gyroscope: A gyroscope keeps track of head movement.

  • Emotiv Epoc Brain Scanner
  • Asad after a night of no sleep

Initial Observations

After using the scanner for a while I realised that training cognitive command took a lot of effort. Not for the machine but for the user. To repeatedly put your mind in a similar position voluntarily so the scanner could distinguish it takes a lot of effort and training. However I did notice that certain emotional states could be distinguished for example when I was watching similar videos on YouTube. I decided to take away the ability for the user to voluntarily move the light in two axis but decided to use emotional states and excitement to produce results while using a video as a stimuli.

Connecting Epoc to Arduino

Connecting the Epoc to Arduino ended up being the biggest last minute challenge of the project. The Arduino needed to read live data from the scanner in order to use it as an input variable for different images. Although Epoc doesn't have any direct support for the Arduino platform, I had seen people do projects with Arduino before and some documentation for connecting the two was available online. It was enough for me to think that this step is already a problem solved and would not take too much time.

Mind Your OSCs

Unfortunately I ended up being very wrong. The program used to make the Epoc talk to the Arduino is called "Mind Your OSCs". It transfers the brain data into OSC data which can then be read by programs like MAX/MSP or processing. The program is offered officially through Emotiv and has a simple interface which transfers the different kinds of data. This was perfect for my purposes. However when I downloaded and turned on the application it simply crashed as soon as the connection was formed. At that time my laptop was having issues so I was convinced the software didn't have any problems. It was on the night before the final presentation of the project that I tested it on another computer and it crashed too. When this happened, the only option I was left with was to search their support forms. After finding a great number of people complaining about the software crashing, I realised that it had not been updated for years and there wasn't any way to make it work.

Python Script Passing Data Over Serial Monitor

After spending a night trying a lot of different ways to get the two devices to interact. I ended up finding some sample code in Python that was passing data from Insight to Arduino through the serial monitor. Insight is a headset similar to the Epoc but with much lesser sensors. Running the code bought up a million problems but I was determined to make it work and with some right modifications which took me the whole night, it finally started working. Code

The following script creates a Tuple out of the mental states from Epoc and passes them on to the Arduino through Serial Monitor.

import serial
import time

# global variables for module
startMarker = 60
endMarker = 62

def valToArduino(emoStateTuple):
    sendStr = "%s,%s,%s, %s, %s,%s,%s, %s, %s,%s,%s, %s, %s,%s,%s" % emoStateTuple
    print "SENDSTR %s" % (sendStr)

def setupSerial(serPort):

    global ser

    # NOTE the user must ensure that the serial port and baudrate are correct
    # ~ serPort = "/dev/ttyS81"
    baudRate = 9600
    ser = serial.Serial(serPort, baudRate)
    print "Serial port " + serPort + " opened  Baudrate " + str(baudRate)


# ========================

def closeSerial():

    global ser
    if 'ser' in globals():
        print "Serial Port Closed"
        print "Serial Port Not Opened"

# ========================

def sendToArduino(sendStr):

    global startMarker, endMarker, ser


# ===========================

def recvFromArduino(timeOut):  # timeout in seconds eg 1.5

    global startMarker, endMarker, ser

    dataBuf = ""
    x = "z"  # any value that is not an end- or startMarker
    byteCount = -1
    # to allow for the fact that the last increment will be one too many
    startTime = time.time()
    # ~ print "Start %s" %(startTime)

    # wait for the start marker
    while ord(x) != startMarker:
        if time.time() - startTime >= timeOut:
        x =

    # save data until the end marker is found
    while ord(x) != endMarker:
        if time.time() - startTime >= timeOut:
        if ord(x) != startMarker:
            dataBuf = dataBuf + x
        x =


# ============================

def waitForArduino():

    # wait until the Arduino sends'Arduino Ready'-allows time for Arduino reset
    # it also ensures that any bytes left over from a previous message are
    # discarded

    print "Waiting for Arduino to reset"

    msg = ""
    while msg.find("Arduino is ready") == -1:

        msg = recvFromArduino(10)

        print msg
        print Code

I converted the Insight code to Epoc by changing all variable right names to "Epoc" in the following script. The original script had a "try" command to store a value for the libEDK value. It wasn't working so I changed it to assign the value directly but this would only work on Windows. A slight modification would be needed to run it on Mac.

import sys
import os
import platform
import json
import time
import ctypes

class Epoc(object):

    def __init__(self, composerConnect=False, composerPort=1726, userID=0):
        self.composerConnect = composerConnect
        self.composerPort = composerPort
        self.userID = ctypes.c_uint(userID)
        self.user = ctypes.pointer(self.userID)

        self.FE_SURPRISE = 64
        self.FE_FROWN = 32
        self.FE_SMILE = 128
        self.FE_CLENCH = 256
        self.FacialExpressionStates = {}
        self.FacialExpressionStates[self.FE_FROWN] = 0
        self.FacialExpressionStates[self.FE_SURPRISE] = 0
        self.FacialExpressionStates[self.FE_SMILE] = 0
        self.FacialExpressionStates[self.FE_CLENCH] = 0
        # try:
        #     if sys.platform.startswith('win32'):
        #         self.libEDK = ctypes.cdll.LoadLibrary("edk.dll")
        #     if sys.platform.startswith('linux'):
        #         srcDir = os.getcwd()
        #         libPath = srcDir + "/"
        #         self.libEDK = ctypes.CDLL(libPath)
        # except:
        #     print 'Error : cannot load dll lib'

        srcDir = os.getcwd()
        libPath = srcDir + "/"
        self.libEDK = ctypes.CDLL(libPath)
        IEE_EmoEngineEventCreate = self.libEDK.IEE_EmoEngineEventCreate
        IEE_EmoEngineEventCreate.restype = ctypes.c_void_p
        self.eEvent = IEE_EmoEngineEventCreate()

        IEE_EmoStateCreate = self.libEDK.IEE_EmoStateCreate
        IEE_EmoStateCreate.restype = ctypes.c_void_p
        self.eState = IEE_EmoStateCreate()

    def disconnect(self):

    def connect(self):
        if self.composerConnect:
            self.libEDK.IEE_EngineRemoteConnect("", self.composerPort)
            self.libEDK.IEE_EngineConnect("Emotiv Systems-5")

    def get_state(self, eEvent):
        return self.libEDK.IEE_EngineGetNextEvent(eEvent)

    def get_event_type(self, eEvent):
        return self.libEDK.IEE_EmoEngineEventGetType(eEvent)

    def get_engine_event_emo_state(self, eEvent, eState):
        IEE_EmoEngineEventGetEmoState = \
        IEE_EmoEngineEventGetEmoState.argtypes = [
            ctypes.c_void_p, ctypes.c_void_p]
        IEE_EmoEngineEventGetEmoState.restype = ctypes.c_int
        return IEE_EmoEngineEventGetEmoState(eEvent, eState)

    def get_userID(self, eEvent, user):
        return self.libEDK.IEE_EmoEngineEventGetUserId(eEvent, user)

    def get_time_from_start(self, eState):
        IS_GetTimeFromStart = self.libEDK.IS_GetTimeFromStart
        IS_GetTimeFromStart.argtypes = [ctypes.c_void_p]
        IS_GetTimeFromStart.restype = ctypes.c_float
        return IS_GetTimeFromStart(eState)

    def get_wireless_signal_status(self, eState):
        IS_GetWirelessSignalStatus = self.libEDK.IS_GetWirelessSignalStatus
        IS_GetWirelessSignalStatus.restype = ctypes.c_int
        IS_GetWirelessSignalStatus.argtypes = [ctypes.c_void_p]
        return IS_GetWirelessSignalStatus(eState)

    def get_facial_expression_is_blink(self, eState):
        IS_FacialExpressionIsBlink = self.libEDK.IS_FacialExpressionIsBlink
        IS_FacialExpressionIsBlink.restype = ctypes.c_int
        IS_FacialExpressionIsBlink.argtypes = [ctypes.c_void_p]
        return IS_FacialExpressionIsBlink(eState)

    def get_left_wink(self, eState):
        IS_FacialExpressionIsLeftWink = \
        IS_FacialExpressionIsLeftWink.restype = ctypes.c_int
        IS_FacialExpressionIsLeftWink.argtypes = [ctypes.c_void_p]
        return IS_FacialExpressionIsLeftWink(eState)

    def get_right_wink(self, eState):
        IS_FacialExpressionIsRightWink = \
        IS_FacialExpressionIsRightWink.restype = ctypes.c_int
        IS_FacialExpressionIsRightWink.argtypes = [ctypes.c_void_p]
        return IS_FacialExpressionIsRightWink(eState)

    def get_upper_face_action(self, eState):
        IS_FacialExpressionGetUpperFaceAction =  \
        IS_FacialExpressionGetUpperFaceAction.restype = ctypes.c_int
        IS_FacialExpressionGetUpperFaceAction.argtypes = [ctypes.c_void_p]
        return IS_FacialExpressionGetUpperFaceAction(eState)

    def get_upper_face_action_power(self, eState):
        IS_FacialExpressionGetUpperFaceActionPower = \
        IS_FacialExpressionGetUpperFaceActionPower.restype = ctypes.c_float
        IS_FacialExpressionGetUpperFaceActionPower.argtypes = [ctypes.c_void_p]
        return IS_FacialExpressionGetUpperFaceActionPower(eState)

    def get_lower_face_action(self, eState):
        IS_FacialExpressionGetLowerFaceAction = \
        IS_FacialExpressionGetLowerFaceAction.restype = ctypes.c_int
        IS_FacialExpressionGetLowerFaceAction.argtypes = [ctypes.c_void_p]
        return IS_FacialExpressionGetLowerFaceAction(eState)

    def get_lower_face_action_power(self, eState):
        IS_FacialExpressionGetLowerFaceActionPower = \
        IS_FacialExpressionGetLowerFaceActionPower.restype = ctypes.c_float
        IS_FacialExpressionGetLowerFaceActionPower.argtypes = [ctypes.c_void_p]
        return IS_FacialExpressionGetLowerFaceActionPower(eState)

    def get_mental_command_current_action(self, eState):
        IS_MentalCommandGetCurrentAction = \
        IS_MentalCommandGetCurrentAction.restype = ctypes.c_int
        IS_MentalCommandGetCurrentAction.argtypes = [ctypes.c_void_p]
        return IS_MentalCommandGetCurrentAction(eState)

    def get_mental_command_current_action_power(self, eState):
        IS_MentalCommandGetCurrentActionPower = \
        IS_MentalCommandGetCurrentActionPower.restype = ctypes.c_float
        IS_MentalCommandGetCurrentActionPower.argtypes = [ctypes.c_void_p]
        return IS_MentalCommandGetCurrentActionPower(eState)

    def lower_facial_expression_states(self, eState):
        lower_face_action = self.get_lower_face_action(eState)
        lower_face_action_power = self.get_lower_face_action_power(eState)
        self.FacialExpressionStates[lower_face_action] = lower_face_action_power

    def upper_facial_expression_states(self, eState):
        self.upper_face_action = self.get_upper_face_action(eState)
        self.upper_face_action_power = self.get_upper_face_action_power(eState)
        self.FacialExpressionStates[self.upper_face_action] = self.upper_face_action_power

    def get_surprise(self, eState):
        return self.FacialExpressionStates[self.FE_SURPRISE]

    def get_frown(self, eState):
        return self.FacialExpressionStates[self.FE_FROWN]

    def get_smile(self, eState):
        return self.FacialExpressionStates[self.FE_SMILE]

    def get_clench(self, eState):
        return self.FacialExpressionStates[self.FE_CLENCH] Code

from Epoc import *
from arduinoCom import *
from datetime import datetime

# -------------------------------------------------------------------------
# Make dictionary for logEmoState

header = ['Time', 'UserID', 'wirelessSigStatus', 'Blink', 'leftWink',
          'rightWink', 'Surprise', 'Frown',
          'Smile', 'Clench',
          'MentalCommand Action', 'MentalCommand Power']
emoStateDict = {}
for emoState in header:
    emoStateDict.setdefault(emoState, None)

def send_emo_state_to_arduino():

    emoStateDict['Time'] = epoc.get_time_from_start(epoc.eState)
    emoStateDict['UserID'] = epoc.get_userID(epoc.eEvent, epoc.user)
    emoStateDict['wirelessSigStatus'] = epoc.get_wireless_signal_status(epoc.eState)
    emoStateDict['Blink'] = epoc.get_facial_expression_is_blink(epoc.eState)
    emoStateDict['leftWink'] = epoc.get_left_wink(epoc.eState)
    emoStateDict['rightWink'] = epoc.get_right_wink(epoc.eState)

    emoStateDict['Surprise'] = epoc.get_surprise(epoc.eState)
    emoStateDict['Frown'] = epoc.get_frown(epoc.eState)
    emoStateDict['Clench'] = epoc.get_clench(epoc.eState)
    emoStateDict['Smile'] = epoc.get_smile(epoc.eState)

    emoStateDict['MentalCommand Action'] = epoc.get_mental_command_current_action(epoc.eState)
    emoStateDict['MentalCommand Power'] = epoc.get_mental_command_current_action_power(epoc.eState)

    print emoStateDict
    emoStateTuple = (emoStateDict['Time'], emoStateDict['UserID'],
                     emoStateDict['wirelessSigStatus'], emoStateDict['Blink'],
                     emoStateDict['leftWink'], emoStateDict['rightWink'],
                     emoStateDict['Surprise'], emoStateDict['Frown'],
                     emoStateDict['Clench'], emoStateDict['Smile'],
                     emoStateDict['MentalCommand Action'],
                     emoStateDict['MentalCommand Power'])
    # print emoStateTuple

# # -------------------------------------------------------------------------
# # connect to Arduino

print "==================================================================="
print "Please enter port for Arduino"
print "==================================================================="
print "Example:"
print "Mac -- \n /dev/tty.usbmodem1451 "
print "Windows -- \n COM4"
print ">>"
arduino_port = str(raw_input())
# -------------------------------------------------------------------------
# start EmoEngine or EmoComposer

print "==================================================================="
print "Example to show how to log EmoState from EmoEngine/EmoComposer."
print "==================================================================="
print "Press '1' to start and connect to the EmoEngine                    "
print "Press '2' to connect to the EmoComposer                            "
print ">> "

log_from_emo = int(raw_input())
# -------------------------------------------------------------------------

# instantiate Epoc class
if log_from_emo == 1:
    epoc = Epoc()
elif log_from_emo == 2:
    epoc = Epoc(composerConnect=True)
    print "option = ?"

print "Start receiving Emostate! Press any key to stop logging...\n"

# connect epoc instance to Xavier composer or EmoEngine
last_command = None

# event loop to update Insight state
while (1):
    # set of operations to get state from Insight
    # returns 0 if successful
    state = epoc.get_state(epoc.eEvent)
    if state== 0:
        # event types IEE_Event_t returns 64 if EmoStateUpdated
        eventType = epoc.get_event_type(epoc.eEvent)
        user_ID = epoc.get_userID(epoc.eEvent, epoc.user)
        if eventType == 64:
            epoc.get_engine_event_emo_state(epoc.eEvent, epoc.eState)
            timestamp = epoc.get_time_from_start(epoc.eState)
            print "%10.3f New EmoState from user %d ...\r" % (timestamp,
            # Limit the command rate so that we won't overflow the buffer
            if not last_command:
                last_command =
                diff =
                if (diff.microseconds/1000.0 > 500.0):
                    last_command =

    elif state != 0x0600:
        print "Internal error in Emotiv Engine ! "

emoArduino.ino Code

Finally this Arduino script is what actually runs the program.

// intended for use with

const byte numStatus = 15;
byte ledPin =  13;
// Values sent over serial to Arduino.
float Time;
float userID;
float wirelessSigStatus;
float Blink;
float leftWink;
float rightWink;
float surprise = 0.0;
float frown;
float clench;
float smile;
float mentalCommandAction;
float mentalCommandPower;

const byte buffSize = 132;
char inputBuffer[buffSize];
const char startMarker = '<';
const char endMarker = '>';
byte bytesRecvd = 0;
boolean readInProgress = false;
boolean newDataFromPC = false;

char messageFromPC[buffSize] = {0};

unsigned long curMillis;
unsigned long prevMillis;

unsigned long prevReplyToPCmillis = 0;
unsigned long replyToPCinterval = 1000;
int blink_rate = 0;
int x = 0;
int count = 15;


void blinkLEDs() {
    if ((curMillis - prevMillis) >= (int)Time) {
     prevMillis = millis();
    digitalWrite(13, !digitalRead(13)); 
//      blink_rate = (int)Time;
//      digitalWrite(ledPin, HIGH);
//      delay(blink_rate);
//      digitalWrite(ledPin, LOW);
//      delay(blink_rate);


void setup() {

    // flash LEDs so we know we are alive
    pinMode(ledPin, OUTPUT);
    digitalWrite(ledPin, HIGH);

    // tell the PC we are ready
  Serial.println("<Arduino is ready>");


void loop() {
  curMillis = millis();



void getDataFromPC() {

    // receive data from PC and save it into inputBuffer

  if(Serial.available() > 0) {

    char x =;

      // the order of these IF clauses is significant

    if (x == endMarker) {
      readInProgress = false;
      newDataFromPC = true;
      inputBuffer[bytesRecvd] = 0;

    if(readInProgress) {
      inputBuffer[bytesRecvd] = x;
      bytesRecvd ++;
      if (bytesRecvd == buffSize) {
        bytesRecvd = buffSize - 1;

    if (x == startMarker) {
      bytesRecvd = 0;
      readInProgress = true;


void parseData() {

    // split the data into its parts
    // assumes the data will be received as (eg) 0,1,35

  char * strtokIndx; // this is used by strtok() as an index

  strtokIndx = strtok(inputBuffer,","); // get the first part
  Time = atof(strtokIndx); //  convert to an integer

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  userID = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  wirelessSigStatus = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  Blink = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  leftWink = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  rightWink = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  surprise = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  frown = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  clench = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  smile = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  mentalCommandAction = atof(strtokIndx);

  strtokIndx = strtok(NULL, ","); // this continues where the previous call left off
  mentalCommandPower = atof(strtokIndx);



void replyToPC() {

  if (newDataFromPC) {
    newDataFromPC = false;
    Serial.print(" Time ");
    Serial.print(" UserID ");
    Serial.print(" wirelessSigStatus ");
    Serial.print(" Blink ");
    Serial.print(" leftWink ");
    Serial.print(" rightWink ");
    Serial.print(" Surprise ");
    Serial.print(" Frown ");
    Serial.print(" Clench ");
    Serial.print(" Smile ");
    Serial.print(" MentalCommand Action ");
    Serial.print(" MentalCommand Power ");

CNC Hardware

One of the major challenges which made this project exciting was building the CNC machine itself. I had a couple of models in mind but was unsure about where to start from. The plan was to simple find a few stepper motors and start building until it starts to work.

Old Makerbot Cupcake

When looking for parts Robert suggested I have a look at an old Makerbot Cupcake 3D printer that was lying around. He was very kind to allow me to use it for my project. This not only provided me the parts needed to build the machine but also gave me an infrastructure to work around.

  • The Setup
  • Stripped down Makerbot Body

Power Supply

Because I had this old 3D printer, I wanted to reuse as many of it's parts as possible. For example it had a great sophisticated power supply unit with it's own fans and custom circuit board. I thought it would make sense to make the same unit power my machine. However after spending a whole day on it I failed to get the rihgt amount of power from it since it was overly complicated for the project. This problem which I spent a whole day on was eventually solved with a cheap Radioshack power adapter that I found in the Pod.

Overly Complicated Makerbot Power Supply

Stepper Motor Drivers

A similar problem occurred with the Stepper Motor Drivers that came with the printer. They were custom made and didn't connect with the Arduino. This let me to ordering two Stepper Motor Drivers TB6560 for the project. I got them working perfectly to power the NEMA stepper motors from the Makerbot and I could get rid of all the old Makerbot parts. I also removed everything that wasn't needed to simply the makerbot. This included the extruder and the whole z axis.

  • Makerbot Stepper Motor Driver
  • Stepper Motor Drivers TB6560

Extending Roof and Camera Holder

Another important hardware modification was to build a stand for the camera to sit in right above the LEDs. I bought new pieces of cardboard which I cut and screwed on two of the sides of the machine. I used some of the rods from the old z axis to make a stand which could hold the camera right in place. The whole case was painted black and a projector was placed on top of the machine as well. It was finally closed off and amde dark using a black piece of cloth.

CNC Software

The software aspect of the CNC was hard since a lot of things needed to be considered and a number of different tools had to interact among each other. Fortunately I didn't have to write anything custom and the following combination of software worked for me.


GBRL is a library for Arduino that allows software to send commands to the stepper motors. Once the library is installed it can easily be called in an Arduino program with the following:

#include <grblmain.h>

void setup(){

However, when I ran this code in combination with the LED program, the memory was not enough and the Arduino stopped working. The problem was solved by using two different Arduinos for LED and CNC.

Universal GCode Sender

CNCs take in GCode which tells the steppers to move in different directions based on what ever digital image the machine is being expected to draw. An open source script called Universal GCode sender is what I used to send GCode to my machine. It allows me to manually move around the x or y axis or add a GCode file for a particular image.


The only shape I used for this project at the moment was a simple circle and it was drawn in GCode with the following code:

G01 X0.000 Y03.500 F400
G41 G01 X01.500 Y03.500 D01
G03 X0.000 Y05.000 I-001.500 J0.000
G03 X0.000 Y05.000 I0.000 J-05.000
G03 X-01.500 Y03.500 I0.000 J-01.500
G40 G01 X0.000 Y03.500

Camera Control

To control the camera so it would take high exposure pictures through the computer I used a software called Sofortbild.

Single LED Trials

Initially the CNC was tried with one LED on the end and the following basic shapes were created.

  • CNC w single LED
  • X and Y Movement Tests
  • Synchronous movement test with Circle

NeoPixle Ring

LED's are cool but NeoPixle rings are even cooler. I ordered a small NeoPixle ring which had 12 RGB leds. They change color, intensity and of course I can turn them on and off too. This gave me enough variables to create interesting images with just one shape.

  • Final NeoPixles on the CNC
  • All LEDs on at low intensity
  • w Movement


I staged the project finally as an interactive exhibition piece. The viewer sat in a small dark room with a glass window through which the audience could watch her. A video titled "Absolute_Location" was put on after the brain scanner was on the viewer. She watched the video as the machine did it's work in the background. Later the resulting pictures were shown to the viewer.

  • Asad's a zombie reading your mind
  • Still from "Absolute_Location"


The simple circle shape led to many different and interesting results. Some of which are attached below.

  • 1Generate.jpg
  • 2Generate.jpg
  • 3Generate.jpg
  • 4Generate.jpg
  • 5Generate.jpg
  • 6Generate.jpg

Special Thanks

Very grateful for the support of different people throughout the project.

Robert Ransick for great advice, encouragement and the makerbot

Jackson Moore for being patient with me while I was very impatient about getting my parts on time

Jonah Nigro for brainstorming about generative art

Sandy Curth for messing around with the power supply with me

& Caroline Stinger for so generously allowing her video to be the stimulus for the viewers

Due to some last minute technical difficulties I couldn't complete the project with the attention to detail and complexity I imagined. I look forward to continuing work with brain computer interfacing in a digital art context. Next term I'll certainly try more complex and diverse shapes and images since I get to keep this machine.