Questions can be sent to: [email protected]


In this VOCAL, we will cover how to use VPixx Technologies’ software tools for Python, with an emphasis on using it with the PsychoPy suite of tools for scientists.

If you are new to PsychoPy, we encourage you to check out their Resources for Learning and Teaching webpage to learn more about how to use their system. In general, PsychoPy offers two methods of working with its tools: 

  • Use PsychoPy’s Builder interface to graphically design an experiment, 
    supplementing their built-in tools with .csv conditions files and custom code blocks. 
  • Access the PsychoPy library into your Python Integrated Development Environment (IDE), like PsychoPy Coder, Spyder or PyCharm.
Left: The PsychoPy Builder interface with the custom code block editor open. Right: Spyder 5, a popular Python IDE

VPixx Technologies has a library of Python functions for our data acquisition systems, called pypixxlib. This library allows the user to interact with our hardware directly in their Python experiment code. Pypixxlib is the Python equivalent of our Datapixx commands for MATLAB/Psychtoolbox, which you can read more about here.

For a full list of our Python functions and some IDE-based examples, please check out our pypixxlib documentation. There, you will also find instructions on how to install our Python tools in the Getting Started section. Common Python troubleshooting problems and their solutions are documented in our Frequently Asked Questions – Troubleshooting.

The purpose of this VOCAL guide is to orient researchers to using pypixxlib with PsychoPy. We will start with a general introduction to the two ‘flavours’ of pypixxlib, and then discuss how to implement our library within the Builder, and in a Python IDE. We will then cover some general tips and recommendations for working with PsychoPy.

If you are using a VIEWPixx /EEG and/or your primary goal is to generate frame-accurate digital TTL triggers, we have a fully automated solution for this called Pixel Mode. There is an in-depth explanation of Pixel Mode in our VOCAL guide: Sending Triggers with Pixel Mode. If you decide to use Pixel Mode, you do not need to invoke pypixxlib in your experiment. However, we still recommend you read the Tips section of this guide for helpful information about drawing pixels in PsychoPy so triggers are sent properly.

For most other applications, you most certainly will have to write some code. This is true even if you are using the Builder; there are no default drag-and-drop tools to support VPixx hardware, so you will need to use custom code blocks to interact with our systems. We provide some examples in this guide to help you get started. 

The two flavours of pypixxlib

There are two general ways to interact with pypixxlib, which reflect two different programming styles. Which approach you decide to take depends on your personal preference, previous programming experience and general comfort level with coding. 

Object oriented programming (OOP)

OOP groups code elements into classes, with their own unique attributes (properties) and functions (behaviours). Broadly defined parent classes may also have child classes that contain more specific attributes and functions. For example, in PsychoPy the parent class visual contains useful attributes and functions shared by all visual elements, but it also has a child class line with attributes and functions specifically related to creating lines.

In OOP, specific instances of classes, called objects, are created and strategically called in the code. Pypixxlib has classes for all of our devices. We also have specific I/O subclasses for each of our different kinds of data acquisition and control (including audio inputs, audio outputs, digital inputs, etc.) that can be called by a device object.

Here’s a simple example where we create an instance of a DATAPixx3 object, and then call the setVolume() function of its audio subclass.

Python (object-oriented)
Python (object-oriented)

from pypixxlib.datapixx import DATAPixx3
myDevice = DATAPixx3() myDevice.writeRegisterCache()

OOP code tends to be elegant and efficient, but it takes time to conceptualize. If you’re interested to learn more about this style of programming, there are lots of resources available on the web.

Wondering about the final line of code in the example “writeRegisterCache?” We will cover this in more detail in the next section of the guide. 

Procedural programming

Procedural programming is a style of programming in which individual functions are called step-by-step to build a program. Procedural code is a linear set of instructions, like a recipe. The downside to this approach is that there is no way to easily modify or invoke previously-implemented code, so occasionally procedural code is less efficient.

Our tools for MATLAB are procedural, following the example of Psychtoolbox. Originally, pypixxlib kept the procedural format and the OOP tools were added later. The procedural functions can be found in a subsection of pypixxlib called libdpx. Procedural libdpx commands have a prefix, usually DPx.

Here’s the same example as before of setting audio volume, using libdpx procedural code this time: 

Python (libdpx)
Python (libdpx)

from pypixxlib import _libdpx as dp

Libdpx commands can be used across all VPixx devices supported by our software tools. Some device-specific functions are prefaced with variations of the DPx prefix. For example, libdpx commands for our eye tracking systems are prefaced with TPx

If you have more than one VPixx device connected by USB, such as a PROPixx projector and a PROPixx controller, libdpx commands will target the appropriate device based on the nature of the command. In the example above, DPxSetAudVolume would be sent to the PROPixx controller, as this is the device which manages audio. If you want to force libdpx to target a specific device, you can use the command DPxSelectDevice and pass the device type or name as an argument.

Procedural programming is much more intuitive for beginner programmers. The libdpx commands also closely follow the format of our MATLAB/Psychtoolbox functions, so if you are familiar with our MATLAB tools already, libdpx will feel very similar.

Ultimately, the decision of which strategy to take, OOP or procedural, is up to the user’s preferences. Both provide a solid foundation for working with our tools in Python.

A quick review of VPixx’s register write/update system

Before we get started, there is a very general principle to keep in mind about VPixx devices. Whether you are using a VIEWPixx, VIEWPixx /3D, PROPixx, DATAPixx series I/O hub, or a simulated version of one of these devices, all of these systems have an onboard data acquisition system.

You can think of this acquisition system as a controller that operates in parallel with your PC. It can store content for playback with precise timing, record new data from connected systems, and keep track of time and the events of the video signal. It even has its own clock.

Like all systems operating in parallel, we need a way for system A (your PC) to communicate with system B (the VPixx hardware). VPixx solves this with a method called register writing and register updating. 

When you implement a line of code to control our hardware, this command does not do anything immediately. Instead, it waits for a special command that indicates it is time to execute. This command is called a register write.  

Python (libdpx)
Python (libdpx)


Register writes have several advantages. First, they allow you to execute several queued device commands simultaneously. Second, this execution can be tied directly to the behaviour of your display; VPixx has a series of special register write functions that delay the time of write execution until a specific event in the video signal is detected. This allows you to synchronize your hardware commands with your video frames or visual stimulus onset.

Register updates follow the same principle as a register write, but they return a copy of the device status as well. This is useful for getting up to date information from our hardware, such as the system clock time and the current state of the I/O ports.

You can read more about our register system and these special video-based functions in our Introduction to Registers and Schedules. We highly recommend this guide for researchers interested in very precise timing control in their experiments.

For now the key points to remember are:

  • With a handful of exceptions, most pypixxlib commands must be followed with a register write or a register update in order for your hardware to execute the requested changes.
  • A register write executes all queued hardware commands. It should be used in situations where you need a fast turnaround time, and do not need any data returned from the device.
  • A register update (also known as a write-read) performs a write and then returns a copy of the device status to your PC. It should be used in situations where you need a fresh copy of your hardware’s status.

It is a good idea to keep in mind this register system when troubleshooting your experiment code. Is your hardware not behaving as expected? You might want to check to make sure you included a register write. Does information retrieved from the device seem outdated, or missing? You might want to check when your last register update was, and whether you have up-to-date device information.

Invoking pypixxlib in the PsychoPy Builder

Our tools are not natively supported in PsychoPy’s Builder interface. This means when you need to interact with our hardware you will have to use custom code blocks strategically placed within the experiment flow. These blocks can use either OOP or procedural style code.

While custom code might sound daunting to new experiment designers, remember that these code blocks don’t need to be overly complicated. We offer many examples below you that you can copy or tweak as needed. These code blocks can be reused across multiple experiments accessing the same equipment.

Create custom code blocks in PsychoPy Builder from the Components > Custom > Code Icon

Custom code blocks can be inserted as elements into specific routines in your Experiment Flow. Code blocks also have an additional feature allowing you to specify where in the experiment you would like them to occur. 

A blank code block editor

Below are some common examples of code which might appear in each one of these positions:

  • Before Experiment: Connect to VPixx hardware, apply  settings changes 
  • Begin Experiment: Apply settings changes, upload waveforms into the VPixx device RAM for playback during experiment (e.g., audio files, digital TTL waveforms, custom analog waveforms), turn on any recording schedules that will be on for the entire experiment (e.g., eye tracking).
  • Begin Routine: Set up and turn on input recording (e.g., listening for button activity), set up output for immediate playback, or playback with a video-based event (e.g., play audio at the same time as a visual stimulus, send a custom TTL waveform on stimulus onset)
  • Each Frame: Routinely check hardware for important status changes (like a button press being recorded, or the participant fixating a target). Note: frame intervals are typically <17 ms, so you only have a brief window of time to execute this code; try to keep your code to a bare minimum here.
  • End Routine: Import recorded data and evaluate it, save any important timestamps, disable recording or playback if needed
  • End Experiment: Shut down any ongoing recordings, restore hardware defaults, close connection to VPixx hardware

Some examples are given below.

Please note, most of our examples use libdpx procedural commands. These commands can be applied to most of our devices without any need to modify the code. If you prefer to use the object-oriented approach, you can look through our documentation to see what the corresponding commands are. 

As a first step before using these examples, please make sure your pypixxlib is up to date. The most recent version of our software tools can be found here on our Downloads and Updates page.

Whenever you insert a code block, remember to ensure the Code Type option selected is “Py” or Python.

1. Establishing a connection to VPixx hardware 
Must be called at the beginning of every experiment


#Code block best positioned "Before Experiment" so any problems are caught right away
#import our library and open the device connection
#throw an error if device is not connected properly
from pypixxlib import _libdpx as dp
isReady = dp.DPxIsReady()
if not isReady:
    raise ConnectionError('VPixx Hardware not detected! Check your connection and try again.')

2. Enabling Pixel Mode and creating a Pixel Trigger
Pixel Mode is a method of sending automated TTL triggers locked to visual stimulus onset. Read more about this mode here: .

This block enables Pixel Mode on our data acquisition systems (note: it is on by default for the VIEWPixx /EEG). It also creates a helper command called “DrawPixelModeTrigger” which you can then call just before the frame where you would like the trigger to fire. The last few lines can be pasted in their own block just before the target video frame; these will generate your custom trigger. 


#Code block best positioned "Before Experiment" or "Begin Experiment"
#Assumes library has been imported and device is connected
#Configure hardware to be in Pixel Mode
#Helper function to draw pixel trigger
def drawPixelModeTrigger(win, pixelValue):
    #takes a pixel colour and draws it as a single pixel in the top left corner of the window
    #window must cover top left of screen to work
    #interpolate must be set to FALSE before color is set
    #call this just before flip to ensure pixel is drawn over other stimuli
    topLeftCorner = [-win.size[0]/2, win.size[1]/2]
    line = visual.Line(
            units = 'pix',
            end=[topLeftCorner[0]+1, topLeftCorner[1]],
            interpolate = False,
            colorSpace = 'rgb255',
            lineColor = pixelValue)
#Uncomment and paste the following in a code block just before your target video frame to draw your pixel trigger.
#There should be no delay between this code and your target video frame, so if you are drawing other stimuli make 
#sure to draw them right away. You may want to start a routine where stimuli onset occurs at t0 and add this code 
#block in the "Begin Routine" phase
#myTriggerValue = 33
#myPixelValue = dp.DPxTrigger2RGB(myTriggerValue)
#drawPixelModeTrigger(win, myPixelValue)

3. Loading an audio waveform into device memory
Save your audio file to VPixx hardware memory for playback during the experiment. 


#Code block best positioned "Begin Experiment"
#Assumes libdpx has been imported and device is connected
#Requires scipy for importing audio files from import wavfile #file path to sound file soundFile = 'C:/.../myfile.wav' #import sound file into Python fs, audio_data = maxScheduleFrames = len(audio_data) #some settings volume = 0.5 #50 percent bufferAddress = int(16e6) onsetDelay = 0.0 stereoMode = 'mono' dp.DPxInitAudCodec() dp.DPxWriteAudioBuffer(audio_data, bufferAddress) dp.DPxSetAudioSchedule(onsetDelay, fs, maxScheduleFrames, stereoMode, bufferAddress) dp.DPxWriteRegCache()

4. Setting up Button Schedules
This mode allows you to define unique TTL waveforms for each of your RESPONSEPixx buttons and save them on the hardware. Button presses will then immediately trigger playback of these waveforms on the acquisition system’s digital output. Use this method to pass button activity to other hardware (MEG, EEG).


#Code block best positioned "Beginning of Experiment" when parameters are being set
#The schedule will work automatically, including catching any accidental presses
#Assumes library has been imported and device is connected
#Enable debounce. When a DIN transitions, ignore further DIN transitions for next 30 milliseconds 
#(good for response buttons and other mechanical inputs)
#Set our mode. The mode can be:
#  0 -- The schedules starts on a raising edge (press of RPx /MRI, release of RPx)
#  1 -- The schedules starts on a falling edge (release of RPx /MRI, press of RPx)
#  2 -- The schedules starts on a raising and on a falling edge (presses and releases, both RPx types)
# For mode 0 and 1, you put the schedule at baseAddr + 4096*DinValue
# For mode 2, you put the schedule of a falling edge at baseAddr + 4096*DinValue + 2048*DinValue 
# and a rising edge at baseAddr + 4096*DinValue + 2048
#Not sure what DinValues correspond to which buttons? Have special wiring? 
#Check directly by using out PyPixx > Digital I/O demo and pressing buttons
signalLength = 6
baseAddress = int(9e6)
#Red button (DinValue 0)
redSignal = [1, 0, 0, 0, 0, 0] #single pulse on dout 0
redAddress =  baseAddress + 4096*0
dp.DPxWriteDoutBuffer(redSignal, redAddress)
#Yellow button (DinValue 1)
yellowSignal = [1, 0, 1, 0, 0, 0] #two pulses on dout 0
yellowAddress =  baseAddress + 4096*1
dp.DPxWriteDoutBuffer(yellowSignal, yellowAddress)
#Green button (DinValue 2)
greenSignal = [2, 0, 0, 0, 0, 0] #single pulse on dout 1
greenAddress =  baseAddress + 4096*2
dp.DPxWriteDoutBuffer(greenSignal, greenAddress)
#Blue button (DinValue 3)
blueSignal = [2, 0, 2, 0, 0, 0] #two pulses on dout 1
blueAddress =  baseAddress + 4096*3
dp.DPxWriteDoutBuffer(blueSignal, blueAddress)
scheduleOnset = 0.0 #no delay
scheduleRate = 2 #waveform playback rate 2 samples/sec
dp.DPxSetDoutSchedule(scheduleOnset, scheduleRate, signalLength+1, baseAddress)

5. Setting up and starting a RESPONSEPixx listener
RESPONSEPixx button box activity is recorded on the digital input of the data acquisition system. We use a special schedule called a digital input log to record only changes in the state of this port, signaling either a button press or a release.

A RESPONSEPixx Listener is a special class developed for our OOP library that encapsulates all of the commands required to set up and collect button activity from the digital input log. While it is not necessary to use a Listener to monitor button activity, this tool streamlines button box monitoring.  The RPxButtonListener begins recording data as soon as it is created. It can listen for button presses, button releases, or both. 


#Code block best positioned "Beginning of Routine" at either start of a block or a trial
#depending on how continuously you want to record data
#Assumes library has been imported and device is connected
from pypixxlib import RPixxButtonListener as rp
#Create button listener, and pass button box type as argument. 
# Accepted values (not case sensitive):
#     1) 'handheld'
#     2) 'handheld - mri'
#     3) 'dual - mri'
#     4) 'dual handheld'
#     5) 'mri 10 button'
#Using a custom button box and need a hand? Contact our tech support team (
RPxlistener = rp.RPixxButtonListener("dual - mri") 

6. Checking for a RESPONSEPixx button press
Once you have a ButtonListener running, you can check for new button activity strategically during your experiment. Below is an example of log checking using an instance of the RPxButtonListener class.

Timestamps reported are VPixx hardware clock. If you want to compare the button event timing to some other event, like visual stimulus onset, you can take frame-accurate timestamps using the code in example 8. Do not equate PsychoPy system time and VPixx hardware time. Separate clock systems can drift and desynchronize over time, making timing comparisons inaccurate.


#Code block best positioned "End of Routine" where routine is a trial 
#Reports all new button activity since the listener was instantiated OR the last call to updateLogs()
#Requires VPixx hardware to be connected, and an RPxButtonListener must be instantiated 
import numpy as np
#This follows the example above, using an dual - mri button box. 
#Logged button events take the format [timetag, buttonID, eventType]
    # timetag is the time in seconds since your VPixx hardware last booted up
    # buttonID is the button that was active. Below is a list of button IDs to help interpret code:  
        # "handheld" button IDs: {0:'red',1:'yellow',2:'green',3:'blue',4:'white'}
        # "handheld - mri" button IDs: {0:'red',1:'yellow',2:'green',3:'blue'}
        # "dual - mri" button IDs: {0:'red',1:'yellow',2:'green',3:'blue'}
        # "dual handheld" button IDs: {0:'red',1:'yellow',2:'green',3:'blue'}
        # "mri 10 button" button IDs: {0:'red left',1:'yellow left',2:'green left',3:'blue left',
        #                               4:'white left',5:'red right',6:'yellow right',7:'green right',
        #                               8:'blue right',9:'white right'}
    # eventType can be either "Push" or "Release"
#If more than one event is logged, they will all be reported in the output.
#Only listen for red or yellow pushes. You can alter these variables to listen for different events
buttonSubset = [0,1] 
recordPushes = True
recordReleases = False
# update the Din logs for any new activity
output = listener.getNewButtonActivity(buttonSubset, recordPushes, recordReleases)
#This bit of code saves the logged events to a .csv file in the working directory. 
#If you call it repeatedly, it will append any new data to the same file.
#You may want to change the file name between blocks or participants to keep data organized.     
myFileName = 'RPxData.csv'
with open(myFileName, 'a') as RPxData:
    #save output
    if np.size(output)>0:
            np.savetxt(RPxData, output, delimiter=",", fmt='%s', comments="")

7. Closing the connection to hardware
A simple cleanup code block to run at the end of your experiment.


#Code block best positioned "After Experiment"   
#Assumes library has been imported and device is connected dp.DPxStopAllScheds() dp.DPxWriteRegCache()

Calling pypixxlib commands in a Python IDE

Pypixxlib functions like other site packages in a Python environment. Once you have downloaded our software tools and installed pypixxlib, you will be able to import pypixxlib at the top of your script just like you would other packages (numpy, psychopy, matplotlib, etc).

There are examples of how to import our object-oriented tools or libdpx in the code snippets in the previous section of this guide. You can also find several IDE-based demos in the demo section of our pypixxlib documentation.

General tips for working with our tools in PsychoPy

Below are some general tips for working in PsychoPy. These apply to both Builder and IDE platforms.

Pixel identity passthrough
Pixel identity passthrough refers to the 1-to-1 mapping between a visual stimulus created in software, and the stimulus presented on the display. Certain of our features, including our special video modes for high bit depth colour, Pixel Mode, and register write/update Pixel Sync, rely on pixel identity passthrough to work properly. If something in your graphics pipeline alters your assigned pixel colour values, this can cause problems for these features, including weird colours, bad triggers and missed synchronization.

Here are some general tips for ensuring good pixel identity passthrough in PsychoPy:

  • Check for dithering. Graphics cards sometimes have this enabled by default. You can read more about this and how to disable it here.
  • Use RGB255 colour space. The standard PsychoPy colour space (-1 to 1)  rounds 8-bit colour values, causing a mismatch between pixel value assignment and detected output. In Builder, you may need to create your triggers and pixel sequences using code blocks rather than relying on drag and drop objects. 
  • Set win.fullscr to False. This is most important for Pixel Mode. For some reason, setting fullscreen to True causes Pixels in the top row of the window to have altered pixel values. You should still use a full resolution window (1920 x 1080 for most of our devices).
  • Be mindful of graphics manipulations like gamma correction and interpolation. Image adjustment methods like blending, antialiasing, filtering/smoothing and gamma correction can alter pixel values after the user draws them. Some of these can be disabled for individual stimuli. For example, interpolation can be disabled for individual visual objects). Some you may have to work around by picking strategic pixel values with maximum 8 bit values (red, green, blue, cyan, magenta, yellow, and white) 
  • When in doubt, draw your pixel trigger or pixel sequence manually. See code block example #2 above for a demonstration.

 Special video modes and sequencers

Our special high bit-depth video modes and high refresh rate sequencers can be used in Python. However, we recommend using an IDE, not the Builder, for these modes. The reason for this is that most of our video modes require custom shaders or image formatting that cannot be easily incorporated into the Builder’s pre-defined window properties. 

For an example of this formatting and how it can be automated, we have an example of 480Hz or “Quad4x” mode available here: [download]. More examples using our custom shaders will be made available soon.

  Get in touch!Still have questions? Pypixxlib code not working? Looking for a specific demo? Contact our technical support staff at [email protected]. Our team of trained vision scientists and software developers can help you get our tools up and running.   


Peirce, J. W., Gray, J. R., Simpson, S., MacAskill, M. R., Höchenberger, R., Sogo, H., Kastman, E., Lindeløv, J. (2019). PsychoPy2: experiments in behavior made easy. Behavior Research Methods. 10.3758/s13428-018-01193-y

Cite this guide

Fraser, L. (2022, March 18). Using VPixx Hardware with PsychoPy. Retrieved [Month, Day, Year], from