we are planning a visuomotor rotation experiment where in some trials subjects localize their unseen hand after a reaching movement in the endpoint lab. During movement there is supposed to be no feedback. Upon movement termination the subjects move a cursor to their perceived hand position using a trackball.
I thought about using the Hand_Feedback or Show_Target Block to generate a VCode and adjust it according to the input of the trackball to display the cursor. Which brings me to the question of how to receive the input from the trackball.
Is there a possiblity to either connect the trackball to the DexteritE computer and display it in the endpoint lab or can we connect the trackball to the PCI card or BNC board input (this would probably require some prior processing of the trackball signal). Thank you a lot!
If possible, the simplest solution might be for the subject to use the other robot arm to localize their unseen hand. As long as this part of the task is unilateral, the subject could use the other robot arm as a mouse and you would use the displacement in its x,y to modify the x,y of the cursor.
Otherwise, the biggest difficulty I foresee is converting data from the trackball to something you use (I haven’t tried interfacing with a trackball or mouse yet!). Here’s a bit on how to get that data into your task:
You can send information to the task using the PCI card or you can also use the Dexterit-E Computer (or another computer).
The PCI card is the NI-6229 and you can access it through the BNC board. There are the AI0-7 analog inputs and the PI and PFI digital inputs that connect to the pins on the NI-6229. You can access the values on those pins inside your task using the appropriate Simulink NI-6229 blocks, which can be found in the Simulink Library Browser.
Inside your Simulink model, you can also use the UDP Receive block to receive data from an external source. You can connect an ethernet cable and use a program to send UDP blocks to the task. If you wanted to do it from the Dexterit-E Computer, you’d have to write a small program that uses the x,y values of the trackball and broadcast UDP packets to the Robot Computer.
The Dexterit-E Computer and Robot Computer are on the same LAN and have IPs 192.168.0.1 and 192.168.0.2 respectively (see the section titled Reserved IP Addresses in the Create Task Programs for Dexterit-E guide). If you want to add another computer/device on the LAN that you can interface with during the task, you can. See the section titled Reserved IP Addresses in the Create Task Programs for Dexterit-E guide for IP addresses that are not reserved by us for internal use.
As you suggested we localized the reaching hand using the other hand. For this we constructed a “spring controller”, which is a spring that is attached between a fixed point in the work space and the localizing hand. Spring length is set to 0 cm so the hand is pulled back to the fix point as soon as you move out. We used this approach to move a cursor. Staying on the center, the cursor hold its position (cursor speed=0). Leaving the center, the cursor moved according to the direction of hand movement and cursor speed according to movement extent.
The spring controller movement is so different from the reach adaptation task, that we are not worried about any interfering effects between them.
PS. To confirm the cursor position after subjects navigated to the perceived hand location we used a pedal that we connected to the digital input.