VirtualNeuroLabs:

Matlab/Octave Tools for the Visual Neuroscience Class
J. Malo, M.J. Luque, M.A. Diaz, M.C. García, 

(c) Universitat de València 2014

Int. Conf. Education Research and Innovation 2014


What is VirtualNeuroLabs?

VirtualNeuroLabs is a series of exercises devoted to the Visual Neuroscience class where we present a number of computational tools to help the students to learn-by-playing, as opposed to the classical analytical approach in physical sciences.
Specifically 
in these exercises the students simulate physiological / psychophysical experiments by recording the response of virtual cells or to illustrate the behavior of neural models (or virtual brains) when facing complex stimuli.

In each virtual lab, we propose the students a challenge that leads them to play with the elements (routines or pieces of code) to explore the behavior of these virtual sensory systems.


Following our experience in previous editions (Luque et al. Proc. ICERI 2013) we submitted two of these pedagogical experiences and materials to the Int. Conf. Educ. Res. Inn. 2014. See the proposed exercises and results below.
 

VirtualNeuroLab I: Simulating Physiological Experiements in Motion Sensitive Neurons


The challenge:

Here we propose the students to take virtual neurons and stimulate them with controlled stimuli to characterize the cells in the same way as done in the physiological experiments. In order to do so, we need (1) software to generate the stimuli, (2) software to define the neurons, and (3) software to compute the response(s) given the stimulus and the sensor(s). With these tools, the students can define a set of stimuli and record the corresponding responses to (i) measure the receptive fields of V1 and MT cells in the Fourier domain, and (ii) measure speed tuning curves for these classes of sensors.

Worked example:    tuning_experiment.m mat
                                     Note this virtual lab requires the installation and use of BasicVideoTools


Results:

v
d
s

s

Conclusions

In this work we presented a virtual lab (a set of Matlab/Octave tools) to simulate physiological experiments with motion-sensitive neurons. In particular the student can design and perform experiments to determine the receptive field of unknown sensors and their speed tuning properties. In this virtual lab the students learn the general stimulation-recording procedure in physiological experiements. Moreover they learn to discriminate between V1 and MT due to their very different bandwidth and speed sensitivity. This virtual lab is appropriate for Visual Neuroscience students that may be more interested in exploring the effect of the parameters (learning - by - playing) rather than analyzing the maths.




VirtualNeuroLab II: Understanding the Excitation Patterns in V1 and MT areas

The challenge:

Here we propose the students to take characteristic sequences with controlled texture and velocity to stimulate sets of virtual neurons with specific tuning properties. This allows the simultaneous visualization of the input sequence and the dynamic pattern of responses. In order to do so, we need (i) software to generate the stimuli and the neurons, (ii) software to compute the response(s) given the stimulus and the sensor(s), and, more importantly, (iii) software to visualize the frequency content of stimulus and the bandwidth of the sensor. Note that issues (i) and (ii) were addressed in the above virtual lab. Here we focus on the visualization of the spatio-temporal Fourier representation, and the visualization of the dynamic response pattern. With these tools, the students analyze the spectrum of natural sequences and, as a result, select specific V1 and MT neurons that will give rise to interesting response patterns for a better understanding of their behavior.

Worked example:    responses_experiment.m mat
                                     Note this virtual lab requires the installation and use of BasicVideoTools


Results:

s

a

d

Conclusions

In this work we presented a virtual lab (a set of Matlab/Octave tools) to simulate the response to natural movies at different regions of the visual brain. The proposed tools allow a straightforward application of spatio-temporal filters that model the linear response of V1 and MT cells. Familiarity with such response patterns is essential to understand how the optical flow can be computed at the visual brain. Consistently with the message acquired in the previous virtual lab. here the  students see that MT cells truly segment the objects according to their speed while V1 cells are focused on frequency content.  This is definitely a better way to explore the meaning of the spatio-temporal Fourier domain than through the explicit demonstration of its symmetry properties.



Installation and Requirements

 - Download BasicVideoTools_code.zip and the required virtual lab file (either
responses_experiment.m or tuning_experiment.m ).
 - Decompress at your machine in the folder BasicVideoTools (no location restrictions for this folder).
 - Update the matlab/octave path including all subfolders.
 - Tested on Matlab 2006b and posterior Matlab versions.

 * Video and image data
are only required if you want to gather statistics from natural videos or from natural images with controlled speed
       

How to get started?

The presented VirtualNeuroLabs require BasicVideoTools since they involve motion sensitive neurons. Please download BasicVideoTools first and update the path.
For a general overview of BasicVideoTools please take a look at the contents.m file: just type, help BasicVideoTools

For additional details on how to use the
BasicVideoTools functions in practice, see the demos:

     demo_motion_programs                - Demo on how to use most functions (except random dots and newtonian sequences)
     example_random_dots_sequence - Demo on random dots sequences with controlled flow
     example_newtonian_sequence     - Demo on physics-controlled sequences

Read the VirtualNeuroLabs files
(either responses_experiment.m or tuning_experiment.m ).


References