MATLAB/OCTAVE TOOLS FOR THE VISUAL NEUROSCIENCE CLASS II: UNDERSTANDING THE EXCITATION PATTERNS IN V1 AND MT AREAS
1 Universitat de Valencia (SPAIN)
2 Fundación Oftalmológica del Mediterraneo (SPAIN)
About this paper:
Appears in:
ICERI2014 Proceedings
Publication year: 2014
Pages: 3797-3802
ISBN: 978-84-617-2484-0
ISSN: 2340-1095
Conference name: 7th International Conference of Education, Research and Innovation
Dates: 17-19 November, 2014
Location: Seville, Spain
Abstract:
Introduction: the need of virtual labs in visual neuroscience:
A problem in teaching Visual Neuroscience is that students with biomedicine background are many times unaware of the advantages of quantitative thinking and usually get stuck with maths. Following [1,2] here we present didactic tools to help the neuroscience students with mathematical models of visual brain.
The proposed virtual lab: excitation patterns in virtual V1 and MT:
V1 and MT cells are both sensitive to temporal changes in a scene but exhibit characteristic response patterns to moving stimulus the students must be familiar with to understand motion processing by the visual brain. To do so, the students are prompted to simulate an experiment as it would be conducted in a neurophysiology lab: they record the responses from cells with particular tuning characteristics to appropriately chosen stimuli. They the virtual neurons whose properties were measured in another virtual lab [2]. In the exercise (virtual lab) proposed here the analysis is focused on the visualization of the spatio-temporal Fourier representation, and the visualization of the dynamic response pattern. Here we present these computation and visualization tools. With these tools, the students analyze the spectrum of natural sequences and, as a result, they select specific V1 and MT neurons that will give rise to interesting response patterns for a better understanding of their behavior. The general code for this virtual lab is available at http://isp.uv.es/soft_visioncolor.htm in the section VirtualNeuroLabs. The specific exercise (virtual lab) with a step-by-step explanation is response_experiment.m
Results and Conclusions:
Although the students could eventually work with any natural video sequence, teachers must devote some time to producing or choosing video sequences that will yield illustrative results. We have generated a collection of brief video sequences, with few and strongly patterned natural moving stimuli, which allows the student to estimate easily spatial frequencies –for instance, by simply counting stripes in the figure-, and motion speeds, that they can compare with the graphical information provided by the software. The results illustrate the most important properties of V1 and MT neurons. For instance, V1 neurons aretuned to specific spatio-temporal frequencies and hence object information is lost. They roughly respond to patterns of the right frequency, while MT cells respond to any spatial frequency whenever it moves with the right speed, therefore, they signal objects moving with that speed [3]. These responses to speed are consistent with perceptually relevant motion signals [4].
References:
[1] M.J. Luque, D. de Fez, M. Carmen García and V. Moncho. Tools for generating customized visual stimuli in visual perception labs using computer controlled monitors. Proc. ICERI 2013 Conf. 2013, pp 6200-6207.
[2] J. Malo, M.J. Luque, A. Díez, M.C. García. Matlab/Octave Tools for the Visual Neuroscience Class I: Simulating Physiological Experiments in Motion Sensitive Neurons. Submitted to Proc. ICERI 2014 Conf. 2014.
[3] Eero Simoncelli & David Heeger. A model of neuronal responses in visual area MT. Vis. Res. Vol.38, N.5, pp. 743-761, 1998.
[4] J. Malo, J. Gutiérrez and I. Epifanio. What motion information is perceptually relevant?.
Journal of Vision, 1(3), 309a, http://journalofvision.org/1/3/309, DOI 10.1167/1.3.309. 2001.Keywords:
Teaching visual neuroscience, computational models, virtual laboratory, motion perception, response to complex motion patterns.