The Second IASTED International Conference on
Computational Bioscience
CompBio 2011

July 11 – 13, 2011
Cambridge, United Kingdom

TUTORIAL SESSION

Neuromechanical Simulations using AnimatLab Software

Dr. Donald H. Edwards
Georgia State University, USA
biodhe@langate.gsu.edu

Abstract

fiogf49gjkf0d
AnimatLab is a Windows®-based software environment for modeling an animal’s nervous system and body in a virtual physical world. This tool allows researchers and students to create realistic 3-D models of the musculo-skeletal structure of an animal, its internal and external sensory receptors, and the neural circuits that link those receptors to the body muscles. The models are sited in a virtual physical world subject to gravity and simple hydrodynamics, together with other objects and other animal models. The sensory receptors respond to environmental stimuli, and the excited afferents interact with central neural circuits to excite patterns of muscle responses that move the animal model within the environment. Simulations are run to determine how the animal model’s movements and behavior result from its interactions with the external world and the responses of its neural circuits. AnimatLab addresses the need of researchers to explore the physical and neural consequences of sensori-motor interactions within the context of closed feedback loops and unrestrained movements, when direct experimental responses are difficult to measure and record.
The tutorial will begin with an introduction that includes three brief demonstrations of AnimatLab’s capabilities: (i) a model of alpha-gamma co-activation and the myotactic reflex in vertebrate limb movements, (ii) kicking and jumping in locusts, and (iii) as part of a hybrid experimental/model system used to study real-time open- and closed-loop control of posture and walking in crayfish. I will then describe AnimatLab’s overall structure, including the body editor, the neural editor, the model environment, stimulus configurations, data display, and the physics and neural solvers.
The body of the tutorial will be a description of how to build a model using the body and neural editors. A simple hexapod walker will be constructed from a set of Lego-like body parts, muscles, and sensors. A neural control circuit will then be assembled by creating single- and multicompartment neuron models from a variety of single compartment types, including firing rate, integrate-and-fire, and Hodgkin-Huxley based models. These neuron models will be linked through conductance-based excitatory and inhibitory chemical synapses and/or rectifying and non-rectifying electrical synapses to form functional neural circuits. Sensory receptors on the body surface will be linked to primary sensory afferents, while motor neurons will excite Hill-like muscle models that span limb joints. Simulations will be run at each stage of construction to examine the behavior of the model in animation and the changes in model parameters, including joint angles, muscle tensions, membrane potentials, and stimulus intensities.
This model walker will provide a point of departure for participants to build their own models during the final part of the tutorial. To this end, participants are encouraged to bring laptops with AnimatLab installed to create their own models. Animatlab is free and available with video tutorials and a software developer’s kit at www.animatlab.com. AnimatLab development was supported by grants from the NSF and the NIH.

Objectives

fiogf49gjkf0d
1. Introduce a novel open source AnimatLab software for neuromechanical simulations (Cofer D, Cymbalyuk G, Reid J, Zhu Y, Heitler WJ, Edwards DH. AnimatLab: a 3D graphics environment for neuromechanical simulations. J Neurosci Methods. 2010 Mar 30;187(2):280-288).
2. Teach listeners to build models of skeleton, muscles, neurons (motoneurons, interneurons, receptors) and neural circuits (central pattern generator, reflexes) to study different biological behaviors using computer simulations.

Timeline

fiogf49gjkf0d
1. Introduction and examples (15 min)
2. Model development
2.1. Tools (15 min)
2.1. Environment (15 min)
2.2. Elements of organism
- segment (15 min)
- joint (15 min)
- muscle, and muscle and skin receptors (15 min)
- stimuli (15 min)
- neuron (15 min)
- central pattern generator (15 min)
3. Simulations (15 min)
4. Graphical output (15 min)
5. Applications (15 min

Tutorial Materials

fiogf49gjkf0d
All participants are advised to bring along their own personal laptop. In order to fully benefit from the tutorial, all participants should run the AnimatLab program with the models as the presenter describes them.
The tutorial presenter will distribute copies of the AnimatLab program application that can be installed on your personal laptop. The AnimatLab program can also be downloaded at http://animatlab.com/Download.htm

Background Knowledge Expected of the Participants

fiogf49gjkf0d
Basic knowledge of mechanics, anatomy and neurophysiology

Qualifications of the Instructor(s)

Tutorial Session Portrait

fiogf49gjkf0d
Donald H. Edwards is Regents’ Professor of Neuroscience at Georgia State University. He graduated from the Massachusetts Institute of Technology with a B.S. in Electrical Engineering in 1970, and earned a Ph.D. at Yale University in 1976 after studying visual neurophysiology with Timothy Goldsmith. He moved to Stanford University as a postdoctoral associate to study sensori-motor integration in crayfish with Donald Kennedy. In 1979, he moved to the University of California at Davis to work on similar questions with Brian Mulloney, using both computational and experimental approaches. He joined the faculty of the Biology Department at Georgia State University in 1981, studying mechanisms of sensorimotor integration, neuromodulation, and behavior in crayfish, again using both computational and experimental techniques. He spent 1992-93 in Washington, D.C. at the National Science Foundation as the program officer for computational neuroscience. In 2004, he led development of the Brains & Behavior Area of Focus at Georgia State University, which then led to formation of the Neuroscience Institute in 2008. AnimatLab was developed by Dr. David W. Cofer as part of his dissertation research supervised by Edwards.