AnimatLab explained

AnimatLab
Developer:David W. Cofer
Gennady Cymbalyuk
James Reid
Ying Zhu
William J. Heitler
Donald H. Edwards
Programming Language:C++, VB.NET
Latest Release Version:2.1.5
Latest Release Date:[1]
Operating System:Windows
Genre:Neuromechanics

AnimatLab is an open-source[2] neuromechanical simulation tool that allows authors to easily build and test biomechanical models and the neural networks that control them to produce behaviors. Users can construct neural models of varied level of details, 3D mechanical models of triangle meshes, and use muscles, motors, receptive fields, stretch sensors and other transducers to interface the two systems. Experiments can be run in which various stimuli are applied and data is recorded, making it a useful tool for computational neuroscience. The software can also be used to model biomimetic robotic systems.

Motivation

The neuromechanical simulation tool facilitates the construction and testing of biomechanical models and their associated neural networks for behavior production. Users can create neural models with varying levels of detail, 3D mechanical models using triangle meshes, and incorporate muscles, motors, receptive fields, stretch sensors, and other transducers to connect these systems. It enables the execution of experiments with diverse stimuli and data recording, serving as an essential resource in computational neuroscience. Furthermore, the software offers capabilities for modeling biomimetic robotic systems.

History

The application was initially developed at Georgia State University under NSF grant #0641326.[3] Version 1 of AnimatLab was released in 2010. Work has continued on the application and a second version was released in June 2013.

Functionality

AnimatLab empowers users to craft models with diverse levels of intricacy, facilitated by a range of available model types. Neurons can be instantiated as simple firing rate models, integrate-and-fire models, or Hodgkin–Huxley models, with the flexibility to incorporate plugins for additional neuron models. Actuation of joints is achieved using Hill-type muscles, motors, or servos, while adapters interface between neurons and actuators to generate forces. Feedback loops between mechanical components, such as joints, body segments, and muscles, are established through adapters to inform the control system. The platform supports the integration of various stimuli, including voltage clamps, current clamps, and velocity clamps for joints, enabling the design of tailored experiments. Furthermore, comprehensive data recording capabilities allow for the collection of data from different system components, presented through graphical visualization or exported as comma-separated values files for seamless analysis, all within an intuitive graphical user interface.

Neural modeling

A variety of biological neuron models are available for use. The Hodgkin–Huxley model, both single- and multi-compartment integrate-and-fire models, and various abstracted firing-rate models are available.[4] This is a relevant feature because the purpose of one's model and its complexity decide which features of neural behavior are important to simulate.[5]

Network construction is graphical, with neurons dragged and dropped into a network and synapses drawn between them. When a synapse is drawn, the user specifies what type to use. Both spiking and non-spiking chemical synapses, as well as electrical synapses, are available. Both short-term (through facilitation) and long term (Hebbian) learning mechanisms are available, greatly increasing the capability of the nervous systems constructed.

Rigid body modeling

Body segments are modeled as rigid bodies drawn as triangle meshes with uniform mass density.[4] Meshes can be selected from a set of primitives (cube, ellipsoid, cone, etc.) or imported from third-party software such as Maya or Blender. Physics are simulated with the Vortex engine. Users can specify separate collision and graphical meshes for a rigid body, greatly reducing simulation time. In addition, material properties and the interaction between materials can be specified, allowing different restitution, coefficient of friction, etc. within the simulation.

Muscle modeling

A Hill-type muscle model modified according to Shadmehr and Wise can be used for actuation. Muscles are controlled by placing a voltage-tension adapter between a motor neuron and a muscle. Muscles also have stiffness and damping properties, as well as length-tension relationships that govern their behavior. Muscles can are placed to act on muscle attachment bodies in the mechanical simulation, which then apply the muscle tension force to the other bodies in the simulation.

Sensory modeling

Adapters may be placed to convert rigid body measurements to neural activity, much like how voltage-tension adapters are used to activate muscles. These may be joint angles or velocities, rigid body forces or accelerations, or behavioral states (e.g. hunger).

In addition to these scalar inputs, contact fields may be specified on rigid bodies, which then provide pressure feedback to the system. This functionality has been used for skin-like sensing [4] and to detect leg loading in walking structures.[6]

Stimulus types

Stimuli can be applied to mechanical and neural objects in simulation for experimentation. These include current and voltage clamps, as well as velocity clamps for joints between rigid bodies.

Graph types

Data can be output in the form of line graphs and two-dimensional surfaces. Line graphs are useful for most data types, including neural and synaptic output, as well as body and muscle dynamics. Surface plots are useful for outputting activation on contact fields. Both of these can be output as comma separated values files, allowing the user to use other software such as MATLAB or Excel for quantitative analysis.

Research performed with AnimatLab

Many academic projects have used AnimatLab to build neuromechanical models and explore behavior. These include:

Notes and References

  1. Web site: AnimatLab > Download. animatlab.com. 2021-03-25.
  2. Web site: AnimatLab.com - Neuromechanical & Biomechanical Simulation . 2024-04-01 . www.animatlab.com.
  3. Web site: National Science Foundation Awards. 2010-01-28. 2023-11-09. en.
  4. Cofer. D. W.. Cymbalyuk. G.. Reid. J.. Zhu. Y.. Heitler. W.. Edwards. D.H.. 2010. AnimatLab: A 3-D graphics environment for neuromechanical simulations. Journal of Neuroscience Methods. 187. 2. 280–288. 20074588. 10.1016/j.jneumeth.2010.01.005. 19398166. registration. en.
  5. Izhikevich. E. M.. 2004. Which model to use for cortical spiking neurons. IEEE Transactions on Neural Networks. 15. 5. 1063–70. 10.1109/TNN.2004.832719. 15484883. 7354646. registration. en.
  6. Szczecinski, N. S. Massively distributed neuromorphic control for legged robots modeled after insect stepping. Master's Thesis. Case Western Reserve University, 2013.
  7. Klishko A., Cofer D. W., Edwards D. H., Prilutsky B. Extremely high paw acceleration during paw shake in the cat: a mechanism revealed by computer simulations. AbstrAm Phys Soc Meeting A38.00007; 2008a.
  8. Klishko A., Prilutsky B., Cofer D. W., Cymbalyuk G., Edwards D. H. Interaction of CPG, spinal reflexes and hindlimb properties in cat paw shake: a computer simulation study. Neuroscience Meeting Planner Online, Program No. 375.12. Society for Neuroscience; 2008b.
  9. Cofer, D. W. (2009). Neuromechanical Analysis of the Locust Jump (Ph.D. dissertation). Available from digital archive database. (Article No. 1056)
  10. Cofer. D. W.. Cymbalyuk. G.. Heitler. W. J.. Edwards. D.H.. Neuromechanical simulation of the locust jump. J Exp Biol. 2010 . 2010. 213. 1060–1068. 10.1242/jeb.034678. free. 20228342 . 2837733.
  11. Cofer. D. W.. Cymbalyuk. G.. Heitler. W. J.. Edwards. D. H.. 2010. Control of tumbling during the locust jump. J Exp Biol. 213. 19. 3378–87. 10.1242/jeb.046367. 2936971. 20833932. en. free.
  12. Rinehart M. D., Belanger J. H. Biologically realistic limb coordination during multi-legged walking in the absence of central connections between legs. In: Society for Neuroscience Annual Meeting; 2009.