SENSOPAC
Structure de mise en forme 2 colonnes
  • Thursday 19 November 2015
  • table mise en forme contenu normal \xE0 gauche + boite de contenu \xE0 droite

    Responsible partner

    DLR - PAVIA - UEDIN

    Topic

    In this module, the results from the other modules are integrated towards a common demonstrator.

    To demonstrate the work being done by the SENSOPAC partners, a number of demos has been created, showing the added value of combining neuroscience and robotics research. One demonstrator concentrates on a task simple for humans, but to date impossible for robots. This "simple" task is the following: take an unseen object in your hand and decide what it is. Rather trivial for most objects, since you can "feel" their surface and estimate their shape and form from dynamic properties. But: impossible for even the most advanced robotic system around! To address this essential scenario, the final SENSOPAC demonstrator is constructed as follows:

    1. with a robotic hand-arm system, grasp an object
    2. identify the object from its following properties:
      1. weight;
      2. dynamic properties, obtained by shaking it

    Despite the advances in robotics and neuroscience modelling, it cannot be expected that SENSOPAC solves this problem as efficiently as humans do. Nonetheless, given a limited set of objects to discern between, possibly parametered with, e.g., "full glass", "half-full glass", "empty glass", this result is demonstrated in the Videos section of this website.

    Results

    To solve the challenge of integrating neuroscience and robotics research poses high demands on the level of understanding of the brain functionality. For this purpose, the neuroscience research within SENSOPAC is focussed on the cerebellum, in which the circuitry features and functionality has been characterized in a relatively high degree of detail. Using a combination of previously obtained basic data and the neuroscience studies and models of SENSOPAC, we have developed a model system representing the essential functional features of the cerebellar region controlling arm-hand motions. The relevant data is focussed on cerebellar circuitry mechanisms but also extends to include circuitry mechanisms in other brain structures associated with this system. As the essential neuronal connectivity patterns between these substructures and the cerebellar circuitry are also being characterized, we can apply reverse-engineering techniques from electronics and mathematics in order to figure out their fundamental functional features. These features are being represented in an executable model of cerebellar arm-hand control named LSAM (Large Scale Analog Model). SENSOPAC also features the development of an advanced movement planner for the robotic system. In its interaction with LSAM, this planner is viewed as essentially carrying out the function of the cerebral cortex, in particular the motor cortex (MCX). Although the ciricuitry design of the cerebellar arm-hand system is very clever, it is not magic. In fact, the brain hand-arm control appears to use some of the same feedback-control techniques that are commonly used in process industry, including for example an inverse and a forward model (IM & FM).

    LSAM has been implemented with a robot system in order to provide a link between the neuroscience models, the movement planner and the robot system of SENSOPAC. As a final part of the project, the contribution of the modelled brain circuitry mechanisms to the haptic discrimination capability has been extracted and integrated separately with the sensory read-outs of the robotic system. In this executable model of haptic discrimination, SENSOPAC showed that the neuroscience models will prove to impart substantial improvements to the haptics discrimination process.