A computational approach on the co-development of artificial visual sensorimotor
Adaptive Behavior: Animals, Animats, Software Agents, Robots, Adaptive Systems
Published online on July 29, 2013
Abstract
To follow a goal-directed behavior, an autonomous agent must be able to acquire knowledge about the causality between its motor actions and corresponding sensory feedback. Since the complexity of such sensorimotor relationships directly influences required cognitive resources, this work proposes that it is of importance to keep the agent’s sensorimotor relationships simple. This implies that the agent should be designed in a way such that sensory consequences can be described and predicted in a simplified manner. Living organisms implement this paradigm by adapting sensory and motor systems specifically to their behavior and environment. As a result, they are able to predict sensorimotor consequences with a strongly limited amount of (expensive) nervous tissue. In this context, the present work proposes that advantageous artificial sensory and motor layouts can be evolved by rewarding the ability to predict self-induced stimuli through simple sensorimotor relationships. Experiments consider a simulated agent recording realistic visual stimuli from natural images. The obtained results demonstrate the ability of the proposed method to (i) synthesize visual sensorimotor structures adapted to an agent’s environment and behavior, and (ii) serve as a computational model for testing hypotheses regarding the development of biological visual sensorimotor systems.