The modular network SOM (mnSOM) proposed in this paper is a Self-Organizing Map (SOM) in function space, as opposed to the conventional SOM in vector space. Whereas each node of the conventional SOM represents a codebook vector, each unit of mnSOM represents a function (i.e., input-output relationship) which may be a dynamical one. In other words, all nodes of the competitive layer are replaced by some kind of neural networks which may be of a multi-layer perceptron type or a recurrent type. The performance of mnSOM is examined by simulation examples such as one dealing with geology-dependent meteorological changes in Japan, one involving musical scale and one simulating a mass-spring-dashpot system. These results show that the functions acquired by the winner modules are mapped into the 2D lattice with topological continuity, i.e., similar functions are close to each other and desimilar ones are allocated far apart. Moreover, “test functions” whose corresponding input-output data are not used during the training are mapped as “test winner modules” that appear at interpolated locations between “training winner modules”.