In this paper, we present a neural classifier algorithm that locally approximates the decision surface of labeled data by a patchwork of separating hyperplanes, which are arranged under certain topological constraints similar to those of self-organizing maps (SOMs). We take advantage of the fact that these boundaries can often be represented by linear ones connected by a low-dimensional nonlinear manifold, thus influencing the placement of the separators. The resulting classifier allows for a voting scheme that averages over the classification results of neighboring hyperplanes. Our algorithm is computationally efficient both in terms of training and classification. Further, we present a model selection method to estimate the topology of the classification boundary. We demonstrate the algorithm's usefulness on several artificial and real-world data sets and compare it to the state-of-the-art supervised learning algorithms.