Recent advancements in generative networks have shown that it is possible to produce real-world-like data using deep neural networks. Some implicit probabilistic models that follow a stochastic procedure to directly generate data have been introduced to overcome the intractability of the posterior distribution. However, the ability to model data requires deep knowledge and understanding of its statistical dependence--which can be preserved and studied in appropriate latent spaces. In this article, we present a segmented generation process through linear and nonlinear manipulations in the same-dimensional latent space where data are projected to. Inspired by the known stochastic method to generate correlated data, we develop a segmented approach for the generation of dependent data, exploiting the concept of copula. The generation process is split into two frames: one embedding the covariance or copula information in the uniform probability space, and the other embedding the marginal distribution information in the sample domain. The proposed network structure, referred to as a segmented generative network (SGN), also provides an empirical method to sample directly from implicit copulas. To show its generality, we evaluate the presented approach in three application scenarios: a toy example, handwritten digits, and face image generation.