A numerical method is developed for solving a nonstandard singular system of second-order differential equations arising from a problem in population genetics concerning the coalescent process for a sample from a population undergoing selection. The nonstandard feature of the system is that there are terms in the equations that approach infinity as one approaches the boundary. The numerical recipe is patterned after the LU decomposition for tridiagonal matrices. Although there is no analytic proof that this method leads to the correct solution, various examples are presented that suggest that the method works. This method allows one to calculate the expected number of segregating sites in a random sample of n genes from a population whose evolution is described by a model which is not selectively neutral.