We develop the rigorous notion of a model for understanding state transition systems by hierarchical coordinate systems. Using this we motivate an algebraic definition of the complexity of biological systems, comparing it to other candidates such as genome size and number of cell types. We show that our complexity measure is the unique maximal complexity measure satisfying a natural set of axioms. This reveals a strong relationship between hierarchical complexity in biological systems and the area of algebra known as global semigroup theory. We then study the rate at which hierarchical complexity can evolve in biological systems assuming evolution is "as slow as possible" from the perspective of computational power of organisms. Explicit bounds on the evolution of complexity are derived showing that, although the evolutionary changes in hierarchical complexity are bounded, in some circumstances complexity may more than double in certain "genius jumps" of evolution. In fact, examples show that our bounds are sharp. We sketch the structure where such complexity jumps are known to occur and note some similarities to previously identified mechanisms in biological evolutionary transitions. We also address the question of, How fast can complexity evolve over longer periods of time? Although complexity may more than double in a single generation, we prove that in a smooth sequence of t "inclusion" steps, complexity may grow at most from N to (N + 1)t + N, a linear function of number of generations t, while for sequences of "mapping" steps it increases by at most t. Thus, despite the fact that there are major transitions in which complexity jumps are possible, over longer periods of time, the growth of complexity may be broken into maximal intervals on which it is bounded above in the manner described.