Life originated in an anoxic, Fe2+-rich environment. We hypothesize that on early Earth, Fe2+ was a ubiquitous cofactor for nucleic acids, with roles in RNA folding and catalysis as well as in processing of nucleic acids by protein enzymes. In this model, Mg2+ replaced Fe2+ as the primary cofactor for nucleic acids in parallel with known metal substitutions of metalloproteins, driven by the Great Oxidation Event. To test predictions of this model, we assay the ability of nucleic acid processing enzymes, including a DNA polymerase, an RNA polymerase and a DNA ligase, to use Fe2+ in place of Mg2+ as a cofactor during catalysis. Results show that Fe2+ can indeed substitute for Mg2+ in catalytic function of these enzymes. Additionally, we use calculations to unravel differences in energetics, structures and reactivities of relevant Mg2+ and Fe2+ complexes. Computation explains why Fe2+ can be a more potent cofactor than Mg2+ in a variety of folding and catalytic functions. We propose that the rise of O2 on Earth drove a Fe2+ to Mg2+ substitution in proteins and nucleic acids, a hypothesis consistent with a general model in which some modern biochemical systems retain latent abilities to revert to primordial Fe2+-based states when exposed to pre-GOE conditions.
© The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.