Our red blood is full of iron. We need iron for growth and for immunity. It is even added to foodstuffs, such as cereals, to ensure that there is enough of this nutrient in the diet to prevent iron deficiency.
However, on a very different scale, during the development of life on planet Earth over billions of years, iron deficiency may have stimulated evolution. According to our new research, published in the Proceedings of the National Academy of Sciences (PNAS), rising and falling levels of iron on our planet may have enabled complex organisms to evolve from simpler forebears.
The terrestrial planets in our solar system – Mercury, Venus, Earth and Mars – have different amounts of iron in their rocky mantles, the layer below the outermost planetary crust. Mercury’s mantle has the least amount of iron, and Mars’ has the most. This variation is due to differences in distance from the Sun. It is also down to the varying conditions under which the planets initially formed their metallic, iron-rich cores.
The amount of iron in the mantle regulates several planetary processes, including the retention of surface water. And without water, life as we know it cannot exist. Astronomical observations of other solar systems may enable estimates of a planet’s mantle iron, helping to narrow the search for planets capable of harbouring life.
As well as contributing to planetary habitability, iron is fundamental for the biochemistry that allows life to happen. Iron has a unique combination of properties, including the ability to form chemical bonds in multiple orientations and relative ease of gaining or losing one electron. As a result, iron mediates many biochemical processes in cells, especially by enabling catalysis – a process that speeds up chemical reactions. Metabolic processes that are vital to life, such as DNA synthesis and cellular energy generation, rely on iron.
In our work, we calculated the amount of iron in Earth’s seas over billions of years. We then considered the effect on evolution of enormous amounts of iron falling out of the seas.
Iron through the ages
The initial formative events of geochemistry evolving into biochemistry, life, took place more than 4 billion years ago. And there is a consensus that iron was a pivotal element in this process. The conditions of early Earth were very different to those now. In particular, there was almost no oxygen in the atmosphere, which meant that iron was easily soluble in water as “ferrous iron” (Fe2+). The abundance of nutritious iron in the Earth’s early seas helped life to evolve. However, this “ferrous paradise” was not to last.
The Great Oxygenation Event resulted in the appearance of oxygen in the Earth’s atmosphere. It occurred around 2.43 billion years ago. This changed the surface of Earth and caused a profound loss of soluble iron from the upper ocean and surface waters of the planet. A second, more recent “oxygenation event”, the Neoproterozoic, occurred between 800 to 500 million years ago. This raised oxygen concentrations yet higher. As a consequence of these two events, oxygen combined with iron and gigatons of oxidised, insoluble, “ferric iron” (Fe3+) dropped out of ocean waters, becoming unavailable to most lifeforms.
Life had developed – and maintains – an inescapable dependency on iron. The loss of access to soluble iron had major consequences for the evolution of life on Earth. Behaviour that optimised the acquisition and use of iron would have had a clear selective advantage. We can still see this in genetic analysis of infections today: bacterial variants able to efficiently scavenge iron from their hosts do better than less able competitors over a few short generations.
A key weapon in this battle for iron was the “siderophore” – a small molecule produced by many bacteria that captures oxidised iron (Fe3+). Siderophores became spectacularly more useful after oxygenation, enabling organisms to assimilate iron from minerals containing oxidised iron. However, siderophores also helped steal iron from other organisms, including bacteria. This switch in focus, from acquiring iron from the environment to stealing it from other lifeforms, set up a new dynamic of competitive interaction between pathogens and their hosts. Thanks to this process, both parties continually evolved to attack and defend their iron resources. Over millions of years, this powerful competitive drive led to increasingly complex behaviour, resulting in more advanced organisms.
However, other strategies, besides theft, can help deal with the dependency on a sparse nutrient. One such example is symbiotic, cooperative relationships that share resources. Mitochondria are iron-rich, energy-generating machines that were originally bacteria but now reside in our cells. Multiple cells clumping together as complex organisms enable more efficient use of rare nutrients than single-celled organisms, such as bacteria. For example, humans recycle 25 times as much iron per day as we take in from our diet. From an iron-biased view, infection, symbiosis and multicellularity provided different but elegant means for lifeforms to counteract the limitation of iron. The need for iron may have shaped evolution – including life as we know it today.
Earth demonstrates the importance of being ironic. The combination of both an early Earth with biologically accessible iron and the subsequent removal of iron during surface oxidation has provided unique environmental pressures facilitating the evolution of complex life from simpler precursors.
These specific sets of conditions and changes over such long timescales are possibly uncommon on other planets. The likelihood of other advanced lifeforms being found in our cosmic neighbourhood may therefore be low. Yet looking at the iron abundance on other worlds could also help us find such rare worlds.