Thanks to the BBC for a link to this paper describing how Google’s algorithm for ranking web pages could determine what species are most critical for sustaining ecosystems.
The authors write in PLoS Computational Biology that their version of PageRank could ascertain which extinction would likely lead to ecosystem collapse.
Species are embedded in complex networks of relationships. Some more so than others. In those cases, a single extinction can cascade into the loss of many other species.
Figuring this out in advance is supremely difficult. The number of links in even simple ecosystems exceeds the number of atoms in the universe. We can’t sort out that kind of complexity without quantum computers.
But maybe Google can. Researchers Stefano Allesina and Mercedes Pascual reversed the definition of the PageRank algorithm that ranks a webpage as important if important pages point to it. In the conservation biology context, even humble species are important if they point to important species.
The researchers also designed a cyclical element into the foodweb system by including the detritus pool… you know, that to which all returns and that from which all arises.
Allesina and Pascual then tested their method against published foodwebs to rank species according to the damage caused if they were removed from the ecosystem. They also tested algorithms already in use in computational biology to find a solution to the same problem.
The results: PageRank gave them exactly the same solution as the more complicated algorithms.
In the real world, this research will likely make it easier to quickly target conservation efforts for maximum benefit.
Hope evolves in that muddy puddle where technology meets environmentalism.