Entropy, information theory and biodiversity
New entropy-based tools for understanding ecological and genetic diversity
The study of biodiversity, whether in the context of conservation genetics or community ecology, involves two fundamental challenges:
- quantifying diversity at different organisational levels
- predicting how diversity depends on environmental and genetic factors.
Our aim is to exploit entropy-based concepts from physics and information theory to address both of these challenges.
The hierarchal properties of Shannon entropy and the closely-related mutual information make them robust, general measures of diversity within and between organisational levels – from genes to communities. We are using these measures as the basis for making theoretical predictions of genetic diversity based on models of population dynamics, which can be tested experimentally and in the field.
We are also using the principle of Maximum Entropy (MaxEnt) – whose origins go back to Ludwig Boltzmann – to explain and predict the patterns of ecological species diversity from local to global scales. MaxEnt offers a statistical interpretation of these patterns as expressions of the community-level behaviour that can be realised in the greatest number of ways at the individual level under the prevailing environmental constraints. This approach includes neutral theory as a special case.
Further reading
- Dewar RC, Sherwin WB, Thomas E, Holleley CE, Nichols RA. 2011. Predictions of single-nucleotide polymorphism differentiation between two populations in terms of mutual information. Molecular Ecology 20, 3156-3166.
- Sherwin WB. 2010. Entropy and information approaches to genetic diversity and its expression: genomic geography. Entropy 12, 1765-1798.
- Dewar RC, Porté A. 2008. Statistical mechanics unifies different ecological patterns. Journal of Theoretical Biology 251, 389-403.