04 April, 2015

Corporations claim they are 'green', but the world is acutally getting greener


M

any corporations claim they are “green”, but research has shown that the world is actually getting greener.

The research, published in Nature Climate Change, is explored in article on The Conversation entitled: “Despite decades of deforestation, the Earth is getting greener”.

The Nature article shows that the world has actually got greener over the past decade and this is, The Conversation reports: “Despite ongoing deforestation in South America and Southeast Asia, we found that the decline in these regions has been offset by recovering forests outside the tropics, and new growth in the drier savannas and shrublands of Africa and Australia”.

Pointing out that more plants may mean more absorption of carbon dioxide.” If so, this will slow but not stop climate change,” The Conversation points out.

No comments:

Post a Comment