The Earth is getting warmer at an alarming and accelerating place. Skeptics may argue it's just a coincidence that the last seven years have been the warmest in our planet's recorded history. Scientific evidence suggests otherwise.
Humanity has already increased our planet's temperature by more than one degree Celsius by burning fossil fuels, heating the atmosphere enough to wreak environmental havoc around the globe. Man-made changes to the Earth's climate are already testing the limits of habitability for millions of people who now face increasingly untenable existences due to heat, drought, or flooding. The collective action we take over the next eight years to reduce our carbon emissions will be decisive in determining how much worse global warming becomes during the rest of this century.
The latest report by the United Nations' Intergovernmental Panel on Climate Change (IPCC) details the dire jeopardy the planet faces unless we enact drastic measures to mitigate climate change and limit global warming to around 1.5 degrees Celsius. Key takeaways from this report include:
In many parts of the globe, our options for adaptation are significantly limited, even as things become more critical. Can computer science help save the planet (and us)?
Computing plays a critical role in mitigating, and adapting to, climate change, and in promoting resiliance. Scientists utilize all areas of computing, including data science and software engineering, to formulate sustainable solutions to combat global warming in the key areas of agriculture, energy, environmental justice, environmental modeling and forecasting, infrastructure, trade, and transportation.
Through computer modeling, simulation, and machine learning, computer scientists in partnership with environmental scientists employ an array of information technology tools at their disposal to help fight climate change, including devices and architectures (e.g., sensor systems for wildfire monitoring), algorithms (to predict climate change impacts and evaluate mitigation), robotics (e.g., use of autonomous drones for environmental monitoring, including groundwater quality), data management systems, energy-aware operating systems, and artificial intelligence (e.g., to identify new materials for use in manufacturing or plan trajectories for agricultural monitoring robots).
For more examples of how computer science can help save the environment, read on.
Computer models represent perhaps the most utilized method of forecasting global climate change. Since the 1980s, computer scientists have processed massive data sets using algorithms to forecast rising global temperatures. The results of these models advance climate research by allowing environmental science to gain a better understanding of the environmental impact of human activity and how it affects the Earth's climate.
Computer modeling also enables scientists to create various feasible scenarios that reflect the changes (and tradeoffs) we must make to support sustainability and reduce greenhouse emissions as they relate to how we grow food, manufacture goods, conduct international trade, transport people, generate green energy and transition to renewables and improve energy efficiency.
According to Keywan Riahi of the International Institute for Applied Systems, he and his colleagues already have modeled several viable tracks to achieving the 2015 Paris Agreement zero-carbon goals. The question remains whether we have the collective will to accept the changes to our way of living and doing business necessary to achieve this goal.
Because quantum computers use qubits to run multidimensional quantum algorithms, quantum computing offers an exponential increase in computer technology and processing capacity. This enables programs to analyze all possible outcomes to a problem simultaneously rather than through sequential yes or no decisions as classical computers do. As well, quantum computers can handle extremely complex problems that can stall supercomputers.
While still in its infancy, quantum computing has the potential to find a host of interventions and solutions that could help us mitigate and better cope with climate change. According to a ZDnet article, quantum computing may enable us "to create more efficient batteries, better materials for solar cells or wind turbines, or even more absorbent catalysts for carbon-capture technologies"—or come up with a way to reduce the energy used to create fertilizer (which currently consumes around 2 percent of global energy).
According to the United Nations, cities (where more than half of the world's population resides) consume around 78 percent of the global energy supply and generate 60 percent of greenhouse gas emissions. In the United States alone, buildings consume 40 percent of the energy generated in the country. To help increase their cities' energy efficiency and inform policy, many municipalities like New York City now require energy benchmarking, a process through which buildings report their energy consumption so that the city can compare a building's actual energy use against a performance baseline.
But how to make sense of all the energy consumption data collected, to confirm which energy efficiency initiatives are effective and which areas need work? Data scientists at Stanford's Urban Informatics Laboratory (UIL) "analyze the relationships between high-dimensional building characteristics and building energy use" and then compare the target building with a local peer building as opposed to a national reference. The UIL data scientists then utilize the Sherlock cluster in the National Science Foundation (NSF)-funded Stanford Research Computing Center (SRCC) to run tens of thousands of simulations that reveal patterns indicating potential energy savings.
Climate change represents an existential dilemma for humanity, so it has unsurprisingly driven numerous scientific studies on its causes and impacts. Sorting through and analyzing this massive amount of data is vital in determining gaps in our climate change knowledge and revealing patterns that were previously undetected. In 2021, climate change researcher Max Callaghan led a team that employed machine learning algorithms to process 100,000 scientific climate change studies to find out how many people in the world are currently adversely affected by the effects of global warming.
What they found and published in their research paper "Machine-learning-based evidence and attribution mapping of 100,000 climate impact studies" published in Nature Climate Change was inequality in climate science—existing climate change studies were twice as likely to focus on richer countries in North America and Europe than poorer nations in Africa and the Pacific.
Of course, this doesn't mean that climate change isn't impacting people in these countries (because reporting shows that it certainly is). Callaghan cautioned that "absence of evidence isn't evidence of absence." But the lack of scientific studies in poorer countries impacts the climate change policies that global leaders propose and implement in these areas. Callaghan hopes that the results of the research he and his colleagues performed will provide a roadmap for policymakers for by which additional climate change research and funds are directed.
Questions or feedback? Email firstname.lastname@example.org