It comes as no surprise that bitcoin mining comes with an environmental cost, but the extent of that cost has recently been exposed. According to research by Elite Fixtures, when Bitcoin was at its peak at $69000, its mining used around 180 terawatt-hours (TWh) of electricity annually. It’s equivalent to the annual electricity consumption of all the data centers combined— and it’s only going to get bigger in the coming years.
Bitcoin was launched in 2009 as open-source software by Satoshi Nakamoto (an anonymous name) and has no real value other than its use as payment. The cryptocurrency doesn’t need to be physically minted. It is not controlled by a centralized bank. Instead, each transaction is verified by computers, or miners, all over the world.
Bitcoin's value has risen drastically, but not without a price tag for our environment. With each bitcoin now worth nearly $24,000 – almost 40 times its value just one year ago – there is increased scrutiny into how cryptocurrency mining impacts power grids worldwide. Bitcoins are created through “mining,” which consists of people using high-powered computers to solve complicated math problems. The point of mining is to ensure that no one person can manipulate Bitcoin transactions because computers must be extremely powerful to do so.
Miners use a lot of energy to keep up with demand. If it trades at a higher price, then there is more mining. Until the prices hover at about $25,200, the consumption is around 180 TWh of electricity annually. By comparison, 100 million homes in America could be powered for one year by that amount of energy, according to the environmental analyst Alex de Vries. And that is only one cryptocurrency; there are many more out there, like Ethereum that use the same energy-inefficient way to validate the transaction.