Skip to content

Are AI and Crypto a sustainable future?

In current practice, Artificial Intelligence and Crypto Mining algorithms are very power hungry.

We just turned on two new reactors at Plant Vogle and 1 Plant Vogle Reactor (1.2 MW) can serve 1.2 million households OR about 6 Data Centers. As our power bills keep rising, and average humans look for ways to lower their energy consumption, will Crypto and AI gobble-up the gains we are making? Fortunately, researchers are working on new, energy-efficient algorithms. On Oct 3, 2024, Vijay Madisetti, Fellow IEEE, AIIA and Xiao Huang, Ph.D., talked about the current state of affairs and the art of the possible. 

Key Takeaways from the event:

Web 3.0 is a convergence of the Internet of Things (IoT) (everything, including your toaster, connected to the web, and tokenized), Blockchain Technology (enabling decentralized data and value transfer), and Advanced Analytics (analyzing large datasets with AI).

Putting everything on the web and providing a secure method for data transfer allows AI to take advantage of millions of pieces of information. Additionally, tokens can represent anything (services and products). These are used in Blockchains to allow peer-to-peer transactions without the need for intermediaries (e.g., banks). This model promotes trust and allows for secure, tamper-proof transactions through hashing.

Current Issues

Power consumption: data centers and crypto mines are energy hogs.

Security vs. speed vs. consumption: Fully decentralized networks (e.g., Bitcoin) are slower but secure. Centralized blockchains like Solana trade some decentralization for higher transaction speeds.

Interoperability: Multiple blockchains developed by different companies (e.g., Walmart, JPMorgan) create challenges in integrating data. Solutions include layer 2 protocols and value token transfer protocols to enhance interoperability.

Public Policy Implications

Policymakers must balance innovation with environmental and economic sustainability, especially given the rising public awareness of energy-related issues. They should consider offering incentives for switching to sustainable algorithms and models, while penalizing energy-heavy models.

Future of AI, Blockchain, and Agents

Agents powered by AI will handle tasks like driving, stock analysis, and personal planning. This trend could lead to job displacement, as agents take over more functions.

Future AI models will focus on combining different approaches (e.g., general and specialized models) to achieve efficiency and accuracy.

FAQs

  • A secure method for validating a transaction using a “token” without the need to exchange sensitive information (such as a credit card number or SSN)
  • A series of “blocks” are created where each is dependent on the last – using math that makes it impossible to create a fraudulent block
  • But it can be used to facilitate any sort of banking, electronic transaction, equipment rentals, etc.
    • All cryptocurrencies are on blockchains, but not all blockchains are for crypto currencies.
  • Legalese
    • ‘Blockchain’ means data that are shared across a network to create a ledger of verified transactions or information among network participants linked using cryptography to maintain the integrity of the ledger and to execute other functions and distributed among network participants in an automated fashion to concurrently update network participants on the state of the ledger and any other functions.
    • ‘Blockchain protocol’ means any executable software deployed to a blockchain composed of source code that is publicly available and accessible, including a smart contract or any network of smart contracts.

Like any technology, blockchain has changed over the years. Each new “jump” in technological capabilities is termed a “generation”

– Blockchain Generations:

  1. First Generation – Focused on cryptocurrency like Bitcoin.
  2. Second Generation – Introduced programmable blockchains (e.g., Ethereum).
  3. Third Generation – Scalable for millions of users with better performance.

– Tokens can represent anything (services, products) and allow peer-to-peer transactions without the need for intermediaries (e.g., banks).

– This model promotes trust and allows for secure, tamper-proof transactions through hashing.

Validation methods

  • Old school – proof of work – new blocks on the chain are created and validated by solving complex algorithmic puzzles. Miners compete to create new blocks and the fastest one wins. Winning gets you paid. In the case of Bitcoin – you are paid in Bitcoin.
  • New school – proof of stake – people become part of a pool of validators by “buying into” it (like a poker “ante”) – new blocks on the chain are created and validated by a randomly selected validator.

– Decentralization of a network means that anyone, anywhere, can contribute to the blockchain.

– Performance is a measure of how quickly a calculation can be done.

– Fully decentralized networks (e.g., Bitcoin) are slower, more energy consumptive, but are ‘unhackable’ and thus can be universally accessible.

– Centralized blockchains like Solana have higher transaction speeds and less energy use, but limit accessibility due to lessened security.

– Blockchains developed by different companies (e.g., Walmart, JPMorgan) are often centralized, but this creates challenges in integrating data.

– A compromise between energy, security, performance, and centralization would be to have decentralized, local, networks operating quickly and energy efficiently. And then performing interoperability checks via something akin to Bitcoin.

  • A currency that relies on blockchain technology and exists outside of traditional governmental authority / regulation / backing
    • Dollar is backed by the US Government. Bitcoin is backed math.
  • Georgia Code § 7-1-680 (2023) – Definitions :: 2023 Georgia Code :: US Codes and Statutes :: US Law :: Justia
    • (30) “Virtual currency” means a digital representation of monetary value that does not have legal tender status as recognized by the United States government. Such term shall not include the software or protocols governing the transfer of the digital representation of monetary value; units of value that are issued in an affinity or rewards program and that cannot be redeemed for money or virtual currencies; or an affinity or rewards program tied to a credit, debit, or prepaid card that awards points for purchases or other transactions, which points can be redeemed for dollar denominated credit to a checking, credit card, or prepaid account, or for a gift card, goods, travel, or services.
  • A large computing center where Bitcoins are mined by performing the Proof of Work validations.
  • legalese
    • ‘Virtual currency mining’ means the use of electricity to power a computer for the purpose of securing a blockchain protocol.
    • ‘Virtual currency mining business’ means a business that uses a group of computers working at a single site that consume more than 1 megawatt of energy for the purpose of generating virtual currency by securing a blockchain protocol.

 

Pros: libertarian currency free from government interference and oversight.

Cons: energy intensive, water intensive, obtrusive noise due to cooling fans. Ponzi scheme style of wealth accumulation.

  • A large computing center that enables cloud storage and cloud computing. Offsite backups, online applications (Microsoft Teams applications, Google Docs/Sheets, Quickbooks, Fortnite, Chat GPT), and token transactions that are not based on crypto mining.
  • Georgia Code § 48-8-3 (2023) – [Effective Until 1/1/2025] Exemptions :: 2023 Georgia Code :: US Codes and Statutes :: US Law :: Justia
    • (ii)“High-technology data center” means a facility, campus of facilities, or array of interconnected facilities in this state that is developed to power, cool, secure, and connect its own equipment or the computer equipment of high-technology data center customers and that has an investment budget plan which meets the high-technology data center minimum investment threshold.

Pros: Distributed computing. Offsite storage protects from individual data loss. Innovative new companies have reduced start-up capital costs.

Cons: energy intensive, water intensive, deep fake videos.

  • Storage is a series of hard disk drives that hold your digital information.
  • Compute is the CPU, where the computer does work such as modeling, Excel calculations, editing a photo, making a video.
  • Storage is relatively cheap and not energy taxing.
  • Computing is expensive and energy intensive.
  • Chat GPT is part of an evolution in modeling – from training models to do specific tasks – to training models that are generic and then asking them to do anything. These new models are called foundational models. The ‘generic’ nature of these models means that you can just a question and get an answer.
  • For example, say you wanted to use a picture of flooding to measure the depth of the water
    • Before – you would have to train a model to pick out the people, and tell it how to use the people to estimate water depth. The model would have to be retrained if you wanted to use cars, or trees, or signposts, or anything else to estimate flood depth.
    • Now – when you can just ask the foundational model – tell me the water height – and it can utilize the person, or tree, or signposts without needing three different training mechanisms.

YES!

Computer algorithms and hardware are always adapting and evolving.

20 years ago, most code did not account for cyber security (because being online was new and no one thought of security before we were all connected), and now it does.

A similar paradigm shift can occur where developers should program with compute efficiency in mind.

Computer hardware and chip design will also evolve to handle common computations in an efficient manner.

As with anything, the default is the quickest and cheapest way to do things. Right now, compute cycles are inexpensive, so there is no pressure to innovate more efficient methodologies.

Watch the full event

That much learning and motion. Learning is a higher order, like a higher-level process. We have adaptive practice, and you can see we have some very sophisticated models, patient training methodologies, and causal analysis. It’s a very popular model used in various applications, particularly for efficiency. And then this example discusses the model back in 1958, where people started to shift towards a model of interconnected systems, the precursor to what Microsoft has been developing.

Oh, can we make some adjustments? It’s not as clear as it was earlier. I’m trying different settings…yes, I know. Oh, that’s…that’s much better. Yes, all right. So, starting from 2013, we have seen rapid growth in artificial intelligence (AI) models. For example, BERT was introduced in 2016, and then Google’s Transformer model architecture became very popular.

 

The first version of GPT was actually based on this Transformer architecture. Then, starting from 2020, we saw the emergence of what is now referred to as LLMs, or large language models. In 2023, GPT-3 was released, and it marked a significant evolution in AI capability. Currently, we’re using GPT-4, which is already considered outdated as the newer versions are being developed, focusing more on automated data processing and optimization.

 

Later versions, like GPT-4.5, are considered upgrades in efficiency. We’re now using GPT-5.0, which is being developed by OpenAI’s collaborative research initiatives.

 

Yes, I just wanted to ask, is the Transformer more like a neural network? Yes, it is, but it has evolved to handle increasingly complex data structures. For instance, the model now uses a hierarchical approach, which allows it to generalize to various classes.

 

In the initial GPT model, for example, we saw about 1.7 billion parameters. Now, with GPT-3, we’re at 175 billion parameters, a thousand-fold increase, and with the latest developments, it’s only going to grow. The challenge is managing the energy consumption needed to train such large models. GPT-3, for instance, consumed about 1,287 megawatt-hours of electricity, equating to approximately 5.2 tons of CO2 emissions, which has significant environmental implications.

 

Microsoft and Amazon are also developing their own foundation models to support efficient data processing. In 2014, data centers in the U.S. alone consumed around 70 billion kilowatt-hours of

electricity, accounting for roughly 1.8% of the nation’s energy use. This number is expected to double, which raises concerns about sustainability.

 

Another type of model, the Geo-Foundation model, is geared toward tasks related to spatial data, such as geographic and satellite imagery analysis. Since the release of GPT models in 2023, we’ve seen Geo-Foundation models trained specifically for tasks like counting objects in images or estimating distances. These are much more specialized but equally capable models.

 

To give you an example, back in the 1960s, during the Cold War, the U.S. used aerial photography to monitor Soviet missile sites, which required manual image interpretation. Today, we have thousands of satellites providing real-time data, but we still face challenges in processing this vast amount of information effectively. Geo-Foundation models help automate and improve the accuracy of spatial analysis tasks.

 

In terms of energy efficiency, a lot of research is focusing on model compression and optimization. Methods like distillation and pruning are becoming popular to reduce model size and energy consumption. Looking forward, I expect that AI models will increasingly use energy-efficient hardware and renewable energy sources to mitigate their environmental impact.

 

Thank you very much.

 

All right, any questions?

 

Yes, is it just satellite images, or can it work with aerial footage as well? Most Geo-Foundation models are designed to work with satellite imagery, but they can be adapted for other sources.

 

Yes, do you foresee environmental regulations impacting this development? Given that models are only getting larger, I anticipate some future regulations around the energy consumption of large-scale models.

 

I like your question about linear growth. The scale of these models doesn’t increase linearly—it’s exponential. While hardware improvements are relatively linear, the models themselves are growing at a much faster rate, which could lead to a significant imbalance.

 

Thank you, great question.

 

Okay, moving on. Our next speaker is Professor Vijay Manasetti, a cybersecurity and privacy expert from Georgia Tech, who will discuss blockchain and AI.

 

Thank you for the introduction. How many of you understand the basics of blockchain? I’ll give a quick overview for those who might not be as familiar. Blockchain, as part of Web 3.0, integrates IoT, blockchain, and advanced analytics. Web 3.0, unlike today’s Web 2.0, enables large-scale decentralized data processing.

 

Initially, blockchain was simply a distributed database. However, as it evolved, it became more than that, supporting complex applications beyond cryptocurrency. With blockchain, we can exchange value directly without intermediaries like banks or governments, creating a trusted, immutable database.

 

Tokenization is a key feature of blockchain, allowing for frictionless exchanges of value. Unlike traditional models, where transactions are managed by banks, blockchain offers a decentralized approach.

 

Blockchain transactions are organized into chains of blocks, with each block containing a “hash” of transactions from a given time period. This makes it almost impossible to alter a block without changing all subsequent blocks, providing security and trust in the system.

 

Most cryptocurrencies rely on proof-of-work (PoW) consensus, which is computationally intensive and consumes significant energy. Due to this, new consensus mechanisms like proof-of-stake (PoS) and proof-of-authority (PoA) are gaining traction. These alternatives are less energy-demanding and are already being implemented by second- and third-generation blockchain platforms.

 

Proof-of-stake, for instance, relies on stakeholders to validate transactions, reducing the need for energy-intensive mining operations. Bitcoin, however, still uses PoW, and many data centers in Georgia and North Carolina support Bitcoin mining, which raises energy consumption concerns.

 

Georgia Power and other utilities have prioritized data center growth, but the environmental impact of such decisions is significant. While options like nuclear and geothermal energy are being considered, the infrastructure shift is slow and expensive. For example, nuclear plants take years to construct and are extremely costly.

 

In Georgia, crypto mines and data centers are often given energy discounts, but the general public ultimately bears the cost. This is why we need to consider both energy efficiency and regulatory frameworks moving forward.

 

So, what can we do about the energy-intensive nature of crypto mining? We could potentially incentivize more sustainable practices, such as charging higher energy rates for PoW-based operations compared to PoS. But there needs to be pressure on companies to adopt more efficient models.

 

There’s also been misinformation about electric vehicle (EV) grids using coal, but grids across the country are diversifying their energy sources and getting cleaner every day. It’s important to be mindful of energy sources and their environmental impact.

 

Finally, let’s discuss how AI and blockchain technologies, like large language models and agents, might continue evolving. Agents powered by LLMs can handle increasingly complex tasks, coordinating information and decisions on behalf of users, potentially transforming entire industries. These developments, however, raise ethical concerns as automation could displace jobs traditionally done by humans.

 

Thank you for listening, and I’d be happy to answer any additional questions.

Was this article helpful?

Science for Georgia is a 501(c)(3). We work to build a bridge between scientists and the public and advocate for the responsible use of science in public policy.

Back To Top