AI advancements have received global attention, as AI Tools, like Open AI’s ChatGPT, are being utilized by more and more people daily. ChatGPT is an artificial intelligence-based chatbot that runs OpenAI’s large language model GPT-3, the most extensive neural network ever developed with around 175 billion in-built parameters. Chat GPT is used by its users mainly as a conversational tool that generates responses based on the user’s queries. However, its functionalities have also been implemented in generating solutions to complex computer programming problems. When training ChatGPT’s language model, GPT-3, Open AI has developed its comprehensive training method that follows a multi-step process. Here is a brief description of these processes:
Pre-Training: GPT-3 is first pre-trained on a dataset with vast amounts of data from various Internet resources. This process follows a technique known as unsupervised learning, in which a model’s training data contains no specified labels. Instead, the model learns the structures and patterns from the raw (text) data.
Fine-Tuning: After the LLM is trained, it goes through a fine-tuning process in which the model is further trained on a more specific and narrowed-down dataset curated based on the user’s query. This fine-tuning process ensures that the model’s responses are as precise as possible.
The release of ChatGPT has raised environmental concerns by the public, as many sources have reported that the chatbot consumes vast resources to sustain it. ChatGPT is notorious for being very computer-intensive, as the tool requires significant computational power, primarily in the CPU (Central Processing Unit) and the GPU (Graphics Processing Unit). With ChatGPT gaining more popularity and hosting over 100 million users (As of June 2023), it requires advanced supercomputer infrastructure. In addition, when training the GPT-3 model, multiple powerful servers or GPUs are needed to maintain the vast computational workload. It requires considerable energy to power to support all this infrastructure. In addition, large quantities of water, up to 700,000 liters a year, are being used for system cooling, as temperatures of these supercomputers and servers have to be under constant control. To address the public’s concerns about ChatGPT, OpenAI has actively optimized its power consumption and resource usage to make ChatGPT more environmentally sustainable.
In a time of innovation, the adoption of AI technology has grown exponentially. However, as we adapt to more of these new technologies, we must continue to voice our concerns about these innovations in hopes of making them more sustainable.