How DePINs Address AI's GPU Gap and Ethics Problems

With the explosion of generative artificial intelligence projects, computational power has become a hotly contested resource. As AI becomes more ubiquitous and the race for graphic processing unit (GPU) supplies intensifies, the need for wider and more democratized access to computing power has become an urgent priority for non-MAANG companies. Combine this red-hot demand with scarcity that is quickly transforming into resource exclusivity, and the ugly likely result is an AI ecosystem is largely being molded by a small handful of massive tech corporations.

Mark Rydon is the Co-Founder and Head of Strategy at Aethir, a decentralized enterprise-grade cloud computing network. This op-ed is part of CoinDesk's new DePIN Vertical , covering the emerging industry of decentralized physical infrastructure.

If we are to avoid this, the future of AI, and its ethical implications, hinges on the ability to distribute these resources widely rather than relying on a handful of corporations to monopolize this power.

Addressing the Supply Side of Compute Demands

As the demand for computing surges , the current infrastructure struggles to keep pace. As reported in the Washington Post , several states are running short of power. Northern Virginia, for example, needs several large nuclear power plants to serve all the new data centers planned and under construction.

Additionally, the escalating costs of model training raise critical questions about the future of AI development: Where will this necessary computing power come from? China recently announced that it aims to boost its computing capacity by 50% in the next decade and a half, but this avenue won't be available to all.

One way to address this is through a decentralized model.

Decentralized Physical Infrastructure Networks (DePINs) can be used to aggregate underutilized enterprise GPUs and put them to use, redistributing previously inaccessible supply back into the market. They can also help leverage the latent compute capacity in consumer devices, creating a vast, accessible network of GPUs that can be utilized for AI training and other compute-intensive tasks. These approaches democratize the supply and access to computational resources, challenging traditional GPU monopolies and fostering innovation.

Moreover, distributed infrastructure optimizes resource use, ensuring that unused computational power can contribute to significant AI projects. This approach maximizes efficiency and aligns with ESG principles of reducing energy waste and environmental impacts associated with large-scale data centers.

Unlocking New Data Oceans

Not only can DePINs address the supply and resource challenge that drives compute accessibility. They can also help unlock new data oceans that can provide the diverse datasets needed to train more specialized, robust and inclusive AI models. This approach enhances the quality of AI systems and promotes data sovereignty and privacy.

DePINs use blockchain technology and advanced encryption methods to ensure data remains secure and ownership is clearly defined. This decentralized approach broadens the spectrum of information, including that of underrepresented regions and communities, leading to more accurate and inclusive AI models.

Furthermore, DePINs give data owners more control over their information, enhancing privacy while encouraging widespread data sharing. For instance, consider a healthcare scenario where a patient’s data from various hospitals and clinics can be securely shared without compromising on privacy. By leveraging DePINs, researchers can access a rich, diverse dataset that enhances their ability to develop better diagnostic tools and treatment plans. Similarly, in the environmental science field, DePINs can facilitate sharing climate data from various sensors, often located on private homes and properties worldwide, leading to more accurate models and predictions.

It’s also worth noting how the concentration of AI development within a few Big Tech companies raises significant ethical concerns. When advanced AI model training and deployment are monopolized by a few entities, it restricts AI's potential to benefit all. This centralized control can reinforce existing inequalities and curtail the scope of AI's positive impact on society.

The concentration of power can lead to biased AI systems that reflect the perspectives and priorities of a narrow segment of the population, exacerbating social and economic disparities. Such a scenario contradicts AI's democratizing potential, where innovations should ideally serve diverse communities and address a wide range of societal challenges.

Democratizing access to GPU resources is not just an imperative for the industry – it is an ethical necessity. By ensuring that researchers, startups, and innovators worldwide can access the computational power required to develop AI technologies, we can promote a more inclusive and equitable AI landscape. NVIDIA CEO Jensen Huang who coined the term "Sovereign AI," also emphasizes that nations must create AI to ensure cultural preservation. This broader access encourages diverse perspectives in AI development, leading to fairer, more balanced, and more effective AI solutions that can benefit society.

The potential impact of decentralized GPU infrastructure on innovation and research, particularly in emerging markets, cannot be overstated. For instance, our recent collaboration with TensorOpera AI to advance large-scale language model (LLM) training on a decentralized cloud infrastructure showcased the tangible benefits of this approach. By harnessing the power of decentralized GPUs, TensorOpera can now conduct significant LLM training runs without relying on traditional, centralized resources. This democratization of computing power now paves the way for innovative projects and research endeavors previously unattainable due to resource constraints.

Bridging the compute divide

Decentralized GPU infrastructure represents a pivotal step towards bridging the compute divide and democratizing access to AI resources. By distributing computational power more equitably, we can ensure that the benefits of AI are realized by a broader spectrum of society, thereby increasing innovation across the board. This approach addresses the ethical challenges posed by AI monopolies and fosters global innovation and research, particularly in emerging markets.

As we move forward, embracing decentralized models and leveraging latent computational capacities will be crucial in meeting the growing demands of AI development. The future of AI depends on our ability to build a more inclusive, equitable and decentralized computational landscape.

Note: The views expressed in this column are those of the author and do not necessarily reflect those of CoinDesk, Inc. or its owners and affiliates.

Crypto Investing Course

Sign up for Crypto Investing Course, was a weekly newsletter to be a smarter, safer investor in eight weeks.

By clicking ‘Sign Up’, you agree to receive newsletter from CoinDesk as well as other partner offers and accept our terms of services and privacy policy.

Disclosure

Please note that our privacy policy, terms of use, cookies, and do not sell my personal information has been updated.

CoinDesk is an award-winning media outlet that covers the cryptocurrency industry. Its journalists abide by a strict set of editorial policies. In November 2023, CoinDesk was acquired by the Bullish group, owner of Bullish, a regulated, digital assets exchange. The Bullish group is majority-owned by Block.one; both companies have interests in a variety of blockchain and digital asset businesses and significant holdings of digital assets, including bitcoin. CoinDesk operates as an independent subsidiary with an editorial committee to protect journalistic independence. CoinDesk employees, including journalists, may receive options in the Bullish group as part of their compensation.

Mark  Rydon

Mark Rydon is the Co-Founder and Head of Strategy at Aethir, the leading decentralized enterprise-grade cloud computing network.

02.08.2024
views: 830

You may have missed