这是一个许多人仍然忽视的巨大的加密牛市案例。 在全球范围内以地理分布的方式聚集闲置计算资源,利用链上协调,并建立一个无许可的市场,为各个垂直领域的AI构建者提供机构级计算,这是一个至关重要的推动力。 在全球GPU短缺日益加剧的情况下,满足新兴的万亿级AI市场对计算的迫切需求(价格非常具有竞争力)。 这确实是加密技术中最引人注目的产品市场契合度之一,对于那些成功捕获显著市场份额的参与者来说,这是一个巨大的货币机会。 研究DeAI。
去中心化的人工智能不是对云的简单调整,而是一次越狱。 连接全球闲置的GPU,成本降低10倍,消除单点故障。 智能应该像比特币一样扩展:无需许可、无国界、不可阻挡。
同时链接回我在这里发布的另一篇关于DeAI的帖子(也涵盖了堆栈的其他几个层次):
DeAI could be the biggest use case + market opportunity our industry has seen to date. We're talking trillions here. Why? Decentralized infrastructure matters across so many layers of the AI stack: - Economic incentive systems for contribution to open-source AI development, e.g. optimizing models on open innovation networks like Bittensor or @tigfoundation or providing (attributable) data to training/fine-tuning data sets on @OpenledgerHQ. - Verifiable inference to create real accountability, fight hallucinations, and enable agentic systems with fully auditable onchain traceability with solutions ranging from purpose-built networks like @Mira_Network to @OpacityNetwork's zkTLS or @wardenprotocol's tooling to build apps leveraging verifiable AI models, among others. - Decentralized training infrastructure that enables large-scale open-source models to be trained collectively, which is pretty much what Bittensor and @tigfoundation are built for, just like @AlloraNetwork (narrowly focused on predictive intelligence), or specialized subnets on Bittensor like @tplr_ai which has already given rise to Templar I. - MPC-based private compute networks that enable confidential model training & inference like @nillionnetwork, @ArciumHQ or @LitProtocol are crucial enablers for any AI-driven use cases that involves sensitive/personal data from sectors like healthcare to insurance and finance, or even (various forms of) personalized AI companions. - Decentralized GPU networks like @ionet that allow for geo-distributed clustering of otherwise idle GPUs globally, democratize access to compute, and improve efficiency in resource allocation, not only providing a cost-effective alternative to centralized providers, but actively addressing the global GPU shortage, and the fact that GPU production is not able to keep up with the rapidly growing demand. - User-owned open-source AI models matter, and ownership + governance across the stack, but especially on the model level should be a core public interest, given the importance and increasing dominance AI displays in our lives. In the centralized AI paradigm, you are the product, while corporate giants that build closed-source models and hoard your data behind their corporate walls, grow increasingly powerful. Meanwhile, all users and integrators are subject to changes the controlling entity chooses to implement on the model. To avoid manipulation, ensure safety, and maximize performance, having powerful, open, and democratically governed models on the one hand, and small, specialized models derived from open source base models for various use cases on the other, will be increasingly important. Both becomes possible with crypto. Certainly a very broad point, but still worth highlighting @NEARProtocol here for its efforts on the user-owned AI front (especially with @near_ai but also given the efforts to build the largest open-source LLM in the market), or @OpenledgerHQ for enabling (no-code) fine-tuning of open-source AI models on attributable and verifiable datasets. - Public blockchains provide the deterministic, programmable, financial infrastructure for AI agents to engage in economic activity, and thrive on. While in the "real world" agents can't sign up for a credit card and might have to wait days for a bank transfer (given they can get access to an account), DeFi offers a vast ecosystem of financial primitives that agents can leverage when provided a wallet to work with. Given its positioning as an AI-native blockchain, its once again worth highlighting @NEARProtocol, which given its natively chain-abstracted accounts, NEAR intents, it's low cost/high performance, and a lot of surrounding infra/tooling for AI builders, is in a great spot to capture a lot of agentic activity. - VLAs (vision-language-action models) lack a Langchain-style framework that empowers autonomous operators to execute complex tasks ranging from desktop based workflow automation to AI-driven gaming companions or even enabling robotics systems to operate seamlessly within complex environments (think within a factory) in a fully traceable, and verifiable manner. Yet, that is exactly what @codecopenflow is building, with a built-in, tokenized marketplace for operators, no-code VLA training, and smart compute aggregation from decentralized networks, empowering builders with seamless tooling and compute access. Meanwhile, each workflow step by operators (that can also run on private and/or proprietary infra) can optionally be recorded on Solana, ensuring full verifiability with an auditable onchain trace. Absolutely huge imo and a crucial enabler across various sectors that are leveraging VLAs (which will soon rise to the same level of prominence LLMs have reached already). Surely there is more, but I think the above pretty impressively showcases already why & how DeAI matters in a market that will be worth trillions as growth in the broader AI industry continues to accelerate rapidly. DYOR anon.
查看原文
6,716
13
本页面内容由第三方提供。除非另有说明,欧易不是所引用文章的作者,也不对此类材料主张任何版权。该内容仅供参考,并不代表欧易观点,不作为任何形式的认可,也不应被视为投资建议或购买或出售数字资产的招揽。在使用生成式人工智能提供摘要或其他信息的情况下,此类人工智能生成的内容可能不准确或不一致。请阅读链接文章,了解更多详情和信息。欧易不对第三方网站上的内容负责。包含稳定币、NFTs 等在内的数字资产涉及较高程度的风险,其价值可能会产生较大波动。请根据自身财务状况,仔细考虑交易或持有数字资产是否适合您。