Dialogue with EdgeX Founder Davy: How Decentralized Edge Computing Networks Promote the Popularization of AI Agents?

CN
链捕手
Follow
1 year ago

Author: Xiyou, ChainCatcher

During the Spring Festival, DeepSeek's stunning debut once again placed AI in the spotlight, impacting the AI industry with its outstanding performance and cost-effective development costs. Currently, how to reduce the operational costs of AI models, improve operational efficiency, and make them more widespread has become a narrative theme in the development of the AI industry.

As early as last year, the decentralized edge AI computing network EdgeX began efforts to lower the barriers to AI operation, aiming to build a foundational network connecting users and AI. Its strength lies in the distributed computing infrastructure, where the operational resources required by AI Agents are provided by users, thus promoting the realization and development of decentralized edge computing.

EdgeX is fully committed to creating a decentralized AI infrastructure platform that integrates distributed computing resources and an AI scheduling management system, building an efficient, secure, and transparent decentralized computing network that supports various AI models to run seamlessly in a distributed environment, promoting the widespread implementation and application of AI technology in edge scenarios.

In simple terms, the EdgeX network gathers computing power, storage, and bandwidth contributed by participants through a decentralized computing framework, forming a global edge computing network that significantly reduces computing costs while allowing any AI model to operate seamlessly and efficiently on edge devices.

Davy, the founder of EdgeX, emphasized multiple times in an interview with ChainCatcher that EdgeX is not just a technology platform, but a practice of a philosophy. We hope to promote the development of decentralized technology by integrating Web3 and AI technologies, allowing AI to truly connect closely with every user.

Currently, EdgeX has successfully launched the hardware product XR7 AI Dual Gateway, which has been successfully delivered and well-received in the South Korean market. Users can join the network and contribute computing power by purchasing this hardware product and deploying hardware nodes, receiving early rewards. At the same time, the beta version of the EdgeX APP has also launched the first phase of testing in South Korea, allowing users to participate in the test network.

It is worth mentioning that founder Davy also revealed to ChainCatcher that EdgeX has successfully obtained early support from two well-known domestic Web3 capital firms and is actively engaged in in-depth discussions with several traditional and Web3 capital firms in North America to explore deep cooperation on key matters such as leading investments. It is expected that specific progress and results of the financing will be gradually announced to the public between the second and third quarters of this year.

The Story Behind the Creation of EdgeX

1. ChainCatcher: As the founder of EdgeX, can you share your experiences in the Web3 and AI industries, as well as the opportunity to start the AI infrastructure project? What are your main responsibilities in the EdgeX project?

Davy: Since 2015, I have been deeply involved in the data center and cloud industry, collaborating with several Web3 companies to build nodes and provide comprehensive infrastructure support, as well as technical support for leading CEX matching engines. Additionally, I have participated in several Silicon Valley projects from product conception to successful listing on exchanges, deeply engaging in key aspects such as infrastructure blueprint planning, product development, daily operations, and market promotion.

In the AI field, I was exposed to machine learning technology early on, especially in storage and computing, and collaborated with several data scientists to develop and design AI applications. Subsequently, I participated in several large model projects in Silicon Valley as a consultant, focusing on optimizing multimodal models, fine-tuning vertical domain models, and efficient training.

In 2024, I observed that the AI field is undergoing a transformation from centralization to decentralization, which coincides with the evolution from Web2 to Web3. As the demand for distributed computing power in AI continues to grow, edge computing, as a key technology, can effectively meet this demand. Based on this, I decided to integrate the advantageous technologies and experiences of Web3 and AI to launch the EdgeX project, focusing on building a decentralized AI infrastructure.

Currently, in the EdgeX project, I am mainly responsible for the technical architecture design of the EdgeX project, designing the intelligent computing power scheduling system to ensure the efficient collaboration of computing resources; building the technical infrastructure of the EdgeX computing network to provide stable and reliable underlying support for AI applications; and optimizing AI applications by gaining insights into the needs of various industries and customizing AI solutions for vertical domains.

2. ChainCatcher: What is the product positioning of EdgeX and the vision you pursue? What pain points in the current market do you want to address?

Davy: EdgeX is committed to creating a decentralized AI infrastructure platform that integrates distributed computing power and intelligent task management to promote the implementation and application of AI technology in edge scenarios. It builds an efficient, secure, and transparent computing platform that allows AI models to run seamlessly in a distributed environment, providing strong underlying technical support for decentralized applications and various scenarios.**

Currently, the AI industry faces many challenges: centralized computing costs remain high, data privacy and security are concerning, and support for edge scenarios is lacking. Specifically, traditional AI models heavily rely on expensive centralized cloud computing resources, leading to high computing costs that limit the innovation pace of small teams and developers; centralized data storage models act like ticking time bombs, constantly threatening data security and privacy; and the operational efficiency of most AI models on edge devices is often unsatisfactory. EdgeX aims to address these issues.

In simple terms, EdgeX significantly reduces computing costs by utilizing distributed computing power, opening the door for innovation for small teams and developers; the distributed computing network supports any AI model to operate seamlessly and efficiently on edge devices, filling the gap in AI operation on edge devices. At the same time, its decentralized infrastructure can provide a higher level of protection for data privacy and security;

EdgeX is not just a technology platform, but a practice of a philosophy. We hope to promote the development of decentralized technology by integrating Web3 and AI technologies, making AI truly benefit every user. At the same time, we are also committed to providing more innovative solutions for developers and enterprises, jointly building an open, shared, prosperous AI ecosystem.

3. ChainCatcher: Can you introduce the composition of the EdgeX team? What unique advantages do you have in the field of AI and Web3 integration?

Davy: The EdgeX team is a diverse and international team composed of outstanding elites in global technology research and development, market operations, and brand promotion. Members are spread across multiple international cities such as Silicon Valley, Singapore, and Taipei, allowing for rapid capture of global market demands and the search for the best partners and resources.

Core team members have held key positions in top global technology companies such as Amazon, Alibaba Cloud, and Tencent, possessing the ability to successfully drive projects from 0 to 1, as well as extensive industry resources to support effective implementation of projects globally.

In the technical field, the EdgeX team has deep expertise in both AI and Web3. Particularly in key areas such as large models, multimodal technology, and decentralized computing power scheduling. Additionally, we have a deep understanding and mastery of core modules such as Web3 token economics and smart contract design, enabling us to confidently address various challenges in the integration of AI and Web3 technologies. Furthermore, the EdgeX team excels in commercialization, with members having rich experience in market promotion, user growth, and supply chain management.

Moreover, we have the support of a top Web3 advisory team, which will provide valuable guidance in product technology, token economics design, market expansion, and strategic planning, providing a solid backing for the rapid development of EdgeX.

4. ChainCatcher: According to EdgeX's official roadmap, the project plans to complete seed round financing between the second and fourth quarters of 2024. What is the current progress of this round of financing?

A: As of now, the EdgeX project has successfully obtained support from two well-known domestic Web3 capital firms. At the same time, we are actively engaged in in-depth discussions with several traditional and Web3 capital firms in North America to explore cooperation on key matters such as leading investments.

According to the established plan, EdgeX is expected to officially announce the specific progress and results of this round of financing between Q2 and Q3 of 2025.

Features and Advantages of EdgeX Products

5. ChainCatcher: What are the specific operational mechanisms, core components, and main functions of the EdgeX network?

Davy: The EdgeX network forms a global edge computing network through a decentralized computing framework that gathers computing power, storage, and bandwidth contributed by participants. Users purchase and deploy EdgeX hardware nodes to participate, and these nodes generate proof of work (PoW) after completing tasks on the network, earning token rewards.

In this process, EdgeX has designed an intelligent task scheduling system. For example, if an AI model needs to run on an edge device, this task will be split and assigned to different nodes to ensure the entire network operates efficiently while maintaining low latency and high concurrency.

The core components of the EdgeX network include: hardware nodes, the EdgeX exclusive operating system, and the AI-Agent system:

Hardware nodes not only support AI model inference but also provide resources such as storage and bandwidth;

The EdgeX operating system runs on hardware nodes, providing optimized computing capabilities for edge scenarios.

The AI-Agent system has core functions that enable distributed AI scheduling, allowing for localized data analysis and inference, and calling high-performance nodes when necessary to enhance task completion.

Additionally, the EdgeX network combines decentralized protocols and distributed storage systems to ensure data security and network stability.

The various components of EdgeX work together to create a decentralized, efficient, and secure computing ecosystem, providing better infrastructure support for AI inference and other AI applications.

6. ChainCatcher: How does EdgeX differ from other decentralized computing DePIN projects on the market, such as Aethir, io.net, Gradient Network, and Theta?

Davy: First, most decentralized computing networks on the market tend to focus on general computing, while EdgeX focuses on the deep integration of edge computing and AI. It particularly emphasizes the optimization of AI inference tasks and resource scheduling, aiming to precisely serve various specific AI application scenarios, which gives it a unique advantage in meeting specific computing power needs.

Secondly, unlike large-scale distributed networks that rely on centralized data centers, EdgeX places greater emphasis on the autonomous computing capabilities of edge nodes, which is key to its suitability for AI inference tasks. Through an intelligent task scheduling system, EdgeX can accurately assign AI tasks to the most suitable edge nodes, significantly reducing latency and improving real-time performance.

In terms of product design, EdgeX combines a software and hardware integrated solution, having launched its own hardware nodes, while most similar computing projects primarily focus on software platforms, which is an important distinguishing feature from other computing projects. EdgeX hardware nodes are equipped with a dedicated operating system and have been deeply optimized for edge computing and AI scenarios. This design not only significantly enhances computing efficiency but also provides users with a more stable and efficient solution.

In terms of token economics, EdgeX combines proof of work and proof of resource mechanisms to incentivize contributors to provide efficient computing and storage resources. This mechanism ensures the rational allocation of network resources and effectively avoids resource waste.

In terms of application scenarios, EdgeX's applications are more extensive, supporting not only general decentralized computing needs but also focusing on multimodal AI tasks, lightweight inference on edge devices, and IoT scenario applications. This diversified technical coverage and practical application allow EdgeX to not be limited to a specific type of task or service, showcasing strong versatility and flexibility.

Achieving diversification in technical coverage and practical applications is not limited to a specific type of task or service.

From this perspective, EdgeX is not only a general decentralized computing network but also an innovative platform focused on edge computing and AI tasks. It brings more possibilities for the integration of AI and Web3.

7. ChainCatcher: What specific application scenarios and products has EdgeX implemented?

Davy: Currently, EdgeX has successfully achieved deep integration and real-time connection between users' physical devices and AI Agents. Users can easily interact with AI Agents through their physical devices, making the Agent a personal intelligent assistant. This transforms the Agent from merely a virtual entity into a smart device that can accurately understand, continuously learn, and execute user commands. EdgeX's Agent can provide localized decision support and flexibly access the required computing and storage resources through EdgeX's distributed network to meet various complex computing needs.

Application scenarios include:

Smart Home: EdgeX's Agent acts as a home assistant, interconnecting with IoT devices in the home, such as analyzing user habits in real-time through edge computing support to intelligently adjust air conditioning, lighting, and other devices while protecting data privacy.

Industrial Automation: In factories or production lines, EdgeX supports edge AI Agents to complete equipment monitoring, fault prediction, and process optimization, reducing latency and improving production efficiency.

Multimodal AI Services: EdgeX's network can support multimodal data processing, including images, videos, and voice. For example, in the medical field, the Agent processes patient data at the edge to provide real-time diagnostic suggestions to doctors.

Education and Training: Through the EdgeX network, AI Agents become learning assistants for students, providing personalized tutoring while protecting data privacy.

Virtual Assistants and Gaming: In gaming or virtual reality applications, Agents utilize EdgeX's distributed computing power to provide real-time environment generation and character interaction support.

As of now, EdgeX has successfully launched a series of physical products, covering hardware nodes and AI Agent devices closely tied to users. These products leverage the advantages of the EdgeX network to ensure efficient configuration and utilization of computing and storage resources, thereby achieving a smooth and seamless interaction experience between users and intelligent devices.

8. ChainCatcher: As a new decentralized AI computing network, what measures does EdgeX take to attract and retain developers?

Davy: EdgeX is committed to building a vibrant and thriving developer ecosystem, not only hoping that developers use our platform but also expecting them to find a sense of belonging and long-term development opportunities here. Currently, EdgeX has implemented several initiatives to help developers quickly get started on the EdgeX platform and gain long-term benefits and development opportunities.

On the technical side, EdgeX provides comprehensive development tools and detailed documentation support, equipped with developer-friendly SDKs, API interfaces, and support for multiple programming languages, along with detailed technical documentation and step-by-step tutorials to assist developers in getting started quickly.

In terms of incentive mechanisms, developers on EdgeX can earn $EX tokens by developing high-quality applications, optimizing network performance, or providing computing resources. Additionally, EdgeX has introduced a revenue-sharing model, allowing developers to directly earn a share of the revenue from user payments for applications deployed on the EdgeX network.

In community building, EdgeX has created an open developer community that encourages experience exchange and idea sharing. The core technical team directly participates in the community, providing technical guidance and support to ensure that developers' questions are addressed and resolved promptly.

Furthermore, EdgeX plans rich growth opportunities for developers, such as regularly hosting hackathons and developer competitions, providing a platform for showcasing. At the same time, EdgeX will help developers expand their user base through its global partnership network, allowing their ideas to reach a broader market.

Application Scenarios of the Governance Token EX and Early User Rewards

9. ChainCatcher: EdgeX has released the economic model of the governance token EX on its official website. What role does EX play in the EdgeX network? What incentive policies are in place for early users?

Davy: The EX token is the core driving force behind the operation of the EdgeX network.

As a governance token, EX grants holders the right to participate in network decision-making, including voting on key matters such as the direction of network development, protocol upgrades, and resource allocation. This decentralized governance model promotes transparency in the network and encourages the community to participate more actively in the ecological construction of EdgeX.

At the same time, EX is also the main medium for economic activities within the network. In the EdgeX ecosystem, users need to use EX to pay for resource invocation fees, such as computing power, storage, and bandwidth services. Node operators, after providing resources, receive EX rewards through proof of work (PoW) or proof of resource (PoR). This mechanism incentivizes more nodes to participate in network operations, ensuring efficient resource utilization.

For early users, EdgeX has launched various incentive measures: Users who deploy hardware nodes early can enjoy higher EX token mining rewards; additionally, early developers who publish high-quality applications or optimize network performance on the EdgeX network can receive allocations from an additional reward pool; there are also plans to launch exclusive token airdrop activities for early users to help them quickly integrate into the EdgeX ecosystem.

Overall, the EX token is not only an incentive tool but also an ecological connector that closely links users, developers, and node operators, jointly promoting the growth and prosperity of the EdgeX network. For early users, they can not only gain economic returns but also participate in network governance, becoming an important part of the ecosystem and sharing in the dividends of EdgeX's development.

10. ChainCatcher: What is the progress of EdgeX product development? How can users participate?

Davy: EdgeX's hardware product, the XR7 AI Dual Gateway, has been successfully delivered in the South Korean market and has received widespread acclaim, marking an important step in global promotion and serving as a significant validation of the actual performance and application value of the EdgeX network.

At the same time, the beta version of the EdgeX APP has launched the first phase of testing in South Korea, focusing on testing network stability and user experience, laying the foundation for subsequent global market expansion.

In the AI Agent field, EdgeX's development team is committed to the continuous optimization of model parameters to achieve significant performance improvements, making the user experience smoother, including faster response times and more precise task processing capabilities.

As for how users can participate in the EdgeX network, currently, South Korean users can join the network by deploying XR7 AI Dual Gateway hardware nodes, contributing resources and completing tasks to earn token rewards. Additionally, users can participate in the beta version of the APP to experience the service and provide feedback.

11. ChainCatcher: EdgeX has revealed that it is in talks with the leading AI Agent product Eliza regarding cooperation details. What are the specific cooperation details? What key roles does EdgeX play in the application of AI Agents? How does it optimize the performance and efficiency of AI Agents through edge computing?

Davy: As a representative product in the AI Agent space, Eliza's smooth interaction capabilities and user experience align very well with EdgeX's decentralized computing power network. EdgeX is committed to integrating a white-label version of Eliza into its network, aiming to enhance Eliza's service efficiency and user experience through this collaboration, achieving rapid response and real-time interaction. The specific cooperation plan between both parties is still being refined.

In the application scenarios of AI Agents, EdgeX provides the underlying computing support and optimization. Through EdgeX, the computing tasks of AI Agents like Eliza can be smoothly transferred to distributed edge nodes for processing. This model allows Eliza to be closer to the user's network location, thereby reducing latency. At the same time, EdgeX's intelligent scheduling mechanism can dynamically assign tasks to the optimal nodes, improving the overall resource utilization and operational efficiency of the system.

EdgeX's edge computing framework optimizes AI Agent performance in the following three aspects, elevating speed, intelligence, and user experience to new heights.

Low Latency: Tasks can be completed at edge nodes near the user without needing to be transmitted to the cloud, significantly reducing data transmission time and enhancing interaction smoothness.

Intelligent Scheduling: EdgeX can analyze the status of each node in real-time and dynamically adjust task assignments based on actual conditions, ensuring rational resource utilization and effectively preventing node overload.

Distributed Computing Collaboration: When a single node cannot handle complex tasks, EdgeX's distributed architecture can quickly call upon multiple nodes to collaborate on processing, ensuring task completion while enhancing overall efficiency.

How to Measure the Reliability of an AI Infrastructure?

12. ChainCatcher: What qualities must an AI Agent infrastructure possess to gain market recognition? Additionally, as an entrepreneur in the DePIN and AI space, what advice do you have for users on how to measure the reliability of a decentralized edge computing AI network?

Davy: The successful construction of an AI Agent infrastructure must revolve around several core qualities, which are also applicable for measuring the reliability of a decentralized edge computing AI network:

1. Assessing Network Performance: First is high performance and low latency, which are the cornerstones of user experience and system practicality. Users expect quick responses when using AI Agents; if task processing speeds are too slow, not only will user experience suffer, but the overall practicality of the system will also be called into question. Next is scalability and flexibility; an excellent infrastructure should be able to flexibly expand as user demands grow and support diverse application scenarios. For a decentralized computing network, users can evaluate the intelligence of task distribution, the efficiency of computing power scheduling, response speed, and processing capabilities, as well as whether it can dynamically allocate computing power based on task complexity and support diverse application scenarios. For example, EdgeX can accurately assign tasks to nodes near users, improving response speed while reducing latency, meeting real-time demands, and dynamically allocating computing power to easily handle multimodal tasks such as images, videos, and voice, adapting to various scenarios from smart homes to industrial applications and even medical assistance.

2. Security and Privacy Protection: As data privacy and security issues become increasingly prominent, users have stricter security requirements for infrastructure. Users should examine whether the corresponding AI network employs reliable encryption protocols and data storage mechanisms to protect data privacy.

  1. Developer Ecosystem and User Community: A strong developer ecosystem and user community are key driving forces for the continuous development of infrastructure. For decentralized AI networks, users should pay attention to whether there is strong developer support, whether new features can be continuously launched or existing services optimized, and the activity level of the user community and ecological construction.

To measure the reliability of a decentralized edge computing AI network, users should also consider the following two dimensions:

Node Stability and Participation: The reliability of the network largely depends on the stability and distribution of its nodes. If nodes are too concentrated or unstable, the network can hardly be considered reliable.

Actual User Experience: This is the most intuitive measure. Users can experience network reliability by actually deploying nodes or running applications, such as whether they encounter technical issues and whether the response meets standards.

In summary, an AI Agent infrastructure or decentralized edge computing AI network that gains market recognition should possess characteristics such as high performance, scalability, security, and a strong developer ecosystem and user community, and further measure its reliability through node stability and participation as well as actual user experience.

13. ChainCatcher: What are your views on the future development of AI Agents? Which specific scenarios do you particularly favor in the integration of cryptographic technology and AI Agents?

Davy: I believe the future of AI Agents will develop towards intelligence, personalization, and collaboration. AI Agents will no longer be just simple task assistants; they will become multimodal intelligent entities that actively learn and adapt to user needs, deeply integrating into people's lives and work, handling complex tasks, and providing emotional experiences in interactions.

From a technical perspective, decentralization and edge computing will be important development trends. Traditional centralized AI architectures face bottlenecks when handling large-scale personalized demands, while distributed networks can provide more flexible computing and storage support, allowing AI Agents to be closer to users. Additionally, multi-Agent collaboration will become the norm; by introducing collaborative mechanisms, different AI Agents can share information and divide tasks to achieve more complex goals. For example, in a smart city, AI Agents in various fields such as transportation, energy, and security can work together to provide overall optimization solutions for city management.

Regarding the integration of cryptography and AI application scenarios, I personally favor:

1. Personalized Services and Privacy Protection: When AI Agents provide personalized services, they can use cryptographic technology to protect users' sensitive data. For example, in the healthcare field, AI Agents can offer personalized health advice while ensuring that medical data privacy is not compromised.

2. Distributed Collaboration and Incentive Mechanisms: In decentralized networks, multiple AI Agents can achieve trustworthy collaboration and division of labor through blockchain technology. Cryptographic technology can support transparent settlement and incentive distribution after task completion via smart contracts.

3. Decentralized Market and AI Service Transactions: Building a decentralized AI service market allows users to interact directly with AI Agents and pay fees, applicable in fields such as education, consulting, and design.

4. Multi-Party Computation and Federated Learning: During the AI model training process, cryptographic technology can enable secure data sharing among different parties. For instance, multiple organizations can jointly train AI models without exposing their raw data, thereby enhancing model performance while protecting data privacy.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Share To
APP

X

Telegram

Facebook

Reddit

CopyLink