What development ideas does McKinsey's Lilli case provide for the enterprise AI market?

CN
5 hours ago

Compared to the past monopolistic leaps in computing power and algorithms, when the market shifts its focus to edge computing + small models, it will bring greater market vitality.

Written by: Haotian

McKinsey's Lilli case provides key development insights for the enterprise AI market: the potential market opportunities of edge computing + small models. This AI assistant, which integrates 100,000 internal documents, not only achieved a 70% adoption rate among employees but is also used an average of 17 times per week, a level of product stickiness that is rare in enterprise tools. Below are my thoughts:

1) Data security in enterprises is a pain point: The core knowledge assets accumulated by McKinsey over 100 years and specific data accumulated by some small and medium-sized enterprises have strong data sensitivity and should not be processed on public clouds. Exploring a balance where "data does not leave the local environment, and AI capabilities are not compromised" is an actual market demand. Edge computing is a direction to explore;

2) Specialized small models will replace general large models: What enterprise users need is not a "100 billion parameter, all-purpose" general model, but a specialized assistant that can accurately answer specific domain questions. In contrast, there is a natural contradiction between the generality of large models and their professional depth, and in enterprise scenarios, small models are often more valued;

3) Cost balance of self-built AI infrastructure and API calls: Although the combination of edge computing and small models requires a larger initial investment, the long-term operating costs are significantly reduced. Imagine if the AI large model frequently used by 45,000 employees comes from API calls; this dependency, along with increased usage scale and feedback, will make self-built AI infrastructure a rational choice for medium and large enterprises;

4) New opportunities in the edge hardware market: Large model training relies on high-end GPUs, but edge inference has completely different hardware requirements. Chip manufacturers like Qualcomm and MediaTek are seizing market opportunities with processors optimized for edge AI. As every enterprise aims to create its own "Lilli," edge AI chips designed for low power consumption and high efficiency will become essential infrastructure;

5) The decentralized web3 AI market is also strengthening: Once enterprises' demands for computing power, fine-tuning, and algorithms on small models are stimulated, balancing resource allocation will become an issue. Traditional centralized resource allocation will become problematic, directly creating significant market demand for decentralized web3 AI fine-tuning networks and decentralized computing power service platforms, among others;

While the market is still discussing the boundaries of AGI's general capabilities, it is encouraging to see many enterprise end users already exploring the practical value of AI. Clearly, compared to the past monopolistic leaps in computing power and algorithms, when the market focuses on edge computing + small models, it will bring greater market vitality.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Bitget:注册返10%, 送$100
Ad
Share To
APP

X

Telegram

Facebook

Reddit

CopyLink