Silicon Valley is fluid, with flowing talent, information, and funds, bringing vitality and innovation, making every day change, and feeling forever young.
By Melissa
This time I stayed in Silicon Valley for about six weeks, arriving in midsummer and leaving just after the beginning of autumn. The California sunshine is always bright, and here at the forefront of technology, there is a surge of AI. Because I hoped to gain a deeper understanding of the development and direction of AI, I met many people (including friends from major companies, entrepreneurs, and investors), participated in online and offline activities, and truly felt the surging momentum at the beginning of this trend. Here, I picked a few waves to share with everyone.
Post-pandemic: Labor shortage and remote work
The pandemic is now a thing of the past, but perhaps because I am new here, I can see the traces of the past three years very clearly. The most profound feeling is the impact of labor shortages and remote work.
Labor shortage
The labor shortage in Silicon Valley is obvious, coupled with recent inflation, resulting in very high labor costs. Once, I ordered Subway delivery through Uber Eats, and the sandwich itself cost $8, but with all the delivery fees, the total came to $17, more than double! I had lived in Seattle for many years and always knew that labor costs in the United States were not lower than in China, but seeing this still surprised me. I learned that a major reason is that during the pandemic, many people resigned or retired early due to concerns about infection. In addition, the government distributed money in the past two years, resulting in many fewer people working. I chatted with an AI education entrepreneur, and he said that there is a severe shortage of teachers. This is a problem facing the entire United States, and I don't know how it will be resolved.
Remote work
The impact of remote work that began during the pandemic is even greater, especially for recent college graduates. Two friends who started their own businesses both mentioned this issue to me. The pandemic required isolation, and college students had no opportunity to intern at companies while in school. After graduation, they worked remotely and had no experience working with colleagues. This led to their not knowing how to collaborate as a team, and it was also very difficult to guide them remotely. They hired graduates from very good schools (including Stanford), and while the students had great potential, they had to let them go because they couldn't collaborate. It's a pity.
Now, major companies are starting to require employees to come to the office, but it has not yet returned to pre-pandemic levels. Years ago, I recruited graduates for the Expedia team, and now one of them is the founder of an AI company. He feels that remote work greatly affects efficiency. During the pandemic, he didn't dare to require employees to come in, fearing that doing so would lead to resignations. Currently, he is watching the rhythm of major companies, and when they clearly require it, he will follow suit. Based on what I have seen at major companies and startups, the number of people coming to the office is still limited. In discussions with friends, everyone's attitudes toward this are not completely consistent. Overall, the larger the team managed by a person, the less satisfied they are with remote work. Everyone feels that things will slowly return to the way they were before, but it's also unlikely to happen overnight.
Here's an interesting observation. Google, Meta, and other major companies are located in Palo Alto and Mountain View, which has led to very high housing prices in the vicinity, while houses further away are much cheaper. Because of remote work not requiring employees to come to the office, housing prices in these more distant areas have also risen significantly over the past two years.
AI Trend: Preliminary Landscape, Very Early Stage
My focus is on AI. Summarizing my observations and judgments about AI in Silicon Valley over the past month.
Large models and GPUs
The industry landscape for large models has been preliminarily determined. Unlike the large-scale battles of hundreds and thousands of models in China, Silicon Valley has several winners in the large model space, with closed-source models mainly from OpenAI and Google, and Anthropic can also be counted, while open-source models include Meta Llama-2, among others. Because the investment in general large models is extremely high, requiring a large amount of manpower, computing power, and funding, the landscape seems to be basically determined, with no new entrants.
GPUs are still in short supply, both for major companies and startups. Everyone is looking for GPUs. A younger colleague at NVIDIA explained the GPU production process to me, starting from the preparation of ore. Hardware is not my focus, so my understanding is limited. It sounds like the shortage of GPUs in the short term is due to the long production cycle, which has led to shortages in the short term, but it should be okay in the long term.
AI is still in a very early stage
Speaking of the current situation in the AI field, a friend who is an investor vividly described it. He said that it's like it's still dark, and everyone is shining a flashlight around, looking for direction. It has not yet reached the situation when mobile internet truly took off. I have talked to many friends, including large model developers, small companies using large models, and startups providing product services around large models. The overall judgment is that the application of large models is still in a very early stage.
One example is quite representative. I have a friend who was previously the VP of Engineering at a very well-known publicly traded company; in recent years, she has started a business, doing something related to e-commerce platforms, with over a hundred employees and investments from several well-known funds in the United States. Her business can make use of large models, and recently she has been exploring how to do so, with two attempts. One is to fine-tune private data on the MosaicML model. The other is to use GPT-4 to put private data in a vector database and use search-retrieval to put the corresponding information into prompts. To her surprise, the results from GPT-4 were better than the fine-tuning. She is very confused and doesn't know how to effectively fine-tune. She doesn't have a clear understanding of what kind of data is needed, how much data to use, and how to fine-tune. In addition, large models are a black box, and she feels that even the people working on large models may not understand this very well. Furthermore, she said that the experience of using MosaicML was not good, but there were no other tools to choose from. Although GPT-4's performance was good, her private data cannot be made public, so it can be used for testing, but not for a formal product. She feels that the technical capabilities of her existing team in this area are limited, and she plans to hire an AI engineer to solve this problem.
I was a little surprised to hear this. Because she is very experienced, and the entire entrepreneurial team has a good background and technical strength. If she is not clear on how to effectively fine-tune, then it's easy to imagine that other companies are in the same situation. Her comparison of results (fine-tuning not as good as GPT-4 search-retrieval) is not an isolated case; I have heard of several similar examples. Another friend's startup provides AI tools and services to large enterprise clients. He said that large models are a new technology for large enterprises, and his clients are just starting to consider it. They are particularly concerned about the accuracy, speed, data quality required, and privacy issues of the models. Clients are also exploring which specific business problems can be solved with AI. He estimates that it will take at least 6-12 months for large enterprises to possibly start using it internally.
It is clear that this round of AI is still in a very early stage, with no killer app seen on the consumer side (except for ChatGPT), and it will take time for it to be implemented in the enterprise space. There should still be a lot of room for development in AI infrastructure and tools. For example, Databricks' $1.3 billion acquisition of MosaicML is aimed at quickly establishing AI capabilities to empower customers.
Here I see two positive pieces of information:
Because it is still in the early stage and the tools are not perfect, large enterprises do not have ready-made technology to use, which leaves space for startups. If large companies can use it right away, they have their own data and scenarios, and the opportunities for startups will be much fewer. This is a point that Howie, a friend of mine in Silicon Valley, mentioned when we discussed it, and I resonate with it.
Large enterprises are eager to use AI, and at least they are very concerned about it. I learned that many companies have set up special budgets for this round of GenAI. Since the money is ready, even if the development is slow in the early stages, the future of AI is still very promising and not likely to cool down easily.
Why does it feel like AI development has slowed down in the past two months?
I don't know how everyone feels, but compared to the beginning of the year, I feel that the pace of AI development has clearly slowed down in the past two or three months. Why is this? Based on my observations, it is roughly as follows:
It is related to OpenAI's strategy. The pace of this wave is mainly led by OpenAI, which has been holding back, releasing the results of the past two to three years (such as GPT-3) in the past two to three months, making people feel overwhelmed. After this period of catching up, Google has become a strong competitor, and now OpenAI is also not daring to release products that are not ready, as it would be more trouble than it's worth. So, there have been no particularly big changes recently, and it may feel slower than before. Actually, I think this is the pace that technology should have; it was never that fast to begin with.
Entrepreneurs are busy building. I gave a lecture in the AI community in Silicon Valley and talked about this issue. The feedback from the community was that at the beginning of the year, entrepreneurs were busy attending various conferences, lectures, and meetups to learn and discuss how GenAI works. Recently, everyone has a basic understanding of large model technology and has started to spend time building their own products. From the outside, it seems that it's not as lively as before.
In the research field, papers are still being published one after another, and there has been no slowdown.
The primary market has indeed slowed down
The overall investment pace in the primary market feels somewhat slower. This is mainly related to the overall environment. People feel that the future economic trend is uncertain, and the Russia-Ukraine war has increased uncertainty, affecting people's confidence in investment. In addition, during the pandemic, the government's large-scale stimulus led to many startup projects being valued very high, and they are still in the process of valuation correction. In this larger context, the primary market in the AI field is actually relatively good. However, because it is still in the very early stages, I have observed that apart from projects that are truly competitive in building large models (including character.ai, which is also building large models), it is not easy for other AI startup projects to raise funds, and many investors are in a wait-and-see mode.
Exploring Major Companies: OpenAI, Google, NVIDIA
In this wave of the AI trend, OpenAI & Microsoft, Google, and NVIDIA have become the trendsetters of the era. Among them, three companies are headquartered in Silicon Valley, and I specifically looked into them and summarized the information I can share.
OpenAI
OpenAI is very concerned about information protection, and its employees are also very sensitive to this. I don't have much information, but a few points left a deep impression on me.
People who have worked with OpenAI all mentioned that their employees are very capable and highly efficient. Their system performance and monitoring are particularly well done, and their engineering capabilities are very strong. Perhaps, the engineering capabilities of their infrastructure—how to use hardware more efficiently and improve performance—are a core barrier for them.
OpenAI is committed to AGI, and I truly realized this after discussing it in detail. They prioritize their work based on whether it can help the development of AGI. If it can better train models and help models learn, they will do it; otherwise, they won't spend effort on it. For example, they previously worked on robots, but they felt that they were greatly constrained by the actual physical world and had limited help for AGI, so they stopped. Based on this, it is highly probable that they will not go into vertical fields.
Before ChatGPT appeared, users had no perception of how effective LLM was. Making users perceive it is very important. In addition to AGI, ChatGPT and API are also the focus of OpenAI.
Previously, Google was relatively slow in advancing AI, and this was related to two things in addition to conflicts with its advertising business. One was a researcher who believed that large models were conscious and was fired. Before that, a black female employee was rejected for publishing a paper and sued Google. These incidents made Google very cautious about AI and slowed down its progress.
Google has always considered itself a leader, until ChatGPT appeared, putting a lot of pressure on Google. In December, the company started a code red internally (highest priority), which is quite rare. Now, the company places a lot of importance on GPT, with a dedicated team working on GPT (DeepMind and Google Brain merged), and encourages other teams to use AI as soon as possible. I have many friends at Google, and in our conversations, they have confidence in Google and feel that at least in this aspect, Google will not fall behind.
NVIDIA
In this wave of LLM, NVIDIA has become the biggest winner. I haven't paid much attention to this company because my experience and interests are in software. This time, I took the opportunity to learn more about it and found it very interesting, so I'll share some insights here.
A One-Person Startup
NVIDIA's style, in summary, is a one-person startup by the founder, Jensen Huang. Friends who work there have a lot of admiration for Jensen, and from what I've heard, Jensen is like a Superman. Jensen has always believed in computing, regardless of the stock price, and has been persistent since 2012, never hesitating. Jensen has a very deep understanding of technology, knows the actual situation of the projects, and is approachable. If there's something that can't be decided, everyone goes to ask Jensen, and he makes decisions quickly and well.
Jensen is very compassionate. For example, when the pandemic first started, the company usually did employee evaluations in September, but he decided to do them early. As a result, the entire company completed evaluations and received raises and bonuses in March, allowing everyone to receive their money early. At the same time, Jensen has insight and a sense of crisis, making him very popular among employees. Even when the stock price was not good before, employees had very high opinions of him.
Emphasis on Technology, Flat Organization
Its company culture is significantly different from other companies I know. As a company of nearly 30,000 people, NVIDIA still does not have people managers. The company emphasizes technical capabilities, and managers at all levels have very strong technical skills.
The organization is flat. It seems that only Jensen has an assistant in the entire company, and no one else does. I asked how they handle team building and such. My friend said the company doesn't have team building activities or Christmas banquets, only company-wide meetings. At these meetings, Jensen speaks off the cuff for two hours, and he's a great storyteller. After he finishes, many employees go up to take photos with him.
NVIDIA Ecosystem
I've heard that NVIDIA has a good ecosystem, so I specifically asked what this refers to, and my friend explained it quite clearly:
Providing comprehensive tools. The chip stack is a deep stack from bottom to top, requiring various supporting tools, including compilers, debuggers, profilers, and more. The needs of developers vary, for example, some want to do deep optimization, so simply packaging the functionality into an API is not enough.
System speed and ease of use.
The company's internal and external communication is very good. For example, the company has a team responsible for communicating with customers, and they also have a very good understanding of internal technology. When customers have any needs, they discuss them directly with the internal development team early on. The same goes for internal communication. The software team closely collaborates with the hardware department, not waiting for the hardware to be ready before developing the software, but interacting and cooperating in a timely manner during the process.
Internationalization of Chinese Enterprises
Changes in US-China relations are closely related to Silicon Valley. This time, I noticed two obvious changes. Entrepreneurs are more focused on market selection, either focusing on the US market or the Chinese market, with very few people balancing both. Some good entrepreneurs and funds from China are also looking for new opportunities here.
How Chinese companies can internationalize is a question of general concern. I attended a closed-door salon over the weekend, and the main topic of discussion was this. I found the guests to be quite representative: there was a Chinese CEO of a publicly traded company doing business in the global market, a fund partner focusing on investing in Chinese companies going global, and an entrepreneur managing teams in both China and the United States, and I was also one of them. Everyone shared a lot of insights. China has advantages in R&D costs, a complete supply chain, internet product operations, and diligence, but going global presents completely different challenges, involving market sales, products, team culture, and management, among other things. One point of consensus among the guests was that for international business, the founder's mindset must first be internationalized.
My reflections are more about what was discussed. I am not unfamiliar with the topic of internationalization; many years ago, we discussed how American companies could expand into China. Now it's the other way around, discussing how Chinese companies can enter the international market. The center of the world is changing, and after years of effort, China has indeed become much stronger, which makes me proud.
The Fluidity of Silicon Valley
I have always been very envious of the talent resources and the atmosphere of free exchange in Silicon Valley. The density of talent here is high; I often find out that someone I'm chatting with is an alumnus of Tsinghua University. In my undergraduate class of 30 people, six are here. I attended a barbecue party organized by a good friend over the weekend, and in casual conversations, I found that several of them are quite impressive. When I asked a little more, I found out that they are successful individuals who keep a low profile.
Because it's Silicon Valley, the entrepreneurial spirit has always been prevalent. Along with it come various lectures, forums, and more. When I first arrived, a friend gave me a Google doc with a long list of AI offline activities in San Francisco, almost every day. It's not convenient for me to go into the city, so I only selectively attended a few times. Later, I searched and found various online webinars and community discussions on topics that interest me. After becoming familiar, I found that there are also many activities in the Bay Area. Whether online or offline, the quality of these activities is generally very good, with core members from major companies or top startups, young entrepreneurs, and a high density of updated and sincere information on cutting-edge technology. I have always enjoyed learning new things, and I am very excited to be here.
Silicon Valley is fluid, with flowing talent, information, and funds. This fluidity brings vitality and innovation, making every day change and feeling forever young.
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。