Today I spent some time reading the Mary Meeker report on AI. I think the report puts a lot of data together and come up with some fascinating results. There are several key points that I feel important.
Still early days in AI revolution. the end game is not coming. Likely someone will fail. In fact, I feel uneasy about our position in Google. I fear the traditional way of getting information and get things done will be challenged in a fundamental way. Google is at a very awkward position. The only bright spot is that Gemini and TPU are helping Google save some face. But still, I feel the technology barrier is far from firm.
User growth and usage growth are unprecedented. It's amazing to see that the services are gaining traction all over the internet. The old paybook that Internet days followed may continue to work, that is to play the scale game and burn some cash before the economy makes sense. I feel in that regard, OpenAI is at a terrific place.
Still feel that the revolution in computer technology will lead to even more glory to NVIDIA. The investment logic remains intact. After the report, people should realize that the investments in AI space won't stop at the current scale. More capex and R&D dollars will be spent as there are several technology companies competing for the future. This is THE most important game at the moment. Not to mention the technology competition between the nations.
AI agent is the next topic that we need to be laser focus on. So far, it seems that only coding has proven its usefulness. Clearly companies can see the boost of productivity in adopting AI. But the next step could be a lot of more automation in various applications. I still need to find a way to study more about the players in this arena.
MCP technology is just a protocol. I spent quite some time on it. I believe the protocol will become popular in the space, mostly because it's easy to adopt. But the impact would be very limited. As it is only a middleware software - useful, but not much more.
What's the implication for technology investments?
Continue to focus on leading players that enjoy dominating positions. So far, let's continue to focus on AI hardware.
Pay a lot more attention on analyzing new business models related to AI agent or other AI applications. Pay special attention to rate of adoption. Financial performance at this time is a secondary consideration.
Supply chain and AI hardware components are less important to me. Initially it may present some explosive opportunities, but over time, it's hard to get more revolutionary ideas in upgrading the AI system.
Stay open minded. As the report shows, this is like the early days of the internet bubble age. The real winners have not surfaced yet. It could be one of them. We need bet on some when the time is coming.
Overall, I feel we should remain cool-headed and wait for more golden opportunities to come.
There are several topics that I want to spend more time:
Several AI related companies, including AI cloud companies like CoreWeave, software companies like ServiceNow.
Crypto currency and stability coins.
AI agents, including related technology and major players, or is it just a hype?
Super Micro. The company seems a bit shady, being a computer manufacturing company with relatively thin margin. But it made it with great growth trajectory and competitive stance in the market. With hundreds of billions of more business to come in the AI server market, Super Micro could become a juggernaut just like Dell and HP.
Don't forget, the key purpose is to make money. So the focus is still to find better investment targets. At the same time, try to create something.
Technology + Manufacturing: I feel I have been looking at a lot of Asian companies that are high-tech manufacturing companies. These companies are facing a multitude of risk factors, from both embedded cyclicality in the tech industry and heavy capital requirements typical in the manufacturing space. They are often stuck there. There are so many value traps if we just look at P/E multiples to determine its valuation. It would be vastly misleading.
An interesting page from the report:
Technology disruption has a long-repeating rhythm: early euphoria, break-neck capital formation,
bruising competition, and –eventually –clear-cut winners and losers.
Alasdair Nairn’s ‘Engines That Move Markets’ (link here) distills two centuries of such cycles,
and his observations are prescient for today’s AI boom.
Highlights of his observations follow…
There were several years of strong share-price growth when the railways were supplanting canals.
The bubble of the 1840s deflated under the weight of overheated expectations and changing economic conditions…
…Any technological advance which requires huge capital expenditure always runs a real risk of disappointing returns
in the early years, even if it is ultimately successful...
…Any technology that necessitates heavy capital expenditure and requires returns to be earned
over an extended period is always going to be a high-risk undertaking –
unless, that is, there is some form of protection against competition...
…The winners of these competitive struggles are not always those who have the best technology,
but those who can most clearly see the way that an industry or market is likely to develop……One of the clearest lessons of corporate and investment history is that without some barrier to entry,
first-mover advantage can be swiftly lost…
…A theme that recurs throughout this research is that while identifying the winners from any new technology
is often perilous and difficult, it is almost invariably simpler to identify who the ‘losers’ are going to be.
Another interesting question is the other part of the equation, which is the data part. Is the training more important or the datasets more important? Some interesting names include Scale AI and VAST Data. NVIDIA is building the brains for AI, while these companies are feeding the data.
Today I'm working on the NVLink vs UALink topic. It turns out that UALink was an open standard promoted by NVIDIA's component. I gave the topic to AI engine to analyze. While AI's report takes a rather neutral opinion, I'm inclined to believe that NVLink will continue to dominate. Where do I get the confidence? It's because NVLink is from NVIDIA. It's not an average company; it's a company coming from fierce competition, rising from low-margin cut-throat market. It's a company led by a Chinese American CEO and backed by the Asian supply chain. It's a formidable opponent. Given the various factors, I have to assume that NVIDIA can continue lead in the competition. After, UALink is still only on paper. Opponents still need to put everything together and make the whole solution attractive with better performance/$ than NVIDIA counterparts. It's near mission impossible to me.
The other part of the day, I was working on the agentic AI topic. There are several leading companies working in the fields from different angles. To me, Salesforce, SAP and ServiceNow are clearly leading the pack with their out-of-box solutions. I think these applications have a lot of potential. I agree with what the AI report is saying. End users will start with out-of-box solutions, often from the same software vendors, and then gradually start to customize with more configurations. Eventually, the enterprises may want to work on their own versions of agents. After the research, I feel that the overall market demand is not coming yet. These applications still feel primitive to me. It's like the early days of Internet. There are a lot of potentials, but the real tide has not come yet. That gives me a feeling that the computing power demand from these agentic AI applications is still not strong yet. We need to talk with people in the industry to understand a bit more about what is really going on.
Unfortunately, I haven't spent much time looking at any interesting business today. I feel some interesting companies could be 1) ServiceNow, 2) Salesforce, 3) SAP. All three are leading in the general agentic AI competition. It's still the early days, but these incumbents already show their willingness to reinvent themselves.
More importantly, I need to look for potential new business models and rising companies. We should pay special attention to small companies with new business models. It's true these companies won't have much margin of safety and it's hard to determine their intrinsic value. But the bet can be attractive enough and they may generate outsized return for brave investors. This time could be different for certain companies.
Yesterday I studied the NVLink Fusion. Now I can see my logic may have some problem.
Here is the breakdown of the BOM cost for a NVL72 server rack. NVLink swiches are only $3k-4k per rack, but it's the backbone of the system. If customers can use NVLink switches to reconfigure their own system, that would open doors for many wonderful things. But not everything is that easy. The end users may need to reinvent a lot of things in the process, figuring out how to put the system together, how to verify the different components, design and build the rack with technology partners. None of the above tasks are easy and straight forward.
Major line-item Qty in NVL72 Unit-cost assumption Extended cost
Blackwell B200 GPUs 72 ≈ US $35 000 US $2 520 000
Grace CPUs 36 (1 per GB200) ≈ US $1 000 US $ 36 000
NVLink 5 switch ASICs 9 trays × 2 chips ≈ US $ 3 750 US $ 67 500
Liquid cooling (cold plates, manifolds, CDU, pump) — — US $ 70 000
Power supplies & bus bars — — US $ 30 000
Misc. chassis, cables, backplanes, fans, control logic — — US $ 17 000
Total rack BOM ≈ US $ 2 740 500
Source: HSBC Global Research “NVL36/NVL72 server rack pricing power” note, 10 May 2024
Net-net, the end conclusion may continue to hold, that is, NVIDIA is still the dominating player in the town at the moment, because it enjoys the first mover advantage and dominates the technology roadmap. Too many things depend on it to perform and hardware is only a small part of the overall puzzel. It's questionable that the CSPs and enterprises will have the true incentive to invest in building the hardware. After all, the economics of the software and the real world businesses are more attractive for most of them.
I don't know why I missed the entry yesterday, but there is no excuse.
I'm feeling not well today, and my concern is Myocarditis. I asked Ariel to help schedule a doctor appointment this afternoon. The symptom may be very straightforward - persistent mild, dull discomfort, which become more intense today.
I had a meeting with boss today and talked about the AI related research I have done. I don't believe these are very relevant. To me, a lot of these study seems very shallow - you don't need to understand very deep, but you certainly need to spend time reading some news and get the basic understanding correct.
After all, making investments is all about getting the basic understanding correct. What are the topics to look?
1) Major announcements from leading tech companies, Google, Meta, OpenAI, Anthropic, etc.
2) Major cloud computing technology companies.
3) Major enterprise software companies.
Of course, the largest markets are those related to consumer facing technologies. If another Google or Apple rises from the current AI revolution, it's going to show somewhere now. The key is to figure out which company would be the winner.
Some remaining topics for this week:
Energy and AI report
NVLink and the plumbing
Google TPU vs GPU systems
OpenAI law suits
China rare earth
Other topics: Manus, Google IO conference, Google AI mode, Apple developer conference, etc.
Arrange some meetings before China tour if any.
We need to do a deep research on Super Micro. Why is it interesting:
Small company with 25 bn market cap.
big industry: computer industry is huge. AI server and industrial company market is large.
Chinese management, US company, Chinese + American culture, Taiwan/Asia supply chain.
I finished reading a report on energy and AI. Most of the report talked about how to meet the energy demand of building AI data centers. Naturally it contains a lot of interesting data points. Several key take-aways:
Roughly speaking, building a leading-edge AI model would cost about half Gigawatts-hours of energy. It's amazing to see the scale of the energy usage. Using the model for inference would requires an almost equal amount of energy.
It takes years for planning a data center. Due to congestion, the process can be further delayed.
US is the about half of the world's data center capacity and China is the second.
Even with rapid growth from AI data centers, by 2030, majority of the demand growth is not from data centers. Industry demand still dominates the demand growth in the world market.
A details breakdown shows that in addition to GPU racks, other parts including switches, power generation, cooling systems and other infrastructure can add up to account for a significant part of the power usage.
Other parts of the report are uninteresting to me: how to use AI for energy industry, etc.
I also need to do some company research this week. I plan to do more on CoreWeave. I need to finish ASAP. From now on, it's important that I come up with an opinion from early on. The current wave of AI build-up has not ended yet. It's a pivot from value investing to technology investing. The key is to ride the waves. Another trendy company that I'm really interested in is Super Micro. I think it has a lot of potential.
For large sums of money, it's impractical to ride the waves and to explore the market inefficiencies in an effective way. But for smaller sums of money, individuals would have the luxury to use a more aggressive strategy that the big funds can't utilize.
This week's plan:
Study Energy and AI report (almost done)
OpenAI law suits follow-up
China rare earth industry study
Apple developer conference
Company research: Super Micro
Company research: CoreWeave
Portofolio company model update
Meta invests in Scale AI; what is Scale AI doing?
In the long term, I plan to study a bit more about some interesting books in the investments. Also, network more in the space. There is a remote possibility that I may team up with someone who is good at quantitative work, and come up with an algorithmic way to trade the stocks that I like. Basically, combine what I learn from value investing with more quantitative basely ideas to seek optimized results.
To-do next week:
Comprehensive research on OpenAI, Mistral, Anthropic, xAI, including key members.
Major Chinese AI names.
Today I'm doing research on Scale AI. There are some interesting findings related to the work.
When I'm reading the ChatGPT report on the company, I came across the website called Contrary Research. It's a website for sharing research and deep dive on private companies. The content is very high-quality. (Link) I think some of the content is a great starting point for my own research. For the case of Scale AI, I appreciate the fact that it's listing out the product line-ups very clearly, with screenshots for each. One missing piece is a thorough analysis of the competitive landscape. I want to find out the company's market share, which is not mentioned in the report. I would assume that Scale AI has a rather high market share, but this remains to be verified.
Another source I can think of to verify this number is Tegus. I feel there should be some interview related to this company.
Given the fact that Boss is asking me to cover a wider range of companies in the AI space, that actually gives me a lot of freedom to study a range of different companies. I should utilize this time wisely to explore a larger investment space. No longer confined to value hunting in the technology or manufacturing sectors, I can now look at whatever companies are making waves in the AI and technology space. That's actually a boon to my experience.
Appen, a competitor to Scale AI, is a listed company in Australia. It seems the company faced two years of declining business, and lost contracts from Google. The company's experience may imply that data annotation business is labor intensive, low margin, and cyclical in nature. I need to study a bit more into its recent financial results.
I feel this is a dead end. This type of companies may not enough economic moat? I need to dig more into the Tegus interview to understand better.
I studied a report from SemiAnalysis.com about robotics. (link) I found this report touched some interesting aspects of the industry. At one hand, China has installed about half of the world's new industrial robots every year. At the other hand, China's KUKA only has a market share of 11.5% of the market as of 2023. What's the meaning behind that? Will China continue to rise to the top?
The next front of the competition could be general purpose robots. Cobots (robots working together with human) and humanoid robots are examples of what are coming in future. In these newer fields, Chinese players are showing more promises and gaining share.
A more interesting question is what's the implication for our investment? I don't think we can really easily exploit market opportunities. I checked some of the Japanese companies mentioned in the report. All of them are operating in some niche markets, some unprofitable at the moment. Compared to leading companies, these companies seem lack of interesting points.
I was asked to ALL IN on AI. It's a high stake game. At this moment, NVIDIA is already dominating the market. I plan to prepare a model of the company.
Remaining work for the week:
NVIDIA model
Portfolio company model update
China rare earth industry study
Company research: Super Micro
Company research: CoreWeave
Following Jensen's Paris trip, there are so many things to learn. He also talked about "inflection point" for quantum computing. I am not a fan of quantum computing, and I cannot fancy how that can be added to the existing computing architecture to help solve future problem. It might serve as some kind of accelerators.
I feel NVIDIA is more than a hardware company. It tries to get into all areas that are growing. If the customers want some direct support for AI-related middleware, Nvidia will provide that support through its microservices offerings. I also need to learn more about these microservices. How big is the market opportunity here? What are the customer feedback on these technologies?
NVIDIA model
Portfolio company model update
China rare earth industry study
Future work:
Company research: Super Micro
Company research: CoreWeave
Quantum computing research
Nvidia microservice solutions
I'm not sure why I was asked to build the model. Maybe we have got a position in the stock.
This weekend is Father's day. It's about 3rd day since Grandma left for China. We already feel we have spent tons of time. Both of us are not that nature with all the housework. We have always had Grandmas to come to help us, so it's a bit hard for us to get back to do the work ourselves. Plus, Ariel is super busy and on call during the week. It's tough to say the least.
Both kids joined the XF invitational tournament. Both teams have failed miserably. B15 A team lost all three games, scores at 3:5, 0:3, and 1:5. B17 B team lost two games and tied one, scores at 1:11, 1:1, and 1:3. Kyle scored the goal for the last game. Jace was having a bad cold. So his condition is not at the best level. He also got some minor wounds, though there is no cut. Kyle was improving during the three games. Yesterday I blamed both kids for the poor performance. But I shouldn't. After all, it's team sports. The whole team has problems, not just them.
Both Jace and Kyle made me a holiday card. I'm so happy. Jace gave an unlimited hug-me coupon. Kyle made me one at school, but at the moment he couldn't locate it! I will ask him to find out tomorrow.
Today I'm building the financial model for a company and I realize I haven't been building any model from a long time. I was hesitating to build it quickly in a scrappy way, or build a model using Capital IQ formulas to facilitate future modeling work. I decided to build a quick one first, and then focus on constructing a CIQ enabled template for my future work.
I briefly looked at Vertiv, and realize the company had so much growth during the past two years. It's revealing a critical question; if we don't build models to force us think about future growth, we may simply miss the opportunity to ask the proper questions.
In light of that, the financial models should be quick and straightforward. It should enable us to ask questions. It need not to be very comprehensive in the first time. Many of the key financial items should be pulled directly from the CIQ database, instead of looking up the original documents. It should enable us for more efficient workflow, instead of slowing us down for tedious and repetitive work. I will keep that in mind.
This week is the first week that Ariel and I are managing the home by ourselves. It's a test for us. Last weekend was not a success; Ariel was stuck with all things and could spare time to watch the kids' soccer games. What a pity!
Remember the meaning of life is beyond making money. It must lie somewhere else. It's sports, music, scientific research, literature and many other things. Summer is the perfect time to explore these new areas together with family. Don't waste the beautiful time!
I was asked to asked to present in Jace's class and wrap up their year-long stock investment program. I'm so excited! But at the time, I can feel the pressure. I would like to prepare for some questions to start the short session. It should be informative and fun. I will also bring some extra gifts for the kids for their efforts.
I just did the model for NVIDIA. It turns out the revenue and net income numbers are incrediblely high for 2029. It seems unreal to me. While it's possible for the global data center market to exceed 1 trillion dollar in 2029, it's just hard to justify that NVIDIA can take it all. I will need to think more on these numbers.
I was sick for several days, and skipped the diary altogether. Today I'm feeling better now, but I'm still recovering. A lot of things happened during the past several days. The 40-year old birthday is a bit simple, but still warm. We celebrated at home, and Ariel prepared the meal. The kids had to go for soccer training, so we didn't sit down until near 8pm. And then Kyle was upset and cried because he couldn't make the birthday card ready for me. He was afraid that we are disappointed by him. Eventually we didn't manage to get a picture taken. After the birthday cake, it's already close to 10pm. What a day.
Thursday I called for sick leave and actually spent half day sleeping. The kids went for the soccer clinic training for the morning, and we went to the Sounders vs Atletico Madrid game at Lumen Field in the afternoon. It's an awesome experience. Most kids from the soccer team came. The whole home game experience is total new to us. This reminds me one thing: there may be no need for another game at London. But, at the same time, the whole soccer experience could be a very interesting experience. Maybe we should do it!
Friday I tried to work at home, but productivity was really low. Didn't achieve much. This week I managed to finish the NVIDIA model, which is the only thing I have accomplished.
Today we had a lot of fun at home. Basically the weather is poor, and we all took some time just to rest. We played a lot of chess, and also played card game. I signed Kyle for Grand Knight Chess Academy. I'm sure he will enjoy the experience there.
Today is a very happy day, full of exercise (kids only), some gaming time, reading time, poker games, and good meals (thanks to Ariel!). I read some pages from the chess kid magazine. Read some news about work. Overall feel very fruitful and relaxed. What's missing is some conversation with friends. I did call Shi Guangjian, who has birthday today in China's time. It's been several years since we last met.
Both kids are growing in the way that I like. After spending time at Jace's classroom, I feel especially encouraged to learn that he is happy with the environment. I believe the public school can be a good alternative for him for the moment. On the other hand, it's worth noting that public schools do have a lower standard on a lot of things. Parents need to pay extra attention. That may not be a bad thing. We were used to the private school setting, until we realized it's still our job to maintain a high standard for our kids.
I would encourage them to explore more areas in life. What kinds of citizens and employees and creators that the future needs? The people that are able to mix all their skills and emotions together, to solve problems and to influence others. I think these are the people that the society demands.
There are always all kinds of problems that the society needs people to solve. There are teams of entrepreneurs who would work on those. Be prepared for these challenges. But start humble from little things. Equip themselves with enough technical skills and smart minds. At the same time equip with open minds. That's the way to go.
This week I plan to do more research at work and more family planning. Don't stress out at the same time. Make everything happen just like life flows by...
Some of the things to research on: 1) embodied AI, 2) optics in AI data centers, 3) financial modeling of Broadcom, 4) mid-year estimates and review.
Some of the house planning work: 1) kids summer camp, 2) flights to and back from San Diego, 3) UK trip hotels and all kinds of tickets.
Since the chance of getting the Chelsea game is low, we might think of going to another game instead. I will ask Ariel for her opinion. I think we should use back up plans to prepare for different scenarios.
So I did a lot of family planning work yesterday and got really tired. I want to do better to have more quality time with the kids.
One thing we can think of is a pet? I think this may be a good time to introduce a pet to the house, given the kids are older now.
Some of the things I need to work on include:
1. review the video from AI Startup School last week. It's extremely valuable. I particularly like the one by Andrej Karpathy. Make a compilation of the videos. And be ready to talk about them.
2. FIND some STOCKs! I need to find names to study. I want to get my hands on some interesting companies. Still, I want to start with Broadcom. Make a quick and dirty financial model of the company.
It's good news that the apartment gets sold in Nanjing. The price is about 2 million RMB lower than several years ago, ~300k USD lower. Also the opportunity cost is huge. We could have made a lot of money if we invest in the stock market in recent years.
What's considered a good quick and dirty financial model? What pages should it include? I think we should re-think what template we should be using and minimize it to the extreme. The current version I'm using is following my old team's template, slightly reformatted for the cover page, but leaving everything else almost intact. What we need is a newer version, which including just the cover page? Maybe we only need one page. That's enough.
The purpose of the model is to help with further investigation. It should also serve as a starting point for forecast. It should help with segment analysis of the business operations. Everything should be part of the model. There is simply no need for very complicated pages. We should try to reinvent everything to serve my current workflow.
Eventually, I should try to use AI to come up with something. It should be doable.
Or, I should go with the old way. That is to create the template, and generate the one page report like before. That way I can quickly update the tables and generate the reports for me to scan and study quickly. That may be the way to go.
Y told me that European stocks are rising this year, outperforming most markets. Should I spend more time studying stocks in that area? I think not at this moment. We should focus on what we know the best, and continue to mill more from the existing cards that we have on hand. Try to dig deeper in the technology space. After all, this market has times and again shown great potential to generate outsized return for knowledgeable and brave investors.
I stayed at home today and spent quite some time seeking a new office chair and submitting global entry applications for kids.
I plan to look at some AI related development news, and try to understand more about the new hypes from all kinds of technology advancement. From the UBS report that I read today, it seems that the enterprise market is not coming yet. I need to check more on that. Overall, copilot, ServiceNow, Adobe, etc. haven't reported strong results yet. The initial adoption concentrates in the consumer space, where adoption feels natural. There are still hurdles related to data, reliability issues for the enterprise market. Yet for consumer market, it's less an issue.
From the AI related development side, there are several moving parts to study. One is the technology. One is what the other companies are doing. One is the major startups and new business models. I feel the last one is probably more important to our work. I don't want to spend a large amount of time on learning the technology. It's a bit time consuming and useless. Not all are relavant. But I feel Andrej Karpathy's talk highly insightful. This is the beginning of the new era for software engineering. It's just different.
Several new ideas:
AI is still in early stage. Especially for enterprise applications, it's nothing compared to today's software market.
Silicon Valley saw the growth of dozens of SaaS unicorns. AI will be the next wave.
Workflows need to be redesigned around new AI tools. So it takes time.
Training is still much of the GPU workload.
Inference will need more tokens with more think time.
GPU will continue to dominate in next several years, but custom ASIC should grow faster.
Today I continue to work from home. I was looking at the AI data center build-up, and studied a bit about the OpenAI Stargate project. I listened to the company sharing by Frank. In the afternoon, I went to the airport and pick up mom and dad. They are very good spirit. The apartment got sold eventually, and mom feels so relief so she won't lose sleep on it any more. I plan to take the kids away from home for most of the weekend, so both of them can get plenty of rest when they are adjusting for jet lag.
In terms of AI technology, I think agentic AI is a huge topic and there are different ways to implement the system. I should spend some time learning the basics of the agentic AI technology. Allow myself some time on it. I need a bit more time on this topic.
The kids went to Aiden's house, so I brought Ariel out for half-day shopping. I talked about our investment portfolio. I said I aim to achieving 10 times return in 10 years time. That's my current ambition. It seems too aggressive, but from what I have observed over the past few years, it's highly likely we can achieve that.
I was meant to become an engineer, yet somehow I was pushed to become an analyst in a hedge fund. 10 years ago, when I was having lunch with Sunyoung and her husband Steve, I said my dream would be to become a hedge fund analyst. 10 years later, I became one. Now it's time to think about our new dream. It's to become really rich and become a successful hedge fund manager.
How to get there? There are several elements that I need to work on:
Networking. I need to learn some people who are also in the field and learn from them. Also try to find business partners and build up personal credibility.
Successful track record. Try to achieve 10 times growth in 10 years time, including our own salaries and other means. That's likely achievable.
Knowledge about everything related. About a lot more companies, about the investment world, about the way the business is run.
Now that I start to think of it, I think it might be a good idea to move our home to the bay area. I know the tax burden is real. But the access to a lot of people and technology is very appealing. But, can we do the same job even if we are in Seattle? Maybe we can.
Today, our company had a BBQ party in a park on Mercer Island. When playing the Paddleboard, I fell into water with Ariel. It's super fun! I was not wearing a life jacket, but I didn't have much difficulty getting back on board. It's a such interesting experience. It makes me feel living here has its charm. Maybe we should stay here.
I'm writing the half-year review. For the first half, I generally feel it's a success. I learned a lot of new technologies, studied a number of companies, and talked about these ideas to the company. Personally, we made a lot of money. One mistake in the personal investment is that I was keeping some leverage before the April market crash. That led to significate decline in net asset value. From this mistake, I think I learned something, which is to reduce the leverage when the market is performing strongly. I'd rather keep some ammunition just in case the market crashes again.
Here are the reflection on the first half year work:
**Guiding Principles (set at the end of last year) and Ongoing Priorities**
**Focus intensely and build knowledge for the long run;** aim to raise the firm’s expertise to the industry’s highest level.
**Sharpen market awareness and response speed** and report promptly.
**Cultivate relationships around you;** give more than you take and reduce mental friction.
**Prioritize U.S. travel** to boost efficiency—attend industry conferences rather than investor events; visit the Bay Area monthly to nurture long-term ties with IR teams and industry contacts, acting with perseverance.
**Save work hours by zeroing in on core issues and companies,** and leverage AI to raise efficiency.
**Sustain company culture through proactive action.**
**Challenge yourself to exit the comfort zone** and fixed thinking; be imaginative in business and investment—form bold hypotheses and verify them cautiously. Broaden your view across related industries.
**Adopt open-ended learning and build a powerful personal network.** Make yourself a hub of knowledge and connections through active sharing and value creation.
As part of my learning in the AI software technology, I came across this video by Andrej Karpathy about the overview of LLM technology. I believe LLM technology is the starting point of understanding the AI software revolution. The video is very comprehensive and accessible to general public.
Deep Dive into LLMs like ChatGPT
Key insights to me:
LLM technology is still rapidly evolving. A lot of know-hows in the post-training and Reinforcement Learning stages. Leading players have to keep investing into the technology.
When solving complex problems, it's ideal to use more computing resources to form the chain of thoughts. Computing needs are higher for these problems. Still a lot of challenges. The industry is still exploring. Enterprise applications may need time as the LLM technology continues to mature.
I watched some chess games on YouTube and suddenly feel the urge to improve my chess skills again. Let's spend some serious time learning chess this time. After all, today is Kyle's birthday.