SITALWeek

Stuff I Thought About Last Week Newsletter

SITALWeek #384

Welcome to Stuff I Thought About Last Week, a personal collection of topics on tech, innovation, science, the digital economic transition, the finance industry, and whatever else made me think last week.

Click HERE to SIGN UP for SITALWeek’s Sunday Email.

In today’s post: a deep dive into the market's apparent misunderstanding of AI like ChatGPT and its potential impact on businesses like web search; using CRISPR to protect plants and improve the food supply chain; the daunting task of upgrading the grid; the uncomfortable lesson in voice actors being replaced by AI; and, Indiana Jones and the Mexican Pizza.

Stuff about Innovation and Technology
Gene-Edited Critters
The genetic modification tool CRISPR may prove useful for combating agricultural pests, according to MIT Technology Review. In one case, the glassy-winged sharpshooter (a type of leafhopper insect) is being modified to reduce its spread of disease-carrying bacteria to grapevines. Thanks to recent genetic sequencing efforts, scientists identified a carbohydrate in the insects’ mouth that makes the bacteria more likely to stick. They want to use CRISPR to disrupt the carbohydrate to reduce bacterial adherence. Other gene editing programs involve sterilizing fruit flies to dampen the populations of these crop-damaging pests. Most of the efforts are in development or just entering testing and trials, with the permitting process for commercial use still unclear. Fish are also the subject of gene-editing studies. Forty percent of farmed fish die before harvesting, but it’s possible to make them more resilient to disease by inserting infection-fighting alligator genes. Other efforts underway include making fish bigger and stronger. Obviously, once deployed, these genetically edited organisms will be existing within a complex adaptive system, and small changes may cascade chaotically forward in time. Caution is merited, but the long-term rewards could be large.

Grid Strain
Looking at Palo Alto, CA as a microcosm for EV and heat pump adoption reveals the humbling hurdles to upgrade the electrical grid, according to IEEE. With 3,500 home charging ports, 1,000 public charging ports, and incentives for heat pumps and solar-battery installations (to meet a goal of 80% reduction in emissions by 2030), the grid in some spots can’t even handle adding a single EV charger. The challenge is to balance peak workloads with bi-directional grid activity to avoid blowing transformers. Having multiple Level 2 home EV chargers on one transformer can reduce its life expectancy from 30-40 years to just 3 years. A lot of work needs to be done to grids everywhere to have a shot at progressing toward green goals. Palo Alto, which owns its utility, estimates $150M and 5-7 years of time (including planning) will be needed to modernize its grid. It's mind boggling to imagine the entire electrical grid needing such dramatic upgrades.

Giving Voice to AI
The AFLAC duck was never as good after the company fired the late, great Gilbert Gottfried. Sometimes a voice is so unique that no other human substitute will do. AI, however, does not suffer similar limitations. Increasingly, voice actors are being asked to yield the rights to their voice for recreation by AI. Often, the company that owns the intellectual property for a character owns the likeness to its voice, so it’s not uncommon for voice actors to be replaced proactively or due to illness/death. I think what makes people uneasy about this idea is that many of us have jobs where part (or all!) of what we do could be learned by AI, and we are potentially signing our rights away with every mouse click, spreadsheet input, email, zoom call transcription, etc., that could be used to train our AI replacements.

AI-Search
Google lost well over $100B in market value last week on the heels of Microsoft announcing a trial integration of the large language model (LLM) ChatGPT with Bing search. Regardless of whether the stock market's fears prove right or wrong over time, it appears to me the market does not correctly grasp what LLMs are and how they might be used. Here, I’ll spend a little time explaining how I would monitor what might happen with search and LLMs in the coming years. Web/mobile search is a utility. Utilities are about speed, accuracy, and results – getting the most information you need in the least amount of time. In a previous whitepaper (2019), I classified digital consumer businesses across three spectrums: utilities, communications, and entertainment. Regarding utilities, I wrote: 
Internet utilities create the highest value - Google Search and Amazon Prime are the best examples - these are products designed for you to spend the least amount of time possible and get the best outcome. For example a web search should immediately give you an answer to a question - and this phenomena accelerates with conversational voice assistants. And, think about Amazon’s Prime ecommerce business - you want to quickly find the product and have it delivered as fast as possible. Data driven utilities are highly monetizable with advertising and fees, and nearly impossible to breakdown once their network effects are established... 

As longtime readers know, I am beyond enthusiastic about the potential for LLMs, but the effort to integrate them into a utility-like search may be years off due to their current slow speeds, often dated information, and propensity to hallucinate and make up incorrect answers. These characteristics make them the opposite of utilities under my definition above. Microsoft’s integration of ChatGPT into Bing is limited in scope and only available for a small number of users. Further, one of the reasons Google is slowly rolling out a more limited AI search assistant is because it “requires significantly less computing power”. Speaking to the FT, the founder of the AI company Perplexity (and former OpenAI research scientist) Aravind Srinivas noted that search assisted by LLMs like ChatGPT could cost seven to eight times as much as a normal search query. Google has long said that around 15% of queries they receive are novel, so eventually this multiplier in cost may be limited (with a large percentage of LLM queries having indexed answers that require little power to serve). Alternatively, the use cases could broaden so much that novel queries might become the norm, and models will need to move from annual to real-time training. I first noted the arms race for LLMs in January of last year, and, for now, it appears the early winners might be the arms suppliers – i.e., manufacturers of chips, including processors, GPUs, and memory. Google may have an advantage here in that they invented transformer models in 2017 and they design their own custom chips (TPUs) versus the rest of the market, which largely relies on GPUs that tend to consume more energy than custom silicon. Google’s AI efforts are vertically integrated with their own engineers and chips, while Microsoft is simply operating as an infrastructure provider in a complicated relationship with startup OpenAI using merchant silicon like Nvidia's GPUs. Examples of this vertical integration in practice at Google are DeepMind’s RETRO and Chinchilla, which could make LLMs highly economical in short order, giving Google an edge. Google has also been applying transformer models to search with great results going back to their BERT model in 2019 (first discussed in SITALWeek #220). 

Tools like ChatGPT may not play a notable role in web search, at least in terms of how we use search today; rather, they feel like something completely alien, as I’ve noted in the past. There is a chance that chatbots (and AI broadly) will evolve into a new fabric for everything – a replacement for the Internet and apps we have today. It’s a complete shift in user interface – like the mouse or multi-touch smartphone screens. Further, LLMs offer access to a new information layer, much like the Internet. Thus, I think it’s best to view chatbots as a completely new platform that will have entirely new use cases and applications. Rather than chatbots integrating into or augmenting search, think of search as just one part of a chatbot platform with a far greater set of functions. Given the capacity constraints, the early release LLMs from Google and Microsoft are primarily an attempt to lure the developers who will determine the winning platforms of the AI Age. Due to the classic innovator’s dilemma the prevents many established companies from embracing new technologies, it’s entirely plausible a startup emerges with the winning chatbot platform, with the existing cloud giants providing the underlying infrastructure for the apps. Releasing products early to drum up interest is the key to winning the developers in any platform shift. Recall the first iPhone was limited in functionality and the App Store came later, once developers had a chance to see the potential of multi-touch. 

In a few years, when search is eventually subsumed by a new, personalized conversational AI that is trained daily for each individual person, it will have major ramifications for the entire Internet. For example, all of the publishers and ecommerce sites that rely on dominating organic results and arbitraging search ads could see their value lost to embedded answers and offers. But, there are miles to go to establish the enormous capacity needed to serve conversational search results and the monetization engines to pay for them (not to mention improvements needed in the underlying AI tools). Even if the solutions were available today, it might take years to upgrade the infrastructure.

Returning to my opening point, the most important metrics for a digital utility like web search are speed and accuracy. Google took four years (2004-2008) to roll out Suggest (later known as autocomplete, the algorithm that offers search choices as you type your query) because they had to engineer the necessary infrastructure to support the real-time task. In comparison, AI-assisted search is orders of magnitude more complex. While the range of outcomes might be widening for tools like search, the tails are bigger on both the upside and the downside, and the path we travel with LLMs won’t be known for years. The most important indicators of future success/longevity we always look for are non-zero sum and adaptability: the most adaptable company offering the best outcome for all sides will be amongst the winning AI platforms (this is key to winning both developers and customers while also finding a business model that can pay for the large costs of AI). One other interesting thing to keep an eye on is geopolitical leverage. Given the ramping Western sanctions on China’s ability to access leading-edge chips, if LLMs do become a new Internet-like platform for the next generation of apps and innovation, continued chip sanctions on China would effectively cut them off from the next-gen, AI-based Internet.

Miscellaneous Stuff
Ford Focus
I enjoyed this interview with Harrison Ford in the Hollywood Reporter. While the profile piece is more interesting from the philosophical side of the 80-year-old’s long career and life (which he has only recently starting opening up about), readers will also be interested in the rather unusual way he was de-aged in the opening sequence of the new Indy film. Rather than using advanced AI to map a younger, virtual Ford onto older Ford’s face, they instead went through 40 years of his films and found shots of him when he was younger in the same angle and lighting and mapped those real images over frame by frame.

Mexican ‘Za Boosts Sales
Surely you all ridiculed me when I celebrated the triumphant return of the Taco Bell Mexican Pizza in #346, but Taco Bell parent Yum Brands had the last laugh, reporting the sale of 45 million Mexican Pizzas over the four-month limited run. The company claimed the Pizza’s resurrection helped drive 2022 sales, along with delivery partnerships and value pricing.

✌️-Brad

Disclaimers:

The content of this newsletter is my personal opinion as of the date published and is subject to change without notice and may not reflect the opinion of NZS Capital, LLC.  This newsletter is an informal gathering of topics I’ve recently read and thought about. I will sometimes state things in the newsletter that contradict my own views in order to provoke debate. Often I try to make jokes, and they aren’t very funny – sorry. 

I may include links to third-party websites as a convenience, and the inclusion of such links does not imply any endorsement, approval, investigation, verification or monitoring by NZS Capital, LLC. If you choose to visit the linked sites, you do so at your own risk, and you will be subject to such sites' terms of use and privacy policies, over which NZS Capital, LLC has no control. In no event will NZS Capital, LLC be responsible for any information or content within the linked sites or your use of the linked sites.

Nothing in this newsletter should be construed as investment advice. The information contained herein is only as current as of the date indicated and may be superseded by subsequent market events or for other reasons. There is no guarantee that the information supplied is accurate, complete, or timely. Past performance is not a guarantee of future results. 

Investing involves risk, including the possible loss of principal and fluctuation of value. Nothing contained in this newsletter is an offer to sell or solicit any investment services or securities. Initial Public Offerings (IPOs) are highly speculative investments and may be subject to lower liquidity and greater volatility. Special risks associated with IPOs include limited operating history, unseasoned trading, high turnover and non-repeatable performance.

jason slingerlend