Radical Reads

Fei-Fei Li’s Candid Take on Leading and Prioritizing

By Private: Leah Morris, Senior Director, Velocity Program

http://shared%20radical%20ai%20founders%20promo%201200%20x%201200%208

The 2024 Radical AI Founders Masterclass kicked off last week, featuring Fei-Fei Li, Co-Founder and CEO of World Labs and Radical Ventures Scientific Partner, in conversation with Jordan Jacobs, Managing Partner and Co-Founder of Radical Ventures. 

This week we share an excerpt from their discussion in which Fei-Fei shares her candid insights about taking on the role of CEO and practical tips on how to prioritize an overwhelming set of obligations. The following has been edited for conciseness and clarity.

Jordan Jacobs

How much time are you spending on setting the vision of the company and mentoring the team versus doing scientific work, managing investors, fundraising, and all the other administrative things that go on in the life of a CEO?

Fei-Fei Li

For entrepreneurs who want to be founders: think about if you want to be that CEO or if you can find a friend or partner to be the CEO. It is generally the most schizophrenic job in any company. I was literally unloading the dishwasher 15 minutes ago before talking to you. And in 30 minutes I might be fundraising. In the fundraising season I probably spend 90% of my time talking to investors. In the hunkering down and planning season, I spend much more time talking to our team on technology and product. So it’s a very dynamic balance. 

Prioritization is one of the hardest things for young entrepreneurs. 

Years ago somebody shared with me a 2 by 2 matrix for prioritizing tasks and I still use it today. One axis is important and not important, and the orthogonal direction is urgent and non-urgent. I put tasks in these four buckets. 

The most important bucket is obviously the important and urgent bucket. You’ll be surprised how many people do too much in the non-important and non-urgent bucket. These tend to be fragmented little tasks that give you immediate reward. When life is tough, when the bigger tasks are hard to crack, doing little things makes you feel like you achieved a lot. But you really have to be careful about that bucket. 

Then the remaining two buckets are the hardest to choose. One is important but not-urgent. The other one is not important but urgent. Again people tend to over index on urgent not-important tasks whereas, I’m not saying I’m perfect, but I have developed a mental muscle to force myself to go with the important but not-urgent tasks. The important and not-urgent tasks will become important and urgent tasks soon. But when you are under too much pressure sometimes you don’t deliver A+ work. So the more you can front load important and not-urgent tasks the better. 

These four buckets guide my own way of dynamically balancing my day. People like me and Jordan, we don’t just have work. We also have a complex sandwich generation life. That makes it even more important for me to make sure my priority buckets are taken care of.

Even among our co-founders we have weekly planning sessions and I make sure we force rank tasks. 

Cover

Jordan 

Being able to frame tasks in a matrix that people can apply to their daily tasks is incredibly valuable. I strongly recommend this approach. If you are a founder, I promise you 10 years from now you’ll look back and say that was a priceless piece of advice.

 

Watch the full conversation between Jordan Jacobs and Fei-Fei Li here.

AI News This Week

  • Google inks nuclear deal for next-generation reactors  (VentureBeat)

    Google has signed a first-of-its-kind deal to purchase nuclear energy for its data centers. It is the first corporate agreement to purchase electricity from advanced small modular reactors (SMRs) that are still under development. Google agreed to purchase electricity from “multiple” reactors that would be built through 2035. In March, Amazon Web Services announced its purchase of a data center campus powered by a nuclear power plant in Pennsylvania. Microsoft signed an agreement in September to help revive and purchase power from the shuttered Three Mile Island plant. SMRs are one-tenth to one-quarter the size of traditional nuclear plants.

  • AI wins big at the Nobels   (The Economist)

    AI research took center stage at the 2024 Nobel Prize announcements. To recap, the physics prize was awarded to John Hopfield and Geoffrey Hinton for their early work in neural networks, essential to the development of AI models used today. A day later, the chemistry prize honored DeepMind’s Demis Hassabis and John Jumper for their AI system AlphaFold, which predicts the three-dimensional structure of proteins. The prize was shared with David Baker for his computer-aided protein design work at the University of Washington. AlphaFold has revolutionized biochemistry by predicting protein structures with high accuracy, helping researchers tackle problems like drug discovery and virus research. This year’s Nobels spotlight the power of AI to unlock new scientific frontiers, paving the way for future breakthroughs in fields driven by computational advances. See last week’s Radical Reads for an exclusive on Geoffrey Hinton’s reflections upon winning the Nobel. 

  • How startups use technology to support Canada’s health care system  (The Globe and Mail)

    Canadian health-tech startups are tackling the nation’s healthcare challenges with innovative applications of artificial intelligence and digitization. PocketHealth, a Radical Ventures portfolio company, provides patients with online access to medical records and imaging, minimizing redundant scans and improving follow-ups, especially in remote areas. Another Radical portfolio company, Signal 1, integrates AI insights into hospital workflows to boost patient outcomes, efficiency, and ease front-line worker stress. These AI-powered platforms showcase how startups are streamlining processes and relieving pressure on Canada’s overburdened healthcare infrastructure.

  • Satellites are photobombing astronomy data — could AI offer a solution?  (Nature)

    Satellites provide global broadband and critical earth monitoring, but they increasingly disrupt astronomical observations, appearing as bright streaks in telescope images and degrading data quality. Thousands have been launched into low Earth orbit in recent years, with many more planned, including China’s 12,000-satellite G60 Starlink mega constellation. In response, astronomers have developed a machine-learning algorithm that detects satellite streaks with high accuracy, making it easier to interpret data and potentially remove these streaks, which often resemble genuine astronomical events or emit radiation that interferes with sensitive measurements.

  • Research: overcoming challenges for long inputs in RAG  (Google AI/University of Illinois at Urbana-Champaign)

    Researchers are tackling a key challenge in retrieval-augmented generation (RAG) for long-context large language models (LLMs): the decline in output quality as the number of retrieved passages increases. While more retrieved information initially enhances performance, irrelevant “hard negatives” eventually degrade the results. To address this, the team introduced training-free methods, such as retrieval reordering, and training-based approaches like RAG-specific fine-tuning. These optimizations help improve the robustness of long-context LLMs, offering significant performance gains and a more efficient way to process large amounts of retrieved information.

Radical Reads is edited by Ebin Tomy (Analyst, Velocity Program, Radical Ventures).