By Jordan Jacobs, Managing Partner & Co-Founder
The collapse of Silicon Valley Bank (SVB) reminds us how hard it is to be a founder.
At the beginning of last week, SVB was the 16th largest bank in the United States. Providing banking services to nearly half of the ecosystem’s venture capital-backed technology and life-science companies, SVB managed tens of billions of dollars in deposits for thousands of startups. On Wednesday, the bank disclosed a $2 billion loss after higher interest rates forced SVB to sell off long-term investments at a loss to ensure it could pay for withdrawals. The news sent a shockwave through the startup community and founders rushed to withdraw their money. A bank run was on. At Radical we quickly sent an extraordinary message to our portfolio companies, outlining a series of recommendations for managing their company’s cash in the midst of this crisis. Most of our portfolio companies bank elsewhere. And some who do bank at SVB were able to move their money on Thursday. However, for thousands of companies, including some of our own, their instructions to move money went unanswered. By Friday, the bank had collapsed, the FDIC stepped in and deposits became inaccessible.
When founders start a company, they know the risks they are taking on. They understand that they are turning down the opportunity to work at secure jobs with predictable income (typically they get paid much less in compensation in return for the equity upside if they can make their startups successful). They know they will be working very long hours and will not see their families as much as they would like. They take on this risk in pursuit of a dream: that with enough effort and ingenuity, they may one day turn the spark of an idea into a viable business.
But what is under-appreciated in the founder’s journey is the immense responsibility that comes with running a company. Founders ultimately must answer to their investors/shareholders, their suppliers, customers and to their team. When a crisis hits, founders are not only grappling with their own fate. They also hold in their hands the future of every one of their team members and their respective families.
As a startup founder myself, I have faced these existential moments. It is a tremendous privilege to hold the trust of so many. But, when things are not going well, it can feel like the weight of the world is on your shoulders. There is nothing more terrifying in business than looking at your books and wondering if you will be able to make the next payroll. Today, thousands of founders suddenly find themselves facing this question, in a situation that was largely unpredictable.
At the time of writing, there remains uncertainty around what actions regulators may take to ensure the deposits in SVB are accessible and recoverable. We do not yet know when people and businesses will have access to their funds or how much. What we do know is that this is a very difficult test for many founders, their teams and families, and the startup community more broadly.
We have been working with all of our affected companies constantly since Thursday to share guidance, best practices and to help overcome challenges, starting with how to make this week’s payroll if sufficient cash is inaccessible.
Innovations that make the world a better place are achieved by special people creating new inventions, products and companies from brilliant ideas and extraordinary effort. They deserve all of our support, understanding, empathy and respect, especially in moments like these.
Radical Reads for the week of March 12, 2023
Salesforce Ventures, the company’s global investment arm, launched a new $250 million generative AI fund to bolster the startup ecosystem and spark the development of responsible generative AI. The fund will initially invest in four companies, including Radical Ventures portfolio companies Cohere and You.com.
2) In AI, is bigger always better? (Nature)
Large language models (LLMs) are well known for their conversational capabilities but can have difficulties answering mathematical queries. Could making these models larger make them better at math? Some AI researchers say that the ‘bigger is better’ strategy might provide a path to solving tasks that require reasoning through improved pattern-recognition alone. But, the debate is heated. Others contend big LLMs will never be able to mimic or acquire skills that allow them to answer reasoning problems consistently. In place of ‘bigger’ models, more energy-efficient AI is the way to make progress. The approach is inspired, in part, by the way the brain makes connections.
3) Four Decades of AI Compute (Laconic)
Author and researcher Oguzhan Gencoglu provides a descriptive analysis of AI progress. Focusing on AI computation, Gencoglu argues that AI growth is exponential with compute doubling every 9 months over the past four decades. The typical metric for total compute to train an AI system is FLOPs – the total number of additions and multiplications. For companies, one effective way to ride this rapid growth smoothly is to combine domain expertise with AI to create a competitive advantage.
4) Research: Improving Transparency in AI Language Models: A Holistic Evaluation (CRFM)
Language models are here to stay. Decision-makers need to understand their function and impact. A 50-person team at Stanford has developed HELM (Holistic Evaluation of Language Models) to provide transparency and assist in evaluating models. Traditional methods for evaluating language models focus on model accuracy in specific scenarios. The use of language models is already widespread (e.g., summarizing documents, answering questions) questions, and retrieving information). To reflect this reality, HELM covers a broader range of use cases, evaluating many relevant metrics (e.g. fairness, efficiency, robustness, toxicity).
5) Research: RealFusion 360 – Reconstruction of any object from a single image (arXiv)
Researchers from Oxford University developed a method for reconstructing a full 360-degree photographic model of an object using a single image. “The approach involves fitting a neural radiance field to the image, but the problem is found to be severely ill-posed. To address this, an off-the-shelf conditional image generator based on diffusion is used, and a prompt is engineered to encourage it to ‘dream up’ novel views of the object.” The results are remarkable and will be extended to animated 3D scenes in future research.