(Re)Share #50 - Icarusky business
Reasoning LLMs | Green steel | Solar probes | Energy grid | AAV therapies
I would have liked to kick off the 50th issue of (Re)Share with a celebratory tone, but I need to pause and address the devastating loss caused by the wildfires sweeping across LA. The scale and scope of destruction from the Palisade and Eaton fires defy words. As someone now splitting time between the West Coast and the UK, I’ve been based in LA throughout this disaster, making it one of the strangest weeks I can recall. Packing a go bag three times by January 10th was not something I anticipated this year, but such is life. Thankfully, my wife and I are safe, and I’ve been moved by the messages of concern and support.
Still, many families have not been so fortunate. If you’re able, please consider donating to the charities supporting the response effort. Some of my favorites are:
With all of that said, it is a real joy to be writing my 50th issue of the newsletter and I have big plans for 2025 so without further ado, let’s get to it.
Shameless plug
On January 17th Fly Ventures is hosting another edition of Binary Stars, our AI-focused online conference. For our third episode we’re focusing on the path from 0 to $1 million of ARR for AI startups and we’ll be hearing from some of Europe’s best. Our speaker lineup includes leaders from PolyAI, Mistral and ElevenLabs so it’s really something you don’t want to miss.
Stuff worth sharing
No rhyme or reason - In the last normal issue (#48), I mentioned OpenAI’s Shipmas event, but I missed the final unveil, which was arguably the most important. The AI giant unveiled o3, the successor to its o1 reasoning model and a breakthrough in cognitive capability. Well, “unveil” is maybe a stretch, as nothing was formally shared or open-sourced, but the model is now in the public zeitgeist, and safety testing is underway. Like its predecessor, o3 enhances its reasoning capabilities by breaking down complex problems into manageable steps to improve accuracy and reduce errors. As with all of these model announcements, it was flooded with benchmark performance stats. The most impressive were an 87.5% score on ACR-AGI, a 96.7% on the 2024 American Invitational Mathematics Exam, and a 25.2% completion score on EpochAI’s Frontier Math benchmark, which no other model has exceeded 2%.
Metal of honor - Swedish climatetech startup Stegra is constructing the world’s first industrial-scale green steel plant. The facility will utilize green hydrogen—produced via electrolysis powered by renewable energy—to convert iron ore into steel, significantly reducing carbon emissions compared to traditional methods. The company has already secured nearly $7 billion in funding and plans an initial annual output of 2.5 million metric tons of green steel, which will double by 2030. This is a big deal because—not so fun fact—steel accounts for ~8% of global carbon emissions, and it’s in everything. Like bioproduction, this really comes down to cost parity with incumbent materials, although there’s an argument that this commodity could command a consumer premium via status signaling. For example, an auto OEM could potentially market green steel to the luxury segment. When you get into the weeds of most material science innovation, it really comes down to replacing just a couple of ingredients and figuring out clean energy supply.
Icarusky business - Last month, NASA successfully executed the insanely ambitious Parker mission. The space agency sent a solar probe for a flyby of sorts, coming within 3.8 million miles of the sun. That may still sound far, but it’s orders of magnitude closer than we’ve ever gotten to the sun, which, as a reminder, is an 864,000-mile-diameter ball of burning hydrogen. The mission seeks to study the outer corona in order to gather data on solar wind and space weather. Another fun fact: the probe’s max speed was clocked at 430,000 mph, making it humanity’s fastest object ever created.
It’s not the size that counts - Two weeks ago, Twitter was aflame with two camps quickly emerging. One was hailing the dawn of a new AI generation, no longer anchored by the incredible capital requirements of breakthroughs past. The other was freaking out over China. The catalyst for this debate was DeepSeek v3, an incredibly impressive LLM that is as capable as GPT-4 and Claude 3.5 Sonnet, but at a fraction of the training cost. The Chinese lab, also called DeepSeek, built the 671-billion-parameter model in just two months and for a mere $5.6 million. Regardless of where you stand politically, this is an objectively impressive feat, which comes as welcome news to anyone bullish on the AI market but lacking a Scrooge McDuck resources. The model also shows just how competent the Chinese AI machine is, despite the heavy export restrictions it’s under.
Frost yourself - In my 2025 Hard Problems post I highlighted physics-proof compute as an area of investment interest and my timing was pretty good. Earlier this week, I heard about a fascinating company called Akash Systems, which is building a novel thermal management system for energy-efficient compute. Akash’s technology is a diamond-based cooling method that is vastly more effective at pulling away heat compared to traditional copper and fan systems. It’s also far more badass to say that a data center is climate-controlled with precious gemstones rather than some run-of-the-mill water system. I have no idea how this pans out financially, but it’s a good way to spend an hour.
Power struggle - A fun read from The Economist on the massive scale-out of energy infrastructure worldwide. The surge in electricity demand is a frequent topic on (Re)Share, but I usually focus on EV adoption and compute demand. While those are indeed material factors, aging infrastructure and plummeting renewable costs are just as relevant. The International Energy Agency (IEA) forecasts annual global growth of 3.4% through 2026, and just in 2024, we saw $400 billion in grid infrastructure investment. Despite this massive cash outlay, we really can’t build fast enough, and energy concerns are increasingly a discussion topic in my circles.
Leaving nothing to waste - A research team out of USC released METAGENE-1, a 7-billion-parameter autoregressive LLM that makes sure our shit doesn’t stink. More specifically, the model can analyze wastewater samples to enable early monitoring of emerging pathogens, pandemics, or other health threats. METAGENE-1 was trained on over 1.5 trillion base pairs of DNA and RNA to capture the massive complexity of microbial and viral diversity in a metro area. The model achieves state-of-the-art performance on pathogen detection and metagenomic embedding benchmarks, although it would be funnier if the research team came in at number two. Full paper here.
Viral adoption - Adeno-associated viruses (AAVs) are a fascinating component of gene therapy, which I’ve been investigating for some time. This blog post by Eric Vallabh Minikel provides an excellent overview and history of AAVs and their role in next-generation medicines. Gene therapies are modalities that treat or prevent disease by correcting an underlying genetic problem in the patient, either by introducing new genetic code or altering existing genes via RNA or DNA alteration. These viruses are safe because they don’t cause disease and can’t reproduce on their own. AAVs can effectively target specific cell types and maintain long-term effects without permanently altering a person’s DNA. However, targeting specificity has historically proven to be a limiting factor. Modern AI tools promise to overcome this limitation through advanced protein modeling and capsid engineering. Fly may or may not—but absolutely does—have a portfolio company working to do just that.
Portfolio Flex
Wayve CEO, Alex Kendal, was awarded an OBE from the UK government for contributions to Artificial Intelligence. Considering Wayve is literally the world leader in self-driving technology, I’d say this is well deserved.