(Re)Share | #14 - In case of climate emergency break glass
Geo-engineering | AI safety | Underwater mining | Deepfakes | BCI
I’m really living up to the “occasionally weekly” subtitle. It’s been three weeks since the last (Re)Share, despite every intention to post earlier. Travel throughout Europe stole all of my time unfortunately. Of course, as my wife so kindly reminded me, “all you’re doing is repurposing other people’s work so it really doesn’t matter,” which provides a nice glimpse into the emotional support that underpins my marriage.
Shameless Plug
A couple of weeks ago I posted my first attempt at a work-in-progress investment thesis. The focus, Climate Reversal, outlines the three areas that intrigue me most within the broader sustainability space:
Carbon sequestration
Carbon capital
Geo-engineering
The last of which created a LOT of DMs with the majority pushing back on this as a distraction from emissions reduction. This only reinforces my interest. Founders in this space I am ready to give you money.
Stuff worth sharing
Throwing shade - If you also find solar engineering intriguing we’re in good company. The UN released a report endorsing further research in the area. While acknowledging the nascent state of progress and the scale of unknown risks, the long term benefits dictate further investigation. As I wrote in my own piece, hopefully we’ll never need to enact such techniques, but our emissions reduction outlook doesn’t look great so we might need failsafe.
Global oopsie - The US Department of Energy announced that they now believe a China-based lab leak to be the most likely source of COVID-19. Remarkable how this was once a pseudo-right wing conspiracy and now mainstream truth. I have spent a lot of time investigating AI safety systems, but I’ve seen enough zombie apocalypse movies to know that poor biotech security is really, really bad.
Asleep at the wheel - Former researcher and recent founder Jonathan Goodwin wrote a fantastic thought piece on why Deepmind didn’t build GPT-3. The brief but insightful post highlights the embedded beliefs and entrenched systems that even cutting edge research organizations can fall victim to. Having spent years working at Deepmind, his view is certainly valid. Highly recommend a read. (full disclosure, Fly VC recently led an investment round in Jonny’s company)
We’ll be good, promise! - 50 organizations joined together to build a framework for self-regulation within AI safety, specifically focused on synthetic media / generative AI. While commendable, history is rife with examples of where self-regulation has catastrophically failed so I have little faith in said approach. Independent, third party systems (e.g., our portfolio company Lakera) are a must.
All hail our robotic overlords - GPT-4 dropped last and it’s pretty bananas. The 24 minute developer presentation shows off the new model’s highly impressive reasoning and inference skills. It’s also David Brent level cringe at times, or so I thought until I watched Microsoft’s latest drop.
Darling it’s better down where it’s wetter - In the past few months I’ve found a real interest in batteries and specifically the economic and ethical tradeoffs they require. Rare Earth metals are extremely in demand and equally hard to procure. Some estimates actually put global Cobalt supply at maybe 20 years. I’ve looked into asteroid mining a bit, but never considered the deep sea. This was a great article on the subject, told through the story of a seemingly incompetent and arguably corrupt entrepreneur.
Related: Tesla is transitioning back to rare Earth metal-free motors.
Maestro if you please - Google released a model that creates music from text. I have listened to the arcade game sample dozens of times.
Sticks and stones and UAV drones - My favorite podcast continues to be Invest Like the Best and this episode with Anduril’s Trae Stephens exemplifies why. For those that don’t know, Anduril is one of the most ambitious and impressive startups out there. They’re modernizing the American defense industry by taking on the Primes head on. If you invest in or are building anything that’s highly regulated I strongly recommend a listen.
RIPSVB - Literally everyone heard about this story so I have nothing new to add. If, however, you were under a rock or don’t fully understand the debacle or just need a laugh, Matt Levine once again nails it.
This might sting a bit - I unfortunately have a lot of familiarity with neurodegenerative disease and I can tell you from experience that the current standard of diagnosis and treatment is abismal. This study of low voltage brain stimulation had some pretty promising results.
Unstoppable force, meet immovable object - Musk’s tough year continues. Elon’s brain computer interface company, Neuralink, got a massive setback from the FDA over safety concerns for human trials. Come for the cutting edge science, stay for the boardroom drama.
How to lose friends and influence people - Apparently the US is now soliciting bids for synthetic media companies to help them with adversarial misinformation campaigns. Deepfakes are a BIG deal and my working theory is that they’ll be the straw the breaks the content moderation camel’s back. Influence and soft power have always been a part of the foreign policy but this, to me, feels like several steps too far.
Buyer’s remorse - While I literally just said deep fakes are wrong, I now encourage you to all click that link of a synthetically voiced Joe Biden talk about a recent purchase.
Spotlight on
On of my angel investments, Kheiron Medical was featured in the New York Times for their insanely impressive AI radiology offering. I’ve had the pleasure of following their superhuman technology for a few years, but it’s very cool to see it now shared with the world. Last year Kheiron analyzed more than 275,000 breast cancer cases, which allowed them to cut down on radiologists’ workloads by 30% and increased cancer detection rby 13% because more malignancies were identified. The future is now.