Ep 115: Josh Wolfe (Co-Founder, Lux Capital) On Uncovering Hidden Opportunities

Josh Wolfe, co-founder and managing partner at Lux Capital, is a master at quickly understanding complex, emerging technologies. In our conversation, Josh shared how he identifies the next big thing, his contrarian views on nuclear energy, biotech, and robotics, as well as his thoughts on the state of the venture capital market today. 

Daily News Routine

Logan: Well, thanks for doing this. So, uh, I'm curious, very open ended, what are you, uh, what are you thinking about these days?

Josh: I mean, I'm always trying to spot failures and risks and chaos and try to understand what everybody else is thinking about and then find the underappreciated thing. So, you know, there's geopolitical things that are front page of news and then things that are like C24, which is always like the page that I like because people aren't paying attention to it.

Um,

Logan: This is Wall Street Journal?

Josh: read literally probably 22 papers. So I use a app called Press Reader and I do USA Today, New York Times, FT, um, four different papers from the Mideast. Uh, Uh, one from India.

Logan: You do this every

Josh: Every morning. But when I say read, I'm skimming through. And then, you know, every now and then there's either a photo or a headline that catches and you can tell some of the implicit bias.

Like, uh, there's three papers in the UAE, one of them which is the national, Has a more populist sentiment and then one which is more promoting of all the amazing diplomatic outreach and economic development that they're doing. And so you sort of can handicap some of the headlines, but it's always like, what is the zeitgeist?

Where is it converging on consensus? And then like, what's the random thing that nobody's thinking about? And sometimes that's the spark of the idea. And I can go three weeks where there's nothing interesting. And then all of a sudden, one day there's like four interesting articles that I, and I don't think anybody else in the world is thinking [00:02:00] about that stuff.

And so that's always a way to sort of generate new ideas.

Logan: Is, uh, I downloaded an app. I don't know if it actually, I've been using it for about two weeks now, which shows me the bias in the different articles I'm reading and how it gets like, uh, what's being reported on and talked about from undersourced stuff as well.

Um, is that, does that take up like an hour every morning?

Josh: Uh, yeah, I, I wake up, I have coffee, um, and spend time with the kids and then rip through these digital newspapers. And the digital newspapers actually began because I used to get physical newspapers. And my wife, Lauren, was upset that I was getting fingerprint newspaper print everywhere on the house. So she's like, you gotta get rid of this.

And then I found this amazing app, PressReader, and now I get all these international papers. But yeah, probably an hour. Um, and then Twitter. And that's probably another 20 minutes of catching up on responses and chaos of the world.

uh,

Logan: I was going through in prep for this, uh, it seems like you've kind of. Stop tweeting to the same extent you used to?

Geopolitical Insights

Josh: I just think that the things that I've tweeted about probably since October 7th has become much more, you know, geopolitical, pro Israel. You know, uh, less Elon. Uh, but

Logan: I, well, maybe I was wrong and maybe there was some glitch in the algorithms I was scrolling through, but it seemed like, uh, I got back to 2022 pretty quick in my scroll.

So maybe,

Josh: no, I probably was resurfacing old tweets.

Logan: Oh, interesting.

Yeah,

Josh: I think there was a something that went viral the other day about Trump and Elon talking about nuclear power and that he needed a rebrand. Which was like a giant dog whistle for me. I was like, you know, so I was retweeting all these things from probably 2022 or 2019 actually going back to this idea of elemental energy, which literally if nuclear power was invented today, people would be like losing their minds and be like, Oh my God, this is magic.

But it was so conflated during my mom's generation with like nuclear war, which is bad, and then nuclear power, which is good. And if you were an environmentalist, you know, you want zero carbon, large baseload power, and we just need a better term for it. So elemental energy, because everybody loves. Wind, sun, solar, uh, [00:04:00] thermal, uh, nat gas, better than coal, but, you know, rocks like coal are bad, but rocks like uranium are actually quite good.

So the elements, yeah, elemental power, elemental energy.

Nuclear Power and Environmentalism

Logan: think, uh, I actually don't know the history of this. Was it as much as the conflating of those two terms and I guess a handful of scares that happened in the early 80s that actually didn't have much come from them,

Josh: Real life and media. So 1979 you had Three Mile Island, which was a, you know, a nuclear emergency, zero deaths, no nuclear leak. It was actually proof positive of good engineering failsafe systems.

And then you had the China syndrome, which was a movie that actually depicted like nuclear meltdown. And then you actually had Chernobyl in the 80s, which was horrible, legitimately terrible. And then I think people were just like in that course of seven, eight years, um, and, and there was also this element which exists today that you could sort of say is like far left or maybe a Marxist or anti capitalist.

But during the 80s and the rise of the boom, boom, Wall Street era and Reaganomics and all of this. It was just like this natural antipathy towards growth. And I think that that has persisted in a really toxic way. And so if you, if you really want societies to thrive and, you know, humanity to improve, you want a flood of energy.

Because energy helps you with healthcare and desalination and food production. And so to want to thwart energy in this sort of, degrowth their mindset, which I think had its early kernels in that environmental movement. Rachel Carson, Silent Spring, you know, looking at some of the chemicals and the, and then that persisted and evolved into today's, what I think is corrupted environmental movement.

to

Logan: back, what's the last interesting thing that you've

seen or gone deep in when perusing those, those

Josh: Oh, uh, let's see a newspaper headlines. Probably to me, the most interesting thing, which I think is still not front page in most traditional Western news, is the Sahel and Maghreb in Africa.

So the conflict regions, which very slowly, and I think it's one of those things like it's too weak to be felt until it's too strong to not be noticed. And Mali, Chad, Niger, just violent extremists, really religious [00:06:00] extremists, in some cases, the inheritors of Afghanistan, Syria, ISIS coming in. Kidnapping women and children, really looking to have sort of a jihadi, almost caliphate going up into the region, and it's going to create massive instability.

It was compounded by Russian mercenaries, the Wagners, that came in and they were sort of just exploiting resources and arms dealing, and then you had China infrastructure around the coast of that Africa region. You are one terror event away being projected into Europe. Uh, where suddenly that area becomes our next Afghanistan and we are like entrenched in it.

And, uh, so yeah, that whole AfriCom military apparatus, I think is something where we've done some training of troops, but it's against now a pressure that very populist and politically Russian and others have been able to force the former colonial powers like France and in some cases the U. S. literally off these bases.

And so I think that that is probably like the hotspot, even though today everything is, you know, China, Russia, Iran, North Korea, Gaza.

Investment Strategies and Competitiveness

Logan: Do you, do you follow this stuff, uh, just out of own curiosity and, or does it, does it benefit

Josh: Curiosity, yes, in that, um, the common thread line in this is trying to understand something that other people aren't looking.

It's, it's born out of competitiveness. It's, it's not born in some virtuous pursuit of truth or understanding.

Logan: morality. And

Josh: it's vainglorious. It is entirely, uh, I want to know something that other people don't know because I like to be competitive and I like to know it before they do.

Logan: you interesting at dinner parties,

Josh: that's the honest answer, right?

And it's like a self introspective one. Other people like, you know, we've talked about in some past, but like one of my great pet peeves is when an entrepreneur is like, I just want to change the world. Bullshit. No, you want to make a ton of money or you want to prove somebody wrong. And when you're honest about that, like inner drive of what that is.

Then I believe it. But when somebody is giving some, so why do I do this? Because I'm competitive. I want to be smarter than the next person. I want to know something that somebody else doesn't know. I want to have an insight where people react to it. I'm like, Oh, that's really interesting. And so. That to me is an addictive feeling.

It's the same way, I love this Linus Pauling quote, he was a two time Nobel laureate, and for chemistry and for peace, and it's [00:08:00] this addictive nature of science, which is like this endless frontier, right? It's like what we do at Lux is finding the frontier of this cutting edge science, and he said, I know something that the rest of the world doesn't know, and they won't know, and they won't know,

That is just an addictive feeling.

And so that, to me, and you experience the same way. If you have a portfolio company, you know they're about to come out of stealth. Or they're about to release a product. And you know it, but the rest of the world doesn't. And you know that's gonna drop at Tuesday at 9 a. m. or whatever it is. That's an exciting feeling.

Finding a scientist with a breakthrough. We just found one that is out of the West Coast. With a pretty cool breakthrough that uses The nervous system to control bone growth. It's a big deal for people with like osteoporosis and bone fractures, never been done before. Um, it was a cool discovery and like you find that and it's like finding a band or an artist or, or a neighborhood before everybody else.

So it's, it's the same principle that you can apply in pretty much any industry. Geopolitics, finding the low probability, high magnitude event. Art, finding the artist, you know, Basquiat before he's been discovered. Uh, the band, you know, finding the cutting edge band that's playing for, you know, 30 people that is suddenly going to be selling out, you know, stadiums.

Andrew Schulze, the comedian who I love. Lauren and I, my wife saw him. Union Square must have been 40 people in the crowd. You know, now he's selling out Madison Square Garden. Like that, that feeling when you find something before everybody else has. It's amazing. Why? Because you get social status of being like, I saw that, right?

You're a

Logan: I was there.

Josh: Yeah. And so it's the same feeling in, in the pursuit of knowledge. I do it in part because I want to understand the world and it's the sense making and I'm trying to understand the zeitgeist and what other people believe and there are objective truths that are just happening.

and then there's the intersubjective truths where somebody believes something and therefore the the world is going to go this direction because everybody believes it but i i love that feeling of just discovering something before others do and again it's not born in virtue it's born in vice of competitiveness and self interest.

Logan: Have you been able to get, Lauren, does Lauren like Schultz.

Josh: She finds him hilarious and she is

Logan: I can't get my My wife, we watched his most recent special together, and she got about halfway through, and she was like, This isn't

for me

Josh: offended or

Logan: [00:10:00] Yeah, I think, I think he's sort of, um, There's a little bit of, uh, A tongue in cheek, misogynistic, uh, tone to him. I happen to find it very funny, but it was not It was off putting to her, and she, uh, She ended up walking out and being like, This just isn't,

Josh: We've had two funny run ins by the way. I've, I've met him, but my wife, um, went to go get sushi with my daughter in Tribeca and I was like, no, I'm gonna stay home, whatever. And, and he was at the restaurant. I was like, ah, I missed it. And then just last week he was out in the Hamptons with, and my son went to an ice cream place and he was there, uh, or a barbecue place.

And, uh, my son took a selfie with him and sent it. It was pretty

Logan: That's pretty funny. He, uh, actually he's one of his producers, uh, for

Josh: PlayGrid. Mm hmm.

Logan: uh, consults with us and has actually helped us quite a bit about like YouTube strategy and thumbnails and all this stuff.

And, uh, so it's actually been really helpful to have, I mean, YouTube is this whole separate algorithm and figuring out how to do thumbnails and titles and editing and all that stuff. So they've, uh, they've been really

Josh: good. I look forward to the thumbnail for this where I'm like

Logan: Yeah, exactly. He does. He does a good job of all that stuff.

Scientific Breakthroughs and Business Models

Logan: Um, on the, on the investment side, when you're thinking about like the nervous, uh, scientists with the, uh, the nervous

Josh: bone growth, or nervous

Logan: system bone growth, uh,

Josh: really nervous

Logan: yeah, the anxiety for bone growth.

Yeah. Uh,

Josh: Yeah.

Logan: when you're thinking about something like that, like, how does How does the business model and the, uh, acquirer set inform any part of the investment thesis? Because Control Labs is a very cool business, ultimately acquired. I'm not sure Facebook meta would have been on the list of acquirers in the early days of it.

Maybe it was because you, you sort of knew the Oculus story and how all that played out. But, but I'm curious when you find something like that, that's so, um, Impactful and interesting.

do

you just assume people will care at

Josh: some point? Yes. If it

Logan: does what it's going to do? Yes,

Josh: There's no quantitative analysis of, oh, you know, what have historic people done?

Preston transactions been a biotech for something related to osteoporosis. It really is. [00:12:00] Holy shit. This is a scientific breakthrough that people have never seen before. And if it works and can translate, and sometimes some of these things don't, and that's the risk that we're willing to take to finance, uh, it's going to be a really big deal.

And so in this case, almost all research in, uh, most biotech has been done on mice, and most of the mice are male. And this particular researcher was doing work on female mice. And she discovered that, uh, when mice are lactating, so pregnant mice or mice that have just given birth, uh, the body's amazing, right?

I mean, just like when you think about all the things we do and why we do them from the hypothesis of why we kiss. I mean, this is like a crazy thing, right? Mothers Kiss their babies, it is believed, because they're actually sampling the pathogens that are on the skin of the head and the, you know, their fingers and their And then they produce the antibodies and deliver it through breast milk to the baby.

It's just like wild, right? Like this mechanism that evolution through billions of years and hundreds of thousands of humans has engineered is just fascinating. In this particular case, when the mother is pregnant, you have this alien growing inside of you. There are, uh, uh, cells and enzymes called osteoclasts that literally are breaking down bone and they are delivering it to the baby.

So you would expect that mothers lose bone density. Now, part of that is offset by cravings for ice cream and, you know, dairy because that's calcium that is refilling the boat. But it turns out that there are circuits in the brain that actually tell the body, produce bone. And so, if you could trigger that for post menopausal women, you could trigger that for people that have had, uh, traumatic injuries.

And be able to, uh, build bone, that's a big deal. And you look around and you say, well, is this something that's being done? Like today? Like, you know, GLP ones, like a lot of people are working on all the different variations of that. That's a crowded space. I would not be very bullish on the incremental advance there because it has to be so much better that it's improbable to really get the attention and to get the investor demand and to get the media attention and to get the desire for people to want to work for that company.

But if you have a breakthrough where people are like innately with this reaction of, Oh my God, that's pretty amazing. That's the thing that I get turned on by, so it's trying to find that science that is authentic [00:14:00] and real but provokes a reaction that I can say, Oh yeah, the slope just got really flat to raise money, recruit people, get media attention, because it's so exciting.

Logan: Does the business model come into consideration as you're thinking that through or if it works, it'll work?

Josh: uh, most of that is me finding the executive, and in this case we have an amazing woman who is a long time pharma exec. Was there in the room at some of the most important drug creation and she's able to look and say This is a three billion dollar market there are two drugs the Modality is somebody has to take this every day for six months if you could take a one time shot or a pill or you know something that changes the the frequency of in the inconvenience of it that that would be a big deal and The pricing for this would be this and the population is that so she's able to do the analysis to justify it And that gives me the further comfort as investors to say, oh yes, when a more scrutinizing investor who is not enraptured and seduced by the science as I am, looks and says, well, what's the market risk and all this kind of stuff that she can answer that incredibly.

Logan: Can you speak to the point of, uh, stamping out risk or killing risk is how you view, uh, you know, the job of investing.

Yeah.

Contrarian Investing and Risk Management

Josh: And this is a little bit. The dichotomy between Peter, who you know, my co founder at Lux, who I joke is, you know, the optimist, and I'm the pessimist, or the cynic, or the skeptic. But

Logan: you were black, he wears white.

Josh: although this is, you know, light black, right? And, uh, but yeah, he's generally thinking about what can go right.

And I've caricatured this and exaggerated it to an extent, but it's mostly true in that he's thinking about, well, what if it works? And I, for a variety of reasons we can get into, am a little bit more drawn towards, like, it's gonna fail. So let's be emotionally prepared toward because if you're emotionally prepared for failure, then you don't have negative surprise and you're able to sort of manage your reaction.

Um, I like to think of risk and value almost like the first law of thermodynamics. So first law of thermodynamics, energy is not created or destroyed. So if we looked and put on the table, metaphorically, embodiment of all the risks of a company, financing risk, market risk, technology risk, people risk, competition, you know, you name it, every risk that we can identify and kill with time or talent or money thrown at it.

A later investor that comes, six months, six years, whatever it is, [00:16:00] they should pay a much higher price and demand a lower quantum of return. And so the higher the price, the lower your expected return. And so if we start a company, let's say, uh, one of our biotech companies or an aerospace defense company, it doesn't matter the technology.

The principle is the same. Here I'm de risking science. Somebody that comes in later should pay a much higher price. I should get rewarded in the form of value creation. Valuation goes up. Their expected return goes down because they're taking less risk. Same thing, an aerospace company. Does it work? Will somebody buy it?

Will they, you know, the dogs eat the dog food? Uh, can we get contracts? Can we reproduce it as a competitive? Somebody else should pay a higher price. And when you look at our most successful companies, you can look at it poetically almost as like, what have the risks that they had been killed and how has value been created?

And somebody else's pays a higher price.

Logan: a and, and so not being contrarian for contrarian's sake, but instead being, uh, building consensus or having people see the world the way you do.

Once that risk has been

Josh: I always say that we want people to agree with us just later. And so the contrarian aspect is just being earlier. It's the same thing as identifying that artist or the band or Uh, you know, the thing that other people haven't found yet, that's the competitive pursuit. I just want to be there first.

Same thing as a real estate developer. You know, everybody's going to Midtown, and they're like, well, what about Brooklyn? Oh, no, no, that's weird, right? You got artists and freaks, and they're like, oh, yeah, that's the cheap place. But in five or ten years, it's going to be the hot spot where everybody wants to be.

Logan: So if, if other people are fishing in certain ponds, do you have an innate desire to avoid those ponds?

Josh: will generally look and say, what's the thing that they're missing? And I have to assume that everybody else is way smarter than me and they're aggressive and they're competitive. So informational advantage, like, is there something that they're missing?

So you go back. You know, was it 20, 2000? I don't remember when Al Gore's inconvenience speech was almost 20 years ago, right? Um, you had an entire movement of very credible, very brilliant, very successful venture capitalists, Vinod Khosla, John Doerr, that were all in on cleantech and green tech. And they created a gospel, which created a sort of, you know, if you wanted to apply the teal, uh, Girardian mimicry, like everybody was just following this kind of Uh, their op eds and their talks.

And it was solar and wind [00:18:00] and biofuels. Those were the big things. Elon and some others were sort of doing electric cars. And nobody was talking about nuclear. The word nuclear wasn't even mentioned in Inconvenient Truth. So, it's like the curious incident of the dog in the night. Like what, you know, listening for the sound that isn't made.

In that case, you know, the dog. It doesn't bark and, and, and then Sherlock is able to identify, well, it was somebody that the dog knew. In this case, nobody's talking about nuclear. So I sort of, my ears perked up and I was like, why is nobody talking about nuclear? And that happened to coincide with the time I read this book called The Bottomless Well.

And it was just this offshoot book written by a relatively conservative, um, duo that was part of the Manhattan Institute. Guy Mark Mills, who's still alive, and guy Peter Huber, who was a brilliant polymath. And they made the simple, logical case on the directional arrow of progress that is, is what I sort of call these, you know, progressions in technology around nuclear.

And it was like, look, we went from carbohydrates to hydrocarbons, oil and natural gas, to uranium. The undeniable arrow of progress across that transition was more and more energy density. Per unit of raw material, large fields of fuel to oil and gas, you know, from dead dinosaurs to small pellets of uranium.

And I was like, well, the logic of nuclear is undeniable. It doesn't matter what you believe, like objectively, this is better. Okay. Well, what's the problem with nuclear? Why is this not? And one of the biggest problems, what do you do with the waste? And so I was led down that path in this sort of contrarian sense, because I rejected what was the consensus.

Everybody was going after solar wind biofuels. And then I wrote a piece basically saying solar was going to be like the flight of Icarus. You know, it was gonna, uh, have hundreds of companies, and most of them were gonna fail, and it was gonna be sort of, by analogy, like the dot com boom with fiber optic, uh, you know, where everybody was laying down fiber optic cables, and the beneficiaries of that was the rest of the world that got connected, but the losers were the investors that funded that stuff.

This is the same thing with biofuels, which I, you know, call them biofools, and the logic there was like, okay, you know, you're gonna use cutting edge recombinant DNA to generate fuel from plants, and people are really excited about this. Again, good, sexy scientific idea, but then you go into the economic case for it and say, this doesn't make any sense.

Why? If you have a commodity, which by definition is undifferentiated by anything but price, people don't [00:20:00] care. Then you're competing with a well in, you know, uh, the tar sands in Alberta or, you know, the Mideast. Uh, now on the other hand, if you have a drug, And you have a patent around the composition of matter.

And you can make a molecule that can save people's lives. And you have a monopoly on that molecule. That's an amazing business. But people mistook a commodity chemical for a monopoly molecule, like a drug. And so I thought that that was foolhardy.

Logan: when you're fishing, so, so the ideas can often come from the papers or the, you know, anything that you're, you're reading from a Twitter or whatever it is.

Um, but in the actual pursuit of getting up the curve on something, you said something interesting earlier there that, um, that, that if someone's fishing in that same pond, a lot of the people are going to have, you know, insights. There's a lot of smart people looking at that. But implicit to seeing something that other people aren't is a need to get up the curve uh, in areas that, um, you don't have existing expertise in.

How do you actually go about that process of figuring out?

Josh: starts with the same competitive drive. It's like, I need to know more than you about this particular topic. And um, Yeah, you know, that'll start just with reading basic scientific papers. And back then, particularly when we were doing this, there was no chat GPT to summarize this kind of stuff. Um, today it's actually a lot easier and quicker for me to get up to speed on a lot of different things and find a car layers between different papers.

But my goal is when I'm talking to that principal investigator, that scientists with that breakthrough.

that

ideally I've heard about from another scientist, meaning I am earlier than science or nature because they haven't even submitted a draft to one of the leading publications, which is then going to be picked up on a Tuesday times a month later.

So I want to be as early as possible to the root. And that means a credible scientist that likes us or we've done stuff with has said, Oh, you had to talk to this person. They have a major breakthrough. That's often how we find our things. It's just word of mouth from scientists that we like and trust who themselves are competitively inspired to be around other scientists who are really groundbreaking.

And then we just start reading their papers and my goal is to be incrementally smarter in every [00:22:00] conversation so that by the sixth or seventh person I'm talking to, Their response is, holy shit, how does he know so much about XYZ? And so that to me is sort of like a competitive, obsessive thing. From nuclear waste, to geopolitical controversy in the Sahel and Maghreb, to aerospace and defense, to satellite and manufacturing.

I need to know more than the next person about whatever that new thing

Logan: How many invested areas are there that you're?

Going deep in at any given time like that.

Josh: I get obsessed about something typically on like a six, eight week basis. Um, and sometimes they're, they end up on the cutting room floor. You know, I get obsessed with something and it just sits there. Um, I was obsessed with digitization of olfaction for 10 years looking, you know, for the prince, you know, the, the frog to kiss to turn into prince.

Uh, and, and finally that happened two years ago, two and a half years ago. Um, but that was an area that I was really obsessed about. I don't know, there's probably 10, 12 at any given point in time. And sometimes it's things where I'm like, this is inevitable. But it might just take forever and I'm waiting for the entrepreneur, or maybe there's a technological breakthrough.

And if I knew what that technological breakthrough was, it would already be here. So I don't know what that thing is, but then you see, you're like, Oh my God, that that's the unlock. Um, so, you know, like brain machine interfaces, you mentioned control labs. Everybody was thinking about just like implants, you know, even nilon, right?

So the natural thing is everybody follows the consensus of like implants and the beauty of what control labs was doing. It was like, no, no, no, we're going to do non invasive. These are going to be risk borne devices that can detect the 15, 000 neurons that innervate the 14 muscles in your hand. And that will be the future of your control systems.

And I was like, that is a brilliant insight. And it, and it followed that directional arrow of progress, by the way, of we used to go up to these large ENIAC computers and you literally pull things. And then you had a desktop computer and you're tickling the keys. And then you have a laptop computer. And I look at that transition and I'm like, okay, we come up with these cheesy terms for it.

The half life of technology intimacy. Because every 50 years, 25 years, 12 and a half years, 6 and a quarter years, 3 years, technology is conforming and coming closer to you. So the natural next thing after airpods and watches and all this, was that we would just have device free gestures to be able to control and raise [00:24:00] volume and lower lights and switch between apps and those kinds of things.

And that's sort of what led us to the conviction for

Logan: If you have eight, 10, 12 things going at any point in time, are you, are you waiting on, um, the answer of why now and someone to be able to come in and say, Hey, this is actually inevitable today. Cause every idea seemingly has tried in some way, shape, or

Josh: Yeah, and, and, and it's not exclusively that. Sometimes I have a thesis, and I'm totally wrong, and an entrepreneur comes in, and, or another partner brings in an entrepreneur, and I'm like, oh my god, like that's, it's their vision. And so, you know, something that I've been obsessed with for 5, 6, 7 years have been, uh, interspecies internet.

An idea that did not originate with me, it was Vint Cerf and Peter Gabriel, and Um, uh, Isabel Rossellini, uh, and a handful of like weird people that had gotten together. And they were like, what if animals could communicate with each other through technology? So sort of Dr. Doolittle meets, you know, and I was captured by that idea, that we know that there's an inner life of dogs or of dolphins, and we don't understand their language, just like I couldn't speak Mandarin.

But technology and Babelfish can let people communicate with each other. And so are there things, whether they're visual cues that can be detected by camera and computer vision and AI, to know the body language of an animal or the actual sounds of what an animal from a whale to a dolphin to a dog. And could you have a dynamic where there's this interspecies internet?

And so that's something that we've looked at. And the closest thing we've seen We're, you know, dogs that were trained to sort of tap these buttons. I'm sure you've seen on the internet and, uh, you know, what, what are they saying? You know, I want to go outside. I want to go to the park. I want to sleep, you know, vomiting half bad.

I don't know, you know, and, um, yeah, there's always some weird thing that we're hunting down and just trying to find the entrepreneur, the technological breakthrough that again, provokes that feeling of, Oh my God, this is amazing. And ideally singularly amazing that there aren't 10 people that are doing this.

Like this is the one

Logan: in that case. Um, do you feel, is it a feeling that you get when you actually make an investment decision behind, uh, you know, someone to actually give money to and [00:26:00] back, or is it more along gestation period and a lot of, uh, iterations and cycles on, um,

Josh: the latter.

The Control Labs Story

Josh: Um, I'll give you another example related to like X Men, which we actually created a company, but in some cases I actually manage, imagine the postmortem, not of the company failing, but of us not winning the opportunity to partner with them. And so I can tell you, and my wife would tell you.

That the circumstances of how we did control labs, which itself is a fascinating story that I credit to Chamath in a negative way. Um, I had sleepless nights because I lost that deal. I felt so strongly that I wanted to partner with this entrepreneur Reardon. And believed so firmly in the conviction of what he was building.

I literally couldn't sleep. I was like, we just screwed up. Like, like, it was like the one that got away. And I felt that emotionally strong about it. It wasn't anything in a spreadsheet. It wasn't other people's opinions. It was this gut instinct of like, I can't let this get away. Can actually I got introduced to Reardon first through a former intern of mine who was, um, computer science PhD.

And he's like, Oh, you should talk to this guy. I didn't take it seriously. I didn't wait it highly enough because I was like, Oh, what did he know? You know, blah, blah, blah. Then we had a partner, Renata, who's amazing, who's gone on to build a firm Renegade.

And her husband was a lawyer that was working on this company. I was like, Oh, maybe we should take another look at the time. I think it was called Cognizant or Cognizante. It was on 27th and. Broadway in a crappy building and I meet this guy. He's quirky and he's weird and he's interesting and he's brilliant.

Uh, Reardon started, uh, uh, I think 17 or 18. He's auditing classes at, at MIT. Bill Gates hears about him, brings him on, says, come work for me. He didn't go to college. Single handedly creates Internet Explorer and then is Bill's right hand guy during the DOJ, uh, fight for, uh, Netscape and, and Explorer.

Retires, made a bunch of money, and then at 33 goes and gets a, um, a college degree, uh, and then gets a PhD in neuroscience [00:28:00] spending the next eight years and just is obsessive. Uh, speaks like perfect Latin, enunciates his T's and everything. He's just a quirky, interesting, singular kind of guy. And I become obsessed with this vision of human computer interaction.

And it was a domain that not a lot of people were looking at, and the ones that were, were thinking about these implantable devices, and he was thinking about non invasive. And so it was really interesting. So, uh, it had been done by Spark, Matrix, and I can't remember, but let's say those two, and I think they did 11 at 40 or 50.

And I was upset that we missed it. Like it was there for our, you know, for us to do. And, and it was a, it was a error of omission. We missed it. I get close to Riordan and, uh, I'm like, okay, we're going to do this. And I think we offered him 20 at 80 or 85 post. So uptick from the prior round and

Logan: Immediately after?

months Okay.

Josh: within six months, maybe nine months, but, but not, not longer than a year.

Um, Antonio matrix. Said, you know, go to market, see if you can get something better. You know your worst case, and that's the right thing to do as a board member. You know, try to lower your cost of capital. Um, we end up losing the deal. Index does it. And I think they did it at a hundred. And I was like, you know what, we're gonna be price disciplined about this.

I started losing sleep. We should've, we should've. Okay, so here's what happens. John Doerr was thinking about having Chermoff and the social capital team sort of come and take over.

But

that was quiet conversations in private. John Doerr is incredible. K. P. 's an incredible brain. There's a Fortune article where I think Tremont sort of bragged about the fact that this was, uh, and he, and he put sunlight on it.

And I don't think John Doerr was happy about And, uh, but he was sort of bragging that, you know, John Doerr wanted him to take over Kleiner Perkins. Um, in the next two or three months, John Doerr poaches Mamoun, and Mamoun comes and runs KP, and Social Capital starts having some dynamics. [00:30:00] Um, and, um, and then Mamoun hires Ilya from Index, and Ilya was the lead sponsor on this deal with Control Labs, and, uh, I call Reardon.

And I'm like, what does this mean? He's like, it's not good. He's like, we lost our sponsor. And I said, I'll, I'll buy the steak. And I didn't know Mike Volpe at Index, but a good friend of mine, John Elkan, who is the Fiat Chrysler family, and he had worked at Index. I call him, I think I was on vacation in Mexico.

I said, can you introduce me to Mike? He's in possession of a stake that I really want. And I love this entrepreneur. And if the entrepreneur supports it, maybe there's a deal to be done. And I would pay a premium for this. And he puts me in touch with him. I call Mike, I'm sitting on a beach, I explain the dynamics.

And he's like, look, I just want it. And he's a class act. I just want to do right by the entrepreneur. And I said, do you want to, you know, retain a stub of this, you know, just as sort of a, an FU in case, you know, this thing works out and you want to, he's like, no, just like, honestly, like we'll dispossess of it.

So we bought it at a cost and, um, I lost the deal. And then a series of dominoes that I think happened with the hubris from Chamath of bragging that set in chain of, and I could have never have predicted any of that, but I stayed close to the entrepreneur and we had a dynamic of mutual admiration or respect and he advocated for us.

And, and yeah, the rest was history.

Logan: All piece of good dude. Yeah. I, uh, I I'm curious, uh, in that. So what is Reardon do now? Is he still at Metta?

Josh: He, he left, um, and, uh, about, uh, six months ago, um, part, you know, earned out investing and he was responsible for really all neuro and, uh, I think you'll see this risk borne device and meta devices by the end of the year being released in both gaming related things for Oculus and then. You know, Zuck's aspirations are not to have, you know, destroyed democracy or reinvent ads, but really reinvent human computer interaction.

So I think this is going to be a big deal. And I think, uh, Reardon's got, uh, you know, one or two more acts in him. Uh, he is really at the intersection of technology and biology. So I think you'll see something pretty big there.

Logan: Does it keep you up at night? What could have been for control labs as a standalone independent company? Yes.[00:32:00]

Josh: Again, error of omission in that I was probably And Reardon would tell you the loudest table pounding director to not sell. I tried financial suasion. Um, you know, give them some secondary or some liquidity. I'll tell you a quick funny story about that. I tried moral suasion. I had my kids who were, you know, Younger and cuter at the time.

And they sent video messages to him, you know, reared and don't sell. I'll never use it. Um, one of the co founders who's an amazing guy, this was before COVID I think it was the fall or winter of 2019 and zoom was active, but not the way that it is today. And we had a board zoom call to approve the transaction.

And I wanted to support the founders, but I was really begrudging to sell because I thought this could be like a 10 billion plus outcome. And every time that Facebook wanted to make an acquisition, they proved really prescient. And you look at every deal that Facebook did, and you're like, man, they were geniuses, right?

They bought it for a fraction of what it was worth.

Logan: And, and the sale price ultimately ended up being for control apps,

Josh: little under a billion.

Logan: little under

Josh: And we made a great multiple in a short period of time, but it really should have been a multiple, multiple. Billion, uh, exit. And, uh, I'm looking at the Zoom call and I see Reardon and Antonio and me and Kevin from Spark and, uh, Eric from Google, GV.

And, um, and I'm looking at his co founder and I see bunk beds and I text Reardon. I'm like, is he in his fucking Columbia dorms? And, and he's like, yeah. And I realized that this founder was probably gonna make 9, 500 million dollars. And I'm trying to get him to hold out for, you know, 500 million, five X or whatever.

And, uh, I was like, I can't believe I didn't offer this guy, you know, some early liquidity to just get a taste of money. He's probably going from making, you know, 40, 000 as a postdoc to 000 as a, you know, co founder. And then, you know, he's looking at 95 or 100 million and I'm trying to convince him to take half a billion dollars.

Like, you know, but yeah,

The Founder’s Dilemma: Financial vs. Strategic Motivations

Logan: in the, in the slope of the line. I'm sure you've experienced this with other sales as well, but the difference between $95 million and $500 million is not one that you'll ever [00:34:00] probably experience.

And so, so the, the need or the motivation for the founder has to be something at that point. Well, beyond financial

Josh: totally. And, and, and that was difficult because I think Zuck was also promising and has followed through on that promise that they were going to invest. a lot of money into this platform. And so the idea of not having to go out and raise capital in the future and just have the scale of this massive organization and thousands of people and Facebook reality labs and, you know, billions of dollars in R and D to put behind it was very enticing.

And so even if you were a scientist, researcher, entrepreneur, that itself is very, is very hard to compete with.

Geopolitical Concerns: Iran and Global Politics

Logan: I, uh, I, I want to, I want to come back to some of the investment themes and stuff, but before we get off the point of, uh, geopolitical or some of, some of the very high level stuff, are you, is there anything else outside of the Wagner group in Africa or any, anything over the course of the next 18 months that you're paying attention to geopolitically that maybe is under appreciated by people?

Josh: I think the single most important thing, and it's interesting because it plays into domestic politics, but if there was one thing that you could You know, grant me a wish to change. It would be Iranian regime change. You know, the Iranian people are amazing. Um, the regime is deeply unpopular amongst the population.

Um, and it is responsible arguably for a lot of the chaos that afflicts the West. And, um, and not arguably, it is responsible. And so when you look at the Mideast and our interests there from Hezbollah, Hamas, Houthis, Syria, Iraq, um, and other tentacles that we probably haven't even seen yet. Um, they have been insidious and pernicious and evil and oppressive.

And I don't know why our population is not outraged in the streets at the apartheid against women, uh, the oppressiveness of the regime, the terror tactics, like none of this should be tolerable. So that to me is like the number one dividend that we would see generationally. I mean, remember, this is a regime that really only took on 1979.

And, uh, and it's persisted, but, you know, where you see regime change there, I think, uh, in a positive way, it would be the greatest peace dividend that the world has seen. I look at it, I call it crink. You have China, which is a [00:36:00] cold war. You have Russia, which is a kleptocracy, a klepto state. Um, Iran, which I think is the most evil.

And then North Korea, which is, you know, the crazy brother of China that they can sort of, you know, keep in control, but, you know, let off the leash a little bit when they need to. And, uh, but Iran to me is the, is the number one thing and that's front page headline news. And, um, whether we see, um, some short term detente in a counterattack against Israel, or it ends up with some regional war or series of continued assassinations, I don't know.

Controversial Investments: Defense in Israel

Josh: We've made our first two investments in defense in Israel, which, um, you know, they'll be publicly announced. Um, one probably I think is leaked in the next few months. One probably not for a while, but, um, that was pretty controversial

Logan: Mm. Defense in Israel. How so? Like, like some angel type investment.

Josh: that's one that's a mix of software and hardware and one that is, um, really exclusively focused on external threats.

Um, not anything to do with Gaza. Um, Okay. And, uh, and, and overt weapon systems

Logan: Hmm. And, and controversial, internally controversial with limited partners? Controversial?

Josh: I think just controversial on the global stage and one that, you know, people will probably protest against and those kinds of things. But interestingly, internally, there was pretty strong support and consensus.

People wanted to know that there were rules of engagement and moral overpinning of it. And we've got a very diverse partnership, uh, and very diverse views about Israel, Gaza. Um, yeah, we felt pretty strong conviction about this one.

Logan: I, uh, I want to talk about the LP side or the implications on making a decision about something like that. But, um, I'm curious your perspective.

Societal Malaise and Happiness Trends

Logan: So, so we talked about you mentioned Iran and the general, um, indifference, maybe that a lot of the country feels towards global politics. Political stuff in, in general. And I think you're seeing a rhetoric about Ukraine. That's the case. Obviously it's pretty polarized to what's going on in Gaza, uh, currently and the different sides of, uh, of all of that.

Um, but I'm curious, [00:38:00] uh, do you have a perspective on the general, maybe malaise of, uh, of society or domestically and people's perspectives on, uh, global political events or happiness or anything along those

Josh: You know, I think part of this is almost like waxing nostalgic that, Oh, the good old days of whatever. But I don't think there's a better time to be alive than now.

Logan: You don't think we can make America, uh, great again?

Josh: think

Logan: period do you think we're going

Josh: America is incredible. Like, like literally that is the question. Where and at what period would you rather live?

And maybe somebody would transactionally say, I'd love to be in like the go go 80s, you know, and Um, but I, I want to live in the future, like I believe that the moral arc of society, I believe the technological progress, I believe that our ability to fight and detect disease, the ability to communicate, the democratization of technologies, all of these things point towards an ever improving world, not to say that it is a linear path, there's ups and there's downs, but the general push to me is that a better and better future.

So I don't feel the malaise. I think that there are punctuated points where, you know, unemployment is higher than people believe it is. Interest rates and the natural cost of capital is higher than people in the Fed and presidential election want you to believe. The valuation of companies are higher than they ought to be.

Their misallocation of capital is greater than it ought to be. Um, the cultural values of a younger generation On the one hand, I think is amazing. People seem to be more pro positive and, you know, focused on environment and focused on social progress. And at the same time, there's a coddling that's been well documented by John Haidt and others, um, that creates culture wars and, you know, far left woke and an intolerance of, of basic tolerance and, uh, so all of that.

But, but to me, it's like punctuated things that get disproportionate amplification. And you see that even in the political parties today, Trump's election of JD Vance, really going out to the extreme of let's appeal to the MAGA base and, uh, Kamala with Waltz is sort of the same thing, let's go out to the left and, you know, there is an opportunity for a great [00:40:00] centrist leader.

And, uh, I think in this current game, and it is a game, it's a political game, we don't have that. But I'm, I'm optimistic. I'm optimistic about this country. There's no place I'd rather live. This city, New York City, is incredible. I mean, every day we've got crazy homeless people, you've got knivings, you've got stabbings, you've got people being pushed.

You read the New York Post and you think this is like a hellhole, right? I mean, it's like, you can't go on the subway, you're gonna be pushed to death, you know, there's, but the migrants and Uh, but it's the most incredible, rich, culturally diverse opportunity city in the world. And I grew up in Coney Island, Brooklyn, single mom, four of us living in a two bedroom, one bath apartment and the life that I've lived, not because of, you know, um, uh, inherited money or you know, I, I joke that, you know, Clifford Ashton III, you know, gave me, uh, it's just, you know, and I want to give more people that opportunity.

Logan: Yeah, I guess one of the things that I, I've been reckoning with or thinking about on these lines is, uh, if you look at a bunch of happiness related surveys, uh, GSS happiness surveys, one that was launched in 1972 and you've seen, uh, While quality of life by every demonstrable, uh, metric has, has seemingly gone up and to the right, happiness has actually decreased over the course of the last 50 years or so, 60 years.

And it's, it's something that I'm not sure, uh, why that's the case. And it may be, it's a function of the human condition. And there's a comparative element of how people view, uh, the world and maybe social media is to blame for it. I'm sure the pandemic had a big part in it, all of that. But it is, it's a very interesting construct that, that while life expectancy is going up and diseases are being eradicated and all this, we're actually not happier for it.

The Role of Media and Social Influence

Josh: Some of that is social contagion and zeitgeist. So, you know, you could point again, go to John Heights, you know, the anxious generation and, and documentation empirically of teenage girls, increases in suicide, uh, comparing themselves socially, body image issues. Uh, same thing with boys, uh, boys being in some sense emasculated and being told, you [00:42:00] know, they couldn't be like tough boys and, and, um, video games and all these, I watched tons of video games.

I mean, tons of TV play copious amounts of video games, had good friendships. Um, And, you know, a very present mother, high expectations, and I think, in hindsight, like, had a very tappy childhood. I was full of, you know, anger and frustration and took it out in heavy metal and hardcore hip hop and that kind of stuff, but like, I don't know, I really think it is the stories and the narratives we tell ourselves.

So if you're watching the news, by definition, the value of the news is not to tell you, you know, Um, uh, man saves, you know, uh, uh, a ton of money and, uh, invest wisely. I mean, yeah, and, and, and, uh, you know, two people save from, you know, burning cars. Everything is, you know, if it bleeds, it leads. And so that, when you become almost inoculated from that to understand like most of the news, most of the information, I do read a lot of papers every morning.

is intended to be negative and to provoke emotional reaction. And then you go to the sort of Murray Gell Mann, you know, the Gell Mann amnesia effect, where you assume that the stuff you're reading has high credibility, and when you read an article about something you actually know, let's say it was an article about venture capital or something, or, you know, neural interfaces or nuclear, I read it and I'm like, man, these people have nothing, no idea what they're talking about, and then you flip the page and you're reading about something else, you're like, oh, that's terrible, you know?

And

Logan: There's a funny, there's a funny tweet recently about like, John Oliver's segments are the best until he talks about something you know

Josh: right. And you're like, well, what a moron, right? Uh, no, it's a great point. So, I think one explanation, and all of these are going to be imperfect and in confluence, but one explanation is News and media definitely influences us. The second thing is the people that you surround yourself with influences you.

Danny Kahneman, who was a friend, just passed away this past year, you know, would always say that it doesn't matter what you think you believe or why you believe it. The most likely reason, the most likely reason for what you believe is the people around you believe that thing and the social dynamic of wanting to believe what they believe.

And so you see that amongst, you know, cohorts of young people, they're on social media together. They're in school [00:44:00] together. They have a teacher. We're the asymmetry of one teacher talking to 40 kids and telling them sort of the thing. So there's a combination of indoctrination, social influence. The media's bias, I don't say that, like, I think media's great, you read left, you read right, you can sort of cancel out the waves and, and, and come up with a cogent sense making understanding and, and handicap the biases.

Um, but, you know, medias do have bias, not in their reporting, but to provoke emotional response in the same way that the algorithms of social media do. Um, And then I think you have some of the things on body image, you know, you're constantly seeing just extraordinarily attractive people, um, there's people that, you know, feel dissatisfied because they're on Instagram and see all these people living their best lives, which are, you know, fake.

And, and so I think all those dynamics sort of add up to, um, punctuated malaise that then gets exacerbated by even the narrative. But if I started and you started being like, I have never seen, I mean literally, if we started an experiment, I have never seen more happy people. I don't know if it's the summer, I don't know if it's the changing winds of political change, but my god, like everywhere I go, somebody is appreciating the small things in life, the little morsel of food that they just ate, the fashion, the fabrics, not taking for granted life.

You can start a movement that could be just as contagious, that would inspire optimism, and people are like, where is this unfound optimism coming from?

Logan: It's like that Louis C. K. Wi Fi on the airplane that it's like people, people, I find that everyone, no matter what level of success you get to or amount of money you have, or like everyone has. I think the human condition is to have frustrations and, and to get, uh, mired in the, the little petty slights, uh, of the day or the comparative stuff that you're competing with someone else on.

And, uh, I think that's just, no matter what, there's probably some steady state that everyone needs. Some people are going to be positively inclined. You growing up as a childhood, uh, in more hardship of circumstances, but you were, A happy kid based on the circumstances. And I went on, uh, my honeymoon was in, uh, Mozambique.

Uh, and you know, we, we were on this island and, uh, they, most of the houses didn't have internet. They didn't have [00:46:00] running water, all that, but everyone was so

Josh: Smiling, right?

Logan: it was so, they were, you know, in the sun every day and they were just very excited that they were tourism coming through and they weren't thinking about, you know, I can't afford a, a Birkin bag or a, uh, iPhone, you know,

Josh: You know, so a few things on this. One, I tell my kids all the time, in part, it's almost like a stoic philosophy, but the family's fighting about something. And I'm like, can we just stop for a second? Nobody has cancer. And we know people that have died of cancer. We know friends that have had cancer. We know kids that have had cancer.

It is a beautiful day out. We're all healthy. We have tremendous opportunity. Like, what are we upset about, right? And, and just to sort of level set and reset that kind of mindset, the Founding Fathers had it right. And that Will Smith movie, you know, Pursuit of Happiness, had it right. Which was, happiness is this hedonic treadmill.

Like, it isn't a permanently achievable state. Unless you are constantly in this mindset. I mean, I wake up every morning and have what some would consider a very macabre, dark view. But I expect that my son is going to come to my bedside and be like, dad, I have a lump over here. Do you know what this is?

And every day that it doesn't happen, I'm like, what an amazing day. How lucky am I that my son does not have cancer? Um, I look over at my wife and I'm like, how lucky am I that My wife is not cheating on me or vice versa, or that we have a happy marriage. And so taking all these little things for granted, because, you know, like the Janet Jackson, like, you don't know what you got till it's gone kind of thing, which may be, I forget who originally saying that.

Um,

Logan: Joni

Josh: yes, yes. Joni Mitchell or Lauren Nyro. Yeah. And, uh, thank you. And, um, uh, I, I think that one is that happiness set point. Which is this Hedonis treadmill that you're never gonna be happy if you're constantly trying to strive for it but just appreciating the things that you have and Thinking of the counterfactual and imagining the negative and be like, oh my god Like I actually have it quite good when I trade this right now There are a ton of people by the way that have you know A lot more money and I look at them in their lives and they're miserable and I'm like I wouldn't trade For a minute, that hardship, I, you know, I wouldn't want that.

So, I think that's one part of it, um, but I, I really think it's just the narratives we tell ourselves. I mean, we literally could start [00:48:00] in, you know, and for somebody that's, prides themselves on being the skeptic and the cynic, the contrarian take is like, life is good.

Artificial Intelligence: Current Trends and Future Prospects

Logan: Uh, artificial intelligence, um, you are something of an expert in that area. You've been spending time in it for a while. You're at least, uh,

Josh: We, we were early

Logan: You go on

Josh: and have been lucky. Oblast

an AI expert,

Logan: I guess, uh, uh, are there, is there anything that you feel like people are missing currently with regard to artificial intelligence?

Josh: Look, I think the consensus, you know, 2016, I was at an event for Invest for Kids. I got the insight from Zoox, which was a moderately okay exit for us, selling to Amazon for self driving cars. It also should have been much bigger, but they were training on these Nvidia chips back then. And I pitched this public conference, um, that I thought was the pear trade of the century.

Be long Nvidia, at the time it was a 15 billion market cap and short. Intel 150 billion dollar market cap. And you know, our LPs and people that attended that, they were very happy. You know, we don't do public market stuff, but um, people bought that in their PAs and, and it did very well. Um, so the chips was act one.

There's, you know, chip 2. 0, which is AMD and other people that are looking at different architectures. And can you get over CUDA and NVIDIA and go into Python and PyTorch and other things? Um, I think that is pretty much saturated. And I think probably what people are missing and maybe people appreciate now is that.

Um, what the wise do in the beginning, the fool does in the end, there's going to be a glut of chips, what, you know, people believe is a shortage. The next consensus thing is, well, it's energy and energy scarcity. I think that you will probably see Mideast players build here, massive data centers, but I think we will end up with a glut of it.

And in the same way that you had people that were building for, say, crypto mining and Bitcoin that had been repurposed. And so those things will get repurposed. NVIDIA and Jensen would like you to believe. that you need cutting edge Blackwell chips and, um, and whatever comes after for inference and training.

You do need it for training. You don't need it for inference. And so that's number one and two, three is applications. I think have this half life. So you've seen user engagement on open AI and chat, GPT decline. And um, There's an endless number of [00:50:00] open source models that are now competing with most of the two dimensional text, video, images.

We've got runway and video, uh, you know, you've got mid journey and stability, which maybe was a fraud, yeah. I mean, all these things, I think, are massively saturated, and when I look at my own use, it's part of my workflow. It increases a little bit of productivity. I'm not as obsessed about most of these things.

I use mid journey to be able to express myself emotionally because I can put things into an image and describe things in a way that I might not be able to articulate in words. I use chat GPT for scientific summaries. Um, but by and large, you know, perplexity for a little bit, you know, deeper, but most of those things I think people will use, you'll have monthly subscriptions.

I don't think that they're, they're going to be as game changing as people, nor do I think that the valuations justify you look at the aggregate amount of capital, just looking at Nvidia's earnings, maybe 50 billion has been invested in the past two years. In AI related things to justify a meaningful return on that added data centers and energy, probably a hundred, $150 billion.

You need 250, 300, $400 billion of economic value to be created. There's maybe open AI plus a long tail of other people, four or $5 billion of revenue, four or $5 billion of revenue is far short of $200 billion of economic value that has to be created. So I think that there's gonna be a lot of, you know,

Logan: Yeah, it seems like, I mean, a lot of people now, the narrative is, uh, well, Nvidia compared to Cisco, it's nowhere near Cisco in the 90s because look at the price to earnings and, uh, the revenue and it's totally in line.

Uh, if the demand curve, uh, these, these are real companies buying it, right? It's not internet and eyeballs, Microsoft and Amazon and, you know, whatever Google are buying a lot of these chips and all that. But if the real revenue and use cases. Don't prove out in a meaningful way. We have co pilot with Microsoft co pilot for the, uh, uh, Oh, three 65.

Doesn't seem to be going particularly well for GitHub. It seems to be going decently well, but you kind of break all this stuff out. And if the demand curve doesn't meet it, uh, then ultimately it has to be a bubble just because the finances make sense. It just, if user demand doesn't actually prove out the productivity, then it's

Josh: circular. [00:52:00] There's two signals supporting exactly what you say. Number one is the market's expectations. And I think it's been clear for two quarters now that now, this quarter and next, people are looking to see, do we have price premium translation?

So if you were Adobe and you were charging 29. 99 for Creative Cloud, Or you're on Office 360 and whatever the monthly cost there is 49. 59 a month. Are you getting another 10, 20 percent premium on top of that that justifies pricing increase? Or are you also getting cost savings? And you know, uh, I forget, was it, um, uh, not Kava, uh, Klarna or with, with the, um, uh, cost savings on customer service that that was like an extrapolation.

It wasn't an actual economic cost savings. Right. And so people went crazy for this.

I think that there's a massive over re overreaction of expectations on the pricing power that people are going to get that it's just not going to translate. And the cost savings that people are getting, it's not going to translate.

And that means then, which is all that markets depend upon, fundamentals and expectations. Expectations very high, fundamentals don't follow, that gap means value gets destroyed. The second thing, Microsoft just began, um, I think in the most recent quarter, telling people that the payback period, For a lot of the CapEx that they're doing is going to be 15 years, which means it's probably gonna be 20 or never.

And so yes, the NVIDIA demand cycle is real because all of the hyperscalers are buying and building, but they're doing it in this sort of composition, uh, fallacy of composition. Everybody's doing it. They're all getting, you know, on their tippy toes. They're all going to be, you know, no competitive advantage.

And eventually then they're gonna look and say, okay, we're slowing. So I, I think. NVIDIA's demand curve was pulled forward three, four, five years, and they've just pushed on the FOMO of this to get people to buy. And then some of it is also geopolitically driven because of restrictions where other countries have bought NVIDIA chips and then through the back door, you know, delivered to China at all.

Um, and I think that that was also pulled forward.

Logan: When, when you have these insights like this, are you tempted to ever to Nvidia on the pro back in 2016, was it a 15 billion or maybe on the con [00:54:00] now? Are you tempted at all to? Dabble in the public markets.

Josh: I do it personally and, you know, we give insights to hedge fund friends, some of whom are LPs, but, um, you know, uh, by and large, uh, you know, we, we are buying long, out of the money call options on private companies.

Um, in, in sort of the, um, uh, convexity of, of what venture capital is. Um, the one thing that I think is interesting, and I think this is not non consensus, uh, all AI over the past few years has been basically two dimensional. So text, video, image, right? And now it's jumping into the three dimensional world.

And the two areas where we've been investing over the past two years, and I always say that there's a five year psychological bias where everybody wants to be invested today where they should have been five years ago, sort of the yesterday's yes. And so the thing that I think in two or three years people are going to be very bullish on, aerospace and defense being another area which we can talk about, because I think there's going to be a bubble in the next few years in that space.

It's early to declare that. Um, biology and robotics. AI into biology is really interesting because most computer scientists underappreciate how hard biology is. Biology is really complex. It is not linear. It is not, you know, programmable biology the way that many people think. And biologists, by and large, being in wet chemistry benches and tinkering with pipettes and what, underappreciate how sophisticated computer science and AI is.

So there's people that are versed at the interstices between those two things that I think are poised to do really well. 1. 0 of AI and biology was a company recursion where we took that public. It was looking at visual, um, uh, computer vision of cellular morphology. So taking pictures of cells and looking for disease states and implications of that.

And they were able to parlay that to in license assets and whatnot, multi billion dollar publicly traded company. But that was sort of AI 1. 0, AI 2. 0, which is where we are today is actually using large language models, which very few people have other than evolutionary scale. This was a group, uh, Which is another interesting theme, which we took out of meta.

There's a guy that I started a biotech company with years ago. He went in, got a PhD in computer science. Went to Metta, followed Jan LeCun and Rob Fergus, Rob went to Google, Jan went to Metta, started the biology group there. They published a paper a year and a half ago that beat AlphaFold [00:56:00] DeepMind. And I said, you should spin this out.

And he's like, no, we're good. We're getting 50, 100 million dollars a year from Metta. Um, you know, I'm, I'm, I'm having the best time of my life. And it's like a 12 person team. Calls me early January. And he's like, can I stop by your office with Eric Lander? I'm like, sure, of course. Eric, you know, found the Broad Institute, incredible scientist.

What's, what's up, what's going on? And he's like, they're going to shut down my group next week. And it was like just a bell. I'm like, okay. And we offered him 25 million on the spot at I think a hundred post. And he did the right thing. And then said, okay, I'm going to go to market and try to get, he ended up choosing us as, as, as, as, as, as his lead

Logan: higher price than the higher price.

Yeah.

Josh: Uh, but, but, but the kind of thing I want from an entrepreneur. Um, and he's gone on and raised now 150 million plus and Amazon and NVIDIA and others, and he has the only frontier. Language model in biology, and he really can from prompt to protein produce things. So it's jumping out of the two dimensional space of code.

into the three dimensional space of actually producing a protein. In that case, the first protein they produced was finding a, a non previously evolved version of green fluorescent protein, a protein that actually lights up. There's many ways that evolution has found this way, but they found a path that evolution never found, which is pretty cool.

And so the implication for being able to do antibodies, small molecules, drugs, proteins, uh, is, is pretty profound. That's a company called Evolutionary Scale. And I think that that as a strategy is something that's interesting of being able to take groups that have received 50, 100, hundreds of millions of Inside of Google, Meta, Apple, et cetera, we've done two out of Google, one which was Osmo for digital olfaction, being able to digitize smell, one which was evolutionary scale out of Meta, uh, physical intelligence, also known as Pi, which is for robots, which I'll talk about next. Um, and then two out of Apple, one which was announced years ago, which was called Ava, which was 4D LiDAR.

Which is also related to the automotive sector, since they shut down that group, um, that has not yet been announced. But I think that's a ripe area to be able to take teams, put them into new ventures. And then you give the parent company either the opportunity to invest or some residual equity and, and it's a really interesting strategy.

Logan: You've been thinking about smell for a long time, by the way. I don't want to let you just skip over that. Uh, cause in going back and doing some prep for this, uh, I [00:58:00] heard like you're yearning for smell even, I don't know, five years ago, eight years ago, you finally found an investment, uh, that can you explain why you think smell

Josh: Yes. And then we'll go to robotics. Um, well, okay. Smell, you know, we've got five senses and it is the one smell of the one sense, which is not intermediated by anything. So touch, you have, um, receptors on your fingers. It's intermediated through vagus nerve to your brain, um, eyes, you know, all kinds of complexity from your retina down to your optical nerve.

Um, taste through your taste buds, an odor molecule comes in, you know, a good one or a bad one, directly to your olfactory bulb, right to the brain, and it is a reason that our ability to chemical sense, you know, from small organisms to humans. to detect positive things that you might want to go towards a food source or negative ones that literally create in your face an evolved reaction reflex of repulsiveness this is literally like scrunching your nose so that you don't breathe in these bad chemicals it is extraordinary salient probably the most salient of our senses you go back to proust in search of lost time and he bites into the madeleine and he harkens back to memory where he's in his, you know, thinking about his childhood.

And I have those moments to this day. You smell something and it reminds you the shampoo of your first kiss, the smell of the nostalgia of your grandparents home, um, you know, a baseball glove that reminds you of a relative, whatever it is, it's just so powerful. And I got to thinking, you know, these devices have eyes and ears.

It's got cameras in the form of eyes. It's got ears in the form of a microphone. And I can hold this up in a cafe and some new song is playing and I hit the button and it's like Star Trek. I mean, people 25 years ago would have thought this was insane. And we all take Shazam for granted today. But literally, it's just invisible sound waves with a frequency and a magnitude or an amplitude.

And it's just like, correlates it with a database of known songs, and it tells you what it is. That's magic. It's literally sorcery. Why can't we do that with smell? Well, it turns out smell is not two dimensions of frequency [01:00:00] and amplitude. It's, um, maybe 40 dimensions. It's much more complex and it's a harder thing to crack.

And we have seen dozens of companies that had like smell o vision and all these kinds of things, and it was all BS. And I'm on another biotech board that has to do with. delivering genetic medicines. And there's a guy there, Dave Shankline, who's at GV, and I was sharing with him this thesis. And a lot of our theses, I just share widely because I'm hoping that somebody will be like, oh, you got to talk to this person.

And sure enough, he says, you got to talk to this guy, Alex Wilchgoe. Immediately call Alex. He was at Google, spending some of the time on GV, mostly developing a research group. Previously started a company, sold one to Twitter, another to Google, and was developing this awesome, digital olfaction and he meets me and he's never met an investor before that was as obsessed as he was now mind you his psychological profile this is a guy who grew up in like college station texas you know friday night football he's six foot three or four uh is not into sports is into perfume you know imagine a teenager in college station texas interpret you know he's just not the popular kid chip on his shoulder I love this idea.

Chips on shoulders, put chips in pockets, and he's driven. He just wants to understand smell and fragrance and flavors and that he goes and gets a Ph. D. at Harvard and like pursues this, but he is like Riordan obsessive. I actually had him and Riordan meet. They hit it off. Riordan's like, he's the real deal.

Alex is like, love Riordan. They, you know, so he becomes part of the company. We end up starting this company called Osmo. We co founded with him, spin it out of Google, bring in Amazon and a bunch of other people did 60 million and Google 60 million financing. And this year they will teleport a smell. So there will be a smell in one location.

It will be recorded, sent effectively over the internet. And then replayed in another location. And the ability to give computers a sense of smell is profound, not only for these consumer applications that I romanticize, but we know that dogs can detect COVID cancer, Parkinson's Alzheimer's. There's a chemical signature to disease.

There are implications for insecticides to repel mosquitoes. What is that chemical that we're using? Can you find safer chemicals that can repel certain mosquitoes or attract other ones? [01:02:00] Uh, and then there's interesting Intel and defense applications. If you wanted to. So you can tag a person very subtly or detect a person if you wanted, uh, bizarre in a particular region that you wanted to have, uh, smoke particles, you know, dissipate and be able to tag a whole bunch of people and track them later.

All these things are very interesting capabilities once you can give machines the capability to detect those chemicals.

The Future of Robotics and Physical Intelligence

Josh: Now, uh, uh, Yes. touched

Logan: on physical intelligence.

Josh: Um, this similarly, partially spun out of Google, partially the most brilliant team out of Stanford, and we've seen a lot of things from robotics.

I have a personal view because I'm always trying to understand the sort of varying perception. Humanoid robots are not going to be the way. Uh, you know, Elon's got Optimus. Those are tele operated. You know, if you watch those tweets, people are like, Oh my God, this is amazing. But then the second tweet is like, you know, that was tele operated, like sort of small footnote, as though, you know, the need for, uh, having close relationship.

to the truth emerges. Uh, when you look at Hugging Face, which is one of our great AI companies, you have something like 60, 000 chat bots, open source, you know, varieties. Um, there were at the time that we made the investment in Pi physical intelligence, less than 30 robot models. So just looking at that asymmetry, we said, here's whitespace.

This is really interesting. And the, Interesting thing for robots is when you look at how large language models are trained, effectively trying to do predictive analysis on what the next most probable word will be and the variations of the different models, um, they're trained on the repository of the internet.

All the Reddit, all the Twitter, all the internet archives, you know, every page of the internet, books, and there'll be copyright issues related to those things and those will get settled. But there's a repository of known information that can give you the statistical likelihood of an X word robots, not in a constrained environment.

Cause that was sort of, if you think about 1. 0 2. 0, we started with constrained robots where we've done parametrically constrained warehouse pick and place assembly, and we have a bunch of companies that do that. Um, the next are robots that can move, but in sort of an X, Y grid, like the Amazon Kiva robots that are doing picking place inside of a factory warehouse, unconstrained robots, which is called 3.

0 [01:04:00] are a robot comes into this room and it's never been here before it is able to do what a four year old can do easily. Navigate the room, pick up a glass, know how much force to use when it picks up the glass to have an intuitive sense of physics, to know that once it touches, it doesn't need to use more force to lift it or less.

Um, all those kinds of things are very hard to do. And the room today at 10 a. m. will be different than at 1 a. m. with different set of obstacles and wires and, and, uh, and things to navigate. So focusing on the training data, the repository for universal robot intelligence. Is, is what they are focused on.

And I think that's where the value is going to be. I do not believe that the people that are doing the end effectors and these human like Westworld anthropomorphic robots are going to survive or win. I think they will attract money because it's a very sexy thing, particularly if you saw Westworld Looks cool.

Great for prosthetics, maybe great for some things, uh, in, in military or in grippling, but by and large, I think that the value is going to accrue to the operating system for robots, which people have been working on for a very long time, which depends upon this training and repository of data, which hasn't existed.

Why do I think that the humanoid robots don't make sense? If I had a cup here or a bottle with a, uh, uh, uh,

my hand is limited. I can do like to

turn it. But if you were a first principle robot approach to this. You would literally have like a drill bit that had a suction cup. You put it on, go spins it off and then changes Swiss army knife to another end effector.

It wouldn't have what nature evolved of opposable thumbs and four or five fingers. And so yes, the built world around us has stairs. And so you might want something that can go up bipedally like we do. But if you were inventing a car today, you would not go back and do it Fred Flintstone style of having two large cylindrical, you know, stones with a guy that's running, you know, you invent the torque and the engine and, and the motor and, um, uh, uh, internal combustion.

And so I think that engineering beats evolution. And I think that this is really inferior to what robots can do.

Logan: do you think? Uh, therefore you're gonna [01:06:00] need vertical integration. 'cause one of the constraints, um, I guess that exists is actually understanding the form factor by which you

Josh: can do

Logan: in, in the wild, in an unconstrained way.

And so how does that dynamic play

Josh: I think that the winning models will be highly specialized robots that do one or two specific things, and they do it extraordinarily well. And then this universal operating system that will basically go into what I think will be a mass competition, partially between Unitree in China, which basically copied, um, uh, Boston Dynamics and a series of other humanoid robots, which will consume and burn and probably destroy billions of capital here in the US and in, in, in Europe.

Um, but I think that the, the value will accrue to the intelligence.

Ethical Implications of AI and Defense Technology

Logan: Um, staying on the topic of, of AI, uh, how do you think about some of the ethical implications of artificial intelligence and, I don't know, things around, uh, genetic engineering or, um, yeah, some of the social impact that could come with some of the technology we're currently.

Josh: The overall ethical view I have are three things.

Um, the first one is sort of a do no harm. And, um, if you think about a technology and you know that it can do ill and you imagine, you know, failure comes from a failure to imagine failure, then you want to put some stop gaps on that. And I, and I don't think you need regulation to do that. You just need common sense and good business and self interest to be able to do that.

But the second thing I think in the ethical, uh, domain. is imagining, um, a world where the particular technology didn't exist. And I think that there is a moral imperative to create more and more technology so that more people can have access to this and express their genius. And the third thing is the moral valence.

I used to have the view that technology is neither good or bad, but thinking makes it so. And that's just sort of what we apply positive or negative valence. Reardon from Control Labs actually persuaded me that that's wrong. And that the correct way to think about this is that the business model matters.

[01:08:00] FaceTime is a benevolent social media means of connecting distant people and making them close. Social media, based on advertising, has a perverse bias. In engagement and arrangement, and you can argue has a negative biased violence. And so the business model where Apple is not trying to get you to be enraged and engaged and is just providing you a tool to literally do video, telephone is different than somebody that is getting you addicted and hooked and trying to hack your attention and your minds and your emotions.

And that is entirely premised on the business model on the technology. So those are sort of the three lenses that I look through it. I think that we underestimate, um, how much good comes from technologies because it's hard to imagine. And you have to expect that the one in a billion person that's out there that's imagining the positive thing is going to far outweigh when the diffusion of that technology and that invention happens through society, the economic value, the life savings, I mean, think about like you mentioned genetic engineering, Norman Borlaug.

And the green revolution that has fed billions of people since the creation, discovery of genetic engineering for recombinant DNA for, for food against insecticide against insects and whatnot. It's just enormous versus the fear that the small group of people had of the one in a billion evildoer that was going to recombinantly engineer plants to kill people or something.

Um, so I think we underestimate the positive. We overestimate the negative. And most of technology has proven. to be far more beneficial. And um, it's hard for me to point to a thing other than, and this is something where like Palmer Luckey and I would disagree,

a technology like a gun, which has a very offered purpose, which is to kill or to thwart, um, that most technology has not been pro social and positive.

Logan: Hmm. How, I guess to that, to that point, um, how does that fit in with your defense tech, uh, thesis and the types of things you will or won't do, uh, from an investment

Josh: You know, when we first looking at, at defense, it [01:10:00] was, um, very overt that we were not going to be cowardly about this, that there were companies at the time that were like, look, we just make the technologies.

We don't decide how it gets used. And if the Department of Defense, the Pentagon, the CIA uses technology.

And we were more of the view of this great arsenal of democracy. There are bad people in the world. There are not a lot of them, but there are bad people. And they want to do bad things. And it really is a moral thing based on what they believe.

Oftentimes very religious. But they have a set of values and they believe things and we wish that they would believe other things. But they can't be reasoned with. They can't be persuaded or negotiated with. They literally want to do harm or hurt or kill. Those folks need to be thwarted, and they need to be discouraged, and usually that is through might.

The idea that if you do something, the punishment and the consequences to you in your own rational self interest will be so severe that you don't want to do that thing. And that's the only way sometimes to negotiate with these very bad actors. And so I think that the persistence of American greatness and our economic striving and thriving and the market systems that we have and the substrate that attracts the world's best and brightest to work here is all premise, not just on American values.

But that we have the strong, the most strong military in the world. And the idea that people are threatened, intimidated, deterred by that, I think is what makes America great.

Logan: As, as your strategy has evolved over the course of the last, when, when, when was Lux founded?

Josh: 20 plus years ago.

Logan: 20 plus years ago. And the most recent fund was how

Josh: big? Billion 2.

Logan: Billion two. Uh, I assume your LP base has matured the normal. Uh, slope of going from, I don't know, individuals to family offices to fund to funds and institutions over that curve. Uh, I'm curious. A lot of these things you've talked about, uh, defense

Josh: tech and, um,

and defense

Logan: all of that, [01:12:00] uh, in our LPA, we have to have a lot of conversations about, uh, uh, you know, what we can and can't do.

So I'm curious. Um. How those discussions go and are you guys just given a broad mandate?

Josh: Do you have

Logan: rules that you end up codifying for limited partners? How does that play out?

Josh: Short answer is, um, there's very few things that we really can't or won't do.

Logan: And so you've had

Josh: LPs, I assume, self select

Logan: out of, in talking to them,

Josh: Yeah, we had people that were, you know, averse to nuclear. We have had people that are averse to defense. Uh, we have people that have, you know, You know, asked us to fill out surveys, which I often refuse to do, about ESG. And I typically retort about that.

The Complexity of ESG and Moral Impositions

Josh: Not because I don't think that elements of ESG are social and positive, but, um, it's the same thing, like, I've been asked to sign, um, uh, um, uh, you know, surveys or, or, um, um, not protest letters, what are they called when, uh, You're, you're signing on a, uh, yeah, uh, you know, that, that we promise to have a diverse team or, and I'm, I'm like, a lot of these things are just good, logical things to do because you're communicating to an audience and you want to appeal to them and you want to win entrepreneurs.

But it's like asking me to sign a, um, uh, statement saying I won't beat my wife. And so I, I've sort of objected to that, that I need this moral imposition that I need to sign on to a petition to say like, we're part of this movement versus just signaling and doing the right thing. And for two decades, we've done the right thing.

Defense and Environmental Catastrophes

Josh: Uh, so I would counter when I get these queries about ESG, do you consider defense ESG? Because I can tell you that if Ukraine had nuclear weapons, that the environmental and humanitarian catastrophe, not only that, that happened there, but that has afflicted much of the African continent, and if you truly care about these people, that that would have been the best thing that we could have had.

And if you look at Greta and the rise of the Green Party in Germany and the, um, the admonitions [01:14:00] against nuclear, Germany shutting down its nuclear power plants literally delivered the country to Putin's grip. And so understanding and appreciating that the world is complex, that there are moral valences to these things.

And that there is virtue in good people that are developing these technologies. I mean, I, I, I struggle.

Investment Ethics and Controversial Decisions

Josh: Um, I would say things that I felt uncomfortable with, or we would not fund.

We wouldn't have been investors in Juul and where you stood on that issue. I realize depending on where you sat on the cap table and there are people that I really respect that were in Juul and made a ton of money and their argument was better than smoking

Logan: Yeah. Other people started in packs and got rolled in too.

Josh: And, and I think I could see that. And I could squint at it, but we felt it's probably more likely to democratize this and get more people addicted to nicotine and sort of, yes, better than lung cancer from cigarettes, but bad. And so that was one where we said we're not going to support or participate

Logan: And there was no LPA consideration or LP consideration, uh, at that level you guys are making these

Josh: No, I think, I think that a fund that restricts themselves from something on the moral imposition that can be arbitrary and changing seeds an advantage to another fund. And I will tell you, like in some cases, uh, us and founders fund are co investors in a bunch of things where other people have said.

You know, I wish that we could have made that investment. But there's no way that our LPs would ever go for it. And large, sophisticated, smart funds. I mean, the Israel Defense Military Company in particular, Uh, there are people that are like, I wish I could have made that investment, but there's no way that our partnership or LPs, because they're so concerned about what, People will think or say about it versus looking at this and saying, like, do you believe that this should exist?

And is this net positive good for the world? And our view in theirs was yes, and other people might be no, but that's what makes markets.

Logan: Yeah,

I, I can imagine their stuff. I can't come up with a specific example, but, uh, I imagine there's stuff that we would have, uh, had to [01:16:00] litigate in a more meaningful way with limited partners.

And Andrew was a,

Josh: you know, you know, I think there are, there are some religious institutions that do not support if we would do. We, we are very, as a partnership, and certainly Peter and I as leaders of the firm for women's health and freedom of choice, and we are pro-Choice individually, not as a firm. We don't have a political view as a firm.

But, I would 100 percent make an investment in pills or treatments or technologies for abortion. And there are, for sure, deeply religious institutions that say we don't want to support that. Um, and they would probably choose not to invest with us, and we would choose not to take their capital. But we wouldn't compromise on what we think is morally right for the world.

Logan: Hmm.

The State of the Venture Market

Logan: Uh, touching on venture capital today, what's your perspective on the state of the venture market?

Josh: It's going to contract significantly. If you look at the amount of capital raised over the past two years or so, 125 billion on average to get a return on that three X or four X, you know, return on capital, you need three 75, 400 million of 400 billion of returns over seven, eight, 10 year period.

If you assume that many of those companies go public and in aggregate venture or outside investors own 60 percent of that, you know, it's something like 750 billion.

If you assume that 5 or 10 companies could be 20, 25 billion dollar companies, you've got somewhere between 100 and 250 billion dollars.

So the same sort of analysis that we were doing on AI, just looking at the aggregate capital, um, there's going to be probably five or 600 billion that are destroyed. And um, that's just not being realized. And so I think that the excess over the past few years that led people to chase returns, which we and others have been beneficiaries of in both exiting some of our companies and raising capital will be severely constrained.

Challenges and Changes in Venture Capital

Josh: You will have limited partners that were [01:18:00] smart. And again, what the wise do in the beginning, the fool does in the end over allocated to the asset class funding, emerging managers, funding small, you know, I've sort of bifurcated this into the minnows and the megas. There's probably several hundred firms that are sub hundred million dollars.

And there were a dozen of, you know, 50 to a hundred billion dollar plus. The tigers, the soft banks, which were once knighting unicorns, and they were coveted because to have their money and the largest and the size and the valuation that they would give you is now taboo. And if you're taking money from soft bank or in some cases tiger or others, you might be, well, why, you know?

And so I think that there's going to be this compression. Uh, both LP interest and even founder demand to go to Redpoint, Lux, Sequoia, Founders, Coasla, etc. And those are sort of the one to two and a half billion dollar size funds. Um, we can do everything from a hundred thousand dollar check to a hundred million dollar last check pre IPO.

We can do corporate spin outs. You have flexibility to look at the marketplace and not be siloed because you're. AI fund or robotics fund or a crypto fund or a sass fund. And so everybody at Lux are generalists by design. Um, I think that you will have probably 30 to 50 percent firm death and people say, well, there's so much money sitting on the sidelines.

There's so much dry powder. And I say, no, that's mostly wet powder and people may not realize it in up markets. It's easy to fund your companies because somebody there is going to come and price it up and reduce your risk. And, uh, you know, In down markets, it's really hard because you have broken syndicates and people are under reserved.

And what might be an okay company for me is that fund's star company. And we might say, we're not just not going to participate in it. And they say, Oh my God, we have a broken syndicate now and the signal value for that. And it ripples. You have a fund that suddenly is facing some of these internal issues then of we just lost, uh, we had a major write up and we just got knocked down.

We have no DPI. We haven't returned money to our investors. There's infighting in the firm. You just saw, I forget if it was NextView or OpenView, the Boston firm. You know, they just raised, [01:20:00] what, five, six hundred million dollars? And then two weeks later announced that they're shutting? I mean, you knew that there was probably, or can infer, that there were internal issues.

People didn't like each other. And that's a really hard thing because we're not a factory. You're a partnership. And you have a handful of people, and you gotta make decisions, and you gotta decide. Do I want to be married to this person? I've been very lucky with my professional marriage with Peter, who's very much like my wife, they're both optimist and pessimist and, uh, and positive, but you know, we've known each other since we were sophomores and in college, he was at, at, uh, Syracuse.

I was at Cornell and, uh, I, I trust him with my family, with my wife, with my money. Like, you know, it's, we have a very good thing, but I don't think everybody has that. And I think that you're going to have a lot of firms that fracture and fail. A lot of LPs that are overallocated, a lot of people that will go chase retail money.

And then I also think that you're going to have a handful of firms that actually go public.

The Future of Venture Firms and Public Listings

Logan: well, so, so one of the things that I, uh, I agree, I think intellectually with it, or at least I want it to be true.

Uh, one of the, the things I've been struggling to reconcile is. Uh, it's not the, the three, four, 500 million fund going out of business or even the 75 million fund that's soaking up a lot of capital. It's the three, four, five, 7 billion fund, uh, that they seemingly, they still have DPI, right? Because they have these big returns that came in 20, 2012, but maybe they've, they've scaled their assets under management.

Uh, or fun size 6X or 3X or 10X in some cases. And I would have thought at some point that the flywheel, uh, breaks. And post 1999, 2000, 2001, what seemingly happened was there was a Only a handful of people that were focused on privates and the endowments or the more sophisticated people pulled out of the market and therefore it corrected.

And two things seem to be happening right now that I'm curious to get your perspective on. One, a lot of those people are saying, well, this is going to be a great time to invest vintage wise. And so we're going to be committed longterm to the asset class. And so they're not actually pulling out. Maybe they're triaging in some way.

Uh, but [01:22:00] two, if they are triaging and saying these managers have gotten way too big, um, The managers are actually saying, that's fine. You're 30 million check. Uh, I would actually prefer to go get 200 from the sovereign, uh, and I'll have one less LP, uh, or we can consolidate five LPs into one. And so one of the things that I I've.

Uh, seen or seemingly felt is that there hasn't been this reconciliation happening. Mostly, I think as a result of, i, I don't wanna say only oil money, but maybe sovereign money that's sort of papering over what would've been the normal cycle. I'm curious your perspective on that. 'cause that's the thing that I've been sort of, um, uh, reckoning with is how is this actually gonna correct?

Josh: I? Think I Think that the Institutional LPS that have relied on the consultants In many cases the target allocation for private equity was 30 percent And venture was a meaningful portion of that.

Um, and, you know, still probably 15 percent on a, on a, um, allocated basis and 85 percent traditional private equity. Um, private equity was going into venture. Venture was going into some later stage and growth, but I think that that's going to be the one pocket where it is going to be. Yeah. These 30 million LPs that are 30 million LPs in 10 funds that are selling saying our 300 million venture allocation this year is going to be 75 or we're actually hitting pause until we get money back.

And some of that might be like, you know, an investment committee is looking and they don't want to be guilty. And so they look at the CIO and they're like, man, she or he made bad mistakes. And so we're going to make change. Well, when that change happens, that's another thing that is a catalyst because new CIO comes in with a different program.

They have different set of relationships or they decide that they want to outsource this to a consultant or they decide, you know what, we've been losing money doing this. We're paying too many fees. We want to have a direct program and we're going to do it ourselves. And in some cases you see that going to the sovereign money.

I actually think they're really sophisticated. And you're seeing this with groups like MGX and UAE. I do not think that [01:24:00] they want to be seen. And I do not think that they are dumb money. They are not going to sit and just be passive investors and, you know, You know, let people just take fees. And I think that they're going to go direct into companies because they have the means and the ability and the balance sheet to do that.

And they have sophisticated people that work at these places. So I actually think that in some cases they may have been LPs and maybe they still will be in some cases, but. I think by and large they're going to be competing with GPs and it's going to change the landscape and people will be like, Oh, I thought we were going to the Midis for money.

No, no, those guys are doing direct investments and they're doing bigger deals than you could ever do. And why would they pay you? Because they could take the GP that was at this prior place who, uh, you know, now is no longer in business and they're running the direct program, you know, at the Sovereign. So that would be to me an unsurprising aspect where you now have a much larger, you're not dealing with a 5 billion or 10 billion or even 50 or a hundred, you're dealing with like a multi a hundred billion dollar entity that can really weaponize capital and do huge deals.

So.

Logan: I just, I guess I wonder if, if, if a 300 million stake in one of these managers allows someone to go out and keep their fund at 4 or 5 billion, uh, and for the, for the UAE or whoever it is, that could be a rounding error in their overall program in their access to directs that maybe allow them to try to put in another 5, 10 billion.

Josh: That will happen, but you're talking about probably less than 20 firms that that could be with and then you have the cultural and values, you know, do people want to give up a piece of their GP and do that? And, and some people will, I do think that there will be probably five or six firms that we historically knew as venture firms that will Be multi asset class publicly traded firms.

I think Andreessen will be public. I think general Atlantic will be public. I think, um, general catalyst will be public. Uh, I think thrive, if Josh wants to do it, could be public. And there's a handful of firms that will be sort of 75 to a hundred billion AUM. The proxy and precedent for this would be looking at the buyouts in the last crisis, post 08, 09, you had TPG, which was last, but KKR, Carlyle, [01:26:00] Apollo, uh, Blackstone, that all went public, they produced permanent capital, it changed the nature of the partnerships, it changed the nature of the investment profiles and the limited partners that were willing to invest, but it, it created, um, uh, perpetuity and, and permanence for those firms, and I think that you will see At least three do that, but probably half a

Logan: No aspiration, this is new

Josh: No aspiration to go public. Um, I like the partnership we have. I like, and this is new for me, investing in our younger people. And, um, you know, you're, you're supposed to not say young and discriminate based on age, but they are younger.

You're supposed to say, Junior and that feels to me pejorative because they're in many cases not junior. They're senior in their intellect. They're super smart They're well connected. They're kinetic They're you know You look at Brandon Reeves and Grace Hisford and Lan Jiang and Shaq Veda and Tess Van Stenkelberg Like they are the future of locks These are people who every day are sourcing deals Intel competitively winning that if I was an LP Sitting back and I was like what firm would I want to be invested in I wouldn't want to be Where the young people are the ones that are attracting the entrepreneurs.

And if I talk to founders, who do they want on their cap table? And they're saying these individuals in this firm. So that to me is the defining thing. And Pete and I, I think over the past five years have made some transitions of older partners and brought in younger partners, and that is the future. And there's an appreciation from the younger people because we're not so hierarchical, uh, in the share of the pie in the economics.

And I look at people that have built firms that I admire and that's what I think we want is just a continuation of a pipeline of talent who are driven competitively but are sort of menches on missions. They're, they're nice people. They're intellectually interesting and interested. And they're kind and when an entrepreneur references them, they're typically seen as the number one to most valuable person on the cap table.

You know, to hear Clem at Hugging Face talk about Lux or talk about Brandon or to have Vipple at Together talk about Grace or [01:28:00] that is just a powerful thing and it helps us win our incremental deals and it makes them feel and we feel like we're doing something right.

Logan: I wanna, I wanna come back to, to deluxe, but, uh, I guess the last point in the venture ecosystem, um, I don't know if you've tweeted about this or not, or talked about it, but, um, the, there's a lack of liquidity in the ecosystem and, uh, there's been a lack of m and a particularly with regard to Lena Kahn and, uh.

I don't know if you guys have been involved in any of these billion dollar plus aqua hires, uh, that are seemingly going on. Do you have any perspective on, uh,

Josh: Yeah, people have joked, like, instead of M& A, it's, you know, L& A, like, license and acquire, or acqui hire.

No, I think that that is regulatory arbitrage. People recognize that there's no way that they're going to do a deal. I mean, we were worried even back in Control Labs that Meta was, that that was going to be approved

Logan: What year was that?

Josh: late 2019.

Logan: Probably doesn't get approved today.

Josh: Probably doesn't get approved today. Now, I think that that is a national sovereign mistake.

And so whoever is elected president, whoever. I think that if I'm a foreign adversary, particularly China, that can weaponize capital and talent and do deals globally and whatnot, that the number one thing I would want is a very tight, scrutinous DOJ FTC regulator that, you know, suppresses to appeal to the populist sentiment that big tech is bad.

I would want that to persist, you know, be the equivalent. Like I joked when I did stuff with special operations and came back to the Pentagon and they asked, you know, what would you want to change? I said, if I was a foreign adversary, I would want like the red stapler guy from office space. You know, here just making sure that your bureaucratic incompetence persists.

So I would want a highly regulated, uh, suppressed non M& A environment. Now people say, well, it's just big tech's going to keep getting bigger. And I always lament, you go back to every major tech monopoly. It was a perceived monopoly and lying in wait quietly was a competitor that nobody anticipated.

Facebook to Microsoft or Google to Microsoft, Facebook to Google, open AI to, you know, It's always there waiting and the greatest regulator is the natural forces of the market and the desire of somebody in [01:30:00] their personal ambition and their greed and their competitive spirit to take on the incumbent and beat them in a way.

And so we should unleash those forces. That is what makes this country truly great. And uh, but going to your starting question, liquidity, it's, it's real. It's an issue. Uh, people, I think, hope that you will see something in the public markets crack and open. And I think that the public markets will be far more scrutinizing.

Public markets have benefited from a broad brushstroke indexation of buy everything. Now there's certain limits on what they can buy and new issues. Uh, you had the SPAC phenomenon, which was brilliant and then total disaster. And you have limited M& A because the regulatory scrutiny. I think that there will be foreign acquirers who also face issues, um, but I think that there will be pools of capital.

There will probably be creative pools formed by the likes of Apollo and other large private equity firms that figure out liquidity solutions that are giving probably 11%, 12 percent returns and not 30 percent returns to venture on an off ramp. And some people may say we need to take that.

Logan: One thing I've, uh, I haven't heard you talk about in a post mortem example, uh, or conversation, um, is I remember when I, when I guess we got to know each other, you were originally, uh, uh, skeptical. You've always been long SpaceX, right? But you're skeptical of Tesla and Elon and the accounting. And I think I've heard, I've heard your postmortem be, it's hard to bet against religion or people that are religious about it.

Uh, I'm curious as someone that, uh, uh, declares himself a cynic, but also is a optimist, uh, where. of, of being vocally, uh, adversarial to Tesla or any, uh, like where, where did that come from?

Josh: know, I got a lot of internal criticism. Uh, Peter always hated it. Um, you know, our head of marketing hated it. Uh, people hated it because They're like you're shitting on somebody's hero, you know This is like so many entrepreneurs that we want to partner with love Elon and you're like no This guy's relationship with the truth is really questionable or you know, he's lying [01:32:00] or

Logan: I think it's proven even more true today than, uh, I, I, that's probably more mainstream on both sides.

Uh, the people that, uh, view his relationship with the truth a little bit more perverted, I think that's true, and I think the religion of Elon is

Josh: yeah And and and I think you cannot take away. He is arguably the greatest capital raiser in the history of technology I mean, I can't point to somebody else. That's better. And I think that Bezos was a greater capital allocator. I think that Amazon built on small amount of equity invested and you can argue about some stock option and debt dynamics, but Amazon is an incredible business.

I don't believe that Tesla has been an incredible business. It has been an incredible stock and it has benefited from religious fervor that is unrelenting in its belief in him. And that is credit to him. I will not take that away. Um, but if you actually were like a fundamental analysis and looked at like the ability to generate capital and incremental returns on incremental capital, it's terrible.

Uh, and I do think that there were

Warranty reserves and accounting that he frankly got away with and it doesn't matter anymore So I could be right about there being borderline or actual accounting fraud and it doesn't matter because they got away with

Logan: Would you have, uh, done it again in retrospect?

Josh: Done the outspoken criticism. Yes, because I think one of the great freedoms is I can be critical of something, and people can counterattack me. Um, you know, for a few years, the people that I was most critical about were Trump, just because I found him to be self centered, uncivil, immoral, and also a liar.

Uh, Elon, and Ray Dalio, and then IBM with their claims around quantum computing and Watson and AI. And, you know, I think people realized Watson was bullshit, so I felt vindicated on that. Um, there's still a huge church of believers on Ray Dalio. I still do not believe that there's substances there. Um, Elon is what it is, and Trump is what it is.

And so, there is religious fervor. You know, anybody that would sort of get those people tattooed on their bodies is proof positive that, you know, You can't win a [01:34:00] rational argument against that. And, um, I can be right about the fundamentals and totally wrong about the persistence of the belief.

Logan: What, uh, the Dalio point, I haven't actually heard you espouse that. What is your view

Josh: I think Bridgewater is a levered long bond scheme that benefited from a generational decline in interest rates. I think that the people that work there are absolutely brilliant. Um, I think that they put out mostly research. Very few of them are involved in any investment decisions. And I think that there's this Wizard of Oz phenomenon that's been created.

And it has attractive capital, large state and pension and other capital. I think they benefit from the daily observations. They feel like they're part of a club. Uh, I don't know if there's China money involved and there's other speculative things, but I just don't believe having listened to him as an investor.

Not as a firm builder, not as a reputation builder, not as, but to me, those three people are sort of the same characters, you know, Ilan, um, Ray, and,

uh, you put him in a room versus Stan Druckenmiller or, um, uh, Seth Klarman or Buffett and the ability, or even Bill Ackman, because this was one of the things that really actually attracted me to this idea.

I was like, Hmm, something's untoward here. Um, actually there were two, Two incidents. One was a charity event where Bill Ackman and Ray Dalio were in conversation and Bill asked Ray, you know, so how do you do what you do? And his retort was, well, how do you do what you do? And Ackman at the time, who was, you know, less, um, aggressive on Twitter and whatnot, just gave a very, uh, well, I look at fundamentals of companies and I read 10 Qs and Ks and I speak to company management and I try to find discrepancy between price and value.

And, you know, it's like a value investor. And so then he asked again, how do you do what you do? He's like, well, We use computers and I was like, Oh, something's sort of untoward here. And then I happened to be in a small meeting with him. It was maybe 10, 12 people and a council farm relations. And most of the people were sycophants and were like, Ray, you know, tell me about your path to greatness.

And, and, uh, somebody actually asked a legitimate question and it was a multivariate question of, you know, [01:36:00] if China retrenches and Saudi increases oil and bonds were to go from 6 percent to 4%, you know, what would that mean for equities and for volatility? It's a sophisticated question to think about.

Global macro and what levers matter more. And he's like, you know, these are complicated questions. They're complicated questions. And when I have complicated questions like this, what I like to do is transcendental meditation. And me and the guy next to me, who happens to be a sophisticated opportunity fund value investor, young guy, were like bruised by the end of the dinner, like elbowing each other.

Because it was just like, this is just flapdoodle. There was no substance. So I think my skepticism of like, I admire excellence and like true excellence to me. You can see it in athletic prowess. You can see it in an intellect. You can see it in a writer. You can see it in a musician and you can see it in investors when they talk about their book and the companies and the way that they appreciate and understand the dynamics of markets or companies levers.

of economic value creation, and they can understand market sentiment. Even a guy that uses computers, like Cliff Asness, is a brilliant investor. Um, and I just, I just have heightened skepticism, and it's sort of like the emperor has no clothes for

Logan: There's a element of that as well of storytelling. Uh, and I know you're a, I don't know if it's a, you uniquely appreciate storytelling and narratives and, uh, Uh, in some ways, I'm curious how, uh, your, your cynical side or your lack of trust in, in people saying things that don't totally pass muster, uh, must be somewhat at odds with your appreciation for the narrative and the ability to conjure up these, these stories and build these movements.

I'm curious how you, how you think about those two things. And maybe for people that don't know, uh, Maybe you can describe, um, your view on storytelling and its importance.

The Power of Storytelling in Venture Capital

Josh: Well, we, we are storytelling primates, and, um, it is the one, technological device truly the [01:38:00] ability to conjure images in our minds and to believe them that has persisted far beyond physical technology existed. You know the campfire people galvanized around the idea of a nation state or a currency or religious belief to believe a story and to have that shared belief.

Through language and, um, and customs is extraordinarily powerful, and it's one that has been evolutionarily selected for, uh, you know, it's such a cliched thing, but if you read Yuval Harari, and there's another one, I think it's called The Storytelling Animal, um, Jonathan, I forget his last name, but these are just great books that present you with this undeniable view That we believe things together and you have an advantage over people that are diverse and don't believe the same things.

And so it's why religion has persisted. It's why the idea of the United States or the belief in the U. S. dollar or the belief in a narrative around NVIDIA or Elon or why we wear the things we do. I mean, these are all just stories we tell. And We want to believe things that other people believe. We want other people to believe things about us.

We try to tell stories and present an image to people. Everything that we do in some cases is, you know, showcasing something and trying to influence what people believe. Um, and so, you know, even my own self awareness about wanting to be intellectually competitive and know things. is telling people a story so that they leave having heard what I say and say he's intelligent or he thinks differently or I want to invest with him or I'd like to be partnered with him or he would be fun to have at dinner.

Those are all things that I'm trying to provoke or invoke because that's the way that I want people to feel about me. And so storytelling is a way to literally I tell this to my son who doesn't enjoy writing, you have an idea and you literally can take that idea and translate it through your muscles into ink on a paper and somebody will read that with their eyes and then it infects their brain and they are seeing what you are seeing in your mind.

I mean, that is like ultimate virtual reality. So storytelling is powerful. And then when you make it very practical in venture capital, somebody could tell me about their great scientific breakthrough and they could do it in the most boring, uncompelling way. [01:40:00] And I'm all for it. Emotionally unmotivated to do anything and somebody can sit there and describe something and there are scientists like we have a Scientist that we've done probably five companies with itself a testament to not only his scientific Sophistication and prowess but his storytelling ability his silver tongue Charles Zuker.

I mean anything he comes and tells me, shows me. First of all, he has an amazing Peruvian accent. Oh, sorry, Chilean, Chilean accent. And, uh, he's a scientist at Columbia University, and I love partnering with him, and I trust him, and his science is real. But the way that he tells a story is so romantic. It's like, anything else that I was thinking about that day, it's Oh my God, this is the thing I'm obsessed with now.

He has a unique ability to communicate science. Feynman had that in science. Um, you know, Steve Jobs has that. Elon has that. Larry Ellison has that. The ability to speak to a group of people and get them to believe what you believe. And every great entrepreneur, by the way, is basically like, I want this thing to exist.

And I'm going to bend the world in an unreasonable way to see what I want and to make it so. That is all from storytelling and narrative, and it's communication and influence. And so the psychology of that, to me, fascinates me. It fascinates me from the things I'm most skeptical of. The Carney Huckster that's cajoling you to come so that they can take your money.

Or the Sunday preacher and the Joel Osteen that's selling you faith and You know, uh, dressing your insecurities and, and, and taking your money to the people that actually have a breakthrough need to communicate it so that they can get people and talent and money and more deserving. I like scrutinizing that.

And there is this ever blurring line between what could be, which is literally fiction because it doesn't exist. It's a promise or a potential or a vision and then nonfiction, which is the roots of science. So. Uh, I read both pretty voraciously, fiction and nonfiction. I used to only read nonfiction and biographies.

Fiction, uh, thanks to my wife, actually, I realized that there are very dense psychological profiles inside of prose where you can read a passage and, oh my god, it's like reading a 200 page psychology book. Rachel Cusk in particular is this snarky British author I [01:42:00] love who in a single prose of, of, um, of, uh, a paragraph of prose can detail all the facial leakage and subtleties of like this interaction right now.

Um, So, so I love that. And, uh, yes, I guess there is something to reconcile between my appreciation of the person that can tell a good story and the desire for that story to be true.

Logan: One of the things in, in talking to Peter, uh, in preparation for this, I know we've talked about him as the optimist and you as the, as the cynic,

Josh: Skeptic, Skeptic,

Logan: skeptic. Uh, I don't know if he used cynic or if that was my interpretation of it.

Josh: Variations of the theme.

Logan: Yeah, variations of the thing. Uh, I, I'm curious, uh, your cynicism isn't re with regard to, or skepticism isn't necessarily with regard to technology. Hearing you talk about the promise and potential of technology, uh, is, um. Intoxicating in some ways and it makes me think about my little b2b world of investing in

Josh: stuff like

what

Logan: could be or what?

You know and am I keeping my box way too narrow? But but there's this skepticism of people that that seemingly you're constantly paranoid that you're gonna get Screwed. And I think I perpetually live in that fear as well, that someone's out to trick you in some way. I'm curious, um, to bring it all the way back.

Personal Reflections and Skepticism

Logan: Uh, do you think that's a manifestation of, of circumstances of growing up Coney Island? And you talked about being, uh, in a four person household with your grandmother and grandparents and, and, uh, mother as well. And I know you've referenced being jumped three times as a, as a kid. Like, do you think it's, do you think it's Something rooted in childhood, if we were to lay you flat on a

Josh: couch. Yes. Yes. Um, it's a protective mechanism. I mean, I think about To your point, like I'm always optimistic and bullish about science and technology and the human condition that builds upon itself because technology doesn't care, science doesn't care, it doesn't care what people think.

There will be people that try to lie or manipulate or do fraudulent science or people that lie and try to [01:44:00] do manipulative and fraudulent technology, but it compounds and the truths that are embedded in a computer chip or a line of code or a molecule that can.

cure cystic fibrosis, those things are real and it doesn't matter whether people believe in them or not.

They just exist and they compound and they build upon prior knowledge. And that is a beautiful thing. And I'm very optimistic. I would almost say faithful, but I don't really have faith in many things, but optimistic that that will continue to persist. And at various times with accelerating progress, and sometimes it slows down and it faces hiccups of availability of capital or talent or wars or whatever, but The long arc will be continued progress and lux itself.

Latin for light is this idea that you shine light on something and on the circumference is what's not known. And that just keeps expanding the circumference of what's not known, keeps expanding. The more we know, the more we realize, Oh my God, there's more stuff on the edge that we don't know. And that's never ending.

So that pursuit to me is virtuous and positive. And the flip side of that is that the one thing that's mostly unchanging is human nature. And you go back to the Pleistocene, or to Shakespearean times, or to the French Revolution, or to the 1950s, or to three weeks ago, and three years from now, and 30 years from now, and probably 300 years, unless we become transhumanist or whatever.

And we still have those same evolutionary impulses, the same quest for vanity and status, competition for mates, uh, trust and distrust, the ability to influence and prevent yourself from being influenced, uh, reciprocity. Greed, fear, all these emotions are timeless. They manifest in different circumstances with different groups of people.

Uh, and that is, I think, what generally makes me skeptical or cynical, which is that I don't think that while technologies and businesses and markets and countries, all these things can change, that human nature can change. is mostly unchanging and a constant. And so understanding it through fiction, through psychology, through literature, uh, I think gives you an advantage.

[01:46:00] In my own childhood, I was born in Fullerton, California. Parents were both from Brooklyn. Mom and Dad split when I was two and a half. Dad was not faithful. Um, I was in part responsible for Discovering it because my mother was like, where were you today? I was out with daddy and his friend and she was like, what friend?

And you know, that sort of mom and I moved back with my grandmother in Coney Island. Um, two bedroom, one bath. My grandfather was a daily news newspaper delivery man. He was my step grandpa, my first grandpa, my biological grandpa who since passed was a POW.

It's crazy, did a lot of drugs, beat my mother, my uncle, um, beat my grandmother, she left him, she married this amazing guy, Stanley.

Stanley delivered the daily news while we slept. So he was a night, uh, midnight, you know, driver delivering literally the physical papers. My mom was a public school teacher and my grandmother was a retired meter maid, so super blue collar. family. They surrounded me with love. Um, they used to tell people my father was passed because people were more attracted to a widow than a divorcee.

Um, and I didn't have a relationship with my father until I was about eight. So partially I think, My mother, my grandmother kept him at a distance and partially I think his own, um, psychology, frustration, animosity, whatever it was, he just wasn't as present. Uh, when I was eight, my mother went after him for child support because he wasn't paying.

And he was living in California, I was living in Brooklyn. And she didn't tell me this because she wanted to protect me. And it was something that To this day, I promised my kids, and that word promise is actually a powerful one, I don't lie to them. Even about Santa Claus and these kinds of things. I create conflict with my wife about these kinds of things.

I may not answer their question, but I won't lie to them. And my mom lied to me, and it was a benevolent lie. And the benevolent lie was, um, a week or three days or so before two detectives showed up at my door. It was, uh, August 28th, 1986. I was eight years old. And, um, they were taking me to go live with my father.

And so that was like, the [01:48:00] floor was like pulled out from under me. And I remember very vividly walking down the hallway of, we had eight apartments on a floor, 23 in our building. I remember the green linoleum. I just, I remember being taken by the strangers. Went, met my father, went on a train, uh, for a variety of reasons, instead of flying, for three days with a stranger.

And we ended up in a place called Leona Valley. Uh, it was a ranch town in California. He was chasing a dream to like live on a ranch. He had been married a few times. This was, I think, his third wife. She had a stepdaughter. It was just a weird time. So you take a kid from Coney Island, Brooklyn, and you put him in cowboy boots in this ranch place in California, 45 minutes northeast of L.

A., which is sort of like meth land. Um, and it was pretty traumatic. I was 8, 9 years old, pee in the bed again. I was like, you know, um, lost a ton of weight. You know, was malnourished, didn't have somebody there like my mom to coddle me. And, uh, and I was actually not allowed to talk to my mom. Uh, more than once a week, court appointed.

Uh, my mom insisted that I go to Hebrew school. And I'm atheist, but I'm Jewish. And, uh, the atheism came later. But I went to this Hebrew school, and the Hebrew school teacher heard my story and said, you're, when you come to Hebrew school, you're just gonna talk to your mom. So instead of talking to her once a week, I got to talk to her two or three times a week.

And she and my mom became best friends. And um, she would later lose her son who was like one of my two friends in that region. But that whole dynamic for two and a half years, from eight to ten, living in this place, the town was a thousand people. Two grades per classroom. I was by far the most intelligent person having come from a gifted and talented program in Brooklyn.

With people that were like, Just sort of white trash and and there were a few friends that stood out and protected me But it was a very bizarre time and I came back and that I think was a very impactful Worldview change which was like life can just suddenly, you know, and my mom didn't do it You know to hurt me she wanted to protect me but the rapidity with which that change happened was like a shock to the system So that was one thing when I [01:50:00] came back Coney Island was a mix of Eastern European Jews black and Latino lots of immigrants You But there was a lot of crime, a lot of violence, and, um, uh, you know, I grew up on hip hop and heavy metal, and you didn't make eye contact with people, you know, you sort of, like, skewed your eyes, um, got mugged three times, um, and you just sort of developed a healthy, squinty eyed skepticism of people, always assuming that somebody's got an agenda, and they're gonna rob you, steal, you know, murder, you We had a Honda Accord.

They got stolen, I don't know, four times, five times. Every time I would go to library Grand Army Plaza, we'd come out, there'd be glass on the thing. My mother'd be like, ah, shit, it happened again. Um, so I just generally distrusted that it wasn't personal. Like people weren't doing this because they had it out for me.

It's just like life circumstances made people in some cases act in desperation. And so making sense of that world and understanding that you were vulnerable, that the world wasn't a safe place, it was like the antithesis of people that today are coddled and, you know, need safe spaces and like, don't offend me.

It's like, you know, the world I grew up in, just people talk shit to each other. You were always on, you know, defense. When I went to Cornell as a freshman and somebody was like, dude, nice haircut. I did not understand sincerity because growing up, me and my boys, like we all just talk shit to each other.

You're sarcastic to each other. You're mean to each other. But like, we were all. sort of tough with that. And then somebody's like, nice haircut. I'm like, what are you trying to say? I didn't understand that they were actually like, you know, nice hair or something. And so that's taken a while. And my wife grew up cul de sac, Merrick, Long Island, much more happy, pro social.

You know, it's just a, and Pete, sort of the same thing, like, Stanford, Connecticut, positive, happy family, um, so it's a, it's a nice yin and yang.

Logan: Yeah. Uh, at some point I want to go into the partnership dynamics and, uh, and I know you, Pete and I have been talking for a while about doing something on Lux and your guys relationship and all that.

So maybe the next time, uh, we can, we can go into all that

Partnership Dynamics and Trust

Josh: He, he is the one person, I will say, just to his credit, truly, um, if somebody's like, take that tweet down, or, you know, I just, I trust his judgment implicitly. Because [01:52:00] he, he's not doing it out of self interest, he really has just such a great moral honor code,

Logan: kind individual,

Josh: always wanting to do the right thing, um, and I really value his moral compass about what is the right thing to do.

I have more of an impetuous side to me where I'm like, you know, like, actually this just happened. Um, we're in a tense negotiation with a particular company. There's a lot of money at stake. My instinct is to go hostile on some of the other syndicate members, you know, um, and, and he's much more long term of like, what are we solving for?

You know, you don't want to do that, you know, and, um, and, and that's just a very helpful dynamic to have that voice and, and the fact that I trust him because we've had so much repetition that he's usually right. But he's one of the few people where. Somebody gives me advice most of the time I ignore it and it's like him and Lauren are the two people where I'm like, okay

Logan: Sounds like they have a lot of similarities too.

So, yeah.

Josh: and they get along great.

Logan: for, for a guy in Stanford, Connecticut, he doesn't make it to the East coast, uh, often enough for us to, uh, try to do this in person. So I was talking and talking to him yesterday. I was like, we might just need to do virtual at some

Josh: We'll do it. Yeah,

Logan: I'll be curious on how you guys have been able to make this work long distance

Josh: we speak speak ten times a day

Logan: Is that right?

Josh: Yeah, I mean literally just like, you know text call non stop and like, you know something happens, but we're we're constantly communicating

We pick up on all the subtle nuances inside of a partnership You know, he might be like, I don't know if you noticed, but like, so and so, like, was really grimacing at that thing and like, we'll try to surface those things, encourage people to talk to each other, abate conflicts, um, constantly sharing intel and information, just got back from this dinner, blah, blah, blah, you know, and, and so, yeah, it's just like a, a brotherhood trust and we, we've known each other 28 years or something like that and been partners 24 years, it's crazy.

That's

Logan: next one we'll do. Yeah. All right. Thanks, Josh. This is great.

Josh: Awesome to be with you, man.

Conclusion and Final Thoughts

Logan: Thank you for joining this episode of the Logan Bartlett show with co founder and managing partner of Lux Capital, Josh Wolf. If you enjoyed this discussion, appreciate it. If you subscribe, know whenever podcast platform you're [01:54:00] listening to us on as well as shared with anyone else that you think might find it interesting.

We look forward to seeing you back here next week on another episode of the Logan Bartlett show. Have a great weekend, everyone.