Vibe Rounds, Tech Ecosystems, Foreign Money in US Venture Funds
Welcome to Technology Brothers, the most profitable podcast in the world. You read The Wall Street Journal today?
Speaker 2:Of course.
Speaker 1:Yeah. OpenAI.
Speaker 2:I need my copy. What are they are they covering the departures?
Speaker 1:Not the fur yeah. It's interesting because OpenAI I mean, the chief technology officer quit yesterday. There's, like, rumors about a fundraising round. A bunch of executives left. So it's, like, kind of a chaotic moment.
Speaker 2:I saw somebody somebody's take that I think was totally on point is the way that OpenAI is, like, employment contracts are set up is if you talk poorly about OpenAI Yeah. After you leave, they're just like, okay, like, this is
Speaker 1:Leave your equity card back.
Speaker 2:So if you wanted to make a statement about how fucked up things were while still holding on to your equity, you'd leave right in the middle of the fundraise. Right? What sends a bigger mess like, because if you're like, let's say we're
Speaker 1:we're running different than but that's not the same as, like, actually violating that clause.
Speaker 2:No. That's what I'm saying. If you wanted to make a very large public dramatic statement about something being wrong at OpenAI Without say that.
Speaker 1:Oh, but you couldn't say that because
Speaker 2:You would leave in the middle of a fundraise.
Speaker 1:Right. But if you want the equity, why would you wanna hurt the company?
Speaker 2:I mean, it's not you know, you could argue that somebody could say, I want no part of this, but you still wanna kind of keep keep money.
Speaker 1:Yeah. I hate that. That that pisses me off so much. That's like that fucking guy from from Meta, from Facebook who, like, made this huge bag and then was in on Netflix. Tristan, you know this guy?
Speaker 1:No. He he was in this documentary called, like, The Social Dilemma. He's like, oh, yeah. Like, it's so bad what Instagram has done.
Speaker 2:Right.
Speaker 1:This generation is so bad. It's like
Speaker 2:And he's just sitting there for montage. Yeah. Exactly.
Speaker 1:He just he needs so he needs so much money. Yeah. And it's just like, I actually love that clause. I think that clause is interesting because if if you are seriously
Speaker 2:Are we an excel character right now?
Speaker 1:No. No. No. I I I this is what I actually believe. I love that clause because if you were actually terrified of AGI and you were like, holy shit.
Speaker 1:This stuff is gonna kill everyone. Yeah. What the fuck is your equities worth 0? Your equity's worth nothing. So you're gonna just be like, yeah.
Speaker 1:Go for it.
Speaker 2:No. I think it's much more of a statement on Sam's character to leave during the fundraise, which is
Speaker 1:I don't know. I mean, like, I I I I hear what you're saying, but it's it's just weird because it's like it's like it's this half measure. Like, either either you'd wanna just be like, let me like, I want the value of this stock to be as high as possible. Yeah. Therefore, let's find the chillest place for me to get out because, like, I am I am tired.
Speaker 1:Right?
Speaker 2:Yeah.
Speaker 1:So I do wanna leave, but I don't wanna fuck over the company at all. So let's figure out a way to get out. And I thought it was originally interesting that they launched this, like, all this stuff broke the same day that Zuck had the big meta announcement. Think it was timed? Yeah.
Speaker 1:I mean, if you it's it's like, you know, bury your news. Yeah. Try and bury your news on, you know, a day when something else bigger is happening. Right? And, like, OpenAI historically has been really, really good about releasing information, like front running, like putting putting their AI announcement, like, the day before Google's.
Speaker 1:Right? Yeah. So they clearly know when the other things are happening.
Speaker 2:Has Lucy ever been approached by OpenAI? I mean, maybe she can't say that. Does she work with them?
Speaker 1:Which Lucy?
Speaker 2:Sorry.
Speaker 1:My company? No. No. No.
Speaker 2:No. They've definitely worked
Speaker 1:with Lucy. Oh, of course. Of course. They do massive shipments.
Speaker 2:Lulu. Lulu. Not Lucy.
Speaker 1:Lulu. I mean, I think Lulu is working with, SSI. Ilya. Ilya's company.
Speaker 2:There you go.
Speaker 1:Because she's on, like, the Nat Friedman, Daniel Gross crew. Right. Although she's, you know, like, it has a bunch of different companies and stuff. But Yeah. But I I when I initially saw that all this was breaking right then, I was like, oh, okay.
Speaker 1:Maybe they're trying to kind of bury this news. I mean, obviously, it didn't really work because the it's on the front page of the business section.
Speaker 2:Which every self respecting person reads daily.
Speaker 1:Well, hey, the I mean, you you you know, like like, the meta platforms announced deals with celebrity actors to use their voices in a new artificial intelligence assistant. That's b 4. And the OpenAI stuff is b 1. So so it it it did break through. The the the that's the way I thought they were framing it initially.
Speaker 1:It was like, let's try and let's try and get this news out the same day that something else big is happening in tech. We know that this Facebook thing's happening. And then but I mean, then again, like, there is one narrative where it's like the whole company is falling apart and everyone's leaving. But then there is another narrative which is like great great man theory of history. Maybe you just need one person.
Speaker 1:Executive teams turn over all the time. They often turnover in the midst of financing rounds because it's like, okay. We're actually changing what the business is gonna be doing. We're going a different direction. So, like, yeah, we are wiping out a bunch of the staff.
Speaker 1:I don't know. I'm I'm I'm not I'm not ready to say, like, it's it's it's so over.
Speaker 2:No. It's it's
Speaker 1:I'm clear that we're selling that. That's most
Speaker 2:the thing that's most, the thing that makes it the most interesting is how opaque it all is. Like, nobody nobody even seemingly insiders don't actually know what the fuck's going on.
Speaker 1:Yeah. Like And
Speaker 2:to have that many that many 1,000,000,000 of dollars in a company and nobody can figure out what's going on
Speaker 1:Yeah.
Speaker 2:And there's this broader public back like like Sam went, you know, a year ago it felt like Sam was he was on top of the world.
Speaker 1:Oh, yeah.
Speaker 2:Right? Like he was everybody was like was in his comments like worshiping him on every single post. Yeah. Like, you know, this like sort of tech demigod and now it feels like that has
Speaker 1:Yeah. Certainly, like, the aesthetics
Speaker 2:of snapback. Like, at least the x AIX community is Yeah. Deeply against him at this point.
Speaker 1:Yeah. And and a lot of it has to do with, like, the open sourcing Right. And commitment to the developer community and a lot of different stuff, but
Speaker 2:Yeah. And it's funny that the I feel like the first, one of the first critiques of Sam was how he would give those interviews where genuinely felt like he was just lying, but like with a very straight face. And he's like I don't know open I I don't know equity in OpenAI.
Speaker 1:Like Yeah. Yeah. Yeah.
Speaker 2:And it's like who knows what's actually true? Yeah. I actually was curious and I tried to do some digging and and people were like yeah like from what I can tell it's like legit.
Speaker 1:Yeah yeah no it it was legit but there was always the soft power element. Soft power element and then now in the for
Speaker 2:profit shift, there's this discussion of getting him, you know, what will effectively be like $10,000,000,000 of equity in the for profit. And so to me, if you actually think about his sort of game theory is like let me come in, sort of run the company, get so much control and power and influence and not have equity while it's still this nonprofit structure and then flip it into a for profit structure and get equity then. Right? Like, it almost is like it almost is like, well, if you were always doing like, why is there even a discussion of you getting 7% of the company Yeah. At a $150,000,000,000 valuation if you never wanted it and it was never a motivation in the first place.
Speaker 1:Yeah.
Speaker 2:But it was kind of if if Sam had come in and taken a huge stake in the company in in in exchange for taking on the CEO role while cucking Elon out of then there would have been, like, a lot of Yeah. Yeah. Yeah. There would have been probably even more. Now he can make the argument where it's like, well, the board just really wants me to be properly incentivized now that we're a for profit structure.
Speaker 2:It's only right. And now the next time, you know, he's up in front of the senate, what's, you know, getting grilled? They're gonna be like, so what, you know, what changed? Like, I thought you just were doing this for humanity and now all of a sudden you have you're, you know
Speaker 1:Yeah. It's interesting. I I wonder, like, I I I always wonder, like, how seriously people believe, like, superintelligence is right around the corner and the the revealed presence.
Speaker 2:That's a few 1000 days away.
Speaker 1:It's a few 1000 days away now. It used to be a few quarters away. It's, like, it's kind of shifted so much. And, like, that informs your decision and, like, your actions, like, so much. And there's, like, a lot of, like, revealed preferences versus stated preferences.
Speaker 1:A lot of the doomers are, like, you know, oh, 99% chance that AGI is gonna, like, kill us all, but, like, yeah, I'm still, like, investing in, like, bonds
Speaker 2:because I want my 401k. Exactly.
Speaker 1:And it's like, okay. Well, like, you kinda don't. It doesn't seem like you believe that. It doesn't seem like you're really taking actions that are con are are consistent with what your what what what your stated beliefs are. Like, maybe you're just trying to, like, you know, get attention.
Speaker 1:But, but but I do think, like, even internally at the at these at these research organizations, like, they might have shifted their their views. Like Yeah. It's totally possible. I was talking to somebody about, Jensen Huang and, like, how NVIDIA has done a lot to, like, kinda get around the chip bans and and whether or not that was, like, un American. Yeah.
Speaker 1:Because they're, like, American companies will buy 100% of NVIDIA's supply right
Speaker 2:now. Yep.
Speaker 1:So the fact that they are taking any TSMC long
Speaker 2:time saying, right, is we're still deeply supply constrained and will be for the next Everyone
Speaker 1:is. Like, yeah. Like like, NVIDIA could very clearly just make, a statement saying, like, we're only gonna sell to American companies, and I'm gonna call up Oracle and Elon and Zuck and Google and all these people and just say, I have, you know, a 100% of my supply. You guys better take it all and they would.
Speaker 2:Yeah.
Speaker 1:And so so so it's not an economic consideration. It's it's more like this long term diversification. You wanna be more of an international, an international company. And and so if you really, really believe that, like, a AASI, AGI is around the corner, it's gonna be this super, like, you know, bioweapon level, like, super critical geopolitical thing, then putting in the hands of the CCP is is very dangerous. Yeah.
Speaker 1:But if you deep down actually just believe that it's just like it's just spell check. Like, it's just it's just like a useful tool. Like Yeah. Then then as just like a libertarian normal person, it's like, yeah. Like, why would you respect the spirit of the law?
Speaker 1:Like, you're only gonna go after the letter of the law. And, like, the chip band says it can't have this much memory bandwidth and so you'll go one point under that. Yeah. And, like, that sounds treasonous if you believe that AGI God is coming, but it doesn't sound treasonous if you're just selling spell check. Right?
Speaker 2:Yeah. Yeah.
Speaker 1:And so it was a very very interesting discussion and and I think I think that's happening at a lot of these labs. I think that a lot of the labs are starting to change their view from, like, okay, well, we thought we were just gonna, like, nail the algorithm and get God, and we wouldn't even be able to process them.
Speaker 2:Country level white collar work. Exactly.
Speaker 1:And so and so, like like and and not just that. It's, like, also God might be, like, decades away. Right? Yeah. Even a few 1000 days, that's a that's a full decade and you gotta do a lot in between and also just the scaling laws and all of the different Yeah.
Speaker 2:The few that we talked about this, but the few 1000 days away Yeah. Sounds way better than 10 years.
Speaker 1:Than 10 years, which is exactly what it is. If you just assume a few is 3 and a thousand, you just do the math and it's like, okay, that's that's a decade. And and we've been hearing about, like, self driving cars a decade away and then Yeah. You know, it took, like, a little bit more than that. It took, like, 2 decades and so
Speaker 2:I'm still not even here
Speaker 1:and then and then the actual build out and rollout of these things is it takes forever to, like
Speaker 2:Yeah. I was disappointed disappointed to see when you said you shared this earlier that TSMC is slandering podcast bros.
Speaker 1:Oh, yeah. They
Speaker 2:called they called Altman a podcast bro as as if it was some type of, like, attack. Yeah. Which I didn't I had to reread it a few times to try to get their meaning.
Speaker 1:I was confused because that's kind of the highest food chain in the Yeah. In the tech food chain.
Speaker 2:Podcast bros, find people that can navigate and sell without, you know, their mouse, you know, venture capitalist after that.
Speaker 1:Yeah. Then then founders.
Speaker 2:Yeah. Right. But but yeah. It's sort of interesting if if you would think that TSMC knows what they're doing. Right?
Speaker 1:Yeah. Do
Speaker 2:you think that they're do you think that they're seeing through the all the narratives and just being like bullshit. Because that's kinda what it sounds like.
Speaker 1:Yeah. Yeah. In in that one in that one interview clip where they kinda talked about, I have fours too if you would like. 4.
Speaker 2:Thank you, sir.
Speaker 1:Yeah. Yeah. In in that one interview clip, they they they are talking about, like, you know, Sam coming in and saying, like, we need, like, 35 more plants and, like, you know, if you go back to the history of of Morris Chang and TSMC, like, just getting that one plant up was, like, immense risk and required, like, massive government, you know, in in intervention to make happen.
Speaker 2:I hadn't realized, but I guess the Arizona plant is actually put producing product.
Speaker 1:I didn't know that, but, I mean, it makes sense. The the the question is just, like, what process is it on? How small are the chips? Like like because we I mean, America does have plenty of chip making capacity for, you know, the, like, the the the much larger the stuff that goes in, like, a toaster and washing machine. Like, GlobalFoundries and a few other companies have, like
Speaker 2:Economist toasters are gonna be
Speaker 1:Yeah. But but it's it's when you get down to, like, the really important, like, iPhone chips and stuff that's in the Yeah. In in in the GPUs. It gets really, really difficult. Yeah.
Speaker 1:But yeah. I don't know. Speaking of AI, I we there's been a lot of talks about, we we can do this later. But the but there's been a lot of talk about, like, you know, the the AI that Elon is applying to the x algorithm and whether or not it's good for the world or good for the user experience.
Speaker 2:And the and the question is is it actually AI or is it 300 people in a room just looking at videos and hitting a button that says degenerate or, like, or, like, or, like, good for society and they're just going like
Speaker 1:this. Yeah.
Speaker 2:And and then they're just, like, feeding those videos. Right? Because you you you remember early TikTok days. Yeah. TikTok, the algorithm was like getting good and you'd you'd I remember going on it.
Speaker 2:Never never I don't think I've ever posted anything there, but I remember going on it and being like, wow. I'm like kinda shocked at that they would feed me that video with how little I've used it, like it was so on
Speaker 1:the nose. Just pausing a little bit on, like, what? Oh, there's a car in the background, the next video is a car video, then the next video is the deepest dive on a car you've ever seen.
Speaker 2:Yeah. Yeah.
Speaker 1:Yeah. It knows I'm a car guy.
Speaker 2:Yeah. Yeah.
Speaker 1:Just like that.
Speaker 2:And I remember at the time there was footage coming out of these effectively, like, farms in algorithm farms is how I would describe them. Right? Where there's just like people in rooms just like sorting like videos like manually. I mean, it's
Speaker 1:a little bit of that. I mean, mostly it's just like it's just it's just like like how much time is spent on this particular thing, how much time you're hovering over it.
Speaker 2:But but but yeah, it's like no, of of course, but but there it's funny to imagine Elon getting so his dopamine rush is engagement. Right? Like, he's all about, like, the metrics. He wants to put points on the board. Yeah.
Speaker 2:And so you can see him just, like, dialing up the degeneracy to the point where he's like, he walks into the room. They're like, sir, we can't we can't go anymore. Like, we were already showing the people fighting the streets, like, and he's like, more. And it's like showing like like racial crimes. Yeah.
Speaker 2:It's like, he's just like more and they're like 11, put it to 11.
Speaker 1:Yeah. Put it to 11. I mean, yeah. Yeah. It's unbridled, but I mean I think that's like the only thing that will turn the business around really.
Speaker 1:Like like the I I think this Yeah.
Speaker 2:You're talking about that.
Speaker 1:Garden that we had in like tech Twitter, like it was truly like like the tech community was subsidizing it.
Speaker 2:Yeah. Yeah. Yeah.
Speaker 1:Because like Twitter was never particularly profitable. The the business never really worked that well and and but it was definitely Yeah.
Speaker 2:And and what's funny is like how much of the market cap was oriented around how much people love the product. Yeah. Yeah. Like it was something when I discovered Twitter. Yeah.
Speaker 2:When I credit Twitter with massively accelerating my professional development from the lens of I was able as a 21 year old in the school, you know, I was at UC Santa Barbara, it's not a tech school, I didn't know we had a technology management program that I was in, but it wasn't like everything we were studying in school was like 10 years behind. Yeah. Right? So it was like I could go on Twitter and and the biggest thing for me was like picking up the vocabulary of the industry so that when I would go meet somebody, I knew what they were talking about Yep. Even if I'd never done the thing.
Speaker 2:Yep. And so to me, I credit Twitter with accelerating my career like, you know, 5 years, something like that Yeah. Because I just felt like I was I felt like I was a part of it just because I was even even though I was just like kind of soaking it all in. I wasn't even a part of the conversations and unfortunately so I think that a lot of their market cap was just the fact that the technology industry and many other industries just love the product. It's like you don't wanna short Twitter.
Speaker 1:It's kinda like it was option value on, like, the value of the network Yeah. Because it had it had, like, world leaders there, the Pope, but then also every journalist who's posting every news story is breaking there. Like, there's just, like, something everyone felt like there was gonna be some sort of value to come out of
Speaker 2:that. Yeah. And the big critique was like you guys don't ship anything.
Speaker 1:Yeah. Yeah.
Speaker 2:Everything's broken. Nothing like works. Like the DM suck. Yep. And Elon's like, okay, you guys want change?
Speaker 2:Like I'll give you some I'll give you some change.
Speaker 1:Yeah.
Speaker 2:And then it's we're like, oh, actually, like like, I still type in twitter.com Yeah. On my browser, and every single time I get this pang of, like, sadness. Oh, my goodness. Yeah.
Speaker 1:But, yeah, I mean, I think there's there's something interesting where it's like he clearly bought it because it was like the festering hive of scum and villainy in the sense that, like, every cancellation campaign started on Twitter.
Speaker 2:Yeah.
Speaker 1:It would usually like, the anatomy of a cancellation campaign would usually be, like, someone figures out that that they don't like this tech CEO or whatever.
Speaker 2:Yeah.
Speaker 1:Then they go back. They dig up some tweet where they're like, capitalism is good. And then some, like, tiny, like, communist account quote tweets is like, actually, capitalism is bad. And then gets like a 1000 retweets.
Speaker 2:Yeah.
Speaker 1:And then and then all of a sudden there's like more people piling on, more people piling on. A real like back and forth fight and then the low tier journalists can start reporting on the backlash on Twitter. And they can start saying, oh, well, like, you know, how are they gonna respond? There's been this uproar on Twitter. This person needs to, like, talk about this.
Speaker 1:And then once the low tier outlets are reporting, then, like, the New York Times and the Wall Street Journal and the Washington Post can, like, report on, like, the crisis that's happening. Yeah. And Elon just completely killed that by, like, kicking a lot of people off, de amplifying thing, community notes. Like, there's so many different changes that he made. Just chaos.
Speaker 2:Yeah. There's He
Speaker 1:got by the blue checks, like, so, so like Yeah. It wasn't this moment where it's like, oh, well like a blue check person amplified this hate that like this hit piece on my company or like that That
Speaker 2:was particularly devastating for me because I got my blue check and had it for like 3 months.
Speaker 1:Me too. I had the same thing. I had the same thing.
Speaker 2:I was like really?
Speaker 1:And then you just gave it away?
Speaker 2:Yeah. I gave it away. Damn it. Yeah. I think a big I don't think that, I don't know how virtuous the decision was as much as it was.
Speaker 2:I believe there's a factor there. Right? I'm not gonna psychoanalyze Musk. Like, the logical decision to buy x is, one, there clearly was a lot always a lot of potential. Right?
Speaker 2:And if you're worth 100 of 1,000,000,000 of dollars, if you wanna make another $100,000,000,000, like you have to take these, like, colossal bets. Right? Yeah. Buying Twitter and trying to make it a, like, as big half as big as Meta is, like, a pretty good way to try to do that. Right?
Speaker 2:Who knows if he'll be successful? So there's the purely financial and like if you overpay for something, but then, like, make it 10 times more valuable, doesn't you didn't ever pay. Like you paid like a fair price. Yeah. Right?
Speaker 2:It was just risky. The other factor is it being like such a critical comms channel for all of what Elon's doing, specifically Tesla. Right? Like the whole there was years where people were like Tesla doesn't spend money on marketing. Tesla doesn't do this.
Speaker 2:And it's like, okay, they ended up like, like their marketing channel was Elon Yep. Spreading the gospel being Elon, and there was a lot of value in making sure that that always stayed the same.
Speaker 1:Yep. Right? Yeah. I think that makes sense. I think I am I'm a little more pilled on, like, the psychoanalysis side of, like, of, like, if there was gonna be, like, a communist revolution, like, it would have started on pre Musk Twitter.
Speaker 2:That's right.
Speaker 1:And and and there were and there were and like he would be the first one to go. Yeah. And so there's a little bit of self preservation there.
Speaker 2:Yeah. Yeah. No. And that's that's the thing is that the real answer is probably not that he did it for one specific thing.
Speaker 1:Yeah. Probably not.
Speaker 2:He made like one of the smartest business minds the smartest business mind probably of the 100 years ish that we're on this earth. Yeah. And he made like a very calculated decision across a bunch of different
Speaker 1:Different parameters. Parameters.
Speaker 2:And he's fine if people think it's because of one of those reasons, but really, like, you usually don't make a decision just for a single reason. Yeah. You know?
Speaker 1:Yeah. And then I do think, like, he's he's been successful in kind of, like, just destroying the, like, cancel culture, like, boiling pot. Like, it like it really served as, like, a kindling. Totally. But everyone was hoping that And
Speaker 2:it gave other VCs a permission to
Speaker 1:So yeah. He kind of, like, canceled dystopia, but I think everyone now is, like, we're waiting for utopia. Yeah. And it's, like, maybe we're not gonna get that. Maybe it will maybe the answer was just, like, yeah.
Speaker 2:Maybe the town square is always gonna be chaos. Exactly.
Speaker 1:Exactly. It's gonna be a lot more chaotic, and maybe that means everyone uses it a little bit less or at least, like, the the high signal. Like, the people that are having a really bad time right now are not the normies on Twitter. Like Yeah. The normies on Twitter are, like, oh, I'm getting, like, even more entertaining
Speaker 2:jobs. Meme with the guy hitting the bong while these two people are just In
Speaker 1:my name. Like Exactly.
Speaker 2:Yeah. Yeah. Yeah. Getting high off the algo and just
Speaker 1:There's a lot of people that are that are in that boat and and and the people that are are upset are, like, the the this very small niche, like, tech Twitter community that was very interested in, like, sharing links and, like, meeting new people and, like, amplifying small accounts and stuff and, like, all that stuff going on anyway.
Speaker 2:Party around go to market strategy probably would have worked but been much less effective Yeah. Where there was a time where if you understood the algo and you had the right people around, you could dominate the timeline for an entire day. Yeah. Genuinely dominate it. Yeah.
Speaker 2:Where, like, you could open you could make sure that when people open the app, they would see your stuff 3, 4 times
Speaker 1:in
Speaker 2:a single session. And I miss those days. That's why we're creating our own channel here. Yeah.
Speaker 1:Yeah. Yeah. Yeah. I think there will continue to be hacks. Like, I Totally.
Speaker 1:Especially in
Speaker 2:Like an excel an excel
Speaker 1:work. Chaos is a ladder.
Speaker 2:Chaos is a ladder. Right?
Speaker 1:And I see a lot of chaos and I see a lot of opportunity. But, yeah, I mean, there will be a lot of, like, sludge. Like, the thread boys existed beforehand. They've clearly amplified in the new era, especially with the rev share stuff. But then also just, like, the I think what's really thriving is, like, the the culture war politic it's also because we're going into an election, so it's even hotter than ever.
Speaker 1:But that type of stuff, like, I think there's a lot of people that don't want to see that stuff at all. And so going back to the AI thing, what what I would love is, like, I don't want to go in there and every time there's a new trend, like, there's this like hippomoodang. Have you seen this thing?
Speaker 2:Yeah. Yeah.
Speaker 1:I I like I have no desire to see any of that, like, and I I could put in like a block keyword. But like what if somebody just shares a meme with a picture being like this is cute? Well, then I'm seeing it anyway. Yeah.
Speaker 2:And then and then you're kind of not in the in the in the meme sort of network, but then Daniel, Daniel shares like And
Speaker 1:then you
Speaker 2:Moe Don has left OpenAI.
Speaker 1:Yeah. Exactly. It's like, oh, okay.
Speaker 2:And then now it's back
Speaker 1:on the line. Bit. But I I I basically just wanna tell I wanna I wanna talk to the algorithm, and I think that that would be the cool thing The cool thing
Speaker 2:the cool thing would be if and and who knows if this will happen, but if Elon could some I invested in that company Farcaster, which is like a decentralized, you know, and their whole vision was like, hey, you build one network and then other people can build network, specific networks effectively on top of the network. Right? Yeah. Like, so if we could take the Twitter social graph and all the mechanics and build effectively our own service for that, like, you could bring back OG Tech Twitter to some degree. Right?
Speaker 2:Yeah. It probably wouldn't be that hard, like, you know, to I mean, it would be hard, but Yeah. If you the social graph is still there, all the users are there. They're just not being coordinated in the same way.
Speaker 1:Yeah. There's this theory of, like, the algorithm store, like, you should be able to pick your algorithm or whatever. But I I actually would prefer to just basically talk to the talk to the LLM like Grock and tell it, like, hey, like, cool it with the
Speaker 2:Yeah. That's why I don't
Speaker 1:the political social, like, like, the the the culture war stuff. Like, I wanna see less of that. I know that it's grabbing my attention and so you're getting this signal that Yeah. That that does retain and but I want you to push back on that in a way that basically
Speaker 2:I found the not interested.
Speaker 1:Does that work?
Speaker 2:It doesn't work for me.
Speaker 1:Doesn't work.
Speaker 2:Like I've experimented with it where every single time I've seen something about Diddy, I just click and say not interested
Speaker 1:Yeah. Yeah.
Speaker 2:Post and the next time I open the app You
Speaker 1:still see it.
Speaker 2:There's something else. Yeah.
Speaker 1:Yeah.
Speaker 2:Yeah. So you the only thing you can do is is is is effectively mute that word, but then what
Speaker 1:if somebody posted a picture of baby oil Exactly. You get it. Yeah.
Speaker 2:Needs to actually and that's the issue with grok is
Speaker 1:Yeah.
Speaker 2:It's a you completely use
Speaker 1:It's completely separate. Like Yeah. What I wanna do is I want to every morning like have something scrape every tweet that would be in my for you page and my following page and scrape all those tweets out and then run those through an l o m that says, would John like this based on the what we know about him? And I've and I've told it a lot about what I like and expressed those opinions. You you can sometimes do this when they, like, when you when you initially onboard, they ask you like, are you interested in sports or technology?
Speaker 1:And then they tell you like, you should follow Cristiano Ronaldo or whatever. Yeah. But like that's not enough. I like I wanna have a much more rich conversation and then have the LLM like kind of score each one based on my individual Yeah. Expression of that.
Speaker 2:I'm I'm surprised that social networks haven't adopted that because every once in a while
Speaker 1:It's really expensive. Interesting. Like Yeah.
Speaker 2:Because I
Speaker 1:think I think actually like like like right now the all it's doing is just like is just like like, you know, this is recommended to, like, this has this tweet has high retention. Yeah. When someone sees this tweet, they click on it, they read the comments, it's a good tweet. It's a smash. The average time they spend on a tweet is a minute or 10 seconds.
Speaker 1:Right. If it's one second, then don't show it to anyone. Right? And so they just they just blast us out and there's a little bit of like fit where it's like, okay, we know that you like a little bit more of this, that more like this, but it's not it's not customizable in any way.
Speaker 2:Yeah.
Speaker 1:And and and and really what I wanna do is just like print out all of them have an AI do that. But until like that becomes affordable, the next best thing is I just I just ask my secretary to print out
Speaker 2:tweets
Speaker 1:for me and I just read it on paper.
Speaker 2:Yeah. Yeah.
Speaker 1:And so I have a bunch of these here that, my secretary, printed out for us to
Speaker 2:There you go.
Speaker 1:To look through. Did you get a chance to look at any of these? Oh, you Actually, we were talking about your favorite. You you would have missed this tweet if you had your Right.
Speaker 2:Right button
Speaker 1:on because it's about Diddy.
Speaker 2:Thank Yeah. Thank God I didn't mute it. Yeah. I think this tweet
Speaker 1:We'll read it out first.
Speaker 2:Put Diddy, SBF, and Eric Adams in the same cell and make a 16 z invest in whatever company they come up with. This hurts because it's feels like there's a reality where, like, one of these people would raise for for Andreessen. Absolutely. And that well, no. It's more like looking at, like, the Adam Newman thing.
Speaker 2:Like, Adam Newman, you know, came out probably a generational entrepreneur and like deserves to come back and raise a bunch of money because like even though he, you know, there there were some issues with how we work with Ran Yeah. Like that guy just willed Yeah. Something into existence that, and I never had a I I never understood that the Adam Newman critique in general because I'm like, alright, he took a bunch of ugly office buildings and used, you know, my Yoshi san's money to make them beautiful and cheap Yeah. And like available to everybody, like how could you hate this guy?
Speaker 1:Yeah.
Speaker 2:Right? These 3 these 3 I think are
Speaker 1:Yeah. Yeah.
Speaker 2:Yeah. Harder to harder to defend.
Speaker 1:I mean, like like the Diddy story has become like he's like a murderer now.
Speaker 2:He's a murderer,
Speaker 1:rapist. It's like insane.
Speaker 2:No. He's
Speaker 1:I I actually haven't looked into the Eric Adams stuff.
Speaker 2:I haven't looked at it enough either.
Speaker 1:The What's funny is that like it's not like a 16 z invested in FTX like they
Speaker 2:They they specifically did it and it was because there were so many obvious conflicts. Right? Like anybody that would look at, hey, the most prolific crypto hedge fund
Speaker 1:Yeah.
Speaker 2:Does the most degenerate crazy shit that is antithetical that is, you you know, that is basically like manipulating all these markets also wants to own the biggest exchange.
Speaker 1:Yeah.
Speaker 2:Yeah. We should fund the exchange.
Speaker 1:Yeah.
Speaker 2:Yeah. Like what could go wrong?
Speaker 1:You know?
Speaker 2:But you know I think they don't get enough. Right?
Speaker 1:Dan Toomey, do you know who this is?
Speaker 2:No.
Speaker 1:So he works for Morning Brew. He has a YouTube channel.
Speaker 2:Oh, he does some of those videos.
Speaker 1:And he's like, he's much more like a video creator than like an insider tech person. I don't think he's done a startup. I think he kind of comes from like the comedy stand up, like, video world. Now he's done a lot of research and he actually understands things, like, really well. But it's just funny that, like, he frames it as, like, oh, this is something Andreessen would do when, like, I think it's just because, like, they're a big name or something.
Speaker 1:It was kind of interesting, like, like, that's what he picked because, like
Speaker 2:No. I read it as the, oh, you're kind of making a play on Adam Newman, but, like Oh,
Speaker 1:yeah. Okay.
Speaker 2:Well, no. Because that's the most high profile company that Andreessen
Speaker 1:So they went back to after after
Speaker 2:the founder who had had a fall from grace.
Speaker 1:Yeah.
Speaker 2:Bizarre. No. But Andreessen doesn't get enough credit for not doing FTX. Right?
Speaker 1:Yeah. Because you
Speaker 2:know they were pitched it.
Speaker 1:Of course.
Speaker 2:Over and over and over and over and over and had to have said no. It's not like Yeah. It's not like it was an issue of price. Right? Yeah.
Speaker 2:Because that's not
Speaker 1:the And their funds are so scaled like. Right?
Speaker 2:Yeah. I think. That was an intentional decision, which is why from an investing standpoint, you always have to ask yourself why am I the one that is doing this deal. Right? Like why are all the there's the market's not perfectly efficient, but it's pretty efficient.
Speaker 2:Right? Yeah. And so anybody who is backing FTX, I mean, while Andreessen has their multibillion dollar crypto fund and has a mandate to be in every important company.
Speaker 1:Yeah. Yeah.
Speaker 2:Why did they not do it at any point?
Speaker 1:That's yeah. That's interesting.
Speaker 2:Found found I don't think Founders Fund did it either. Right?
Speaker 1:Yeah. I mean, the the the the the team always talks about, like, yeah, there were a bunch of red flags. But Here's here's There was there was another thing. I mean, some people passed I mean, obviously, like, the best people passed just because they clocked SPF as like a fraud. But other people passed because, either they were already in Coinbase or, what what's that?
Speaker 1:Finance. Finance. So if you're in those, you might be like, I don't wanna do FTX as well.
Speaker 2:People always figure out how to do yoga. Oh, he's building a finance super app.
Speaker 1:Yeah. It's not an issue. Yeah. It's true. And then, yeah.
Speaker 1:I mean, there's a bunch of stuff there, but,
Speaker 2:I I think that the the this reminds me of of of, another story that I don't think is printed out, but is worth covering. Came out yesterday that the FBI is probing this group called Hone Capital Sure. For ties to the CCP. Yeah. And I think it was between 2016 and 2018, AngelList took $80,000,000 Yeah.
Speaker 2:In in sort of direct investment from Hone Yeah. Under the conditions that Hone could get full visibility into every syndicate happening on the platform Yeah. And presumably all the data Yeah. Associated with those, which as you know is decks, financials, you know, other you know, basically, so like that to me that to me is like the biggest story in tech right now, and there's Yeah. And there is enough, there's enough people that have a vested interest in AngelList.
Speaker 2:I think it's generally great for the innovation ecosystem and is like a positive platform and force, but that to me is potentially one of the first stories of a series of stories around foreign money in venture that will just continue to Yeah. Drop.
Speaker 1:I always I always wonder about the actual impact of that stuff, though.
Speaker 2:Totally.
Speaker 1:Because it's like I I was I was joking about like imagine you're some spy and you work your way up and, you know, you wheezy your way into a big venture firm and you're like, oh, yeah. I'm with like the top dog. Like I
Speaker 2:I'm seeing everything.
Speaker 1:I'm seeing everything. And then the GP is just like the future is NFTs. And you're like, goddamn it.
Speaker 2:I will never bring honor
Speaker 1:to the weapons. Yeah. Like like yeah. Okay. So they steal the code to the the the the the monkey pictures.
Speaker 1:It's like it's like like there are a lot of companies where actually the IP Yeah. Is not that valuable and stolen. Like
Speaker 2:Yeah. But I'll give you a story that is relevant. So deterrence, defense tech startup Yeah. Was speaking with a high profile rolling fund GP on AngelList.
Speaker 1:Yeah.
Speaker 2:And because deterrence will be doing sort of stuff that's critical to national security, we need to know who your end LPs are.
Speaker 1:Sure.
Speaker 2:And so we pressed said GP on who their LPs were, and he basically had to say, I don't know. I cannot verify Yeah. That they're not Yeah. That it's not Russian money, CCP money, etcetera.
Speaker 1:Yeah.
Speaker 2:And I independently verified with another, probably won't run this, but I independently, between you and me, I independently verified with somebody who worked with the fund who was like, yeah, there's Russian money, like Yeah. 100%.
Speaker 1:Yeah. I mean, I I at the same time, like, I get that it's a bad look, but, like,
Speaker 2:I mean, I barely give
Speaker 1:the cap table to my own investors.
Speaker 2:But but if the problem is if some if that money gets invested into the company, it damages the company prospects. Right?
Speaker 1:Purely brand and optical? Or do you think that there's actually a path to No.
Speaker 2:It's USG USG being like, hey, we don't wanna work with a company whose end investors.
Speaker 1:Sure. Sure. Sure.
Speaker 2:Yeah. That that's just
Speaker 1:Yeah.
Speaker 2:That can damage the company.
Speaker 1:I just I just have trouble because, like like, most investors, it's like you take a small check from them and you never hear from them again. And then a few of them, it's like, okay. They got some information rights. Like, maybe they get, like, a p and l. Yeah.
Speaker 1:Or, like, maybe they get a cap
Speaker 2:table once
Speaker 1:in a while. But it's like they're never getting access to, like, the git repo. They're never getting access they, like, they they don't they don't have any they're they're not badged. Like, they don't get to go in the company. And so, like, I get that it's, like, a huge problem in, like, in, like, a big thing in, like, the abstract, but I feel like, like, even if even if you found, like, 6 degrees of separation, there's some, like, you know, Russian or Chinese money in something.
Speaker 1:Like, I don't know that that's as dangerous as we think it is as opposed to, like, the the the more obvious thing of just, like, there's an employee who's been compromised and just has access to the whole network.
Speaker 2:Right.
Speaker 1:Like, that might be that might be
Speaker 2:what
Speaker 1:we actually need to be worrying about, like, more.
Speaker 2:I would say the other
Speaker 1:And then and then there's also, like, the the Chinese and and Russian money. There are expats. There are people who are just rich kids over there who, like, want to play in this
Speaker 2:They wanna get a YOLO.
Speaker 1:They wanna leave or they or they like America and they're just, like, not actually aligned. Of course, they could be turned. Of course, there could be spies. Like like, that that definitely happens and that is an that is an issue.
Speaker 2:I would say that
Speaker 1:Yeah.
Speaker 2:I would say, 1, again, all this information is possible to get in numerous different ways. Yeah. Like, if you wanna find out about a company, almost any company, any tech startup that is, like, precede through series b right now Yeah. You or I could get their their most recent fundraising deck by the end of the day Yes. If we actually wanted to.
Speaker 1:Yeah.
Speaker 2:Right?
Speaker 1:And We could probably find that out. Of companies, it's a bad idea to even copy it for, you know, for like the rest of the it's like for for for for the next for the next, like, order of magnitude of companies Yeah. And if there it's, like, okay. So, you know, maybe there is something there, but, like, just having the deck isn't enough to really, like, create some sort of, like, geopolitical advantage. There's just a lot of steps to, like, actual, like, you know, shifting the geopolitical balance.
Speaker 1:Again, it's gonna blow up and it's gonna be really it's gonna be a really big, like, brand issue and it's gonna be something that people really have to fight through and, like, work through these questions.
Speaker 2:It's tough for AngelList because founders already are not generally super excited about their deal going on that platform because of the information issues
Speaker 1:Yep. Just
Speaker 2:getting a thousand.
Speaker 1:And and Carta had a similar thing where they were like kind of selling data or something or like Carta was they They they were trying to do secondary transactions. Right?
Speaker 2:Private mark or cap tables are should inherently be private Yeah. And owned by the company Yeah. And Carta was leveraging that information to catalyze their Secondary. Their secondary market which was the entire reason that they were gonna be a $10,000,000,000 company.
Speaker 1:Because if they become like the NASDAQ
Speaker 2:Because of your
Speaker 1:secondaries that's Yeah.
Speaker 2:Like massively that they can make 5% on every trade. Yeah. They're huge blocks
Speaker 1:Yeah.
Speaker 2:Trades. You could be facilitating 1,000,000,000 of dollars of trades Yeah. A year. If it worked it would have been
Speaker 1:Much more valuable than just a NAS product.
Speaker 2:Yeah. The the people that are like, who's investing in card at a $8,000,000,000 valuation? It's like, well, if you can be the platform for secondary transactions and get 5% rate on everything, like, you're such an unbelievably large business. It's insane.
Speaker 1:But yeah I don't know do we know who like do we know who is running hone yet like is this guy there's some dude who was like running around right investing.
Speaker 2:He probably was very well loved because he's like you get 2,000,000 for your fund, you get 2,000,000 for your fund. We're backing your next syndicate deal. Right? Like it's really like money is like a way to make fast friends in the valley. Right?
Speaker 2:Yep.
Speaker 1:Yep. That's true.
Speaker 2:Everybody loves a loose LP.
Speaker 1:What else should we, should we should we do this one?
Speaker 2:What? Oh, yeah. That's great. I mean, so it's funny. I've I've DM I have no idea.
Speaker 2:Do you I actually don't know who's behind the account at all. I don't. But I and they seemingly came from, like, finance Twitter and, like, just inserted themselves because, like
Speaker 1:Yeah. It's one of these things where it's like it might not even be a girl.
Speaker 2:Totally. Totally.
Speaker 1:Who knows, but good good posts. So Sophie says I think we can do away with seed series a b etcetera. Either a round is a vibe round or an excel round. And then Matt Turc says, something like this with the midwitt meme. And the midwitt meme is just at the early stage it's vibes and the midwitt is excel.
Speaker 2:It's all it's all vibes.
Speaker 1:And I mean we're certainly seeing that with with like vibe rounds getting done in the seed stage and then also VIBE rounds getting done in, you know, tens of 1,000,000,000 and beyond.
Speaker 2:The tens of those are the real VIBE rounds.
Speaker 1:Yeah. Do you think it Yeah. Do you think it just all comes down to the founders of the ability to create a Vibe round at any point in time? Is it always better to have a lower cost of capital through a Vibe round? Vibe round feels like like inflated valuation, like higher multiple.
Speaker 1:When I hear 100 x revenue multiple, like that's that sounds vibey.
Speaker 2:I think it's I think a better way to look at it is does a company have product market fit or narrative market fit Mhmm. And you actually get crazier multiples for narrative market fit Mhmm. Because it's all just And so an example being if you have product market fit and you have 2,000,000 of ARR and you're adding a million of ARR every 3 months, you could probably go out and raise it like 60. Right? Yeah.
Speaker 2:Somebody's like, okay, like this is, you know, great momentum. This thing's working. If you have narrative market fit and like venture capitalists are sort of applying this lens to your business, which is like this is a crazy new market and a crazy team and they're just gonna figure it out, you can also get the 60. Right? Yeah.
Speaker 2:And so I often think whether it's truly like a PMF round or a VIBE round, they get done in a similar range if the founder of Cal if the if the caliber of founder is is, you know, relatively, you know, equal. The real crazy multiples come when it's both. Right? When you have the product market fit and the narrative market fit
Speaker 1:That's true.
Speaker 2:And I would give I'd give the example of that company Harvey, which, like, has they're the legal AI company. And, you know, they have probably 10,000,000 of ARR or tens of millions. I don't I don't actually know. I think, like, people talk about their metrics, but it's sort of unclear, but the narrative market fit, which is that LLMs are gonna dominate the legal industry is so strong that why would you not do this round at 800,000,000. Right?
Speaker 2:Like just crazy multiple because Yeah. Yeah, they're the category winner and like the narrative tracks.
Speaker 1:And it's not like the early days of LLM development where I mean what open AI probably made no money for like 8 years. Right? It was incorporated in what, 2016 or something like that? Like, as a non profit for that reason. And, like, all of those labs didn't really have products that anyone wanted to buy for a very long time.
Speaker 1:Right. Even GPT 1, GPT 2. Like, they had APIs. Even the first GPT 3, it was, like, kinda cool, but no one could figure out a way to, like, really incorporate it in a product. Yeah.
Speaker 2:There was a company this company copied out AI. Yeah.
Speaker 1:Rocketed to,
Speaker 2:like, 20,000,000 of error. Because they kind of figured out how to actually figured out how to how to leverage the API Yeah. And it was great. And I think they had to completely pivot because people just started using chat gpt. Right?
Speaker 2:Because it did exactly that for way less money.
Speaker 1:Yeah. There was another one too. Some sort of writing assistant. I think it had like one of those personal names like not not Harvey, but it was like a human name and
Speaker 2:Yeah. So I think I think Vibe rounds or Excel rounds is right, but the way to look at it is to say, is this a product market fit? You know, sort of like we're doubling down on this, like, momentum, like, actual business traction or is it narrative or is it both? And that's the crazy stuff.
Speaker 1:Yeah. There does seem to be, like, a certain type of founder that you can just, like, will a Vibe round into. Like like Adam Newman to your point.
Speaker 2:Oh, it's category dependent. Right? But right now, if you're
Speaker 1:Because like yeah. I mean, like, WeWork was not in some like AI megatrend. Right? Like, there was no there was no like mega trend in like, oh, yeah. Of course, real estate in 10 years is gonna be completely different.
Speaker 1:Like, whereas, like, the legal profession is pretty consensus that, like, like, the tools are gonna get better.
Speaker 2:Yeah.
Speaker 1:Who's gonna be the winner? How what will the market structure be? There's a lot of questions there. But you can clearly build a thesis around that. Whereas saying that, you know, the the real estate market is gonna be dramatically different in 10, 20 years.
Speaker 1:Like, that's not
Speaker 2:an awful lot of sense from people. The WeWork was incredible product partially because it was venture sub size. Right? Yeah. The fact that you could pay $300 a month to walk into a beautiful space and have a desk and have coffee and snacks, like they were just not charging enough.
Speaker 2:Right? And then you combine that with incredible positioning around and storytelling from Adam around what has this actually become. WeWork is, you know, a dominant brand within all within everyone's daily life. Right? Like, something that is the the whole vision of apartments and offices and
Speaker 1:Yeah.
Speaker 2:You know, it was easy to buy into that when he was adding a new WeWork every 2 days and I'm sure the retention when it was really cheap was great and, you know, people loved the product and peep you know, I think, so again that's him, he was forcing the narrative Yeah. But the product market fit was real.
Speaker 1:Yeah. He was able to like sprinkle the vibes on top. Yeah.
Speaker 2:I think it's funny as founders really convincing founders the most convincing founders convince themselves
Speaker 1:Mhmm.
Speaker 2:First, and then by nature of being a 100% convicted Yeah. They like, Breslow is a good example. Right? Breslow, when he goes out and pitches, is a 110% convicted in what he's saying. Mhmm.
Speaker 2:So much so that other people are like, this sounds crazy, guy's convinced and he's gotten this far. So, like, why would you not bet on him? Right?
Speaker 1:I
Speaker 2:think that wears off eventually. You think it
Speaker 1:wears off just because you
Speaker 2:become jaded because of, like,
Speaker 1:you're, like, as you get further in your career, like, you realize, like, wow, I was a 100% convicted about that particular thing and it didn't pan out. Therefore, maybe I should be less convicted about the next thing.
Speaker 2:I think it wears off on a founder to founder level. Right? Like nobody stopped believing in TK. Right? Imagine what Uber would be today.
Speaker 2:Yeah. It would it would probably be Waymo. You know, I think they have a partnership now or something. But
Speaker 1:I wonder yeah. I wonder if there's anything that you can do to, like, up regulate, conviction. Almost like the the it's like the inverse Ayahuasca thing, you know.
Speaker 2:Yeah.
Speaker 1:I mean, I I guess for some of the psychedelics, like, it does seem, you know, there was kind of that counteraction to well, like, Steve Jobs took psychedelics, and he came back, like, extremely, you know, you know Focused. Focused and and probably more aggressive about, you know, I know the truth. I know that, like, this product is the right one. And he was, like, you know, unwilling to accept a plastic, you know, like a plastic screen on the phone. Like, it had to be glass because, like, that's what he, like, knew to be true.
Speaker 2:Meanwhile, the Apple designers today are, like, we're gonna put the camera offset from the back of the phone
Speaker 1:So it's like
Speaker 2:so that people have to buy a case Yeah. Which is gonna increase our AOV by 6%. It drive meaningful. Where it's like, oh, you made a phone that, like, doesn't sit flat Yeah. And the camera lens sits on the, like, yeah.
Speaker 2:So, yeah, I think I think the the topic of psychedelics is is a is a funny one because I do think that the technology industry has for a long time benefited from it Yeah. But it actually went too far. Right?
Speaker 1:Yeah.
Speaker 2:And, yeah, the whole Ayahuasca by itself I think will be studied. Like to me that's like the overdosing on LSD of our generation Yep. Where like a lot of people use LSD and, you know, experience some
Speaker 1:Yeah.
Speaker 2:Range of benefits from it. Right?
Speaker 1:And the psychedelics might have just straight up been weaker back in the job Totally. Yeah. He might have been taking like 1 tenth of the dose that like some tech middle manager takes on some Ayahuasca retreat Yeah.
Speaker 2:Like turbo
Speaker 1:fries their brain or whatever.
Speaker 2:Yeah, the funny thing about the average I've always always avoided it. Had no interest in it and fundamentally like, you know, would have had had plenty of opportunities where I could have and opportunities where I could have chosen to do something like that. But once people understand the actual nature of an Ayahuasca retreat, most of them involve going with a 100, 20 to a 100 strangers and sitting in a room. And the stories I've heard of people saying, like, yeah, I was in a room with a 100 people, everybody's laying in a cot, throwing up, having an exorcism. The funny thing is is people end up ultimately losing themselves so much that that I've heard of men starting to masturbate in these sort of, like, group settings.
Speaker 2:I know these 2 girls that run a popular health podcast in LA and they were approached by this, they were approached by this company in Costa Rica that said, hey, come come down and experience, like, you know, it's it's on us, it's comped, and they were they were telling me the story. They were like, yeah, a bunch of, like, half the room was throwing up, 10% of the room was just masturbating, and and I was like, oh my god. I'm so sorry, and they were like, no, no. It was incredible. I was like
Speaker 1:Yeah. I
Speaker 2:was like, and what did you get out of it? You come these people come back and they're even more lost like it seems like it sends you down this, the turbo fry. I think I think Mark called it turbo frying.
Speaker 1:Yeah. It was a land shark who he posted like the screenshot of land shark saying like, you know, you get your gigafry your brain. Turbonormies get their get their brain gigafried because they're, like, they're not settled spiritually and then they go into the space station.
Speaker 2:The thing with psychedelics, if you are incredibly mentally stable and fortified Yeah. And you do these things Yeah. It can have a positive effect in terms of helping you see things in a new way. Yeah.
Speaker 1:And you have to imagine that I mean I don't know exactly Job's timeline for this stuff but it's possible that, like, he had already created a $1,000,000,000 of value and created, like, some of the greatest, like, advances in computing history ever. And so going into that, you're probably, like, pretty confident. Yeah. You're not, like, where's my place in the world? Like Yeah.
Speaker 1:He's going into that experience thinking my place in the world is to create great technology, and I'm really fucking good at it. And so I'm just doubling down on that when I come out of this.
Speaker 2:And psychedelics help you see things from new angles.
Speaker 1:Yeah.
Speaker 2:That's the best way that I think it for somebody that hasn't ever tried them to look at it is like you're looking at an object like you're looking at a computer and your entire life you've looked at it one way. Yeah. Psychedelics help you view it from like this way.
Speaker 1:Yeah.
Speaker 2:And you're just like, okay, what if we, you know, made the whole back structure clear so you could see into the machine.
Speaker 1:Yeah. Yeah. Right?
Speaker 2:And it's like that becomes an iconic feature of
Speaker 1:the product.
Speaker 2:Yeah. Yeah. Yeah. But it wasn't like Steve went into a cave and then invented computing.
Speaker 1:Yeah. Right? Like it was like
Speaker 2:he didn't come out with like the circuit board. Yeah. It was more so I think he deserves credit for things like design and deeply prioritizing the user and coming and thinking about products in new ways and delivering elegant versions of that. Yeah.
Speaker 1:I do have a friend who is already like one of the most aggressive people I know and has always been and he did acid one time and he can't and he and I was like, oh, what was that? Like he said, yeah. Like stuff like, you know, wiggled around but my number one takeaway was that I need to be more aggressive. Yeah. And I was like
Speaker 2:It just makes you more of you.
Speaker 1:Yeah. Yeah. And I was like, that's the opposite of what you normally hear about where it's like, it's like, it's so at peace with the world. Like, you know, I need to, like, think completely, but it's like, you know, I need to be even more aggressive.
Speaker 2:Yeah.
Speaker 1:Already one of the most aggressive guys. Okay. Should we talk about, Turner's tweet? I kinda like this one. Do you see this?
Speaker 2:Tech tech ecosystems right now. SF, our most important startup is imploding. NYC, our mayor just got indicted for corruption. Europe, we are banning AI. China, VCs are suing all the founders.
Speaker 2:Des Moines, Iowa, we'd like to buy 50% of your company for 50 k. I think it's great. I mean it's it's pretty much
Speaker 1:Well, did you see the chart of the Chinese, like startup form?
Speaker 2:Oh, yeah.
Speaker 1:I was just like fell off a cliff.
Speaker 2:Yeah.
Speaker 1:And like no one's starting companies there?
Speaker 2:Yeah. Because they can claw so Yeah. For those who didn't see it, the basically, if your company fails, they can, like, come after all of your Personal. Personal assets, which feels fairly communist
Speaker 1:Yep.
Speaker 2:Which makes sense.
Speaker 1:Makes sense.
Speaker 2:But, yeah. I mean, I think it's, like, that's why that's what makes technology beautiful. Right? It's not just like it's people, like, try to blunt put tech in a single bucket and this and and even to people that aren't in the industry, if I they ask me what I do, I say I'm in tech. Right?
Speaker 2:Like, and then they have an idea of what that means, but tech is so different in different, like, you it's surprisingly, like, different in different locales. Yeah. Right? Like, it's still operating. Yeah.
Speaker 2:If you the real alpha the real alpha is to go to the Midwest and just start offering slightly better terms.
Speaker 1:Yeah. Yeah. Yeah. Yeah. I'll give
Speaker 2:you 50
Speaker 1:60 k. I'll give
Speaker 2:you 50 on 500. Yeah. 55 k.
Speaker 1:Yeah. 50 percent of your business. Yeah. I mean the funny thing is that, like, these are these are framed as, like, they're all in chaos. Like, it's all bad news everywhere.
Speaker 1:Like, everything's falling apart. But I read this as, like, you have to be in a major American city. Yeah. Yeah. Yeah.
Speaker 1:Yeah. I I would choose SF in New York where one startup is imploding and the mayor's getting indicted versus banning AI, BC suing founders, or clearly, like like, you know, capital markets that are just, like, you know, defunct, basically. It's like yeah. Like the the SF. I don't even know if I buy the imploding narrative, but like if it is, that's a lot of opportunity.
Speaker 1:That's maybe creative destruction or creative disruption. Maybe there'll be, you know, all year. Time for
Speaker 2:us to go join OpenAI and climb the ladder. That's the long term. What's what's your long term vision for Technology Brothers? COO and CMO of OpenAI. Yeah.
Speaker 1:A more a more hostile takeover. You know, they always talk about like the the, you know, Sam Altman. If you drop him on the island, he could become like the cannibal. It's like, no. Like, I could even do it better.
Speaker 1:I could
Speaker 2:do it better. The knife. Yeah. Yeah. And who knows?
Speaker 2:But I I do think that, there's this tendency to latch on to headlines Yeah. And just everything behind the headline in every single one of those cities there's amazing companies that are hitting their stride finding, you know.
Speaker 1:Except in Europe.
Speaker 2:Except in Europe.
Speaker 1:Europe seems in in a lot of shambles. Shambles.
Speaker 2:It seems. We it's time to colonize Europe.
Speaker 1:Yeah.
Speaker 2:Yeah. They colonized America.
Speaker 1:Well, I was It's
Speaker 2:time to colonize it better.
Speaker 1:I was joking about, like yeah. Like, either in Germany or or France, like, yeah. France specifically, like, you know, it just seems like authoritarianism was on the rise. Like, they're banning free speech. They're banning, like they're arresting Pavel Durov, the Telegram guy.
Speaker 1:And it's like, yeah. Every every 80 years, like, America has to go in and, like, cleans clean it out. Clean house. Clean house. It's like like we gotta go in Alright.
Speaker 2:This was a fun social experiment. There's only one form of government that actually works.
Speaker 1:Yep. And we're here to sort of bring it back. It's reinvades.
Speaker 2:We've we've brought democracy. We've exported it's the greatest export. Right? It is. Free markets and democracy.
Speaker 2:Yeah. But sometimes people forget that. Yeah.
Speaker 1:Yeah. Yeah. Yeah. What else is on the docket today? What else do we have?
Speaker 2:I had I had a funny moment today that or a funny moment this week that you'll appreciate. Yeah. Somebody asked me about Excel and Yeah. Blah, and they were like, yeah. I really thought that I I've been working on a pouch concept.
Speaker 2:I think that, it could be really good business to just get to, like, 40 to 50,000,000 in cash flow.
Speaker 1:Yeah.
Speaker 2:And I was like, yeah. Yeah. I mean, that that's, like, cool story, bro. It's like I was like, you're, you're gonna go into one of the most cutthroat competitive industries 7 to 8 years behind Lucy Yeah. And then just think that you're gonna, like, ride to, like, cash flowing some asset Yeah.
Speaker 2:From 0 as there's all this money coming in and, like, it's a turmoil. And it made me think of something that I think is a topic Yeah. That that more people need to understand is, like, there's this attitude within, like, the, like, startup founder of, like, I'm gonna try to find these, like, easier ideas Mhmm. And usually usually they land on ideas that are perceived to be easy for some reason, which is like, okay, I get a plastic thing Yeah. Put nicotine pouches in it Yeah.
Speaker 2:I just start selling it
Speaker 1:Yep.
Speaker 2:And I'll get to like 40, 50 mil and just cash flow it.
Speaker 1:Yeah.
Speaker 2:It's not a venture thing, but like Yeah. And there's this idea too that for some reason cash flow businesses are easy to build when it's like, okay, would you have been able to get, you wouldn't have been Lucy wouldn't have been able to survive without venture capital to
Speaker 1:get to the world. Impossible.
Speaker 2:And so I think it's like it's probably a topic that will come in. I just see it come up all the time where people are trying to find I mean, it's ideas.
Speaker 1:The exception in the tobacco category and the nicotine space is just straight up being illegal. Like Yeah. There are companies that cash flow a lot because they just break all the laws and they just don't do anything. And so Yeah. But they're very short lived and they're not and there's no equity value and it's not like when I think when people think like a cash flowing asset, it's like, oh, yeah.
Speaker 1:It'll cash flow for a long time. Not like, I mean, like, all our cash flow. Businesses. Yeah. Yeah.
Speaker 1:Yeah. Yeah. A lot of these a lot of these vape businesses, like, popped up overnight. Yeah. Just, like, completely broke whatever rule was in place.
Speaker 1:And it's like, well, you know, Big Tobacco isn't gonna break that rule. Juul isn't gonna break that rule any anymore, like, has been you know, like, if there's a flavor ban, they can just, like, move around it or whatever. Yeah. But, yeah, it's it's wild.
Speaker 2:Yeah. So that that was something that that came up multiple times this week where founders were just searching for what they perceived to be an easy idea because it was very tangible. Yeah. Yeah. Yeah.
Speaker 2:And then you're just like, no. Well, you know, you know that this is like gonna be extremely difficult on all these dimensions. Yep. You've had a and and in that situation I was like best in class team with Lucy that's been executing on the same vision for almost closer to a decade Yeah. And Yeah.
Speaker 2:1st year. Yeah. And so that was funny. And then the other thing that the other thing that came up that I think is funny, I was the the text that I sent to a buddy was real men raise pre product because like I we have a friend I have a friend who's starting a new company and he's like hire he's done well previously. He's like hiring engineers for the new company, but he hasn't he hasn't raised yet.
Speaker 2:And I was texting him wanting to invest and he's like, yeah, we're gonna wait till we get to like some revenue traction. I was like, God, like, it's so soft. Just like, you know, like, and that's again like the vibe round versus like the traction round is like, I'll give you the valuation that you you will get when you have the traction. I'll give it to you now.
Speaker 1:Yeah. Yeah.
Speaker 2:Yeah. But, like, just fucking sit, like
Speaker 1:Send it. Send it.
Speaker 2:Yeah. Right. So, like, be a man. Like a race. Like, if you have conviction, have some conviction in your idea, raise free products.
Speaker 1:Like Yeah.
Speaker 2:It's soft to do anything otherwise. Like Yeah. The venture economy depends on people Yeah. Going full send. Yeah.
Speaker 2:Yeah. Yeah. Yeah. Recruiting the economy.
Speaker 1:I always I always laugh thinking about this this conversation that, Mark Andreessen was having with Ken Griffin. He's talking about, like, his earlier funds and how he was, like, kind of trading his own money. And then he, like, scaled up the fund, raised got some LPs, like, got some meat on his bones so he could, like, size up the firm and, like, start going bigger and bigger. And now, you know, he still owns, like, a huge amount of it. But he's just, like, you know, Mark and, you know, obviously, as you know, like, you know, if you don't if you don't get some capital behind you, like, you don't go very far.
Speaker 1:And he just said it, like, it was just, like, such a truism, and, like, it was just such a funny, like, disregard, like, anyone who's, like, oh, bootstrapping. Like, it's easier and stuff. He was like, no. No. No.
Speaker 1:Like, not if you wanna play in his league. Yeah. Yeah. It was just like it was just like, yeah. Like, obviously, you need capital if you have a good idea and are moving quickly and want to, like, actually win and play some, like, deeper, bigger game.
Speaker 2:Yeah. But
Speaker 1:the but the the myth of, like, oh, yeah. A lot of a lot of entries burn out. Therefore, there's some sort of lifestyle business might be easier or something. I've never liked it.
Speaker 2:Yeah. The the the issue when people analyze bootstrap versus venture businesses is they'll be like, look at this business. It was bootstrapped. Yeah.
Speaker 1:Yeah.
Speaker 2:And it sold for $1,000,000,000. Yeah. And they're like, look at this company. This company had to raise $300,000,000 to get to $1,000,000,000. Discounting the fact that the bootstrap business took 30 years.
Speaker 2:You know? Like, they're like, look, they were able to bootstrap it and it's like, yeah, you'll be 70
Speaker 1:years old. Yeah. Like,
Speaker 2:so anyways real men. I'll just end this podcast by saying real real men raise pre product.
Speaker 1:Indeed.