WEBVTT
00:00:00.320 --> 00:00:08.880
The United States of America will never allow a radical left woke company to dictate how our great military fights and wins wars.
00:00:09.199 --> 00:00:15.439
So wrote the president of the United States on Truth Social the week before the USA went to war in Iran.
00:00:15.759 --> 00:00:16.640
Welcome back.
00:00:16.800 --> 00:00:20.640
This is the AI Transition, and I'm Steven, and with me as always, Lauren.
00:00:21.120 --> 00:00:24.079
Steven, woo-hoo! What a late-in! Oh my god.
00:00:24.559 --> 00:00:27.039
Just another just sort of a small topic for us to cover today.
00:00:27.519 --> 00:00:30.960
Look, there's not much out there on this one at the moment, so it's gonna be tricky.
00:00:32.320 --> 00:00:42.000
So obviously, this is in the news all over the place from literally we're seeing a war getting played out, what played out a couple of weeks ago with what happened with Anthropic ourselves.
00:00:42.159 --> 00:00:44.719
Our plan is to uh recap the background.
00:00:45.600 --> 00:00:48.159
Then how the government said no.
00:00:49.759 --> 00:00:50.640
Computer said no.
00:00:50.960 --> 00:00:53.840
Computer said no to government says no.
00:00:54.079 --> 00:00:54.880
Government says no.
00:00:55.119 --> 00:01:01.200
The fight back from the computer dudes, and then potentially something you can do about it all as well.
00:01:02.799 --> 00:01:03.840
Computer says no, gummy.
00:01:04.079 --> 00:01:04.480
Well, that's true.
00:01:04.560 --> 00:01:05.040
What's just this?
00:01:05.120 --> 00:01:05.519
Who says no?
00:01:07.040 --> 00:01:07.920
Exactly.
00:01:08.560 --> 00:01:11.280
Well, I think we should maybe say no well we can, to be honest.
00:01:11.359 --> 00:01:13.200
So that sounds good to me.
00:01:13.920 --> 00:01:19.040
So maybe if we just quickly recap what happened in the previous episode and the lead up to this.
00:01:19.280 --> 00:01:19.519
Sure.
00:01:19.599 --> 00:01:22.480
So with our with our last episode, Computer says no.
00:01:22.640 --> 00:01:37.200
We talked a little bit about um, well not a little bit, a lot about Anthropic's 23,000-word constitution for its AI model, Claude, you know, which was designed to embrace core ethics and values um directly in its model uh rather than a hard set of arbitrary rules, right?
00:01:37.280 --> 00:01:42.959
And we dove into this awesome badass philosopher that I had working with the team who helped develop that framework.
00:01:43.120 --> 00:01:48.719
But it was specifically there to prevent it from blind obedience even to government operators.
00:01:48.959 --> 00:01:52.319
And then we saw things get really interesting with the Pentagon, right?
00:01:52.560 --> 00:01:57.840
So the US Department of War still blows my mind that that's what it's called by the I know, I know.
00:01:58.239 --> 00:01:59.840
Sweet defense, and now it's war.
00:02:00.239 --> 00:02:16.719
Um, they approached Anthropic, right, saying, hey, we we we we've got a contract for you, but we need blanket permission to use Claude for mass surveillance of American citizens and also for autonomous weapon systems that you could fire without having a human in the loop.
00:02:16.800 --> 00:02:17.360
And guess what?
00:02:17.439 --> 00:02:20.240
That didn't really align with the AO constitution.
00:02:20.479 --> 00:02:20.639
Yeah.
00:02:21.039 --> 00:02:23.120
Fundamental to how Claude is meant to work.
00:02:23.439 --> 00:02:27.520
Look, look, look in fairness, and I can't believe I'm saying in fairness to the Department of War.
00:02:27.680 --> 00:02:34.639
What the what the pushback was that they said that they wouldn't have anyone telling them what to do at all.
00:02:34.879 --> 00:02:35.199
Right.
00:02:35.360 --> 00:02:42.080
So so yes, these these are two red lines, but the the point was for the Department of War saying, no, no, no, you don't go to decide that.
00:02:42.240 --> 00:02:43.120
We we decide that.
00:02:43.199 --> 00:02:48.159
We might not go we might not do that anyway, but it's us that make that call, not you.
00:02:48.960 --> 00:02:53.120
So it's a real power battle that that was that that was going on.
00:02:53.439 --> 00:02:56.000
So that's where we left it a couple of weeks ago.
00:02:56.240 --> 00:03:08.000
And what happened next was the February 27th deadline passed, which the Department of War had set, and Secretary Pete Hesge, lovely man, um, designated anthropic a supplier.
00:03:10.479 --> 00:03:10.639
Indeed.
00:03:11.599 --> 00:03:15.280
Indeed, you know, you're kind of top of the top of the tree talent there.
00:03:15.360 --> 00:03:18.479
Um that's where you go when you're thinking about your military strategy.
00:03:18.800 --> 00:03:22.639
It is, it is, and also where do you go after you've been your uh your a news presenter?
00:03:22.719 --> 00:03:29.840
I mean, if you're not going to be you're in charge of the Department of War for you know the the superpower power of the world, what are you going to do?
00:03:30.000 --> 00:03:31.199
I mean, where is the next step?
00:03:32.000 --> 00:03:32.159
Yeah.
00:03:32.319 --> 00:03:34.080
Where where can we go next ourselves, Stephen?
00:03:34.159 --> 00:03:35.039
It takes the question.
00:03:35.360 --> 00:03:38.400
How did I combine us into that whole con that was bad.
00:03:43.439 --> 00:03:45.360
So he they set a deadline.
00:03:45.520 --> 00:03:46.960
That deadline passed.
00:03:47.199 --> 00:03:54.000
Um and they were designated, although it's a little bit ambiguous whether they were, whether they weren't.
00:03:54.240 --> 00:03:56.879
Do you do you believe the the the truth socials?
00:03:57.280 --> 00:04:07.039
But um this designation gives the uh the Secretary of War the authority to exclude a company entirely from competing with military contracts or subcontracts.
00:04:07.120 --> 00:04:09.199
So basically kicking them out of government, right?
00:04:09.280 --> 00:04:19.920
Which is for a company that's basically a B2B company, that's pretty serious if you know one of your major clients, who is the United States government, are saying, you're now a risk.
00:04:20.079 --> 00:04:28.560
And uh that label has only ever been used in the past for foreign adversaries and non-US companies, which is wild.
00:04:28.879 --> 00:04:30.000
Oh, again, mind-blowing.
00:04:30.079 --> 00:04:33.360
It just sounds a little bit like this whole little tariff game that gets played as well.
00:04:33.519 --> 00:04:40.480
Here's the version for you, and we're gonna take you out of all the government procurement opportunities and shut you down that way.
00:04:40.720 --> 00:04:46.639
And I think did you did Trump seriously say, you know, there's a big meltdown on Truth Social?
00:04:46.959 --> 00:04:47.759
There's the words right there.
00:04:47.839 --> 00:04:49.439
Oh, we're going deep in this, aren't we?
00:04:49.600 --> 00:04:54.319
You know, it threatened the full power of the presidency and he's gonna fire them like dogs.
00:04:54.480 --> 00:04:55.600
How do you fire a dog, by the way?
00:04:56.079 --> 00:04:57.920
Actually, uh why would you fire a dog?
00:04:58.000 --> 00:04:58.800
I don't know.
00:05:00.399 --> 00:05:02.560
Well, it depends how obedient they are, huh?
00:05:03.680 --> 00:05:07.120
Sorry, and I don't want to go there, but is that shoot them like dogs, not fire them?
00:05:07.839 --> 00:05:08.959
But sorry, Lauren.
00:05:09.759 --> 00:05:11.839
I know you're a cat guy, but oh my god, I've been a kid.
00:05:13.040 --> 00:05:26.240
Um yeah, like you know, just talking about anthropics attempt to actually enforce these terms of service, which by the way are meant to be good, solid ethics, not to murder a bunch of people, um, as a disastrous mistake.
00:05:26.399 --> 00:05:29.519
Like, it's really ironic with what we're seeing going on in Iran.
00:05:29.839 --> 00:05:30.000
Yeah.
00:05:30.240 --> 00:05:35.120
And and let's let's recap the what he actually said, which we we said at the top of the show as well.
00:05:35.360 --> 00:05:45.199
And he said, the US, the United States of America will never allow a radical left woke company to dictate how our great military fights and wins wars.
00:05:45.839 --> 00:05:52.480
We're going to let the Fox News guy, he's going to dictate how we wage war instead.
00:05:53.120 --> 00:05:58.879
There's times when you're I'd be living in a sci-fi movie, or and to be honest, it doesn't sound like a very good sci-fi movie.
00:06:00.639 --> 00:06:02.720
How are they going to turn this around in the third act?
00:06:03.040 --> 00:06:06.160
That's like, yeah, I don't I don't quite believe this one, to be honest.
00:06:06.399 --> 00:06:22.240
Um but then what moved on was there was a memo that was and that was that that was leaked, uh, interestingly, that all military commanders had to rip anthropics technology out of all nuclear missile systems, cyber systems in 180 days.
00:06:22.800 --> 00:06:25.120
Well, we know how easy it is to decommission systems, Lauren.
00:06:25.279 --> 00:06:26.160
I mean, we've done this all our life.
00:06:26.879 --> 00:06:27.360
Just a breeze.
00:06:27.680 --> 00:06:31.120
I mean, you basically just go in and you you you unplug it and then that's it, it's done.
00:06:31.360 --> 00:06:32.720
And 180 days, right?
00:06:32.879 --> 00:06:34.000
So about six months.
00:06:34.240 --> 00:06:40.079
And we're talking about nuclear systems, drones, firefight, you name it.
00:06:40.240 --> 00:06:44.399
Oh, it's um war fighters, uh mission critical activity.
00:06:46.240 --> 00:06:46.879
Yes.
00:06:47.120 --> 00:06:47.680
Yes.
00:06:47.920 --> 00:06:53.519
So the the Dark Department of Chief Information Officer, I know I'm kind of laughing to this, but it's my is the defense mechanism.
00:06:54.959 --> 00:06:56.399
Because this isn't funny stuff, right?
00:06:56.560 --> 00:07:04.079
Um, Kristen Davis said on March the 6th that that these vulnerabilities in AI pose catastrophic risks to war fighters.
00:07:04.319 --> 00:07:09.680
So the catastrophic risks that she's uh uh alluding to is these ethical boundaries.
00:07:09.839 --> 00:07:11.600
And that's a catastrophic risk.
00:07:11.920 --> 00:07:15.199
And so they've got 180 days in order to rip these systems out.
00:07:15.680 --> 00:07:17.439
Again, it it's just mind-blowing.
00:07:17.519 --> 00:07:27.439
And this is, you know, the only official authorizing the government, if we can call it that, oh no, getting more and more controversial, to grant exemption, right, for these mission critical activities.
00:07:27.680 --> 00:07:29.360
So there's no viable alternative.
00:07:29.519 --> 00:07:34.160
So she's made this call, the government's standing behind it, 180 days to pull this out.
00:07:34.959 --> 00:07:44.639
Uh meanwhile, by all accounts, and you know, some of this is absolutely obviously held by held behind secrecies and the rest of it, you know, the Pentacan has been using Claude.
00:07:45.040 --> 00:07:49.279
They used Claude all over the Venezuela raids and for from Maduro, supposedly.
00:07:49.439 --> 00:07:55.040
Um by all accounts, Claude is fully embedded for identifying targets in Iran just now.
00:07:55.199 --> 00:07:57.279
So it's not like this isn't getting used.
00:07:57.360 --> 00:08:01.680
And it's you know, and depending on where you stand in this, you know, is that use you know ethical or not?
00:08:01.759 --> 00:08:07.439
You know, I mean anthropic set the boundaries at mass surveillance of of the American public.
00:08:07.759 --> 00:08:12.720
Although I'm not sure if it's all public or just the American public, I'll have to go and double check that one.
00:08:12.959 --> 00:08:13.199
Right.
00:08:13.279 --> 00:08:23.120
Um, and whether the AI um systems have that final kill switch that you don't need a human in the loop, but all the way up till that, they absolutely are getting used just now.
00:08:23.439 --> 00:08:27.759
Um so you know, big big stuff.
00:08:28.079 --> 00:08:28.639
Big stuff.
00:08:28.800 --> 00:08:38.720
And and again, this is just a little that we're we can kind of garner from what's out there in the news world as to what's really going on, how deep it's being used, and what it's actually being used for as well.
00:08:39.120 --> 00:08:42.960
So this then sparked somewhat of a fight back that was going on.
00:08:43.120 --> 00:08:46.399
And there is a massive lawsuit that Anthropic has now filed.
00:08:46.559 --> 00:08:49.440
Um I had a brief look over the actual lawsuit itself.
00:08:49.519 --> 00:08:53.840
Okay, I took the lawsuit and I threw it into Claude and I got him to summarize it for me.
00:08:54.000 --> 00:08:54.320
Right.
00:08:54.399 --> 00:08:57.120
Uh and it's it's really, really uh interesting.
00:08:57.200 --> 00:09:00.240
I mean, this is First Amendment stuff, and this is unprecedented.
00:09:00.320 --> 00:09:06.799
As we said, you know, no other organization in the history of America has ever been um hauled over the coals like this.
00:09:06.879 --> 00:09:14.639
You know, they've they've done it to companies like you know, Huawei and other Chinese companies, etc., but they've never done this to an American company itself.
00:09:15.039 --> 00:09:20.240
Uh and so there's a massive lawsuit that's now going through the courts, but you know how long these things take.
00:09:20.399 --> 00:09:25.200
I mean, it you know, it won't even get through the first rung of the courts in the next three to six months, probably.
00:09:25.360 --> 00:09:27.279
So but we're fighting back.
00:09:27.919 --> 00:09:39.600
Yeah, and again, it's uh it's all arguing that the government's actions are unconstitutional because they're explicitly designed to actually punish Anastropic for communicating its protective viewpoints and its safety.
00:09:39.759 --> 00:09:45.360
So when that constitution was discovered and people started to realize, hang on a second, isn't that part of the military?
00:09:45.600 --> 00:09:55.039
Then this, you know, forced this to come ahead to a head in terms of how much the government can control this software that's actually, if we can call it that, that's actually got ethics built into it.
00:09:55.519 --> 00:10:13.440
So I've actually heard that so that it was interesting that Anthropic called a constitution because I've heard this now in a number of different podcasts and and and reading material when it's saying that the the American military has found this constitution within this model and it's not the American constitution.
00:10:13.679 --> 00:10:14.559
So how dare they?
00:10:14.799 --> 00:10:15.600
So how dare they?
00:10:15.759 --> 00:10:20.879
Because it should be the American constitution that's at the base of this and not this anthropic constitution.
00:10:21.600 --> 00:10:28.159
Interesting place of words, and playing to that kind of populist bent of well, you know, people are going to say, well, of course it should be the American constitution that's in there.
00:10:28.559 --> 00:10:37.840
And of course, just forgets the global, you know, nature of what we're dealing with here and the arrogance of our friends.
00:10:38.080 --> 00:10:38.159
Yes.
00:10:40.960 --> 00:10:42.879
Well, it's this is part of the fighting.
00:10:43.039 --> 00:10:43.440
We're fighting back.
00:10:43.600 --> 00:10:44.159
Okay, we're fighting back.
00:10:44.240 --> 00:10:46.000
This is part of the fighting back, yeah.
00:10:46.399 --> 00:10:47.360
It's part of us fighting back.
00:10:47.679 --> 00:11:00.480
But literally on the same day that the the deadline passed, and this was um Friday, the 27th of February, Sam Altmont, you know, a couple of hours afterwards came on and said, Hey, don't worry, we've taken the government contract.
00:11:00.720 --> 00:11:01.360
We'll do it.
00:11:01.600 --> 00:11:02.639
We'll do it.
00:11:02.960 --> 00:11:06.159
And there was quite a lot of jaw-dropping at that one that went on.
00:11:06.879 --> 00:11:09.600
But with the same boundaries, like, hey guys, we're not we're gonna do it.
00:11:09.840 --> 00:11:12.720
No, we're not gonna do anything unethical, but but we'll do it.
00:11:12.960 --> 00:11:20.639
So I don't quite understand how this works because it says we haven't actually dropped any of our safety principles to secure this contract.
00:11:20.879 --> 00:11:23.519
So the ethics were not there before.
00:11:23.600 --> 00:11:29.120
But or or well, actually the argument is that um this is exactly what Anthropic were asking for.
00:11:29.200 --> 00:11:31.440
That's what OpenAI has now signed.
00:11:31.600 --> 00:11:35.120
Now, a lot of people are incredibly skeptical on that.
00:11:35.279 --> 00:11:43.039
But maybe it's because, and I'm looking back to the the the notes here, maybe it's because that OpenAI isn't a radical left-woke company.
00:11:43.200 --> 00:11:43.360
Right?
00:11:43.600 --> 00:11:43.759
Right.
00:11:43.840 --> 00:11:43.919
Yeah.
00:11:44.879 --> 00:11:50.320
They didn't come out with this ethical nonsense with a philosopher trying to teach its system some value.
00:11:50.639 --> 00:11:50.960
Correct.
00:11:51.120 --> 00:11:54.080
Scottish philosopher as well, just to kind of Oh well, you're the problem.
00:11:54.960 --> 00:11:55.840
There it is, right there.
00:11:57.279 --> 00:12:00.159
Um, and you know, more part of that.
00:12:00.240 --> 00:12:03.440
So that then there was a backlash against NAI itself.
00:12:03.679 --> 00:12:04.320
There's a shock.
00:12:04.879 --> 00:12:10.480
I think it's also like the human you start to realize, hang on a second, I have this on my phone, I'm paying a subscription for it.
00:12:10.639 --> 00:12:12.240
I'm kind of contributing here.
00:12:12.639 --> 00:12:13.120
Yes.
00:12:13.360 --> 00:12:14.720
Yes, get to it in a minute.
00:12:15.200 --> 00:12:22.240
So it it turns out that the OpenAI president Greg Brockman had made a$25 million donation to Trump.
00:12:22.559 --> 00:12:25.120
So it's like maybe this is something to do with it.
00:12:25.360 --> 00:12:30.159
Maybe this is this is a small reason why OpenAI might have got the contract and Anthropic didn't.
00:12:30.240 --> 00:12:38.240
Because, you know, I'm surely that$25 million shows that they're not a radical woke-left organization.
00:12:38.559 --> 00:12:38.960
Yeah.
00:12:41.039 --> 00:12:41.600
Oh my god.
00:12:43.200 --> 00:12:47.919
Um but um and these numbers keep changing every time we look them up.
00:12:48.000 --> 00:12:49.039
But they doubled.
00:12:49.120 --> 00:12:53.360
I think we started to write this, you know, late last week and then jumped in and checked the numbers.
00:12:53.440 --> 00:13:00.080
And you've got 1.5 million people cancelling ChatGPT over the last what was 700,000 last week.
00:13:00.159 --> 00:13:01.759
So it's doubled since we last checked.
00:13:01.840 --> 00:13:09.360
Over this behind there was a big campaign basically saying cancel your subscription because ChatGBT is a subscription-based model.
00:13:09.840 --> 00:13:10.320
Yes.
00:13:10.559 --> 00:13:12.399
So this is where your money's going to fund it.
00:13:12.559 --> 00:13:20.240
So lots of public outrage and big, you know, celebrities getting behind this and trying to, you know, get the stir the public's interest.
00:13:20.399 --> 00:13:22.240
You've even got some civil rights groups.
00:13:22.320 --> 00:13:27.919
I think there's a couple we've quoted there, Common Cause, Young Americans for Liberty, there's a great name.
00:13:28.159 --> 00:13:33.039
And I'm sending letters to Congress saying, hey, we need to actually halt the use of this altogether.
00:13:33.200 --> 00:13:35.919
You know, AI for mass surveillance and autonomous weapons.
00:13:36.000 --> 00:13:44.399
You know, we we spoke about this, I think when we first started the pod, that real red flag when you start to see this mass surveillance, let alone in this situation.
00:13:44.639 --> 00:13:46.799
And now we've got the big quick GPT.
00:13:47.519 --> 00:13:54.879
And maybe it can are these things just too big and too big to fail now, and and we're nibbling at the elephant around the sides.
00:13:55.039 --> 00:14:01.279
And this is just going to play out with the, you know, our lords and masters are going to work this one out themselves.
00:14:01.440 --> 00:14:02.799
But you know, we'll see, right?
00:14:02.879 --> 00:14:06.159
It's this has um obviously become very public.
00:14:06.399 --> 00:14:10.879
Hopefully, this is breaking through now into the mainstream, you know, of how important all this stuff is.
00:14:10.960 --> 00:14:12.639
And that's why we're kind of focusing on this.
00:14:12.960 --> 00:14:13.200
Absolutely.
00:14:13.279 --> 00:14:17.440
And obviously, you could go to a really dark place digging into all of it and trying to fix all of it.
00:14:17.519 --> 00:14:19.679
But it is quite empowering to know that you can have some.
00:14:19.759 --> 00:14:23.519
You've got some kind of way that you could at least, you know, put a little in.
00:14:23.679 --> 00:14:27.440
Like you said, there'll be under no illusions of as to how much of this we can solve.
00:14:27.600 --> 00:14:33.600
But it's important to know when you're buying into these paradigms, what you're actually where your money's going to.
00:14:33.919 --> 00:14:34.080
Yeah.
00:14:34.320 --> 00:14:49.519
And you know, and and if this stands the American president can basically just take a dislike to a company like that and destroy them, you know, that's that's pretty serious for kind of from a free market point of view, from what happens to the software industry, et cetera.
00:14:49.600 --> 00:14:53.759
Or or we'd be naive and you know, this is the way that it runs now, right?
00:14:53.840 --> 00:14:55.919
Uh that this is the way that it's there.
00:14:56.000 --> 00:14:59.039
Um it's yeah, again, I think we were talking a little while ago.
00:14:59.200 --> 00:15:01.600
We we probably weren't saying naive to it.
00:15:01.759 --> 00:15:10.960
We knew things would move fast, but in terms of what we've seen, um, capability and functional, you know, wise functionality, can't use my words, over the last even month has been huge.
00:15:11.120 --> 00:15:22.639
And particularly now that you see um front runners, and we'll talk about this in future pods, but uh like Claude coming out with different capability, how quickly all the competitors are rising to the occasion because they see that market opportunity.
00:15:22.960 --> 00:15:28.399
Well, why don't we quickly take a bit of a detour there and talk about those competitors and the different sorts of models that were there?
00:15:28.480 --> 00:15:30.639
Because you're doing some really interesting research on that.
00:15:31.039 --> 00:15:32.559
Greg, let's just touch on it, Steve.
00:15:32.639 --> 00:15:33.120
You're very kind.
00:15:33.279 --> 00:15:42.480
So when we're thinking about how the government was so easily able to impact Anthropic, you've got all these different uh main players in in the market at the moment, if we could call it such a thing.
00:15:42.559 --> 00:15:45.759
We've got the big ones being OpenAI, Anthropic, and Google.
00:15:45.840 --> 00:15:49.840
They're all running quite different businesses in terms of how they actually make their money.
00:15:50.000 --> 00:15:55.200
So you've got OpenAI, it's primarily a primarily a consumer subscription company, right?
00:15:55.360 --> 00:15:59.600
So they get 85% of their revenue from individual users.
00:15:59.759 --> 00:16:05.600
Um and it's actually a small percentage of those users that are paying a subscription that are actually keeping them afloat.
00:16:05.759 --> 00:16:07.360
But they're now pivoting, right?
00:16:07.440 --> 00:16:11.279
Because they've been trying to hit this huge revenue target out there in the market.
00:16:11.360 --> 00:16:15.279
So it's interesting times for OpenAI in terms of where it goes to next.
00:16:15.360 --> 00:16:20.960
And this is where you're starting to see uh the rise of them potentially putting ads in, which is a whole.
00:16:21.440 --> 00:16:27.600
Well, and that was raised quite a lot, you know, a month or two ago, but um has quietly been pushed to the sides for a bit.
00:16:27.679 --> 00:16:28.960
So we'll see what happens there.
00:16:29.120 --> 00:16:35.759
But you know, that 125 billion revenue target, you know, if it's$20 a month, that's that's a lot of users to get to the city.
00:16:35.919 --> 00:16:40.480
It's a lot of users, and you just lost a lot, you know, one and a half million pretty quickly.
00:16:40.879 --> 00:16:45.200
So then you've got Anthropic, and they're quite different because they're more of a B2B infrastructure company.
00:16:45.279 --> 00:16:49.679
So about 70% of their revenue is taken from API token consumption, right?
00:16:49.840 --> 00:17:00.159
So this is about 300,000 plus business customers, eight of the Fortune 10, and that Clawed code hitting and enabling about 2.5 billion in nine months.
00:17:00.480 --> 00:17:05.359
I think it's really interesting the way that Anthropic is is a different sort of company with different sorts of products.
00:17:05.440 --> 00:17:14.319
And since we've been getting much more into the Anthropic ecosystem recently, how it probably works just a lot better from a from a business point of view as well.
00:17:14.559 --> 00:17:23.519
Um so it'll be interesting to see whether it's this consumer or the business-led one kind of kind of wins that or is it the dominant player, which is the last one, Laura?
00:17:24.000 --> 00:17:24.400
Absolutely.
00:17:24.480 --> 00:17:37.359
And then the third, you've got um Google's Gemini, um, which is more a a bit, it's a bit well, I think they call it a defensive play where they're trying to they're they're protecting their existing, what,$200 billion ad empire, which kind of blows my mind.
00:17:37.440 --> 00:17:41.680
But when you think about the the Google um ecosystem and how it works.