March 16, 2026

Government Says No!

Government Says No!

Here you go: The Pentagon wanted Claude for mass surveillance and autonomous weapons. Anthropic said no. Trump said fire them like dogs. Then it got weird. This week we're unpacking the biggest AI story of the year — the full-blown collision between the US government and the company that dared to build ethics into its AI model. We recap how Anthropic's constitutional AI framework put it on a collision course with the Department of Defence, what happened when the February 27th deadline passed,...

Apple Podcasts podcast player iconSpotify podcast player iconRSS Feed podcast player icon
Apple Podcasts podcast player iconSpotify podcast player iconRSS Feed podcast player icon

Here you go:

The Pentagon wanted Claude for mass surveillance and autonomous weapons. Anthropic said no. Trump said fire them like dogs. Then it got weird.

This week we're unpacking the biggest AI story of the year — the full-blown collision between the US government and the company that dared to build ethics into its AI model. We recap how Anthropic's constitutional AI framework put it on a collision course with the Department of Defence, what happened when the February 27th deadline passed, and why a label previously reserved for Chinese adversaries like Huawei is now being pointed at an American company.

We get into the leaked memo ordering military commanders to rip Anthropic's technology out of nuclear and cyber systems in 180 days — despite Claude reportedly being actively embedded in military operations right now. We look at the First Amendment lawsuit Anthropic has fired back with, OpenAI's eyebrow-raising decision to step in and take the contract, and the $25 million donation that might explain a thing or two.

And then we bring it back down to earth. Because underneath all the geopolitics, something genuinely exciting is happening with these tools — and if you wrote AI off six months ago, it's time to look again.

The government said no. The computer said no. The question is — what do you say?

00:00 Intro

00:31 The Backstory

08:35 Anthropic Fights Back

15:31 The AI Landscape

18:41 The Agentic Revolution

28:14 Outro

WEBVTT

00:00:00.320 --> 00:00:08.880
The United States of America will never allow a radical left woke company to dictate how our great military fights and wins wars.

00:00:09.199 --> 00:00:15.439
So wrote the president of the United States on Truth Social the week before the USA went to war in Iran.

00:00:15.759 --> 00:00:16.640
Welcome back.

00:00:16.800 --> 00:00:20.640
This is the AI Transition, and I'm Steven, and with me as always, Lauren.

00:00:21.120 --> 00:00:24.079
Steven, woo-hoo! What a late-in! Oh my god.

00:00:24.559 --> 00:00:27.039
Just another just sort of a small topic for us to cover today.

00:00:27.519 --> 00:00:30.960
Look, there's not much out there on this one at the moment, so it's gonna be tricky.

00:00:32.320 --> 00:00:42.000
So obviously, this is in the news all over the place from literally we're seeing a war getting played out, what played out a couple of weeks ago with what happened with Anthropic ourselves.

00:00:42.159 --> 00:00:44.719
Our plan is to uh recap the background.

00:00:45.600 --> 00:00:48.159
Then how the government said no.

00:00:49.759 --> 00:00:50.640
Computer said no.

00:00:50.960 --> 00:00:53.840
Computer said no to government says no.

00:00:54.079 --> 00:00:54.880
Government says no.

00:00:55.119 --> 00:01:01.200
The fight back from the computer dudes, and then potentially something you can do about it all as well.

00:01:02.799 --> 00:01:03.840
Computer says no, gummy.

00:01:04.079 --> 00:01:04.480
Well, that's true.

00:01:04.560 --> 00:01:05.040
What's just this?

00:01:05.120 --> 00:01:05.519
Who says no?

00:01:07.040 --> 00:01:07.920
Exactly.

00:01:08.560 --> 00:01:11.280
Well, I think we should maybe say no well we can, to be honest.

00:01:11.359 --> 00:01:13.200
So that sounds good to me.

00:01:13.920 --> 00:01:19.040
So maybe if we just quickly recap what happened in the previous episode and the lead up to this.

00:01:19.280 --> 00:01:19.519
Sure.

00:01:19.599 --> 00:01:22.480
So with our with our last episode, Computer says no.

00:01:22.640 --> 00:01:37.200
We talked a little bit about um, well not a little bit, a lot about Anthropic's 23,000-word constitution for its AI model, Claude, you know, which was designed to embrace core ethics and values um directly in its model uh rather than a hard set of arbitrary rules, right?

00:01:37.280 --> 00:01:42.959
And we dove into this awesome badass philosopher that I had working with the team who helped develop that framework.

00:01:43.120 --> 00:01:48.719
But it was specifically there to prevent it from blind obedience even to government operators.

00:01:48.959 --> 00:01:52.319
And then we saw things get really interesting with the Pentagon, right?

00:01:52.560 --> 00:01:57.840
So the US Department of War still blows my mind that that's what it's called by the I know, I know.

00:01:58.239 --> 00:01:59.840
Sweet defense, and now it's war.

00:02:00.239 --> 00:02:16.719
Um, they approached Anthropic, right, saying, hey, we we we we've got a contract for you, but we need blanket permission to use Claude for mass surveillance of American citizens and also for autonomous weapon systems that you could fire without having a human in the loop.

00:02:16.800 --> 00:02:17.360
And guess what?

00:02:17.439 --> 00:02:20.240
That didn't really align with the AO constitution.

00:02:20.479 --> 00:02:20.639
Yeah.

00:02:21.039 --> 00:02:23.120
Fundamental to how Claude is meant to work.

00:02:23.439 --> 00:02:27.520
Look, look, look in fairness, and I can't believe I'm saying in fairness to the Department of War.

00:02:27.680 --> 00:02:34.639
What the what the pushback was that they said that they wouldn't have anyone telling them what to do at all.

00:02:34.879 --> 00:02:35.199
Right.

00:02:35.360 --> 00:02:42.080
So so yes, these these are two red lines, but the the point was for the Department of War saying, no, no, no, you don't go to decide that.

00:02:42.240 --> 00:02:43.120
We we decide that.

00:02:43.199 --> 00:02:48.159
We might not go we might not do that anyway, but it's us that make that call, not you.

00:02:48.960 --> 00:02:53.120
So it's a real power battle that that was that that was going on.

00:02:53.439 --> 00:02:56.000
So that's where we left it a couple of weeks ago.

00:02:56.240 --> 00:03:08.000
And what happened next was the February 27th deadline passed, which the Department of War had set, and Secretary Pete Hesge, lovely man, um, designated anthropic a supplier.

00:03:10.479 --> 00:03:10.639
Indeed.

00:03:11.599 --> 00:03:15.280
Indeed, you know, you're kind of top of the top of the tree talent there.

00:03:15.360 --> 00:03:18.479
Um that's where you go when you're thinking about your military strategy.

00:03:18.800 --> 00:03:22.639
It is, it is, and also where do you go after you've been your uh your a news presenter?

00:03:22.719 --> 00:03:29.840
I mean, if you're not going to be you're in charge of the Department of War for you know the the superpower power of the world, what are you going to do?

00:03:30.000 --> 00:03:31.199
I mean, where is the next step?

00:03:32.000 --> 00:03:32.159
Yeah.

00:03:32.319 --> 00:03:34.080
Where where can we go next ourselves, Stephen?

00:03:34.159 --> 00:03:35.039
It takes the question.

00:03:35.360 --> 00:03:38.400
How did I combine us into that whole con that was bad.

00:03:43.439 --> 00:03:45.360
So he they set a deadline.

00:03:45.520 --> 00:03:46.960
That deadline passed.

00:03:47.199 --> 00:03:54.000
Um and they were designated, although it's a little bit ambiguous whether they were, whether they weren't.

00:03:54.240 --> 00:03:56.879
Do you do you believe the the the truth socials?

00:03:57.280 --> 00:04:07.039
But um this designation gives the uh the Secretary of War the authority to exclude a company entirely from competing with military contracts or subcontracts.

00:04:07.120 --> 00:04:09.199
So basically kicking them out of government, right?

00:04:09.280 --> 00:04:19.920
Which is for a company that's basically a B2B company, that's pretty serious if you know one of your major clients, who is the United States government, are saying, you're now a risk.

00:04:20.079 --> 00:04:28.560
And uh that label has only ever been used in the past for foreign adversaries and non-US companies, which is wild.

00:04:28.879 --> 00:04:30.000
Oh, again, mind-blowing.

00:04:30.079 --> 00:04:33.360
It just sounds a little bit like this whole little tariff game that gets played as well.

00:04:33.519 --> 00:04:40.480
Here's the version for you, and we're gonna take you out of all the government procurement opportunities and shut you down that way.

00:04:40.720 --> 00:04:46.639
And I think did you did Trump seriously say, you know, there's a big meltdown on Truth Social?

00:04:46.959 --> 00:04:47.759
There's the words right there.

00:04:47.839 --> 00:04:49.439
Oh, we're going deep in this, aren't we?

00:04:49.600 --> 00:04:54.319
You know, it threatened the full power of the presidency and he's gonna fire them like dogs.

00:04:54.480 --> 00:04:55.600
How do you fire a dog, by the way?

00:04:56.079 --> 00:04:57.920
Actually, uh why would you fire a dog?

00:04:58.000 --> 00:04:58.800
I don't know.

00:05:00.399 --> 00:05:02.560
Well, it depends how obedient they are, huh?

00:05:03.680 --> 00:05:07.120
Sorry, and I don't want to go there, but is that shoot them like dogs, not fire them?

00:05:07.839 --> 00:05:08.959
But sorry, Lauren.

00:05:09.759 --> 00:05:11.839
I know you're a cat guy, but oh my god, I've been a kid.

00:05:13.040 --> 00:05:26.240
Um yeah, like you know, just talking about anthropics attempt to actually enforce these terms of service, which by the way are meant to be good, solid ethics, not to murder a bunch of people, um, as a disastrous mistake.

00:05:26.399 --> 00:05:29.519
Like, it's really ironic with what we're seeing going on in Iran.

00:05:29.839 --> 00:05:30.000
Yeah.

00:05:30.240 --> 00:05:35.120
And and let's let's recap the what he actually said, which we we said at the top of the show as well.

00:05:35.360 --> 00:05:45.199
And he said, the US, the United States of America will never allow a radical left woke company to dictate how our great military fights and wins wars.

00:05:45.839 --> 00:05:52.480
We're going to let the Fox News guy, he's going to dictate how we wage war instead.

00:05:53.120 --> 00:05:58.879
There's times when you're I'd be living in a sci-fi movie, or and to be honest, it doesn't sound like a very good sci-fi movie.

00:06:00.639 --> 00:06:02.720
How are they going to turn this around in the third act?

00:06:03.040 --> 00:06:06.160
That's like, yeah, I don't I don't quite believe this one, to be honest.

00:06:06.399 --> 00:06:22.240
Um but then what moved on was there was a memo that was and that was that that was leaked, uh, interestingly, that all military commanders had to rip anthropics technology out of all nuclear missile systems, cyber systems in 180 days.

00:06:22.800 --> 00:06:25.120
Well, we know how easy it is to decommission systems, Lauren.

00:06:25.279 --> 00:06:26.160
I mean, we've done this all our life.

00:06:26.879 --> 00:06:27.360
Just a breeze.

00:06:27.680 --> 00:06:31.120
I mean, you basically just go in and you you you unplug it and then that's it, it's done.

00:06:31.360 --> 00:06:32.720
And 180 days, right?

00:06:32.879 --> 00:06:34.000
So about six months.

00:06:34.240 --> 00:06:40.079
And we're talking about nuclear systems, drones, firefight, you name it.

00:06:40.240 --> 00:06:44.399
Oh, it's um war fighters, uh mission critical activity.

00:06:46.240 --> 00:06:46.879
Yes.

00:06:47.120 --> 00:06:47.680
Yes.

00:06:47.920 --> 00:06:53.519
So the the Dark Department of Chief Information Officer, I know I'm kind of laughing to this, but it's my is the defense mechanism.

00:06:54.959 --> 00:06:56.399
Because this isn't funny stuff, right?

00:06:56.560 --> 00:07:04.079
Um, Kristen Davis said on March the 6th that that these vulnerabilities in AI pose catastrophic risks to war fighters.

00:07:04.319 --> 00:07:09.680
So the catastrophic risks that she's uh uh alluding to is these ethical boundaries.

00:07:09.839 --> 00:07:11.600
And that's a catastrophic risk.

00:07:11.920 --> 00:07:15.199
And so they've got 180 days in order to rip these systems out.

00:07:15.680 --> 00:07:17.439
Again, it it's just mind-blowing.

00:07:17.519 --> 00:07:27.439
And this is, you know, the only official authorizing the government, if we can call it that, oh no, getting more and more controversial, to grant exemption, right, for these mission critical activities.

00:07:27.680 --> 00:07:29.360
So there's no viable alternative.

00:07:29.519 --> 00:07:34.160
So she's made this call, the government's standing behind it, 180 days to pull this out.

00:07:34.959 --> 00:07:44.639
Uh meanwhile, by all accounts, and you know, some of this is absolutely obviously held by held behind secrecies and the rest of it, you know, the Pentacan has been using Claude.

00:07:45.040 --> 00:07:49.279
They used Claude all over the Venezuela raids and for from Maduro, supposedly.

00:07:49.439 --> 00:07:55.040
Um by all accounts, Claude is fully embedded for identifying targets in Iran just now.

00:07:55.199 --> 00:07:57.279
So it's not like this isn't getting used.

00:07:57.360 --> 00:08:01.680
And it's you know, and depending on where you stand in this, you know, is that use you know ethical or not?

00:08:01.759 --> 00:08:07.439
You know, I mean anthropic set the boundaries at mass surveillance of of the American public.

00:08:07.759 --> 00:08:12.720
Although I'm not sure if it's all public or just the American public, I'll have to go and double check that one.

00:08:12.959 --> 00:08:13.199
Right.

00:08:13.279 --> 00:08:23.120
Um, and whether the AI um systems have that final kill switch that you don't need a human in the loop, but all the way up till that, they absolutely are getting used just now.

00:08:23.439 --> 00:08:27.759
Um so you know, big big stuff.

00:08:28.079 --> 00:08:28.639
Big stuff.

00:08:28.800 --> 00:08:38.720
And and again, this is just a little that we're we can kind of garner from what's out there in the news world as to what's really going on, how deep it's being used, and what it's actually being used for as well.

00:08:39.120 --> 00:08:42.960
So this then sparked somewhat of a fight back that was going on.

00:08:43.120 --> 00:08:46.399
And there is a massive lawsuit that Anthropic has now filed.

00:08:46.559 --> 00:08:49.440
Um I had a brief look over the actual lawsuit itself.

00:08:49.519 --> 00:08:53.840
Okay, I took the lawsuit and I threw it into Claude and I got him to summarize it for me.

00:08:54.000 --> 00:08:54.320
Right.

00:08:54.399 --> 00:08:57.120
Uh and it's it's really, really uh interesting.

00:08:57.200 --> 00:09:00.240
I mean, this is First Amendment stuff, and this is unprecedented.

00:09:00.320 --> 00:09:06.799
As we said, you know, no other organization in the history of America has ever been um hauled over the coals like this.

00:09:06.879 --> 00:09:14.639
You know, they've they've done it to companies like you know, Huawei and other Chinese companies, etc., but they've never done this to an American company itself.

00:09:15.039 --> 00:09:20.240
Uh and so there's a massive lawsuit that's now going through the courts, but you know how long these things take.

00:09:20.399 --> 00:09:25.200
I mean, it you know, it won't even get through the first rung of the courts in the next three to six months, probably.

00:09:25.360 --> 00:09:27.279
So but we're fighting back.

00:09:27.919 --> 00:09:39.600
Yeah, and again, it's uh it's all arguing that the government's actions are unconstitutional because they're explicitly designed to actually punish Anastropic for communicating its protective viewpoints and its safety.

00:09:39.759 --> 00:09:45.360
So when that constitution was discovered and people started to realize, hang on a second, isn't that part of the military?

00:09:45.600 --> 00:09:55.039
Then this, you know, forced this to come ahead to a head in terms of how much the government can control this software that's actually, if we can call it that, that's actually got ethics built into it.

00:09:55.519 --> 00:10:13.440
So I've actually heard that so that it was interesting that Anthropic called a constitution because I've heard this now in a number of different podcasts and and and reading material when it's saying that the the American military has found this constitution within this model and it's not the American constitution.

00:10:13.679 --> 00:10:14.559
So how dare they?

00:10:14.799 --> 00:10:15.600
So how dare they?

00:10:15.759 --> 00:10:20.879
Because it should be the American constitution that's at the base of this and not this anthropic constitution.

00:10:21.600 --> 00:10:28.159
Interesting place of words, and playing to that kind of populist bent of well, you know, people are going to say, well, of course it should be the American constitution that's in there.

00:10:28.559 --> 00:10:37.840
And of course, just forgets the global, you know, nature of what we're dealing with here and the arrogance of our friends.

00:10:38.080 --> 00:10:38.159
Yes.

00:10:40.960 --> 00:10:42.879
Well, it's this is part of the fighting.

00:10:43.039 --> 00:10:43.440
We're fighting back.

00:10:43.600 --> 00:10:44.159
Okay, we're fighting back.

00:10:44.240 --> 00:10:46.000
This is part of the fighting back, yeah.

00:10:46.399 --> 00:10:47.360
It's part of us fighting back.

00:10:47.679 --> 00:11:00.480
But literally on the same day that the the deadline passed, and this was um Friday, the 27th of February, Sam Altmont, you know, a couple of hours afterwards came on and said, Hey, don't worry, we've taken the government contract.

00:11:00.720 --> 00:11:01.360
We'll do it.

00:11:01.600 --> 00:11:02.639
We'll do it.

00:11:02.960 --> 00:11:06.159
And there was quite a lot of jaw-dropping at that one that went on.

00:11:06.879 --> 00:11:09.600
But with the same boundaries, like, hey guys, we're not we're gonna do it.

00:11:09.840 --> 00:11:12.720
No, we're not gonna do anything unethical, but but we'll do it.

00:11:12.960 --> 00:11:20.639
So I don't quite understand how this works because it says we haven't actually dropped any of our safety principles to secure this contract.

00:11:20.879 --> 00:11:23.519
So the ethics were not there before.

00:11:23.600 --> 00:11:29.120
But or or well, actually the argument is that um this is exactly what Anthropic were asking for.

00:11:29.200 --> 00:11:31.440
That's what OpenAI has now signed.

00:11:31.600 --> 00:11:35.120
Now, a lot of people are incredibly skeptical on that.

00:11:35.279 --> 00:11:43.039
But maybe it's because, and I'm looking back to the the the notes here, maybe it's because that OpenAI isn't a radical left-woke company.

00:11:43.200 --> 00:11:43.360
Right?

00:11:43.600 --> 00:11:43.759
Right.

00:11:43.840 --> 00:11:43.919
Yeah.

00:11:44.879 --> 00:11:50.320
They didn't come out with this ethical nonsense with a philosopher trying to teach its system some value.

00:11:50.639 --> 00:11:50.960
Correct.

00:11:51.120 --> 00:11:54.080
Scottish philosopher as well, just to kind of Oh well, you're the problem.

00:11:54.960 --> 00:11:55.840
There it is, right there.

00:11:57.279 --> 00:12:00.159
Um, and you know, more part of that.

00:12:00.240 --> 00:12:03.440
So that then there was a backlash against NAI itself.

00:12:03.679 --> 00:12:04.320
There's a shock.

00:12:04.879 --> 00:12:10.480
I think it's also like the human you start to realize, hang on a second, I have this on my phone, I'm paying a subscription for it.

00:12:10.639 --> 00:12:12.240
I'm kind of contributing here.

00:12:12.639 --> 00:12:13.120
Yes.

00:12:13.360 --> 00:12:14.720
Yes, get to it in a minute.

00:12:15.200 --> 00:12:22.240
So it it turns out that the OpenAI president Greg Brockman had made a$25 million donation to Trump.

00:12:22.559 --> 00:12:25.120
So it's like maybe this is something to do with it.

00:12:25.360 --> 00:12:30.159
Maybe this is this is a small reason why OpenAI might have got the contract and Anthropic didn't.

00:12:30.240 --> 00:12:38.240
Because, you know, I'm surely that$25 million shows that they're not a radical woke-left organization.

00:12:38.559 --> 00:12:38.960
Yeah.

00:12:41.039 --> 00:12:41.600
Oh my god.

00:12:43.200 --> 00:12:47.919
Um but um and these numbers keep changing every time we look them up.

00:12:48.000 --> 00:12:49.039
But they doubled.

00:12:49.120 --> 00:12:53.360
I think we started to write this, you know, late last week and then jumped in and checked the numbers.

00:12:53.440 --> 00:13:00.080
And you've got 1.5 million people cancelling ChatGPT over the last what was 700,000 last week.

00:13:00.159 --> 00:13:01.759
So it's doubled since we last checked.

00:13:01.840 --> 00:13:09.360
Over this behind there was a big campaign basically saying cancel your subscription because ChatGBT is a subscription-based model.

00:13:09.840 --> 00:13:10.320
Yes.

00:13:10.559 --> 00:13:12.399
So this is where your money's going to fund it.

00:13:12.559 --> 00:13:20.240
So lots of public outrage and big, you know, celebrities getting behind this and trying to, you know, get the stir the public's interest.

00:13:20.399 --> 00:13:22.240
You've even got some civil rights groups.

00:13:22.320 --> 00:13:27.919
I think there's a couple we've quoted there, Common Cause, Young Americans for Liberty, there's a great name.

00:13:28.159 --> 00:13:33.039
And I'm sending letters to Congress saying, hey, we need to actually halt the use of this altogether.

00:13:33.200 --> 00:13:35.919
You know, AI for mass surveillance and autonomous weapons.

00:13:36.000 --> 00:13:44.399
You know, we we spoke about this, I think when we first started the pod, that real red flag when you start to see this mass surveillance, let alone in this situation.

00:13:44.639 --> 00:13:46.799
And now we've got the big quick GPT.

00:13:47.519 --> 00:13:54.879
And maybe it can are these things just too big and too big to fail now, and and we're nibbling at the elephant around the sides.

00:13:55.039 --> 00:14:01.279
And this is just going to play out with the, you know, our lords and masters are going to work this one out themselves.

00:14:01.440 --> 00:14:02.799
But you know, we'll see, right?

00:14:02.879 --> 00:14:06.159
It's this has um obviously become very public.

00:14:06.399 --> 00:14:10.879
Hopefully, this is breaking through now into the mainstream, you know, of how important all this stuff is.

00:14:10.960 --> 00:14:12.639
And that's why we're kind of focusing on this.

00:14:12.960 --> 00:14:13.200
Absolutely.

00:14:13.279 --> 00:14:17.440
And obviously, you could go to a really dark place digging into all of it and trying to fix all of it.

00:14:17.519 --> 00:14:19.679
But it is quite empowering to know that you can have some.

00:14:19.759 --> 00:14:23.519
You've got some kind of way that you could at least, you know, put a little in.

00:14:23.679 --> 00:14:27.440
Like you said, there'll be under no illusions of as to how much of this we can solve.

00:14:27.600 --> 00:14:33.600
But it's important to know when you're buying into these paradigms, what you're actually where your money's going to.

00:14:33.919 --> 00:14:34.080
Yeah.

00:14:34.320 --> 00:14:49.519
And you know, and and if this stands the American president can basically just take a dislike to a company like that and destroy them, you know, that's that's pretty serious for kind of from a free market point of view, from what happens to the software industry, et cetera.

00:14:49.600 --> 00:14:53.759
Or or we'd be naive and you know, this is the way that it runs now, right?

00:14:53.840 --> 00:14:55.919
Uh that this is the way that it's there.

00:14:56.000 --> 00:14:59.039
Um it's yeah, again, I think we were talking a little while ago.

00:14:59.200 --> 00:15:01.600
We we probably weren't saying naive to it.

00:15:01.759 --> 00:15:10.960
We knew things would move fast, but in terms of what we've seen, um, capability and functional, you know, wise functionality, can't use my words, over the last even month has been huge.

00:15:11.120 --> 00:15:22.639
And particularly now that you see um front runners, and we'll talk about this in future pods, but uh like Claude coming out with different capability, how quickly all the competitors are rising to the occasion because they see that market opportunity.

00:15:22.960 --> 00:15:28.399
Well, why don't we quickly take a bit of a detour there and talk about those competitors and the different sorts of models that were there?

00:15:28.480 --> 00:15:30.639
Because you're doing some really interesting research on that.

00:15:31.039 --> 00:15:32.559
Greg, let's just touch on it, Steve.

00:15:32.639 --> 00:15:33.120
You're very kind.

00:15:33.279 --> 00:15:42.480
So when we're thinking about how the government was so easily able to impact Anthropic, you've got all these different uh main players in in the market at the moment, if we could call it such a thing.

00:15:42.559 --> 00:15:45.759
We've got the big ones being OpenAI, Anthropic, and Google.

00:15:45.840 --> 00:15:49.840
They're all running quite different businesses in terms of how they actually make their money.

00:15:50.000 --> 00:15:55.200
So you've got OpenAI, it's primarily a primarily a consumer subscription company, right?

00:15:55.360 --> 00:15:59.600
So they get 85% of their revenue from individual users.

00:15:59.759 --> 00:16:05.600
Um and it's actually a small percentage of those users that are paying a subscription that are actually keeping them afloat.

00:16:05.759 --> 00:16:07.360
But they're now pivoting, right?

00:16:07.440 --> 00:16:11.279
Because they've been trying to hit this huge revenue target out there in the market.

00:16:11.360 --> 00:16:15.279
So it's interesting times for OpenAI in terms of where it goes to next.

00:16:15.360 --> 00:16:20.960
And this is where you're starting to see uh the rise of them potentially putting ads in, which is a whole.

00:16:21.440 --> 00:16:27.600
Well, and that was raised quite a lot, you know, a month or two ago, but um has quietly been pushed to the sides for a bit.

00:16:27.679 --> 00:16:28.960
So we'll see what happens there.

00:16:29.120 --> 00:16:35.759
But you know, that 125 billion revenue target, you know, if it's$20 a month, that's that's a lot of users to get to the city.

00:16:35.919 --> 00:16:40.480
It's a lot of users, and you just lost a lot, you know, one and a half million pretty quickly.

00:16:40.879 --> 00:16:45.200
So then you've got Anthropic, and they're quite different because they're more of a B2B infrastructure company.

00:16:45.279 --> 00:16:49.679
So about 70% of their revenue is taken from API token consumption, right?

00:16:49.840 --> 00:17:00.159
So this is about 300,000 plus business customers, eight of the Fortune 10, and that Clawed code hitting and enabling about 2.5 billion in nine months.

00:17:00.480 --> 00:17:05.359
I think it's really interesting the way that Anthropic is is a different sort of company with different sorts of products.

00:17:05.440 --> 00:17:14.319
And since we've been getting much more into the Anthropic ecosystem recently, how it probably works just a lot better from a from a business point of view as well.

00:17:14.559 --> 00:17:23.519
Um so it'll be interesting to see whether it's this consumer or the business-led one kind of kind of wins that or is it the dominant player, which is the last one, Laura?

00:17:24.000 --> 00:17:24.400
Absolutely.

00:17:24.480 --> 00:17:37.359
And then the third, you've got um Google's Gemini, um, which is more a a bit, it's a bit well, I think they call it a defensive play where they're trying to they're they're protecting their existing, what,$200 billion ad empire, which kind of blows my mind.

00:17:37.440 --> 00:17:41.680
But when you think about the the Google um ecosystem and how it works.

00:17:42.000 --> 00:17:51.039
So in terms of its government risk, where you've got anthropic going, that deep enterprise integration, um, it really takes a while to unwind that relationship.

00:17:51.119 --> 00:17:51.279
Trevor Burrus, Jr.

00:17:51.359 --> 00:17:54.400
We've got these three main players, and there's other players that we've not talked about as well.

00:17:54.559 --> 00:17:59.839
You know, like you know, Ilmos um X, you've still got Meta in the sidelines, you've got the Chinese models.

00:18:00.000 --> 00:18:05.200
Um but you've got these three very different players that are that are playing in the in these different ways.

00:18:05.440 --> 00:18:08.880
And and and and who knows which one is going to win out of this.

00:18:09.039 --> 00:18:16.640
Or maybe maybe there's multiple, maybe this is a multipolar of of how this ends up, rather than this being a single winner that that appears here.

00:18:17.039 --> 00:18:19.359
We've seen kind of a big leap, dare I say, in the last few weeks.

00:18:19.519 --> 00:18:28.400
We've still had that kind of hype bubble around how we're actually gonna reduce you know costs of doing business, and we're still not seeing those productivity gains out there.

00:18:28.640 --> 00:18:36.559
So it's still interesting to see, you know, such volatility in the in the press around what's happening in um in the Department of War.

00:18:36.720 --> 00:18:40.640
And then it's giving you that inkling of this capability and really where we're going.

00:18:41.200 --> 00:18:46.640
Why don't we take this down from the macro back down to kind of lower-level people like ourselves?

00:18:46.799 --> 00:18:49.519
Not that we are low-level people, but but you know what I mean.

00:18:49.599 --> 00:18:55.200
As in if we if we take our heads way out of the the clouds and down onto that kind of individual basis.

00:18:55.279 --> 00:19:11.119
Because I must admit, what I have seen over the last couple of months, and we're going to go into this in a lot more depth in the next couple of podcasts, is the real step change that's happened with these models, and in particular with Anthropic and Claude, which is the one that's um that's really hit home for me.

00:19:11.519 --> 00:19:28.640
This move to this uh a agentic workflow, this ability to stop using it like a search engine and just asking it a question and getting a response or giving it one thing and getting it back, but actually giving tasks and like decent tasks as well, and it just going away and doing it and it works.

00:19:30.720 --> 00:19:33.519
It's a big step change that's been going on.

00:19:33.920 --> 00:19:52.880
And I think, you know, for us, and this is where we're, I guess we'll talk to now, we're starting to look at, you know, migrating away from ChatGPT, like you said, not just, oh, we're using it like a super smart Google, but it's getting to think a different game for yourself and what you want to actually get uh these agentics uh to agentics, I've got to love that word, to actually return for you and what it can do.

00:19:53.039 --> 00:20:01.279
So we're starting to see that next leap now where you've almost got to consciously go, hang on a second, do I really need to go in, open up all these different screens, copy and paste stuff from here and there?

00:20:01.440 --> 00:20:05.599
Where can I actually could I send an agent off to do my bidding and bring it back?

00:20:05.759 --> 00:20:08.480
That's that's the big leap that I've been really excited about.

00:20:09.039 --> 00:20:19.440
I mean, it doesn't sound like much basically to say, well, I'm going to have an agent in the middle that will go and do that copy from here and scrape it over here and open that there and write this file there and then do that.

00:20:19.599 --> 00:20:27.359
But it actually is because you know there's a lot of skill usually involved in all those different steps and what happens and tying it together.

00:20:27.599 --> 00:20:36.000
Whereas the skill is now, well, working out what is it you actually want and being able to tell this agent in a way to say, can you go and do this for us?

00:20:36.079 --> 00:20:44.720
And it will go and do the four or five or ten or fifty or you know a hundred steps and then come back with with the output.

00:20:44.960 --> 00:20:48.000
That's that that's a significant way of interacting.

00:20:48.400 --> 00:20:51.440
Exactly, that true productivity saving that we've all been looking for.

00:20:51.599 --> 00:20:52.000
Yes.

00:20:52.240 --> 00:20:52.559
You know?

00:20:52.799 --> 00:20:57.279
Um not just stealing it from the creatives where oh my god, look at my beautiful new graphic I've created.

00:20:57.440 --> 00:20:58.480
Oh, this song that I wrote.

00:20:58.559 --> 00:21:01.200
So now I'm getting really into my dark place.

00:21:01.519 --> 00:21:10.160
Um but we've both looked at making that move from ChatGPT off the back of this, you know, huge blowback in terms of what we've seen with the Department of War.

00:21:10.400 --> 00:21:14.480
And just seeing what Claude has come out with with cowork has been mind-blowing.

00:21:14.640 --> 00:21:26.559
So we thought we'd just take a few minutes to talk about how easy it is to actually migrate from ChatGPT, which alone, even in its name, going to something called Claude and Cowork, is quite the game changer.

00:21:27.119 --> 00:21:28.640
And it's really straightforward.

00:21:28.720 --> 00:21:39.200
So to kind of ease anyone's concerns, so who's listening to this, if you do want to kind of bring a lot of the stuff that you were doing over in ChatGPT over into Claude, there's literally a button now.

00:21:39.279 --> 00:21:47.200
So once you s sign up to Claude and you go into the settings, there's an import button saying import from from a different LLM.

00:21:47.359 --> 00:22:08.240
And it and you press it, and it just gives you this prompt to go and put into the other one, and then it spews out all this stuff that you then copy back into Claude and then it goes away and Processes it and eventually, you know, 10 minutes, half an hour, an hour, you know, all day, depending how much stuff you've got there, it will chew through all of that.

00:22:08.400 --> 00:22:11.119
And then basically most of that memory is there.

00:22:11.200 --> 00:22:19.920
I mean, there's there's deeper exports that you can do with full data dumps and export that, but just that simple one, I've found enough, to be honest, of what I needed.

00:22:20.240 --> 00:22:22.400
And it was really interesting what I was actually pulling out.

00:22:22.559 --> 00:22:28.799
Like you kind of think when I'm migrating from one tool to another, I'm just going to extract all of the data, which is all of your projects, all of your queries.

00:22:28.960 --> 00:22:35.759
No, it's actually more focused on pulling out how you think and what you've been using the tool for, which was really fascinating in and itself.

00:22:36.000 --> 00:22:42.319
I think I I found out a bit because I think it was like, well, it's about two years or something been using it now.

00:22:42.480 --> 00:22:44.880
And what it knew about me and what was in there.

00:22:45.039 --> 00:22:48.480
And and as I was skimming over the file and it was going, Oh god, I'd forgotten about that.

00:22:48.559 --> 00:22:49.200
And it knows it.

00:22:49.440 --> 00:22:53.200
I have to only have got a veggie patch, who would have done Exactly, right?

00:22:53.279 --> 00:22:59.119
So all of these things are and how much we are sharing in here that's just part of this.

00:22:59.279 --> 00:23:03.519
So I mean, there's a whole separate conversation to have about you know data privacy and what's going on.

00:23:03.680 --> 00:23:15.119
But the the the it's mostly a good thing that you're the um this ability to migrate from one platform to another, in this respect for an individual, I think for an organization, very different.

00:23:15.200 --> 00:23:18.799
But for an individual, um, it's actually it's trivial, right?

00:23:18.880 --> 00:23:20.000
It's so easy.

00:23:20.400 --> 00:23:23.279
Two screens next to each other, a bit of copying and pasting.

00:23:23.599 --> 00:23:30.160
Yeah, you can do it while you're watching the TV, just you know, click, click, let it, you know, whir away, and then you're off and running.

00:23:30.400 --> 00:23:53.359
So that migration over, and then you've got uh when you go into Clauds, and we'll go into this a lot more in future episodes, where you've got a thing called co-work, which is that kind of more igentic business focused, and this thing called Claude Code, which basically allows you to become a coder using the vibe coding and getting into that, which I've been doing a lot of recently, and it's really, really powerful, like incre incredibly powerful.

00:23:53.440 --> 00:23:59.119
And it's those moments now we're having of going, oh I think there probably is something in this.

00:23:59.279 --> 00:23:59.759
This is big.

00:24:00.319 --> 00:24:07.519
I think it's even the free version, like I think we've both invested in it a little bit, you know, the next level up, but even the free version has so much functionality in it.

00:24:07.680 --> 00:24:11.839
I think it's huge in terms of just even, you know, chatting to your buddies around the traps.

00:24:11.920 --> 00:24:18.640
Like some of them like, oh, I whipped up a website, or it's I've used it for my research papers, so I'm finishing my degree, all on the free subscriptions.

00:24:18.720 --> 00:24:19.519
It's quite powerful.

00:24:19.599 --> 00:24:26.400
And they keep, I think they've just um come out in the last 24 hours doubling some um access and availability for people too.

00:24:26.640 --> 00:24:27.039
Yeah, they have.

00:24:27.359 --> 00:24:34.160
They're obviously they're trying to kind of ride this wave and bring people in and they keep dropping new functionality all the time.

00:24:34.319 --> 00:24:48.720
There was one earlier this week where if you wanted to um uh teach you something, you go you I want to be trained and blah, blah, and and one thing I tried to do was well, teach me about compound interest, you know, really exciting, wonderful stuff, right?

00:24:48.880 --> 00:24:50.480
Uh a whole podcast in that one.

00:24:51.279 --> 00:24:52.960
Come back next week, everyone, compound interest.

00:24:53.279 --> 00:24:53.680
Indeed.

00:24:53.920 --> 00:25:04.720
But what it came up with was on the fly, based on how I wanted it to look, it came up with a little training course, an interactive website, an explanation of how it was going on, some grass, blah blah.

00:25:05.039 --> 00:25:14.400
I've I've done this for decades, creating training courses and you know, and all of that's gone because it's it's literally you say, Can you just teach me how to do this?

00:25:14.559 --> 00:25:17.279
And it's off and running and and it's creating stuff.

00:25:17.440 --> 00:25:21.599
And it's not just text, it's images, it's interactive sites.

00:25:21.759 --> 00:25:24.400
It's it's incredibly powerful now.

00:25:24.799 --> 00:25:25.119
Incredible.

00:25:25.200 --> 00:25:40.559
And that of you know the back of tools like Notebook LM and it is shocking, like Stephen, like you think back to the the many months and sometimes years it would take to get, you know, these uh transformations and training programs and now really it there's so much power at the tip of your fingers.

00:25:40.720 --> 00:25:50.400
Obviously, we know that there's going to be some you know percentage of errors there, but in terms of you tapping into your creative mind and what you can, you know, yeah, understand in the world, it's huge.

00:25:50.640 --> 00:25:59.279
And what I'd probably encourage people is that if people who have been, you know, used this a year ago or two years ago, or even three, six months ago and went, eh, it's not very good.

00:25:59.519 --> 00:26:09.680
In fact, I I I was out the the other night uh and two of my good friends were laughing about how AI had failed in the one task that they gave them that day, and that just proves that AI area.

00:26:09.839 --> 00:26:10.559
Right once everyone.

00:26:11.039 --> 00:26:14.799
Uh and it's like, okay, but just there's all of these tools over here.

00:26:15.200 --> 00:26:23.519
There's there has been a step change in recent months, and I would encourage people to go and explore the new tools that are there and in a different way as well.

00:26:23.680 --> 00:26:28.000
So don't just use this as a search agent, you know, like like a Google search.

00:26:28.240 --> 00:26:32.240
This is this is a different sort of daunting too with how simple and open it is.

00:26:32.400 --> 00:26:34.960
I know my um partner was doing some research other day.

00:26:35.119 --> 00:26:38.640
I'm like, oh, you should try um, you know, this particular tool um for that.

00:26:38.720 --> 00:26:49.599
And it was just a little bit mind-blowing around, hey, hang on a second, it just allowed me to, you know, whip up a whole podcast of my own in about 10 minutes based off a handful of sources that you threw in for some links.

00:26:49.759 --> 00:26:54.799
Like you're starting to see um, I guess a conversion of how some of these tools work together.

00:26:54.960 --> 00:27:00.400
And once you start to get to know tools like Claude, and there is a bit of a different style there with the way it interacts.

00:27:00.559 --> 00:27:04.559
I don't know, maybe I'm a little bit biased with what's happening, but there's a bit more of a warmth there.

00:27:04.720 --> 00:27:07.839
Am I being corny, Stephen, in interacting with Claude?

00:27:08.480 --> 00:27:16.720
Or or or maybe, you know, the you more, you know, I I've got more affinity for these kind of um walk radical left organizations.

00:27:16.960 --> 00:27:17.279
I don't know.

00:27:17.680 --> 00:27:19.200
That's the problem right there, isn't it?

00:27:19.440 --> 00:27:19.759
Yeah.

00:27:19.920 --> 00:27:24.799
So it's it's wide open in terms of your creative side of it, but it is a a shift in terms of how you work.

00:27:24.880 --> 00:27:25.519
Anyway, we've gone.

00:27:25.599 --> 00:27:26.480
We've gone segue.

00:27:26.559 --> 00:27:27.680
Either way, dig into it.

00:27:27.839 --> 00:27:30.480
It's very easy to move from these tools to another.

00:27:30.640 --> 00:27:33.920
Um get in there, get your hands dirty, play around with some free ones.

00:27:34.079 --> 00:27:35.119
Um and yeah.

00:27:35.599 --> 00:27:40.000
So look, we started off with World War, so why don't we end with a uh with a bad dad joke, Lauren?

00:27:40.240 --> 00:27:41.359
Surely you must have one for us.

00:27:41.759 --> 00:27:47.519
Stephen, I I do think there's some hope for us because in Tay right now, AI, the tet the dad jokes are not even growing with either.

00:27:47.599 --> 00:27:47.920
They're terrible.

00:27:48.160 --> 00:27:48.720
They're hopeless.

00:27:48.799 --> 00:27:48.880
Yeah.

00:27:49.039 --> 00:27:50.799
So I've learned on some of my old choles.

00:27:51.200 --> 00:27:53.119
Um here's what I've come up for you.

00:27:53.200 --> 00:27:54.799
Uh you be the judge.

00:27:55.039 --> 00:28:00.160
Did you hear about the agent who uh combined all the books ever written into one big novel?

00:28:00.400 --> 00:28:01.200
Uh no.

00:28:01.759 --> 00:28:03.519
But yeah, it's a long story.

00:28:03.839 --> 00:28:04.559
Oh dear.

00:28:04.799 --> 00:28:05.359
Oh dear.

00:28:05.599 --> 00:28:06.559
Excellent, excellent.

00:28:06.720 --> 00:28:08.799
So we're still gonna survive.

00:28:08.880 --> 00:28:10.559
There's still a niche for us now, Laura himself.

00:28:10.880 --> 00:28:14.319
Oh, look, as long as you need bad dad jokes, everyone, I'm available.

00:28:15.519 --> 00:28:16.720
Look, that was a lot.

00:28:16.799 --> 00:28:21.680
Uh look, next time we're going to explore things more at the individual and company level.

00:28:21.759 --> 00:28:24.559
There's a lot of interesting parts that we're going to delve into.

00:28:24.720 --> 00:28:30.799
Uh, and particularly what's going on with teams these days as well, you know, which is at the heart of you know what most people work in.

00:28:30.880 --> 00:28:33.680
We all work in teams and how AI is going to affect them.

00:28:33.920 --> 00:28:37.680
But look, if this has got you thinking, share it with somebody who needs to hear it.

00:28:37.839 --> 00:28:39.599
Subscribe wherever you're watching or listening.

00:28:39.680 --> 00:28:42.960
It it genuinely helps us keep the channel going.

00:28:43.119 --> 00:28:44.559
And we'll see you next time.

00:28:44.720 --> 00:28:48.480
But remember, you still matter, at least for just now.

00:28:48.720 --> 00:28:49.599
Thanks, David.

00:28:50.079 --> 00:28:51.200
My Glorne.