9

The biggest stories in privacy, AI, and tech this year (so far) with John Maloney

What have the major tech, privacy and governance themes been in 2025 so far? What can we expect from the second half of the year? And will we hear the terms “DOGE” and “agentic AI” less, or more, from here on?

In this mid-season episode of FILED, Kris Brown and Anthony Woodward sit down with John Maloney, a barrister, Melbourne Law School lecturer, and co-host of the podcast Don't Praise the Machine.  

John brings a legal and philosophical lens to some of 2025’s most pressing questions about DOGE, AI, data ownership, and the challenge of regulating emerging technologies.

They also discuss:

  • What makes agentic AI legally challenging
  • The collapse of US AI regulation proposals
  • Data ownership, privacy, and the philosophy of title
  • Ethical gaps in government access to citizen data
  • The role of AI in the future of law
  • Cultural shifts around chatbots and emotional attachment

Resources:

Transcript

Anthony: [00:00:00] Welcome to FILED a monthly conversationwith those at the convergence of data privacy, data security, data regulation,and AI governance. I'm Anthony Woodward, CEO of record point, and with me todayis my co-host Kris Brown, our executive Executive vice president of PartnersEvangelism Solution Engineering executive nurse.

And I believe sound engineering today.

Kris: I haveabsolutely been playing with the script. Well read. Very good. We'll get seehow many more executives I can throw in there. But hey mate, how are you? We'rewe're all over the place today. Yeah,

Anthony: no, I'm inSeattle today, it's summer here in North America.

I happen to escape what I think was a large weather event inAustralia and the cold down there. So. Kind of enjoying the weather here at the

Kris: moment. Yeah,she's, she's been a bit blowy, that's for sure. There's been a little bit ofwind. I won't suggest that it's cold in Brisbane 'cause it doesn't really getcold per se, but yes, there are definitely lots of people walking around at Oaktoday.

No, I don't

Anthony: think we hadenough continents. So, our guest today John Maloney actually [00:01:00] joining us from Europe. How are you John?I'm well, thanks. And whereabouts in Europe are you today?

John: So, I'm in atthe town of Luno in Ticino in southern Switzerland. It's about. Well just after11 o'clock at night here, but it's still very warm.

But I'm glad to see that the blue polo shirt seems to be a, agarment for all occasions.

Anthony: We need toget you a.

Kris: Yeah, wedefinitely gave you the memo. Well done. This

Anthony: is notbranded, so I do think we need to get you a RecordPoint one just to make surewe're all, we're all good for next time. Love one, but, but appreciate youmaking you on and appreciate you getting the, the shirt choice, right.

John, you are a podcaster, a barrister, Melbourne law schoollecturer, a writer, and even a fellow podcaster like ourselves. I'm not evensure Kris and I can call ourselves podcasters. So, you are a podcaster. We arejust pretend ones. But thank you for coming on today's show.

John: My pleasure.It's great to be here.

Kris: John, just forthe audience, maybe give us a little bit of background on sort of who you areand how you came there. I know Anthony gave us the read through there, but howdid you come to be here and given that we're an information governanceaudience, we love a little bit of tech, a little bit of ai.

We're really [00:02:00]interested in data. Sort of maybe give us a bit of a spiel for yourself here,for the audience on, how you came to be. I sure. I've had a good listen throughsome of the pods and I will agree that, we are. Definitely pretenders. Andcertainly some of the topics that you cover are a little bit of fun too.

I did have a good chuckle.

John: I'm glad tohear that. Yeah. So, I'm maybe an odd combination, but a combination I hopewell suited to, to being here in that I, I'm a barrister, as you said. I'vebeen doing that for about eight years now, mostly kind of public andadministrative law. So, law that involves government decision making on oneside, and that covers a pretty broad spectrum.

Some of it. Touches on questions around the regulation of dataor the regulation of online conduct and some of it touches on information. So, maybeto that extent, there's some overlap with what's of interest to you. Thepodcast has been something I've been doing since March 2021 with a friend ofmine who lives in Berlin.

It's called Don't Praise the Machine. And as the name mightsuggest, there's a bit of concentration on technology and the way that [00:03:00] technology is changing culture. And so, asthe years have rolled on, we've started to touch more and more on AI and what'sgoing on in the US and things that I think you've talked about on your show aswell.

Kris: Perfect. Ithink that'll be great for today's podcast if we're at the midway point of theseason. Very much wanted to sort of have a look at there. We cast quiet, a widenet to sort of go, well, you know, what's going on? And let's talk today alittle bit about those important stories.

You know, in our own world, the FILED world, if you will. So, fromDOGE to Deep seek, we are really looking to cover a ton of those importantheadlines. And you know, again, I think if you've had any listener for thelisteners, we always like to have a little prediction at the end as well, so.

Be, be be warned,we'll definitely get you to pull the magic eight ball or the crystal ball,depending on your flavor of choice there. What's been the wrap up for you of2025? What are the highlights for yourself here? Well, I think we've got acouple of topics that we will dive into, but is there any favorites of yourown, even just in your own world this year?

John: Oh, well, AI issomething that, and I guess kind of agentic ai, but. You know, things like,well, good old Chat GPT. It's something that we touched on [00:04:00] recently on the podcast and something thatI think we've come back to regularly over the last couple of years. And, youknow, we were talking about it recently as a kind of, as a bit of our own wrapup trying to.

Backtrack through the way that we've dealt with it over theyears since, you know, since we kind of first started talking about it. I thinkin October 2021 or thereabouts and in very kind of amateurish ways. I think wewere calling it, you know, Chat GPT or Chat GGFT, or you know, not quite surewhat we're talking about.

And then suddenly, you know. Three, four years later, I'm well,I teach a class, as you touched on it. I also do a bit lecturing. So, I teach aclass at Melbourne Law School, and we closed off the semester with a class onAI and the law and how, the potential for AI to.

Influence and maybe even dramatically change the way that lawfirms operate, the way that courts and tribunals operate and so on. So, that'sendlessly interesting to me because of how rapidly it's changed. And I thinkparticularly because of the rapidity and subtlety with [00:05:00]which it's become a part of people's lives.

So, you know, the fact that something like Chat GPT might'vebeen something that, you know, people were broadly aware of 12 months ago. Butnow I think in the last 12 months people have started to dive in in greaternumbers and, you know, share information. With Chat GPT or perhaps with OpenAI,you know, more willingly and develop in spite of themselves, some kind of, youknow, sense of a relationship there, or it's a fixture in their lives, or it's,you know, something that they turn to and ask questions of on a regular basis.

I find that. Fascinating because I just think it's, you know,people haven't necessarily thought through the implications of that from aprivacy perspective among other things, but also just from a kind of social andcultural and psychological perspective. So, that's probably, you know, a topicthat's, that's always fascinating me.

And then of course you've got things that are happening inAustralia, the tussles between our eSafety commissioner and the impendingsocial media bans for under sixteens, which are, a topic that [00:06:00] interests me, I guess if I've got mypodcast hat on, but also as a lawyer.

Anthony: Yeah.

Kris: Yeah. So, I waslistening to the podcast he had there on Steve Smith, I think is the, probablythe extreme case of these, where it's, you know, he’s it fallen in love withhis his bot? That's right. Yeah. Married man. I'm not sure my wife would let meget away with creating my own machine that I then I fall in love with.

I think that that would be dangerous, to be honest. I've notgot those relationships with Chat GPT put in the, the details here, nor I,we'll kick into some more details quickly.

And certainly, for me, there's, certainly an acronym that wekeep hearing about. There's a bit of a bromance. It's had a fallout if we we'regonna go down those relationship lines. Yes. But bring into government. Youknow, setting up that Department of Government efficiency has been the news forone reason or another ever since January.

So, it's, it is about that half year market. It's been a riseand fall. Or a rise and rise and a fall and fall. Yes, I'm not sure how much Ibelieve the news that they're gonna try and deport Elon, but given the rolethat it's played in the US government, there's a few think ways this discussioncan go too, but I [00:07:00] thought let's talkabout AI and the access for government data and citizen data.

And now some of the court cases that have come up, you know,from your side, has there ever been such a, like that quasi agency that, hasbeen that big an impact on, a government, any citizens, but specifically UScitizens from where you sit?

John: Look,

Kris: I,

John: I wouldn'tcount myself as an expert on all of the ins and outs of the various attempts tocut back on the deep state as Elon and Trump might call it, but to me it doesseem like a pretty unprecedented development, and one that's happened obviouslyvery quickly with a limited degree of congressional or cabinet oversight.

Their methods and the outcomes of those methods are. Prettyopaque. So, that's been kind of interesting to watch and to compare to theAustralian, I suppose to contemplate whether a situation like that could arisein Australia for one thing, but you know, to consider what it might look likeif it did happen in Australia and what might follow from it.

But I think, yeah, I mean, [00:08:00]as with everyone, I've followed that along both with concern, but also morerecently with a slightly more. puerile interest in the bust up between Trumpand Musk and wondering where that might lead. It seems now that Trump'sthreatening to turn Doge on Musk, which would be an interesting development.

Kris: Yeah, it's, andbig beautiful bills. It's, it's, it's interesting, right? Like it's in thefirst hand. It's like, yes, this is all great. Now here I'm gonna go and createa bunch of efficiency. And then now the division that he's created is beingturned upon him. It's, it's a novel for sure. Mm-hmm. It is. It's certainly anovel.

So, let me ask that, the dumb question then, but you know, whendid you first get sick of hearing the term Doge?

John: I think I'mstill not sick of hearing of, of it. I'm keen to see where the bus stop goesnext. I think it'd be a shame to, to tune out at this point, but yeah.

No, it's a good, it's not you're, you're tuning in for seasonthree or wherever we're up. Do Exactly. I, I thought Dogecoin was ridiculousenough, but little did we know that that wouldn't be the [00:09:00] last time or the last incarnation of theconcept. But yeah, look, to me, it's interesting to see how, you know, thiskind of, I mean, as somebody, I guess as a lawyer, I'm, I'm somewhat acquaintedwith, well, with the structure of government.

With the operations of government, and to see someone who has akind of. Crash or crash through mentality. This kind of very fast and loosesort of way of operating, which maybe has served him well in other contexts tosee that applied to. A massive bureaucracy is an interesting spectacle, and Iguess we'll see the, ripple effects of that for some time.

Kris: Yeah.Interesting. And I know we've spoken about it on this before, John, but we'vecertainly spoken about, I don't think anybody's anti-government efficiency inthe sense that, you know. Doing more with less, being more efficient. As yousaid, it's a bureaucracy and for good reason. There's lots of reasons why itneeds to be the way it is, but I've, you know, personally got experience withfriends who have worked in some of the musk organizations and his ability towalk into a room, throw a massive target down and then disappear is yeah, knowsno [00:10:00] bounds per se, but it's led toinnovation and it's led to change and it's led to people finding solutions.

So, you can't. Is it, the history says that there's somethingabout what he's been able to do that's interesting and, and works per se. Butyeah, watching the novella that is Doge and Musk and now Trump's bromance. As Isaid, certainly I'm, I'm, I am fixated. It, it is a little bit of fun. It's alittle bit worrying too.

It's worrying. Yeah. I try and see the funny side because thealternative is a bit concerning perhaps.

Anthony: Yeah. Ithink it's intriguing. I feel also the news today of the job rate in the US nowthe data's a little misleading. Government workers have actually increased inthe US in this survey.

It's unclear whether that's federal or local or state. There'sa bit more breakdown that isn't in that survey. They're expecting, a printsomewhere around 44,000 jobs and they're actually 60,000 government jobscreated. So, as an overall set of stat.

It doesn't look like as a general view, it's sort of had theimpact that was expected.

John: I mean, look,as a lawyer, it [00:11:00] interests me well, Iguess as somebody who's a kind of a bit of a public law nerd, it's interestingto me to think about somebody or an institution or.

An entity like Doge, which is not a government department,which doesn't have the kind of reporting obligations therefore, of a governmentdepartment or the same lines of accountability to Congress, I guess, and whichtherefore is at liberty to make all kinds of claims about what it has done andwhat it will do without.

What seems to be a great deal of rigor or consistency appliedto or transparency applied to, you know, how well that's working or actuallykind of what the long-term game plan is. So, now that, especially now that maskis outside the tent, I'm not sure you know whether it will just fall over or,or whether it will change shape or.

Whether as Trump's suggesting it might be used as a weaponagainst mask.

Anthony: Yeah. It'dbe interesting to see how it plays out. I wanna switch gears a little bitthough and it's been in the tip of my tongue listening to the two of you talkup until now. I really wanted to dive in and ask John, your view [00:12:00] on the AI regulations falling over in thelast little while.

In the big beautiful Bill that one in terms of publicadministration of law and states’ rights here in the us but you're seeing someinteresting things also playing out in places like Australia and other, in theconversations that are going on where this regulation of AI and theconversations we started with the opening of the show is really unclear andlooks like it's getting, you know, gonna be quite interesting both post whathappened here in the Senate and the US.

John: Yeah, I thinkin the lead up to speaking to you both today, I was preparing to speak to youabout the kind of moratorium on regulations in the United States and what thatmight mean. You are probably both better qualified to speculate than I am, butit seemed like at the very least, I mean, 10 years is an extraordinary lengthof time in AI terms, given what we've seen over the last four or five years.

And so, I suppose the withdrawal or collapse of those measuresmight bring some short term relief to some, but it seems like, anyone's guessas to what might happen [00:13:00] next, andwhether they'll try and reach the same point through other means. Theunderlying issue there is that the extent to which big tech in the UnitedStates has Trump's ear and is not going to stop having those conversationsabout deregulation just because these measures fell over.

Anthony: Do you seethe same echoes occurring in Australia or in Europe and what, what's the pathyou think, for the regulating of AI in a more general sense?

John: That's a goodquestion. I think in Australia, I mean, look, I'm not an expert on the kind ofsuite of measures that are under consideration. I know in some places,including Australia, there's, you know, kind of non-binding statements ofprinciple that institutions and industries are signaling that they might bewilling to sign up to.

But I think in terms of the kind of legislative landscape inAustralia. It's very much, you know, responsive to what's going on in the uswhat's going on in China. We've got as I alluded to before, the eSafetycommissioner who's [00:14:00] got some.

Power to regulate online hate speech or the misuse oftechnologies that might include ai, for example, to harass someone or tothreaten someone. But in terms of engaging in a broader sense with theregulation of ai, I'm not sure that I think it's still pretty open pasture.

Anthony: You broughtup an interesting topic that I wanted to, on a previous podcast. I talkedabout. An agentic capability I'd written for my children to order pizza and dosome negotiation around pizza. Very simple set of process, but my son wasactually asking what happens if the AI starts abusing him.

You know, we talk about the safety Commissioner who hasaccountability for what that AI is producing. In fact, he's been trying to goadwhat I built into being inappropriate. As an 18-year-old boy will do.

John: Sure.

Anthony: Yeah,exactly. Especially it's been given to you byyour father to deal with your sister.

Right? Absolutely. There'smultiple layers of this going on.

John: Yeah.

Anthony: I'm reallyintrigued around where, I know there's no case law yet. There's a lot to workout, but what's your view of where we're headed here?

John: Reallyinteresting question in [00:15:00] terms of thekind of accountability for what's done by.

Agentic AI and those same questions are being asked in terms ofAI's role in the law. Obviously at the moment, already you've got a whole rangeof processes that are, everything from legal research and I'm sure people willbe aware of. The kind of darkly funny cases of people coming to court with alist of authorities that turn out not to exist.

But you know, behind that there's a whole range of otherlitigation processes or interlocutory processes that might precede at, say, acommercial trial where, you might be talking about discovery of documents, theproduction of first drafts of statements and so on, and to the extent thatthose processes might become more and more autonomous I guess there arequestions there about, who bears liability for errors in those processes oroversights or biases in those processes.

Can liability be sheeted home to the lawyers or legal actorswho are ultimately responsible for those things? Or [00:16:00]is there a kind of potential for the application of, torts law, for example toapply to the people who are writing those or developing those systems. Eventhen, I guess there's an issue because, if a system's kind of developed semiautonomously over time, well then perhaps something it's done.

Some are error or oversight. Is not, you know, entirelyforeseeable and it's, you know, perhaps there's an injustice in, in cheatinghome responsibility to the person who developed that program, et cetera. So, Ithink there's, I mean, I think the short answer is that the system as it's, orthe kind of, you know, familiar principles of law that we've, that we've cometo think about in these areas are not well adapted to these kinds of questions.

And there's going to need to be, you know, some pretty thoroughgoing, rethinking of those, those concepts.

Anthony: Yeah. 'causeyou know, as a technologist. You know, are we, I think we've mentioned it wellon a period post card, you, you have anthropic literally coming out and sayingthey have no ability to back trace why their algorithm and even call 'em analgorithm as a [00:17:00] misstatement.

John: Mm-hmm.

Anthony: Is producinga result. So, therefore I don't, I'm sure there are, you know, where does thataccountability lie is going to be some really interesting conversations when wereally, you know, and we've talked, we covered this a lot on this podcast thatwe're moving to this notion of autonomous agentic agents creating businesstransactions between agents, right?

John: Yeah.

Anthony: That are,you know, for all intensive understandings of, of, of tot law at the end of theday are true transactions. They have all of the key representations that makethat valid. Maybe not the, you'll correct me on this. There was the case wherethe guy had the bandages and the bleeding signature, and it wasn't actuallysignature 'cause bandages, but the famous one may, maybe not the blood thepackages.

John: Yeah.

Anthony: But youknow, all the other executional elements. What would your advice be if I turnedup to you today and said, Hey, I've had an agent transaction, I'm now gonnareject these transactions.

John: Mm.

Anthony: Maybe one ofthem was with, with the government. How do I go forward?

John: Mm. Well, look,it's a great question.

I think it's not something I'd be reluctant to provide off thecuff advice on

Anthony: i'm lookingfor [00:18:00] free advice.

Kris: And he's tryingto make sure that he didn't get in trouble with his daughter because the factof the matter is it's him that's in trouble and it's, he's responsible for thepizza agent and it's as simple as that.

So, if we roll all the way back, I can see exactly what'shappened here. Is that the son has now come in decided, right. I've got a wayhere to get dad. It's gonna be dad's fault and I'm still gonna get a jibe at mysister, it's great. It's perfection. You've created a monster, Anthony.

John: I think if weput this to a jury, there'd be a degree of sympathy for your son's anarchictendencies to try and find a way.

Way to make the system do skew towards the the offensive. Look,it's a really interesting question and you know, people are already, I mean,Melbourne Law School where I teach, there's a center for artificialintelligence and digital ethics, and they're starting to ask questions alreadyabout, you know, how far is this going?

Is there potential for us to start thinking about legal advicegiven by. Ai, I mean, to an extent that's already started happening in somejurisdictions, in parts of China, for example. There are smart courts socalled, which are kind of equipped to deal with, [00:19:00]with basic processes through artificial intelligence.

You know, some of the kind of basic filing mechanisms or otherprocedural mechanisms are being taken over or at least supplemented with ai.And I think there is a question of who bears responsibility when things gowrong, not just from the point of view of legal liability, but from the pointof view of, well, we have a system designedly in the law's case, which hashuman actors.

I mean, if a judge makes a decision, well at least we know.This is who that judge is, and these are the written reasons why they've madethat decision. But trying to, and you can find fault with that, and you cansheet home responsibility to the judge. But, you know, trying to, as you say,kind of unscramble the egg of what's gone on, you know, what's led to aparticular oversight or error or bias.

I mean, even the people who might have created that.Technology, as you say, it's a misnomer to even call it an algorithm. You know,they can't tell you how something might have ended up in the system that's, ledto an oversight. So, think, it's a really difficult question whether we, [00:20:00] in pursuit of more efficiency or moreaccuracy or whatever it might be, we're willing to make some trade offs thatmight include, in certain circumstances, not being able to hold those systemsaccountable.

Anthony: I was justtrying to deal with what was for dinner.

Kris: Yeah. Well, youcreated a monster and got off from it. Now we've told the world about it. It'sall your fault. That's what I heard.

That was your free piece of advice. Yeah. John, I, I think itwas interesting at the beginning of the, the chat, we were talking about ChatGFT had debuted according to the Don't Praise the Machine podcast, but we'veall been using LLMs. You alluded to you know, that we're all getting engaged inother ways, Anthony's building, outsourcing his parental duties, which is, Isaid, it's still genius.

I'm, I'm absolutely all over it. I just wish my daughter likedpizza.

John: Yeah,

Kris: but that wouldmake it a lot easier. The businesses have started to create more and more.We've had that rise and rise of the Agentic AI which, you know, sort of, andit's interesting because the [00:21:00] LLMshave plateaued a little bit by the, we're now looking at very incremental.

Early on it was very fast, very high. Pace improvement. But nowthat we've almost got to a maturing, we're still deep in the hype cycles troughof disillusionment. I just love that term. It's so dramatic. Like you could seeit raw eyes rolling to the back of its head when someone said that in the firstmeeting at Gartner when they first brought that up,

sure. Yeah. So, Agen AI though. Good name. Good name for rockband. It's with the value. Yeah, exactly. It's where the value is. So, ratherthan make a better chat bot you know, they're starting to give us new skills,do things for us. You've mentioned just now that you know, legal research andother things, and obviously we've had a quick chat about all of the roles andresponsibilities and problems, but you know.

For yourself, podcaster, writer, lawyer, lecturer. There's,there's a, you know, a wide range of pieces there. What are you doing with it?How do you, how do you utilize it? You've, you've got such a broad range ofthings that you're having to do every day. Mm-hmm. And we've just heard vibecode Woodward over here.

He's building [00:22:00]Agentic AI to outsource parental skills.

John: But

Kris: What are youdoing with it and what are the, key things that, you get value from it with

John: Look, I thinkI'm probably not somebody who's explored its potential for, say, professionalpurposes. I still find, maybe I've asked at the odd, very general legalquestion for teaching purposes, but I think I'm not someone who, you know,says, give me a list of cases on this because I, I just, the amount of timethat at this stage that it takes me to.

Double check those results. I may as well find that informationmyself. But I think, I mean, I do, and you know, one of the things that, thatmy co-host Al and I have talked about on the, on the podcast is, is this ideaof it becoming something that you kind of dip in and out of in your daily lifein a way that I didn't 12 months ago.

So, I'm traveling at the moment with my partner and ourdaughter, and we had three days in a particular place. And so, I said, well,you know, give us a, here's the things we are interested in. Give us anitinerary for three days here, or I'll be, gardening and I'll think, okay, [00:23:00] well, you know, how do I do this?

Or how do I avoid doing this? These kinds of things that,individually are pretty innocuous, but I think they, what they do thatinterests me, as someone who thinks about these things and as a lawyer, is theykind of build that sense of a, of a kind of relationship or of a habit of kindof dipping into these technologies.

And, you know, of course we all have that, those habits withother. Technologies, people, you know, long past the point of, of starting tokind of Google things. But the difference with Google is, you know, you can dothat for as long as you like, and it doesn't kind of talk back to you in thesame way. It doesn't have that same sense that I say Chat GPT has ofremembering who you are and.

Tailoring its results to who you are. Of course there's a levelof that going on with Google, but it's, you know, it's not as though, I mean,you could sit on, on Google all day every day and you wouldn't get anywherenear, I think it's Chris Smith who, you know, developed a relationship with AI.

And I'm not saying, you know, the average punter is at risk ofdeveloping a relationship that ends [00:24:00]in a proposal. But I think people do become, through those kind of habits andthose cycles of use, they become. Unguarded and become more willing to sharetheir personal information and they become less vigilant about the fact thatthat information's going to.

You know, it's going to open AI and it's a bit unclear, youknow, where it's, how long it might survive and in what form and where it mightend up. And I mean to, to me, that kind of puts me in mind of, as a lawyer,somebody who comes into, you know, the classic sort of case of somebody cominginto a law firm and saying, well.

I told some, I told someone something I shouldn't have, but Ijust thought it was a kind of private conversation between friends and now I'min trouble. Or somebody who says, yes, I promised to do this, but I thought,you know, it was a conversation between me and somebody I knew, and I didn'tthink about it in terms of a verbal contract, et cetera.

People are put off guard by the way that these systems presentthemselves, and I think therefore don't consider the kind of long-term [00:25:00] consequences. So, that's something that Ifind. Personally, I find really interesting. Where's that heading and where'stheir scope to kind of remind people you're sharing a lot of your personalinformation with a system which seems warm and fuzzy but may not be.

Kris: So, it's aninteresting, I liken this a little bit to, you know, generationally just.Technology growth. You know, as you work your way through the generations, youknow, as a Gen X, you know, you've straddled tapes and LPs through to CDs andinto Napster and everything else. It was interesting.

I was having a conversation with my wife the other day, andthere are things that our children are very, very good at as it relates totechnology, but they weren't there for the generational buildup of it.

John: Yep.

Kris: And I thinkthat, the same is probably true here for AI. I was speaking with someone

putting a name and a place and a and a body on this AI andthought it was a private conversation. And it's, led to an issue this weekalone, I dunno where I've put my phone number. I've clearly put it somewhere Ishouldn't have. But this week alone, I've got a dozen. Phone [00:26:00] calls with AI voices at the other end,attempting to have conversations with me to talk me into buying something orout of, you know, spending something or posting or getting other details fromme.

John: Mm.

Kris: I want to sayit's related to my insurance claims, but I said, I won't say who that is outloud for the reasons that you know, that we won't. I'm not accusing anybody ofanything, but yeah, I think there's that. So, the questions for me is, youknow, there's going to be people who are gonna get left behind here.

Are we creating classes of people who are going to bevulnerable because of this? And I think, we've seen a lack of regulation mightdo that. What's your thoughts here? I'm really interested in that conversationyou're having as a lawyer.

John: Yeah. No, Ithink absolutely we are. And you know, the same things that. Maybe make peoplemore vulnerable to all kinds of other scams are going to make them morevulnerable in respect of these technologies as well. I think there was someresearch in about March or April this year from from, might have been MIT, thatlooked at a number of people using Chat GPT on a regular basis and found therewas a [00:27:00] certain subset of those peoplewho were generally, more emotionally open, people, more willing to.

Share of themselves socially. And you know, those, that subset,as I understand the research essentially reported a kind of a sense of, youknow, when they were taken off by this regular contact with Chat GPT, they’rereported a sense of, you know, emotional dependency or a psychological payoffthat they were getting from sharing information with it that was lost when itwas taken away from them.

And that gives you a window into, how easily people can bemanipulated, I think. By some of these technologies. And obviously they're,still in their infancy. I mean, this is, we're talking about somebody who mightbe using Chat GPTs voice function or might just be typing to it.

That's just one aspect of it. But I think you hinted atsomething which maybe introduces other kind of. Stratifications betweendifferent groups of people, which is this sense of a knowledge gap, I think,between people who, I mean, I'm somebody who came up in a, in law firm, in alaw firm at a time when as a young lawyer, as an [00:28:00]inexperienced lawyer, you did a lot of the kind of, you know, pretty low level,pretty repetitive basic research tasks or drafting tasks or checking tasks orwhatever it might be.

You know, collating information, copying information, etc thatwas, dead boring a lot of the time, and I would've been happy to delegate thatwork to a computer. But there was some value in that process, in that you moveup from that into the more kind of advanced forensic skills that are involvedin lawyering

you have some idea of, how the sausage is made from the groundup. And there's going to be potentially, as these agentic capabilities takeover, there's going to be people who don't know, as it were, how the sausage ismade. They might not have done coding. They might have just asked chat GP toproduce the results of a good code for a particular purpose.

So, it interests me what that looks like, you know a generationfrom now or even. Five or 10 years from now, and whether there's going to be acohort of people who have a set of skills that maybe people who are coming upthrough [00:29:00] those different industriesand using different systems now might not have or might have to try and find.

Or developed through some other means.

Anthony: It's aninteresting conversation, although I did have this exact conversation withsomeone at anthropic, which seems to be a topic for me today, and they did makea really interesting observation to me. They said, so you eat a bunch of food,you get it from restaurants, the different flavoring they're put in, somesynthetic creators, some not.

How is it created? 'cause you put it in your body, you shouldat least know what you're putting in your body. Do you have, do we have thattracking? And it's an interesting debate, right? You think about the amount ofthings we do in our world today that you, don't actually have an understandingof those things and how they're made, but you do understand how to use them.

I do wanna loop back.

Kris: You would'vecountered with if I got sick, I could come back and sue you because there'slaws that protect me. I know you would've.

Anthony: You know,not, not everything has to go to a lawsuit, but I do wanna flip back. There'ssomething I think really interesting about the AI conversation and it reallyrelates to a lot of what we talk on this podcast around

data, and I think at the root all, all of this is there's afundamental [00:30:00] misunderstanding andpotentially a fundamental misalignment in society about who owns data, what arethe principles of data and data itself, when we actually talk about AI and theexamples that you talked about, a lot of it is.

Is about the centricity of data and ownership of data, and thatin reality, and you look at the privacy legislation as one piece ofinterpretation and there are, there are a number of others there. Maybe we justgot it wrong, and that's what's occurring in terms of allowing these systems torun amuck.

With data because we haven't actually managed data.

John: Mm. Yeah.That's a really interesting point. In your view, is that because people don't,conceptually they don't have a good understanding of what data is and where itfits into that picture, or they, you know, they, they don't have a good ideaabout the principles of data management or?

What's the underlying issue?

Anthony: I think, Ithink it's twofold, right? I think there's one which is a little bit morephilosophical, and that is much like you have title to [00:31:00]your own land. You don't truly have title to your own data. Mm-hmm. And that'sa real mishmash in the law day, right? So, you take something like, and I knowwe've, you know, in Australia, they've enacted some things to get.

Control again, of your own genome, right? Mm-hmm. That'sprobably the first piece of data you've seen do that, but it's really at theedge, right? I that the most folk haven't mapped their genome. Most folk don't,

John: no,

Anthony: haven't,gone to the expense of those things, and the technology there is still fairlynew.

Mm-hmm. But the data of, you know, you talked about Googlesearch earlier, the data of me going and searching and the thing that createsof me and how I interact with the world, surely, I owned that title. That's me.That describes me. It's no different to, my land and someone having a carriageway.

Right? It's still my land. They're allowed to pass over it fromend to end, but I own that land. That's mine, right? Yet with data, we don'tprescribe those same principles.

John: No. That'sright. Yeah, that's a really interesting point. And I think there is some,well, I mean already in the way that property laws operate, there's a [00:32:00] degree of, there's a, there's a kind offolk understanding of things, which doesn't always gel with the legal realitieswhen people might have a sense of, you know, well this is, this is my house andwhy I wasn't getting into

Anthony: easementsand everything else.

That's a whole, but

John: But I think thepoint is that, people have an intuitive sense of ownership that I guess is kindof, well I start with, "well I own, I own me" and I kind of moveoutwards from that. And so that, there's a strong intuitive sense that what I'mputting out into the world in terms of my search history or.

Whatever else it might be. I've got some kind of moral claimto, but as you say, it doesn't necessarily sound in any legal or technicalreality.

Anthony: I thinkthat's the philosophical point, but I think there's a practical reality though,of how do you, what is the onus of those interactions? And I don't think thishas yet been effectively litigated.

It's all very well to sign someone up to a policy inside Googlesearch, which says, well, I'm gonna separate you. I'm gonna do some things tomake sure that your [00:33:00] data is redactedor de fingerprinted. But still gives me ability to sell you items, which isclearly their commercial intent.

But what is the reality of the leakage of that and theretention of that? Beyond that, and I know those are the things that GDPR and.Some of the other pieces of legislation have attempted to address. But thereare some real significant gaps there because as a stakeholder actually enactingthe processes within, say, GDPR is just so unbelievably complex.

John: Yeah. That'sright. I think there's a, I mean, one of many areas where kind of theaspirations of legislating and regulating, but up against the practicalrealities of how these technologies are used and how they interact and youknow, obviously we saw some of those issues with and no doubt it'll keephappening where people, you know, they might have signed on to certain termsof, and conditions not knowing or knowing very little about the kind ofmechanics of, what those.

Terms and conditions mean in the [00:34:00]real world and particularly over time.

Anthony: Yeah, no,it's an interesting area that I think we're all gonna have to keep exploring.The one thing, and this is probably because it's a passion of mine, I know isof Kris', data retention is the big underlying thing that we're just notdealing with.

It just isn't fair that, you know, having children in thatadult range who've been constantly on social media in a world where we've stillbeen learning. What that means. I think talking to a much younger niece of minewho has a much better awareness of what I did, even what my children who wererelatively aware of, and now she is sort of this generation below that, theretention of that data.

Sh the, the surely there must be, and I, you know, I know, Iknow the regulations on it are still quite loose. You know, there must be anonus, I would think, or an interpretation coming soon where it just isn't aright. That continues because even the contracts that shrink wrap theseagreements

John: mm-hmm.

Anthony: Like, youknow, things don't last forever.

John: Mm-hmm. Yeah. That's right.

Anthony: Mean is that going to, are we gonna see youthink [00:35:00] case law or anything in thatworld where there is an ability to contest that retention?

John: Look, I thinkthat's very possible. I mean, we are dealing, as you say, when you're dealingwith those kinds of fundamental features of contract law. You are dealing with

A system that was designed around a certain range of, say,objects and processes and a certain timescale, and, that's pretty differentnow. Potentially something, I mean, we're dealing with the retention of datathat might go on indefinitely. And I think that raises a whole host of novelissues.

But, you know, how the law responds to that remains to be seen.I guess it's you know, well, in the Australian context, you might see the kindof piecemeal adaptation of those principles by way of common law, the formationof some settled principles that we could be a long way away from that. Andperhaps ultimately.

The reception of those settled principles into statutoryregimes. Or it could be that, you know, as with the regulation landscape, thatwe take our queue from a set of principles that have been developed [00:36:00] elsewhere. But look, I just, you know. Ihope you're not gonna, I hope that's not going to be the.

Question I have to predict because it's the crystal ball ismurky

Kris: that do that toyou, John, promise, he says, wryly scribbling in a question he now has to ask.I wanna take an interesting, and Anthony, this might actually be posed a littlebit to yourselves as well, but I'm, I'll be interested in John's opinion, butyou've just sort of, you know, you raised something a moment ago about titleand we assume title of the, the land because it's built into property law, etcetera.

And we sort of, you know, we've made this assumption around ourown data. But you raised before that, you know, the conversation around, well,we don't really know what the chef's doing to the food. Is it the same problemwe have here? And maybe this is more our industry, but it's, people assume thatgovernment departments.

Organizations, corporations who are collecting the data aredoing the right things. There was a cost associated to storing data a long timeago, and the physical paper in folders in boxes needed to be stored in land.And so you had this cost associated. And so the corporate culture wanted toreduce that cost, so retention policies would be applied and they would make [00:37:00] documents disappear.

We're seeing the opposite. It's actually quite easy to storedata for a very, very long time of time. Very, very cheap. And that corporateculture hasn't quite cottoned on, you know, is it that we've got generationallyan assumption that organizations are doing the right thing with data. We alwaystalk about, you know, the common man here.

Because we're back to, they're the ones talking to ai. They'rethe ones giving their data away. You know, you always assume you, I only haveto keep my tax receipts for seven years. You were sort, you were sort ofdrummed into you once you get that first job and you keep your proof andevidence for that long, but you don't want to keep it, you want to get rid ofit, you, you, you get rid of those things.

You know, you might my Gmail account on, you know, I've had itsince the beginning of time, but. You know, the emails that are in it now areprobably only from the last three or four years because I've run outta spaceand therefore I just do a wholesale call once a year and get rid of things.

I'm almost forced to do it. It's not particularly thoughtfulwhat I've done in those situations, but, you know, are we at a place where.It's data retention, [00:38:00] it'senforcement of, of data governance. That would help a lot here. Like if, youknow, we use the example of me if there's a classification of DNA or the genomein a particular way and there just has to be a legislative, you can only keepit for this long and use it for these purposes.

It's that usage rights. Like I said, sure, they're going toprovide me a service for free and this is the thing that we forget. Facebookand Instagram and TikTok and even Chat GPT are free services. They costbillions to run. Mm-hmm. They've, they're going to, and we are very, veryaware, at least I believe most people are aware that, you know, they'reutilizing our data in for benefit there, but for how long and is it as simpleas we just need to go, the organizations need to do the right thing.

If they can't, we have to legislate for it. But it's back todata retention. So, I'm all over it. From that perspective, I think Sure. Havemy data for two years. That I gave you and

Anthony: check in. I,I've, and Kris, Kris, I know you fall into a trap when you go have my data fortwo years. I think it [00:39:00] all has to bein the claim to use.

So, and I think if you think about, and that's why I do likebringing this batch on to property law, right? It has to be a claim to use thatlinks to the purpose of that, for that relationship we had. But it is where thearchaic notion of a contract sort of falls apart because that's not generallybuilt in.

John: No. That'sright.

Kris: Certainly GDPRhas done something in that way. And so I wanna give it some credit because wewere talking about it before the pod started, but lived in the UK for a while,joined a number of organizations, either through paid membership or justthrough volunteering and other things.

I. And every year I get a email that says, do you wanna keepyour account open? Otherwise, if we don't hear from you in six weeks, we'regonna delete all of the data you know, and yes, back to the claim, Anthony,it's, you know, we're only using the data to ensure that we tell you aboutfootball tickets.

If it's a football club, or we're only telling you aboutconcerts, if it was a you know, so again, you had that list of these are thethings we can use your data for, and that's what we're doing. And of course,they can breach that, but at least it's very clear [00:40:00]and outlined.

You know, that's what they're doing and at any point in time Ican get rid of that. It's a huge impost on a large organization to do thosethings. We do understand that, and that's why I try to simplify it back is thatwe make the assumption that the chef didn't leave the meat out for six weeksand just let it sit in the sun and then chuck it on the fryer and send it outto me to eat because there is something there.

Was to get food poisoning. I do have some sort of recourse.It's, it's a little more instant and it's alittle bit more obvious, the, the data. But it's,

Anthony: I wanted to listen to, I wanted to link thisdidn't mind Kris, to one of John, the topics I had, I enjoyed you discussing,which was the OnlyFans piece, right?

So, you even think about the discussion you had about OnlyFansand the retention of data of things people posted, not regretted when they postit, potentially regretted later, whatever that is, right?

John: Yeah,

Anthony: but whereare these, where does that live? Because the, in negotiating those contracts isvery one-sided.

Yeah. And that isn't normal.

John: No, that'sright. I think there's a, you know, I mean, the short answer is it does callfor a rethinking of those basic [00:41:00]concepts to an extent because I'm not sure that there's a comparablecircumstance that's, you know, previously been the subject of conventional, youknow, like a species of contract where you might be dealing with something thatgoes on, I mean, there's, you know, there's certainly contracts that go on fora long time.

But you might have a very long or an indefinite ride over apiece of land, for example, but you know what you're dealing with and it'stangible and you understand, who's involved with it, what parties have a stakein it and in what capacity. And so you can engage with the

state of play as it were. And I dunno, seek, seek recourseshould you wish to on those terms. But I think if you're talking about a pieceof information that you've given to a third party that's been subsequentlyonsold to many further parties and perhaps used for a range of purposes, onsold again in certain circumstances

I mean, you're dealing with so many permutations and, so muchkind of changing of hands, at least notionally. I [00:42:00]just think it's like trying to keep track of a 20 cent piece for a year as itmakes its way through various pockets. But you're dealing with millions ofthese things and the transactions are occurring incredibly rapidly and overincredibly long periods.

So, I think it does start to kind of strain at the boundariesof those concepts. And you, you know, you do start to think well. The world inwhich those concepts came to maturity is a very different one from the one thatwe're now contemplating,

Anthony: but there issome precedent, right. To changing those constructs.

You, you go back and you, you'll correct me on the, the numberof years, but you know, you didn't all, you know when you originally had Titlethe Land, you own the mineral rights. At some point we legislated out and madethat a different title.

John: Mm-hmm.

Anthony: You know,it's not like these things haven't evolved even in land and title concept.

John: Yeah,

Anthony: this is justanother version of that. Is it not, or, do you see this as really quitedifferent?

John: Look, I think,I think, I suppose there's the kind of, you know, the shock of the new and it'seasy to mistake the latest thing for something that's fundamentally different. [00:43:00] And, but I do think we're, we're in prettynovel territory, not just because these concepts, were devised to deal withmore kind of tangible object processes, et cetera, as abstract as they mightbe, but also because the method that we have for refining and devising theseways of dealing with.

Those realities. That is to say, you know, the common lawparliaments that pass laws when necessary, I mean these are, these are prettycumbersome and sometimes pretty slow moving, pretty incremental processes andthe rate of change is such that, you know, by the time the common law starts tograpple with these things, you know, in a more thorough going way.

Well, they might, you know, they might be, by the time the inkis dry on a judgment, things might have changed shape. And I think that'sanother, that's another complication. So, yeah, I mean, I have some faith inthe adaptability of, as any lawyer should, I guess, in the adaptability of, ofour legal concepts and our legal processes to deal with novel [00:44:00] territory.

But it does seem to me that this is more strikingly novel than,then other perhaps recent examples of, you know, new species of contract or newtechnological developments that have needed to be thought through.

Anthony: Reallyappreciate the time, John. This has been a fascinating conversation. I knowit's getting late where you are, so I really wanna let you go and appreciatethe time you've invested.

There's a heap of things I think. We'd really like to discussmore deeply in a future episode. But thank you for coming and spending sometime with us and picking through those things and I really appreciate it.

John: No, it's mypleasure. Very great to chat to you all and I probably won't be able to get tosleep now. I'll be worrying about the future of contract law.

Anthony: Look happyat any stage, and maybe not, maybe not this evening, but we can to pick thatthrough more deeply. Thanks everybody for listening.

I'm Anthony Woodward, if you've enjoyed today's episode. Pleaseleave us a review on your favorite podcast platform of choice. We spent a lotof time on LinkedIn under the record point heading. You can find all thecontent there and I do, we do have some additional outtakes I believe of Kris [00:45:00] posting into LinkedIn so you can findthose outtakes to go get the full FILED experience online.

Thanks very much.

Kris: And I'm KrisBrown. We'll see you next time on FILED.

Become a FILED guest

If you’re an expert in any of the industries we discuss – data privacy, cybersecurity, regulation or governance, and more – we want you.
Learn more

Enjoying the podcast?

Subscribe to FILED Newsletter.  
Your monthly round-up of the latest news and views at the intersection of data privacy, data security, and governance.
Subscribe Now