6
Events season is nearly over – here's everything we learned, with Josh Mason
The RecordPoint team have just spent April and May on the road for events season, so Kris and Anthony sat down with RecordPoint CTO Josh Mason to go through everything they’ve learned, their highlights, and why Anthony built an app to automate pizza delivery for his children.
They also discuss:
- Event highlights and lowlights
- AI and data governance discussions
- The rise of adversarial AI and cybersecurity insights
- Why identity security is the new perimeter
- Quantum is no longer a theoretical challenge, the time to be prepared is now
- Conclusion and reflections on the conference season as a whole
Resources:
Transcript
Anthony Woodward (00:13)
Welcome to FILED, a monthly conversation with those at the convergence of data privacy, data security, data regulations, records and governance. I'm Anthony Woodward, the CEO of RecordPoint. And with me today is my co-host Kris Brown, our executive vice president of partners and solution engineering. And that title gets longer and longer every time I say it.
Kris Brown (00:32)
That's half the words, that's the best bit. How are you?
Anthony Woodward (00:36)
We're coming back from a set of long road trips and I think the audience can probably hear that in our voices. In fact, a little bit of leave for yourself as well. We've been out on the road, primarily around the US, attending a bunch of events. I think today we really want to deep dive on what's been happening, what we've seen in our travels. We've also got a special guest star today, the RecordPoint CTO, Josh Mason. How are you, Josh? Good.
Josh Mason (01:02)
forward to the discussion today. We're in trouble since Kris and I are on the same podcast. I hope we've allotted about four to five hours today.
Kris Brown (01:09)
Yeah, absolutely, my. it's I think they've deliberately kept us apart because, yeah, the pair of us hate the sound of our own voices. I'm rocking the for those who might be watching us online or we're seeing the podcast in the clips and other things. I'm rocking the events. Polo, we've got this wonderful shade of pink or very, very obvious on a lot of the show floors. I think I regularly bumped into attendees who spoke about, you're the pink. You're over there with the pink guys.
in the corner there. it was great work from the marketing team to put us out there and have us, I said, I personally think I'm great in pink, but now I'm sure others will give us, give us those comments. But yeah, we have been attending a lot of events and, you know, and the best part for the three of us is that we weren't at the same ones. April was just a big period for events.
I've got a list here, but there was Sea Air and Space, Deloitte Roundtable, IAPP GPS, ARMA InfoNext, ARMA Canada Information Conference, Google Next, RSA Week. And that was just the ones that we went to. There was others going on as well. There was just a huge web back at events time. It was a great time throughout for all of us. We did cross over in some of these, which was cool. I wanted to start by highlighting Sea Air Space. Not something that we've been doing before. It was...
It was actually a really interesting place to be back in an old hometown of mine in Washington, DC. It's the U S Navy conference and so lots of defense personnel and contractors, lot of hardware, big boys toys. Yeah. If you'd like drones, there was a lot of drones. think drones, if I said drones enough times, did I say drones, drones? There's a of drones.
But as said, probably not a space that you would traditionally say, Hey, information management and records management, data governance element was big in. But, but if I come back to my friend drones, drones, no, they produce a ton of data. People were talking to us about, know, what do we do there and how do they do those things? spoke to a number of contractors from different organizations and a number of defense personnel are around those things. It was actually really, really interesting to talk about that.
Big thanks to Team Defence Australia who put us up at that event too. Great big booth in the centre of the National Harbour there, a phenomenal facility. And so it was a really good event from that perspective. This was an interesting one though too, like, you know, sort of take the personal side of it. This was an event where they didn't feed anyone. There was no coffee, there was no breakfast, there was no lunch. We were lucky at
the team defense Australia booth, they put on a coffee thing, but I think the entire conference, some 15,000 people tried to make their way through our coffee booth there. So we definitely got foot traffic from that perspective, but it was the first for me and that I ordered some Uber eats to the conference floor. Um, you know, there was, I was really worried for all of the military personnel who were walking around, but yeah, let's, let's start out there and let's start to have that conversation with that in mind. You know, what was your, maybe your best and worst conference moments? Me not necessarily saying that was the worst, but it was certainly a furnace.
Anthony Woodward (04:14)
I look definitely on the worst was Kris and his pink shirts. Now I'm mentioning it. Come on. Far out. No, look, I think what was really interesting and yeah, the International Privacy Association Conference was the tone, I think was one of the worst things that a lot of disappointment around coming back a year later and probably not how to got the progress people were expecting from conference to conference. That was certainly my worst.
Josh Mason (04:44)
Yeah, look, I want to say first of all, I was jealous that I did not dig it to go to the military conference where there were drones. I did try to do my part at IAPP. I did bring a drone in my backpack and got one flight out of it until ⁓ security came by and said, no, drones are not allowed at IAPP. So, but yeah, I got that in. Look, like sound self-serving. I do like meeting with the vendors at all these different conferences, at the RSA conference especially, where there's ⁓ a lot of new vendors there.
I generally learned something from that because you walk around looking at lot of solutions and realize potentially there were problems to solve that you didn't know that you had or gave some clarity to problems that you have. So that's been great. Some not so great moments. I heard a lot of people talking about AI and I think everyone's still learning about what this means in the industry. What does it mean to organizations? What does it mean to individuals? And heard a lot of... ⁓
maybe not so great advice from people. I was in one session where somebody asked about, I really need to be worried about AI? Because do I really need to worry that open AI is gonna get hacked? It was kind of like, was there a thought? And I kind of expected an answer around talking about personal account use leading to oversharing and copying private data into prompts and disclosing a PII and like ethical bias risk and things like that. But instead, the answer that kind of came back was,
It's a good question, but really the risks are not really known at this time. ⁓ So I think, yeah, so you have to be careful, I think, right now with picking out single people to kind of listen to. think in this case, it's looking at lot of listening, a of different data sources and going to lots of different sessions. And it's going to give you the biggest benefit.
Kris Brown (06:22)
Yeah, I was lucky enough to go to Savannah in Georgia. I'd not been before. As I said, again, a privilege to be able to trip around and see some of these wonderful places. with ARMA Info Next, it was a really boutique conference, right? Like I said, quite intimate, sort of a very smaller group of very focused individuals. AI was the topic that really got hammered home again and again and again. I think to that point, Josh, the ability to
hear lots of different opinions and there were lots of different opinions, some good, some indifferent, just very much repeating the rhetoric that they are hearing. There was a handful of people who were obviously taking the time to really point out what you should be doing and how to do it. Again, I had the opportunity to speak at that event too and it was great to see I sit down and share my thoughts with everybody at that event. think for me, that was
very, very close. spent a lot of time with the same group of people. did have some really in-depth chats and the sponsor at that event too. We got to have some other more executive meetings as well. So that was certainly a highlight for me. If I was to give you maybe more of a comical rather than a worst moment, it was at IAPP where the roof started to leak on us. yeah. Again, another first, we had this, we're in this massive hall and all of a sudden there's effectively rain coming directly down onto some electrical equipment and other things in our booth.
everybody moved very, very quickly. And the next morning we came back and it was all fixed. But if you looked up, they described it as a very large diaper. And I think that was that was just a little bit of fun wondering whether or not the diaper would fill and whether the diaper would come down on top of us. Certainly we got through the event and at the end that didn't occur. But it was again, another first for what was a very, very long sort of five or six weeks of travel. Lots of great conversations and
I think there, Anthony, the conversations were interesting. It's been a while since I've been in country, into the US, and there's been a real change.
Anthony Woodward (08:24)
Yeah, I think that has, but I do. And I'd love to get to that topic, but do I talk about best experience? And, know, I thought you guys had bring this up, but you didn't. We did win best pitch, the best value pitch at RSA. Not going to point out who it was that did the pitch. No one pointed out who beat the other. Congratulations. we did manage to take that. So that was my best moment. I just, I'm a little sore Kris that you didn't bring it up, but let's go talk about you and what you want to talk about.
Josh Mason (08:42)
Congratulations.
Anthony Woodward (08:52)
The vibe, and I'm in Seattle today as is Josh, I'm still on this trip from when we began in April and it is almost June. So the vibe in the U S you know, in terms of data governance, let's stay away from the political podcast elements was really interesting. And it changed actually, I've been here so long, actually sort of palpable change over the process. Cause when we walked on shore and started talking about DOGE and the carts and
For the listeners out there that have never attended IAPP, a lot of federal government is held in Washington DC, which was the start of my events out there. There was a real somber point and a lot of people, you know, very concerned about their own jobs, very concerned about what was happening and a real mood of really like being here. We haven't seen a lot of progress, as I said earlier, but not sure what the future held, feeling very, a bit jarring. mean, is that how you saw it? Yeah.
Kris Brown (09:47)
I think, again, I probably got in maybe a week or two before you and landing into DC, walking around, there was the federal job cuts and the DOGE related cuts were very jarring at the beginning there. And that did create a bit of context around the way people came out. think even for Sea Air Space, there was a pausing of government credit cards and things. And so there was a real worry there for a moment that of the 15,000 people who were attending, it's probably going to be military staff.
Were they going to be able to pay for their hotels and other things at this, what was their big event for the year? I think all of that was resolved in the end of the day, they dealt with those things, but not to be too political about it, but it was, yeah, was, there was a topic that it was really sort of driving home. The reverse of that though, is as you then dove into the conversation, it's like, privacy was that piece where it's like, there won't be, there was the AI legislation, the executive order that's like where there's going to be new AI legislation. There's the.
increased cross regulator coordination across the groups as well. That's been brought in. The US state regulators are very much getting, you know, they're starting to sharpen the pencil and bring in their own regulations due to it not being a focus of federal. You know, obviously FTC came in with a new set of priorities and even just at the simplest level, you know, the cookie and digital tracker compliance coming in and people were talking about that at IAPP. There was a very heavily moving landscape. A lot of things were being spoken about.
I think the interesting topic at IAPP was a lot of talk about the Fourth Amendment as well in terms of people crossing borders. There was obviously a reticence for people to be traveling in and other things just with the uncertainty of what was happening. I know even in my own piece at the back of this trip, we had a bit of bit of personal time and I invited some friends to come and join us and some of them were really worried about jumping on a plane and crossing borders and doing other things.
It was really, really interesting. Josh, I don't know about yourself, but did you say you're the homeland of it, right?
Anthony Woodward (11:45)
Well, as the American in room, he's the only one the Fourth Amendment actually applies to.
Josh Mason (11:50)
Well, look, I'll start with, I guess, kind of a funny impact from the DOGE tariff thing. I did notice one booth that I talked to at the RSA conference and they were extremely light on their merch. Like the table was pretty empty and I did talk to them about it briefly and they were actually impacted by the tariffs. So they had their merch was coming in, stuck at borders, stuck at customs there and ⁓ didn't quite notice it in time and didn't pay to get it released. they were a little short on merch from that one.
Yeah, I talked to some of the other organizations. We've seen a little bit of this, but I think in other countries, the relationship with working to the US is changing slightly. Look at some of the Canadian organizations, and they're moving from data sovereignty to operational sovereignty. So not just trying to keep the data in Canada, but they're concerned about potentially having some of that data in US data centers, but also having potentially US workers managing systems that are in Canada. ⁓
Some more organizations requiring that operational sovereignty as well. Also a bit of an increase in just cybersecurity reviews kind of doubling down a little bit more intense look at US-based companies globally.
Kris Brown (12:58)
Yeah, look, was a really interesting trip. And I recall I ended up in a, it was part of a tour with one of the events and we went out for the day to do have some fun. And I recall just listening to some others who were on the bus, was a public bus, and they were talking about how they'd been affected by sort of the DOGE funding pieces. were scientists. There was a bunch of research effort that they were obviously involved in. the topic got all the way down to, you know, they were having to kill off a number of rats, like
and the way in which they were dealing with the science and bring forward a bunch of results and other things. you know, it's certainly created a conversation, not only in industry, but across the country. It was really, really, you know, very eye opening, not something that you traditionally would hear spoken about a lot in public places. All of the keynotes, the major events we're talking about, you know, the what, need and desire for efficiency at the end of the day. there's no...
I don't think there's anybody saying we shouldn't be more efficient. We shouldn't be, you know, have, have better processes and other things that be, and have a look at how we're doing our spending. think there's the very much an acceptance of that. So it was, but it was interesting as conversations that I'd probably not heard coming in and out of the country before and having lived there for a while as well. But let's now have a look at how AI and AI governance was discussed. I said, at information now info next in Savannah.
Literally every single stream was AI, AI, AI, AI. And then again, at IAPP, I know Josh, and I both saw a lot of it. AI governance was very much the net new product that everybody was talking about and everybody's at different levels. Certainly some organizations are very much talking about the process of running your AI governance team. Others are talking very much about the, do you technically manage what's coming in and coming out? Others are talking about where are we dealing with
access to LLMs and access to the co-pilots and open AIs and others. Certainly at some of the events I attended, the conversations were sort of very much 18 months plus behind. You know, it's like, Hey, this is what AI is and a not so successful keynote at one event. I won't point it out too hard, but there was certainly one unsuccessful keynote where they tried to attempt to have a conversation with AI on stage in front of everybody. And it was a bit of a flop, but
You know, everybody's been there and done that. Even myself, think, you know, past 12 plus months ago at a different conference, I did something very, very similar just to show people that it's the myth, the legend that is AI isn't there to bite you. It's there to help. And so it became very, some of them were very simplistic discussions around, you know, just those basics of using LLMs, but things like Google connect, there was obviously a lot of talk about that next gen of agentic AI and its impact in organizations and Josh, know you had the opportunity to attend, but
What was going on there? What were they saying?
Josh Mason (15:47)
the Google Next? lots of talk around agentic AI connecting, basically having AIs be more than just ⁓ a single thing to help you with to write an email or fixed formatting, but really the interconnection of multiple agents together to form bigger problems that can actually do thinking. One of those use cases that I'm really interested in and met with a few of those vendors actually at RSA was the kind of autonomous security agents. And because one of the things that
One of the things that companies really struggle with organizations is the volume of data that is coming in from all of the different systems that you connect to. You have your, you know, your SAS platform and your DAS platform and your IaaS platform and your WAF and all these alerting and security systems that are all producing alerts and ⁓ all producing alerts. And so there's just massive volume of data. And so trying to figure out what do you prioritize, which things are really important, which things are the things that I should do, you know, having these AI agents be able to go through.
analyze those results and bring those to the surface is a really good use case. So there was quite a few organizations working to try to solve that problem using agentic AIs.
Kris Brown (16:52)
Yeah, certainly I'm loving the ability now to just not ask it a straightforward question, but it's like go away and prepare me a bit of a brief around this, that and the other, giving it that more detailed research capability has been giving me a lot of value and looking at some of the the agentic AI use cases that I've seen kicking around in the last few weeks. It's just absolutely mind boggling.
And there's lots of risks and there's lots of areas around compliance. And Anthony, I know you've used this phrase before and I'm a big fan of it, which is the building the plane while flying it, fixing the car, changing the tire while driving it, these sorts of things as it relates to managing AI risk and compliance. What were you seeing during this period?
Anthony Woodward (17:36)
Yeah, I think I had a slightly different experience than both of you. I particularly at somewhere like RSA where I did a lot more executive conversations, a lot more folk that were C level or board level, as well as investors and those kinds of people that you see in San Francisco where the conference is held. So lot more conversations around AGI and when AGI is going to fall on us and how, definition you want to give to AGI.
how you can get ready for it what that means. So was really interesting. It's not a conversation I think we've had either here on the podcast or a lot elsewhere when they were talking about what I want to talk about. So AI governance and data governance, they were then really drawing these parallels to AGI. So that was super interesting around, you going to help us control that? And what does that look like? And kind of building those things. And that's the building the plane while flying it, right? If you believe the doomsday kind of view of AGI, well,
was literally trying to repair the wings and it might crash while we actually kind of understand what this thing does and how it's effective and how to govern in the enterprise. So I thought that was, that was, that was really super interesting to kind of observe that and those conversations going on at that level. I do have a really, I think an interesting anecdote that my children are going to hate, but for the first time in terms of the agentic AI that Josh was talking about, I built an agentic AI to order my children food when they couldn't agree.
on what to eat for dinner. So I built them a little app that sent them an SMS or like a WhatsApp message and said, what do you guys want for dinner? Pick from list. And if they picked together, they could go off and make it themselves based on ingredients that my wife was with me traveling that they could go and make themselves. And just for anyone on the podcast, my children are over 18. They're more capable in theory of looking after themselves. I'm not sure about feeding themselves, but I built an agent and when they didn't agree, it would just send a pizza delivery with their normal pizza choices.
Josh Mason (19:29)
What happened?
Anthony Woodward (19:30)
It kept freaking my children out because they hadn't worked out this pattern that the AI was basically looking for the non-agreements and then just delivering a food I know they'd eat. And so that was, think, the first time I've actually implemented, you know, when I say it, delivered it, I gave it my credit card. I gave it the instructions on what to do. I gave the instructions on polling out. When it pulled out, also checked if the dog was fed in that messaging. And then if they couldn't agree, would go off and process it without asking them. Pizza Man would just turn up.
It did take them two or three deliveries before they worked out what was going on. went, we keep getting pizzas delivered and we don't know how this happens.
Kris Brown (20:07)
OK, so just so we're very, clear, Anthony, what you've just said to me is that you've solved the problem of I don't have to make decisions about dinner ever again. Is that what you're saying? Why are we not published this and why are we not making money from this today? Anthony, I'm really not sure that the. ⁓
Anthony Woodward (20:28)
is all I can do is get pizzas and I'm not sure that's a healthy diet but
Josh Mason (20:31)
You
Kris Brown (20:32)
We
can resolve that problem down track. We're building this plane while we're flying it. The fact that I may need help with obesity down the track. We'll fill that in. It's fine. I love it. It's a great anecdote. And I said, certainly I will be looking for help with that starting very, very soon. Yeah. So if they just, you know, obviously all jokes aside.
I want to get now to a little bit of take on that AI conversation as it came to cybersecurity. both were at RSA. I know we've spoken about it a little bit, but what were the conversations, you know, really focused on? Josh, know you gave us a little bit of a touch on there, but adversarial AI was a topic that came up. know, the obviously the AI governance of efficiency for scale. Interesting comments that you made, Anthony, around that, you know, that artificial general intelligence or AGI that
is the next level. You know, it's going to come and do my job for me. What was the RSA take? What was the overall, or what were your takes?
Josh Mason (21:31)
Look, think, again, lots and lots of solutions, lots around how to deal with the volume of data that's out there and how we can use AI to calm that down and prioritize work. A lot of conversation around the adversarial AI side. ⁓ Verizon released their, what do they call it, the Data Breach Investigations Report, actually came out while we were at RSA.
Anthony Woodward (21:52)
our audience, Josh, what is adversarial AI?
Josh Mason (21:55)
Yeah, this is basically utilizing the AI platforms to basically act as an attacker and you can effectively use these things to automate attacks. so what we're seeing is, again, what came out in that report is 60% of breaches are occurring because of humans within organizations. They're basically getting fished. And so these adversarial AIs are able to effectively automate attacks and you can deploy a force of these AIs to...
act as attackers and fish information out of particular users, which is why organizations should be utilizing companies like NoB4 or Cofence to test themselves and make sure that they're resilient to these because the amount of phishing attacks kind of coming from these has increased. A kind of funny part of that too is, because we were talking about DOGE earlier, is the government was actually using Reuters special services. TRSS was one of the things in the news that popped up a few months ago that
DOGE cut as though the government was kind of funding a Reuters ⁓ news organization. But it turns out that it was actually a part of Reuters that actually was helping the government be more secure.
Kris Brown (23:03)
What about yourself, Anthony? What did you see in those spaces?
Anthony Woodward (23:07)
Yeah, look, said, at RSA, ⁓ I had a very different conference to Josh. I had some slightly different objectives, but I think again, to paint for those that have not been to RSA, RSA is huge, right? It is probably one of the larger conferences I've certainly attended in terms of attendees, but probably in, probably the bigger measure, the amount of spend of
the vendors and the conference and everything there. It's just, it's quite mind blowing. What I think I saw a lot of though, was a lot of conversation still about AI at the boundary. So we, you know, we certainly talked a lot more about AI. There's a lot more conversation of AI. Everybody's got AI everywhere. However, it really still came down to how am I building an AI firewall? How am I building an AI ring fence? How do I make sure my staff aren't sending stuff to the AI?
I mean, I would ask questions like, well, what's the AI? Because you're just going to go Open AI or ChatGPT, but there are 50 other flavors, right? And some of those flavors exist on your phone. They don't even exist on a website that you can screen out. And so I think it was really interesting to see this. It's good to see the evolution, but I don't think we've yet seen the industry really grapple with something we've talked a lot about on this podcast, that AI is essentially
being able to govern AI and manage AI is all about the data. And they still seem to want to be talking about the perimeter and not the data.
Kris Brown (24:37)
Barrier jumping is back. But again, same problem, right? We're working from home, the majority of the workforce is still remote. ⁓ Even if there are mandates to change that, you're on your own devices, you can do these things. doesn't nothing stops me even if I have two devices from looking at one screen and typing it into another, we've got the traditional security elements, but it's the same thing. They lost the battle of I shouldn't have a private email address that happened a long time ago.
Josh Mason (24:38)
Yeah.
Kris Brown (25:03)
It's very, very interesting that we're still talking about those things, especially at a cyber event like that, Anthony. I am really surprised to hear that.
Anthony Woodward (25:11)
Yeah, I'm really stereotyping what was a really big conference and there were lots of different conversations and there were lots of different pieces to it. But I do think there is an evolution for us to go through. I mean, think talking about evolution, something that I did see come up, which is a little bit less AI, but certainly was part of my conversations was quantum readiness. Now, it's the first time I've, I think you and I, Kris, have been talking about this. And I know you have as well, Josh, for a good few years now, but...
They've been very much inside the wall conversations rather than outside the wall conversations. And these are the first times I've actually had people asking me when we talk about data and we talk about encryption, we talk about protecting that data. How quantum ready are you? Did you observe that as well while you're out there, Josh?
Josh Mason (25:55)
Yeah, definitely. That's a bigger topic now. I think, ⁓ you know, as we keep seeing more reports in the news about quantum becoming less theoretical and more real, there's been real advancements in the space. know, organizations are having to really figure out what is what does this mean? And what do I really need to do? And, you know, really the biggest risk around the quantum technology. And this depends on who you talk to, whether we basically say it's it's really in the market in a year or five years or eight years or 10 years is really
around harvest now, decrypt later. You know, it's really about protecting your data as it sits because somebody can basically come around and steal your safe. Basically, they can take that data today, bring it back. And once that technology becomes more readily available in whatever time period that is, they'll be able to decrypt it.
Anthony Woodward (26:40)
describe that more as almost, and yes, I'm a Trekkie, getting a replicator and replicating is safe, but not having the key and then being able to build a key later, right? So you don't even know that your safe was stolen. That's right. It went into the Star Trek replicator, one of Kris's favorite machine. And so they've got that and whatever was in it and later on, they'll be able to open it.
Kris Brown (26:59)
Jedi's is a belief system, this replicator rubbish, just, I'm not so sure.
Josh Mason (27:05)
Yeah, that's where this comes back to in terms of so what can you really do today as an organization around quantum is understanding where your data is, where you have risk, and it's mainly so you can get some focus in, do you really need this content? Are there additional cyber security controls you can put in place? Are you reviewing the assets controls where those things are? Do you have good infrastructure security? It's really about protecting that data so that people don't get it now.
Kris Brown (27:30)
Josh, look, you're starting to make us look bad. I just want to be really clear. You've come onto our podcast, you've stepped in, you're just delivering messaging back on, back on, back. Anthony and I, we're discussing Star Trek and Star Wars. It's interesting. Time to cool your jets there a little. But no, I agree. And look, it's interesting, as it's at the rise and rise of RSA, the rise and rise of cybersecurity, the conversations that people have to have. And I think if I was to play the on-message bat, even in the other conferences, it was...
that message of you just need to know where your data is and what it is because you can't spend infinitely on cybersecurity. But the harvest now decrypt later, the replicator story, it's an interesting one. And it's certainly why there's a real focus on this. And I say that with my tongue very firmly pressed in my cheek, Anthony, I can see you smiling. But yeah, we've also been speaking. didn't grab him on the conference, but Joe Pierce had a product. Yeah, he was also at Google Next.
One thing that he highlighted that came from that event was, was oversharing. Um, I think we touched on it a little bit, you know, it sounds like Google starting to learn that lesson, especially the one that, Microsoft had sort of learned with the SharePoint space and with the advent of, of just AI in general, that, know, this is becoming more and more urgent. Anthony, this very much ties into something we've been talking about, which is that identity security is the new perimeter. You sort of, we've had the conversation and everybody was talking about the perimeter itself, but.
that identity security element. Can you give the audience a little bit more there about what you were saying? ⁓
Anthony Woodward (29:03)
I think I've been talking to Joe a little bit about the notion of oversharing and the notion of how that works. So let's unpack that a little bit. You know, when we share something, you you go into these systems and whether it's SharePoint or Google workspaces or similar tools, you'll go into the little share box and you put in a person's name and that's going to go off to them, right? And this is happening cross company now, it's happening inside a company, we've got all sorts of...
security permissions that come out of these very easy processes. And it's good, know, we, certainly recommend encourage it to be really easy for the users to go and share this information, interact with this information. But what's happened though, is the security providers, you know, when you log in with your Google identity or you log in with your 365 or your Microsoft identity or Facebook or whatever is your sign on tool of choice.
that's actually become the new perimeter because that's the key to all of that data. And when we think about data itself and how it describes the security that should be mapped to it, that's getting more and more difficult with this sharing. And then on top of that, you then think about how that's being fed into AI engines and AI engines are starting to have ways to restrict their responses based on the security of that data. But
Nobody spent any time managing this security. And in fact, it's just gotten a whole bunch worse with the scenarios I just laid out. So we do have two really big problems as we're now rolling out AI across these data sets. How are we dealing with the identity perimeter issue? Because we all, if you're logging into a system, now have an identity perimeter. It doesn't matter what firewalls on your computer. It doesn't matter whether you're logging in at the airport or on the
the Qantas lounge's computer, you've actually shared your identity perimeter and that may not be clear where that's going. And that's one thing to think of, but that identity perimeter is also being fed into the AI and is now having implications on that AI. And so there's a lot to, I think we could do a whole podcast on, on this and where that's going. And there's a lot of work without getting too much into what we're doing here in roadmap and in RecordPoint There's a lot of work to go fix this problem.
Kris Brown (31:15)
You know, said it certainly, does make you think and we've all experienced it in the information governance space is that they are trying to control security of records over time as the business grows. We've seen the mess that can curated just inside a SharePoint environment. As you say, we're now adding this to literally every single piece of data and then training something in the AI to understand it as well. And so all of a sudden.
I'll link back, if we will, to Josh's adversarial AI. So now we have AI learning about how we actually interact with the data that we've got, which we don't know where it is. And now we can train AI to not only know how to implement that, but also how to attack that and use us. still the weakest link in these pieces, right? Like I'm seeing more and more, even just through your own personal life, the need for two-factor or multi-factor authentication, but even just the headache that that creates.
in your own world around banking and other simple things, add in our corporate culture, security across data, and where you want to work. It is huge.
Anthony Woodward (32:18)
Yeah, look, and again, I think those themes were things we certainly saw across the conferences, right? It was, yes, RSA is a security conference, but whether it was IAPP or some of the other conferences we all attended, there are these common sets of themes. And I think what underpins all of that is that most organizations still have a ways to go to think about their data estate. And that was something I still heard consistently over and over again across the conference season.
Kris Brown (32:45)
I don't know what I have. I'm still at that. don't know where I have stayed. So that is still the scary piece. And I think from an industry perspective, the practitioners and the listeners, this is still hugely important to get control, get an understanding, get an inventory even.
Josh Mason (33:01)
mean, the nice part is that work that you're doing in that data governance, information governance space, that journey that you've already been on, is now has a larger value, a larger payoff in the end, right? Because you're also working towards not just having life cycle management controls in place, reducing your risk, but you're getting ready for that post-quantum world. You now have AI as a new value point in the future that you can leverage once you get those types of systems in place. So I think the nice part is you're just adding more value to the journey that
organizations are already on. I was
Kris Brown (33:32)
explaining
this Josh to someone the other day, because they're asking, you know, sort of, know, what do we do and how do we do it? And they were outside of the industry looking in. And I said, for a long time, you know, 30 plus years in this space, for a long time, it's been very much about thou shalt be compliant. You he's had to avoid risk. It was very much for want of a better term, a fear cell.
There is so much value that is now added by having a good understanding of what's going on here. It's probably for the first time in the last couple of years, we're very much looking to talk about the positives of what we can add to the things that the businesses want to do. And it's tangible. It's real. You're helping security, you're helping privacy, you're helping risk. And they need the help because data governance is where...
The baseline of all these, it was the snowflake, a recent snowflake podcast where it was AI governance is 100 % based in a good data governance platform. they, you know, the, to paraphrase, good AI governance comes from good data governance. just, it's a simple statement and it's a baseline statement. And I think that was an overall message that I heard a lot across those conferences.
Anthony Woodward (34:45)
Yeah. And I think, you know, probably winded up there because there was so much to cover, but I do want to thank everybody we actually met at the events. If you happen to have got all the way through to the podcast at this point and we spoke to you, it was really great conversations and it was really great to catch up with a lot of people that we've met over the years and getting out there. And the one thing I did notice this year was a lot more people attending the conferences, which was good because you know, really probably the first time I've seen a conference is back in force since COVID in many ways.
So it was really good to connect to those people and to meet. I actually met a couple of file listeners, which was kind of cool. So, you know, thanks mum. It was great to see you out at the conference. Cool. Thanks for listening. I'm Anthony Woodward.
Kris Brown (35:30)
And I'm Kris Brown and we'll see you next time on FILED.