Episode 13

Privacy maturity requires information governance maturity | Dr Darra Hofman, San Jose State University

In a world of rapidly advancing technologies like, and constant threats to privacy, are records management and information governance professionals being left out, right when they could be most useful?  

In this episode, San Jose State University School of Information assistant professor and Masters of Archives and Records Administration program coordinator Dr Darra Hofman joins Anthony and Kris to discuss why records professionals need to claim a seat at the table.

They also discuss:

  • Their research into the relationship between privacy and transparency from the perspective of recordkeeping.  
  • Striking the balance between anonymity, transparency, and privacy.
  • The importance of provenance and contextual markers of data.
  • What recordkeeping can offer large organizations developing Large Language Models (LLM).
  • Why records management has an image problem
  • What is coming for information governance policy in coming years.

Resources:

Transcript

ANTHONY WOODWARD

Welcome to FILED - a monthly conversation with those at the convergence of data privacy, data security, data regulations, records, and governance. I'm Anthony Woodward, CEO of Record Point and with me today is my cohost, Kris Brown, VP of product management. Hey Kris, how are you?  

KRIS BROWN

Very good, mate. How are you?

ANTHONY WOODWARD

Good. It's a sunny start to 2024, pretty much around the world from what I can say.  

KRIS BROWN

Not so much here in Brisbane today, mate. We've got a little bit of a cloudy start. It's been a very wet break. Sort of, unfortunately, yeah, we didn't see a lot of sunshine when it was here, it was very hot, but you know what, new year, we start again and I'm really looking forward to this season of FILED.

KRIS BROWN

Absolutely. And I'm particularly looking forward to this episode with Dr. Darra Hofman. Hi Darra, welcome to FILED.  

DARRA HOFMAN: Hi, thanks for having me.  

ANTHONY WOODWARD

It's really great to have you. For those in the audience that don't know Darra, Darra has an amazing background, and has done an amazing amount of research. But to formally introduce you, Darra, you're an assistant professor right now at San Jose University in the School of Information, is that correct?

DARRA HOFMAN: That's correct.  

ANTHONY WOODWARD

And I believe you're also the program coordinator for the Masters of Archives and Records Administration there in the school.  

DARRA HOFMAN: Yes. We call it MARA. That's my baby because I'm passionate about bringing records education to the world. We desperately need it, I believe.  

ANTHONY WOODWARD

Fantastic and look, we couldn't agree more.

ANTHONY WOODWARD

It's so great to have you on. I've actually been really looking forward to this episode. So, thanks for making the time.  

KRIS BROWN

Thanks, Darra. And look, I've had the opportunity to have a bit of a read of one of your dissertations and certainly some of the research that you've done. You know, just, I'd love to maybe start with a little bit of background if you want to sort of help us understand, you know, some of your favorite subjects.

KRIS BROWN

As I said, I bet I can guess one already to see the beginning here. But, I think, you know, obviously records and privacy and maybe just a little bit about your background in general.  

DARRA HOFMAN
Sure. So, I always joke that I was actually an attorney back when dinosaurs walked the earth for about three minutes, realized that I didn't actually like being yelled at, at three in the morning by people who were in the middle of divorces. So, I went back to school. I got my Master of Library Science and then my PhD in archives. I did my PhD at the University of British Columbia under the incomparable Luciana Durante, and I also had the privilege of studying under Victoria Lemieux.

DARRA HOFMAN

And so, of course, a lot of my doctoral work was all about, you know, archives and the way that we can bring that sort of archival understanding to these really cutting-edge technologies. A lot of my doctoral work was, of course, under InterPARES Trust with cloud computing, and then with Dr. Lemieux looking at blockchain through that archival lens.

KRIS BROWN

I have a very small background with BC. Back in the day, I was working for a competitor, and we implemented a product at the BC government there and had the pleasure of working with Ox and ARX. That was one of the more interesting pieces of work that I've had to do over the years. If you can maybe just give us a little bit more background, then on the dissertation itself.

KRIS BROWN

So certainly, the one relating to how you're actually trying to tie that transparency element with privacy and records, which is a key theme here for us at FILED.  

DARRA HOFMAN

For sure. So, where that originally came from is that the head of Google Canada gave a talk to the House of Commons here in Canada and they were saying, "oh, you don't need to worry about your privacy because, you know, we at Google builds privacy on these four pillars."

DARRA HOFMAN

Dr. Durante said "Oh, you know, you should look at this guy's talk" and when I looked at that transparency pillar, it's the one that really. stood out to me and so I wanted to really understand "How do privacy and transparency... what's the relationship there?" Because, as archivists and records professionals, we're all about transparency.

DARRA HOFMAN

That's the place where we live, right? Through my dissertation work, I was able to realize that, it's not a one-to-one, it's a many-to-many kind of relationship that encompasses a lot of context and yeah, messy humanness.  

ANTHONY WOODWARD

It's such an interesting topic, and it's one that we really like exploring at RecordPoint and here on the FILED podcast, and it comes up a little bit in some of your work around the connection between anonymity and transparency.

ANTHONY WOODWARD

And so, one of the factors we see that we certainly are thinking about a lot, and we think is something interesting for the landscape is, in a world where you can save data on pretty much any aspect of anybody's life - and I think this goes back to your point around what Google were talking about - how do you deal with what needs to be anonymous versus what needs to be transparent versus what is better for the greater societal good? And that's what I really saw shining through in your papers.  

DARRA HOFMAN

I think if I had the answer for that, I would be a billionaire, but I think that that's part of where we have to go back to record keeping.

DARRA HOFMAN

Datafied technology concerns me because really kind of our notions of privacy, our understanding of transparency has really arisen along with record keeping as sort of a socio technical infrastructure, right? And so, we have this inherent understanding of recorded information as having these privacy norms embedded in it, right?

DARRA HOFMAN

And so of course, when we're talking about things that have crossed the archival threshold, we have all of our ideas of trustworthiness and it all hinges on provenance. And then when you look at things like, you know, Helen Nissenbaum and the contextual integrity folks, who are working on that.

DARRA HOFMAN

Again, we have these ideas of sender and transmission. What we as record folks know is like fundamental archival diplomatic type concepts but in this datafied world, we've really stripped those sort of contextual markers away that really let us have a meaningful understanding of what should be transparent, what shouldn't be transparent, and especially when we start thinking about how to use these systems for things like accountability, which of course is like that traditional transparency function from the record's perspective.

DARRA HOFMAN

You run into this real challenge of - Malcolm Todd, the Canadian archivist, wrote about this in like 2006, this empty archives problem of how can we use things for transparency if we're going to strip them of all of the personally identifiable information, because then you lose that relationship that lets you make accountability, right?

DARRA HOFMAN

And so, I wish I had an answer. You know, my answer is always to double down on record keeping and really. We need to move beyond, I think, the data paradigm to understand things in this more contextualized and human way. And it really forces a lot of, if we go down that route, it forces a lot of accountability back onto the folks who are building and using these datafied systems, which, of course, no one wants.

ANTHONY WOODWARD

Absolutely, sort of, because everybody wants the shared outcomes that are valuable, right? We see the conversations now about large language models and ChatGPT and, you know, not going to ask you about the pending new litigation with the New York Times and other things that are going on, but those tools are really powerful, right?

ANTHONY WOODWARD

When you bring together these large datasets and are able to make them actionable, but we also don't want to expose ourselves, I think, as individuals to the privacy fatigue, if I can use that language that comes with that level of exposure. What's really interesting I think in this podcast, and we like exploring is what do you think of the fundamental building blocks of how we should think about the treatment of information versus records versus the privacy paradigm, where do you see that?

DARRA HOFMAN

Right, right. I feel like the word out of my mouth is context, but I think that's really kind of the place for so much of this does end up gelling because, of course, I mean, you know, you look at the privacy paradigm in the US and it's very sectoral and there's a good logic behind that in the fact that, we do have this intuitive understanding that our health information is different from our banking information and should be treated to have different norms around it. And so, you know, to a degree, these kinds of laws attempt to capture that contextual understanding. But especially when you start dealing with like these late large language models, you know Emily Bender and Timnit Gebru wrote about the “stochastic parrots” problem, right?

DARRA HOFMAN

We need so much data to feed these generative AIs and I think we get a little bit less than careful about where we're scraping all of this data from because, again, there's just the need for so much of it. And so, I think, again, it comes back to my mind to context, to all these things that we have kind of inherently at this point in record keeping that we have yet to do in these datafied systems.

DARRA HOFMAN

And that doesn't mean we can't get there, it doesn't mean we won't get there, but we're not there yet because, of course, we also don't have, literally thousands of years of evolution of these systems like we do with record keeping.  

ANTHONY WOODWARD

And it's a great point. It's almost the paradigm here and the juxtaposition is that these large language models are actually probably going to give us more context.

ANTHONY WOODWARD

So, by the very notion of having these large data sets, we're actually able to create context to then look at how policy makers can start to prescribe different actions over those contexts. And I think that's an interesting topic. Where do you see, if we think about our listeners who are really grappling with, how do I think about my records program?

ANTHONY WOODWARD

How do you think about my information governance processes and the policymaking evolution that's going to occur? You know, January 2024 is the time when people start to make predictions for the year. Where do you see the policymakers going over the next four or five years in this space?  

DARRA HOFMAN

Oh, boy. So, like, I think we're going to continue to see this really strong divergence between the US and the EU.

Darra Hofman

So, for example, in the EU, we have the AI Act. In the US, we're just starting to see like, you know, California, Virginia, Colorado, we're starting to see these GDPR types of regulations coming down the pike. But I think one of the things that especially as AI becomes more sort of center stage, one of the things we're going to start to realize is that this sort of data protection model doesn't necessarily get at the fundamental problems that we're trying to solve because, when I talked about in my dissertation and I talked earlier about privacy being an umbrella concept, privacy has really kind of become this term that we use, at least, informally to mean all of the different ways in which these large technological systems undermine my autonomy as either an individual or a community. And so that's a whole lot of problems beyond just data protection problem. And so, we see some of that in like the EU AI act, and we are also, of course, seeing all of the pushback about how we actually implement this in a meaningful way.

DARRA HOFMAN

And so I think one of the things that we're going to continue to struggle with is this ideal of sort of technologically-neutral legislation and policy solutions versus the very messy, real, like, everyone who's ever worked with records knows that you can have ARX and orcs all day long, but each record center is going to have very individualized needs based on the people that they serve, right? And so, trying to solve these problems at this large policy level, this large, and I do believe we need more regulation, but trying to solve them at that level without also addressing, you know, even who's in the room, right? Like, who's building these systems? How are we dealing with them? Because you know what? I can tell you in higher ed, our students are using ChatGPT right now, and you see a real divergence in approaches.  

DARRA HOFMAN

You know, you see a lot of folks who are saying, okay, you think ChatGPT is per se plagiarism and if I catch you, you fail. I tend to be towards the far other end, use ChatGPT, generate your first draft and then identify the delta between what that and what an actual professional would do, and then show me your delta and give me your product because one of the things that these systems, I don't think we'll ever do. I don't think we'll ever see general artificial intelligence. I don't think they'll ever have judgment and that's sort of that like implicit knowledge piece of context that is so hard to capture. So how we actually will address those from a policy and regulatory standpoint, I think part of it is recognizing both the need for regulation and the limits of regulation as a solution.  

ANTHONY WOODWARD

One of the things that we've been discussing here at RecordPoint is the notion of burden. So, the burden on those folks that need to actually build these records management systems, the burden on the consumer in terms of understanding, these ridiculously long privacy statements put out by various entities out there and then the burden, as you say, of understanding the utility of a Large Language Model like ChatGPT and what it is and isn't, because people don't understand that. How do we go about educating everybody? You know, I think as people listening to this podcast are all going to be information management professionals, we're all coming from that perspective.

ANTHONY WOODWARD

How do we bridge that burden gap? Because it's big.  

DARRA HOFMAN

It is. It's huge. I think that's actually one of the biggest kind of calls for information professionals in general and records professionals in specific over the next like 10-15 years, is we need to be much more vocal. We need to be at the table. I'm really privileged and honored that Mozilla chose my team to implement a Responsible Computing Initiative for the whole of San Jose State University over the next year.

DARRA HOFMAN

Because we're coming from that information professional perspective where we're able to say, yes, of course, we need our computer scientists. We have wonderful computer scientists on the team, but everyone is impacted by this tech ecosystem, this information ecosystem, and we all need to have a seat at the table.

DARRA HOFMAN

And as the specialist in trustworthy information and in things like information literacy, we need to be sitting at the table in all of these discussions, and so bringing those tools and that power to our students is something that I feel incredibly privileged and driven to do on the education side but I think it's going to take a lot more, you know, of course, in higher ed, we do what we do, but I think that it's going to require sort of this wholesale shift in, you know, we need to stop looking at information professionals as shushes and cardigans on the library side, or boring people sitting with dusty manuscripts in the archive.

DARRA HOFMAN

But as folks who have that really rare and valuable set of soft skills, information skills, and tech skills, you can't graduate an archives program anymore if you can't do, you know, metadata, right? So, bringing that unique perspective to these problems, having us at the table in Silicon Valley, I think that's how it has to be moving forward.

ANTHONY WOODWARD

Great answer. And congratulations on, I didn't realize that you'd won the RCC, the Responsible Computing Challenge at Mozilla. So, congratulations on that.  

DARRA HOFMAN

Oh, thank you.  

ANTHONY WOODWARD

Because I do know some other people that went for it and weren't, didn't quite make it through and that's not actually an easy part to get through.

DARRA HOFMAN

I'm very blessed to have a wonderful team of, you know, we have a cultural competence specialist, an AI specialist, and a computer vision specialist, and so, and folks who just really support that vision of computing technology as a broader, you know, socio-technical infrastructure. You know, what it really comes down to is we understand records as an infrastructure, and we don't yet understand computing as an infrastructure in a lot of ways.

KRIS BROWN

That's a really interesting comment and, and I really like the records as an infrastructure, you know, piece of that. And certainly, I think the big thing for me and the focus here at RecordPoint and obviously the podcast as well, is of all organizations that are sort of holding sensitive data and making sure that you know that they're doing the right thing, but your research has been going a lot wider than that.  

KRIS BROWN

And we've sort of touched on it a little bit here, but talking about, you know, how communities and groups are struggling with. You know, privacy and transparency. If you don't mind, Darra, could you just share a little bit more about what you uncovered there as it relates to that sort of that wider context, not necessarily just me as an individual or me as a consumer?

DARRA HOFMAN

For sure. So this is, an area of passion for me because, like I said, when we talk about privacy and we're talking about all the different meanings of privacy, one of the things that really gets neglected because, of course, in the U. S and Canada, especially, we have like, individual rights and liberty orientation towards what privacy is and means, but there's a lot of communities for whom community privacy and even community secrecy are essential for survival.

DARRA HOFMAN

Queer folks are one of the communities we've worked with. Native American and Indigenous communities have a lot of cultural knowledge, sacred knowledge, that's just been straight up stolen over the years. There's a lot of stuff in our archives that shouldn't be there but there's also this push, I'm really privileged to work with the Northern Cheyenne Tribe on trying to preserve some of their cultural heritage and finding the balance of these tools, That are really aimed toward extracting as much data as possible and building in the safety and the protocols to make sure that things that shouldn't be shared beyond the elders are only accessible to elders, things that should be divided along gender lines and especially things that should not be shared outside of the community.

DARRA HOFMAN

We don't really have good regulatory infrastructures for that, and we can, of course, do that in record keeping, you know, the classification code is everything right, but building the systems in ways that reflect the ways of knowing that are not necessarily our traditional ways of knowing. That's something that we've been kind of having to like, you know, Of course, there's wonderful tools like Makutu and all that, but a lot of it's been very sort of hands-done because that's not the dominant way that we think about privacy.

DARRA HOFMAN

And so, you know, privacy is kind of like the Gordian knot. Every time you pull, you find 10 more strings that are tied up into it.  

KRIS BROWN

Yeah, beautiful.

ANTHONY WOODWARD

We could riff on those topics, I think, for hours, but I do want to switch gears slightly and I know you've done a little bit of work and always find an interesting debate because I probably sit in a funny corner of it in my own work around the intersection of blockchain technologies, privacy, and then how we can provide control of data through those technologies. What are your thoughts on what we'll see in 2024 and beyond with the evolution of those technologies?  

DARRA HOFMAN

I think we're kind of finally over like the massive height for blockchain but the thing with any technology like I think we keep looking for that magic bullet, right?

DARRA HOFMAN

There's never going to be a magic bullet. No technology is going to solve our privacy problems because they're not technological problems, they're social problems, they're human problems, right?  

DARRA HOFMAN

And so, there's lots of things the blockchain is good for. It's very, very good for ensuring the integrity of records generated on the chain, for example, but there's also lots of different blockchain technologies. Plural, right? And so, when you're thinking about privacy and using it, just like any other technology, just like AI, you have to really think about the actual design, the affordances and constraints of the technology itself. And for us, that means, of course, thinking about, can we instantiate the archival bond in these technologies, right?

DARRA HOFMAN

How do we prove provenance? And like, you know, NFTs were super popular there for a minute, but do NFTs really have the same understanding of provenance as we do? I don't think so, right? And so, I think that these technologies are good in that they stimulate the conversation. They give us new tools that we can use, but they're just that, they're tools. They are not solutions in and of themselves.  

ANTHONY WOODWARD

We would concur, that's certainly our thinking as well, that it's just like a database technology effectively that has some strengths and weaknesses that can be applied. But I think what's really interesting that's come out of the debate around blockchain and other technologies like that is the notion of Privacy by Design built into those systems and being able to shift left in the capabilities that you bring for those processes.  

ANTHONY WOODWARD

One of the things that we've been advocating and starting to write about is the notion of bringing Records by Design and records management processes by design. I'm not seeing that really penetrate into the world, though.

ANTHONY WOODWARD

I don't know if you agree with this, but what are the obstacles to getting that, that mindset shift to happen?  

DARRA HOFMAN

I totally agree. So, one of the big projects I'm working on with my dear friend, K Royal, she's a privacy attorney and we'll be presenting our little framework at InfoNext in Palm Springs in April.

DARRA HOFMAN

But our idea of privacy maturity is that you can't have privacy maturity without records and information governance maturity. But part of the problem is that we're not in the rooms, archives and records folks, generally speaking. Of course, like you've got your very advanced folks, but a lot of the times folks go "Oh, there's people who do that, right?"

DARRA HOFMAN

Like we really have that sort of image problem that very old -I mean just look at the name. We're archivists. We're librarians We're tied to these places these institutions in the popular imagination and so the first like half of every talk I give when I go to a computer science conference is just explaining what archival science is and what records management is, because we have a whole paradigm and framework that folks need, but don't even know exists, right?

DARRA HOFMAN

And so, I think that's really like the very first barrier is just getting out there and making people understand that a lot of what they're looking for is. It already exists. Like, I have this real kind of moment, reading one of the contextual integrity papers where Maria Hildebrandt, who's brilliant, I love her work, was saying, what we need for contextual integrity is we need a way to track cinders and actors and all that. I'm like, we have that. It's called Diplomatics. So, we've had it for, you know, over a thousand years. So, give me a call, right?  

ANTHONY WOODWARD

Do you think there is some conversation to be had around branding? It's one of the things that we, even in this podcast that we struggle with generally is the language that sits around records management.

ANTHONY WOODWARD

It's such a broad topic, but it's so narrow in people's implementation of it in their own heads. Is that something that you think as an industry we should be attacking and, and changing some of these labels? I mean, I often find myself now just talking about data governance because records is just data governance.

ANTHONY WOODWARD

It's just another side of the coin and not using rep, not using records. Cause when I talk to the CEO of a large bank or the data team at a large bank, they put records into such a particular bucket that when I need to change those words with data governance, it's all good. And everybody's all happy and we can all move forward in a structured way.

DARRA HOFMAN

Yeah, and that's like one of the fundamental challenges to my mind because records are different from data governance, the functions, the things we're hitting are similar. But that records understanding is missing, and I think until we have it, we're not going to be able to do the things we actually need and want to do in terms of protecting privacy and transparency in the way we've always expected them to be protected.

DARRA HOFMAN

And so how we make that jump, that shift, because you know, we see all this like information governance and these are all really important areas within the field, but they all are united by that sort of fundamental records, understanding that we all have in the profession. So how we do that branding, is one of the most critical pieces of what we do, because one of the things that you see now, of course, is pieces of what we do being broken off into all these new sexy competencies that don't have that understanding those fundamentals that we have, and you see it getting done wrong, because it's not under that broader kind of vision, if that makes sense. My husband's a plumber and kind of where we've arrived at is that, you know, we're kind of like sewage. Everyone just knows it's expected to work. No one wants to talk about it until the pipe breaks and there's crap spilling everywhere.

DARRA HOFMAN: And that's where we are right now but no one wants to call the plumbers because it's not sexy. They're like, "Oh, why don't we bring in a drone to fix it?" Right.  

ANTHONY WOODWARD

Absolutely. But the reality of sewage though, I think much like the records management is that the very modernity doesn't exist without sewage.

ANTHONY WOODWARD

Like it's such an important process in the world that people ignore it. I love that metaphor. It's just fantastic.  

KRIS BROWN

Darra, I know we've sort of focused a lot on some of the formal research that you've done more recently, what are you concentrating on next? And I know obviously you just jumped in and said, you know, there's something you're presenting at InfoNext and I'll be at InfoNext as well.

KRIS BROWN

So, we'll definitely hopefully get a chance to catch up and meet in person, but what else is coming for you and your team this year and moving forward?  

DARRA HOFMAN

With the RCC, we're really excited. We have a speaker series. It's hybrid, it'll all be available on our website. So, you know, it's aimed, of course, at our SJSU students, but it's available to everyone.

DARRA HOFMAN

We're really excited. We have the language archivist for the Cherokee Nation who is going to come talk to us about digital curation and heritage. We have Alex Hanna from DARE, the Distributed AI Research. That's Timnit Gebru's organization, so she'll be talking to us about AI and trans rights. So, we're really excited to bring these various perspectives to the world.

DARRA HOFMAN

We're also working more on understanding transparency and privacy specifically in the context of AI and how we can train our models to be more contextually aware. And really, I'm just excited to keep trying to grow our Mara program and to develop opportunities for folks to really build that records awareness.

DARRA HOFMAN

Like I said, to my mind, this is one of the most critical, especially now that we're in this information age, record keeping is the critical infrastructure, and it's being really kind of undermined and neglected, just like our bridges and everything else in the U. S. Right. And so, if we want to be able to keep driving down that information superhighway, we have got to make sure that the infrastructure is there.

ANTHONY WOODWARD

Totally agree. We really loved having you on the podcast. I certainly ask questions and try to understand more things for hours, but I just think in, in terms of wrapping up the conversation today, what would you, for our audience, have as the top two, three things that they could be thinking about in their world as they're out, you know, most of them operating in the records management sphere, be thinking about all places they could go to get more information to widen their own programs and how they're trying to attack the problem.

DARRA HOFMAN

So, I think kind of my message to all records professionals all the time is: always be thinking about how you can have that seat at the table within your organization, within your professional organizations. I think we have a tendency as a profession to really undersell our expertise and the importance of what we do and the need for the organization.

DARRA HOFMAN

And of course, you know, part of it, like you said, is that whole shifting the language to data governance, that sort of thing. But how do we get that message through to folks of what we do and why it's so critically important whether it's talking the compliance angle talking the efficiency angle.

DARRA HOFMAN

They need us. They just don't know that they need us. So, how do we make that happen in terms of sources of information? Of course, for our MARA program. We are always happy to have webinars. So that's all available on our website. I'm always preaching the value of your professional organizations, you know I'm always like, "I love Nagara. I love ARMA. I love SAA." but it's true because there's so much more that we learn from each other and being in kinship. I always joke, I'm a useless academic, right? Everything that I learn of value is from my relationships with practitioners, the folks really out there and doing the work. And so, if you're a practitioner please call me.

DARRA HOFMAN

Let me know what you want to talk about, right? We need to have each other's backs as a profession and really kind of be reaching out, talking beyond us, but also talking to each other as well. And then in terms of like, if you're really interested in AI, of course, InterPARES, TRUST-AI, TRUST-AI is the newest iteration of the InterPARES research project out of Canada and so if you go there, I think there's roughly 80-ish studies going on right now. So, all the studies get published on the website as we go on, of course, Vicki Lemieux is a blockchain at UBC. If that's an interest of you, that's also available through their research is through the blockchain at UBC website, and then our RCC project is the circle project at SJSU.  

ANTHONY WOODWARD

And what project are we in, Kris? Because we’re in the InterPARES project. Are we allowed to talk about it?  

KRIS BROWN

Yeah, we're in the InterPARES AI. Yeah, so you'll see we're sort of most of the way through that research with a group out of Europe.

KRIS BROWN

So that'll be published, hopefully, I think next year is when they're finishing off that. But yeah, we're actively working with them now.  

ANTHONY WOODWARD

It's a super interesting research area that we're super proud to have been invited to be part of as well. So really, I think if anybody does get a chance, it's worth checking out.

ANTHONY WOODWARD

Look, Darra, I really appreciate the time and I'm also going to be over at a few of the conferences. So, I hope we get a chance to, to talk again in the next little while, but thank you very much for appearing on FILED. Probably would love to have you back and deep dive into some of the topics that we skated over today because it's just so much there, but I really appreciate your time today.  

DARRA HOFMAN

Thank you so much for having me. I love talking archives. It's my passion and records. So, I'm always happy to talk about it. So, thank you so much for the chance to talk about a few of the things that I love, and I'm interested in.  

ANTHONY WOODWARD

No, thank you and thank you listeners. Thank you for listening. I'm Anthony Woodward.  

Kris Brown

And I'm Kris Brown. We'll see you next time on FILED.

Enjoying the podcast?

Subscribe to FILED Newsletter.  
Your monthly round-up of the latest news and views at the intersection of data privacy, data security, and governance.
Subscribe Now

We want to hear from you! 

Do you have a burning topic you'd love to hear discussed?
Submit your topic idea now to help shape the conversation.
Submit your Topic