Privacy must be everyone's responsibility | Debra Farber, Shifting Privacy Left podcast

Debra Farber has made the cause of shifting privacy left her life’s mission. In addition to her work as a privacy consultant, she has spent three seasons and 60 episodes of her podcast, the aptly named Shifting Privacy Left, talking to everyone from privacy advocates to engineers about embedding privacy throughout organizations.

She joined Anthony and Kris to dive deep into the subject, its importance, its applicability to organizations large and small, and to share the most surprising things she's learned in her journey.

They also discuss:

  • Her move from privacy law to more technical aspects of privacy
  • How shifting privacy left relates to shifting security left
  • Why privacy shouldn’t belong to the lawyers
  • Why privacy is about preventing harm to people



Anthony Woodward

Welcome to FILED, a monthly conversation with those at the convergence of data privacy, data security, regulations, records, and governance. I'm Anthony Woodward, CEO of RecordPoint. And with me today is my co-host, Kris Brown, RecordPoint's VP of product management. Hey, how are you, Kris?  

Kris Brown

Mate, I am very good. How are you?  

Anthony Woodward

Good, good. And today we have Debra Farber, a consultant in privacy and also the host of the extremely excellent, I was listening to over the weekend, Shifting Privacy Left podcast. How are you, Debra? I'm doing well, doing very well. Thank you for having me. Yeah, it's great to have you on.  

Kris Brown

Hey Debra, look, massive, massive fan.

I've been listening to the podcast, and I want to congratulate you on the recent nuptials. So, yeah, big congratulations there. Thank you. Great to have you on FILED. As I said, it's going to be really, really cool to have a bit of conversation today, but I'd love to start by discussing your background and how you came to be in privacy, if you want to share that with us.

Debra Farber

Sure. I've been in privacy for 19 years, directly working on privacy stuff. So, there's a lot there. I'm going to try to condense it as best as I can. I first went to law school thinking that I wanted to do intellectual property law because I like this concept of intangible property rights and the idea you can own something that you can't actually touch and feel unlike real property.

And I just happened to have a law professor who taught that, that I really enjoyed his class, Paul Schwartz. He's one of the professors that has written the legal books along with Daniel Solove on like actually teaching privacy law in law schools. So, this is back in 2004 that I took privacy law with Professor Schwartz.

I really liked his copyright law approach. He was just an engaging professor. So, I'm like, what are you teaching next semester? Right? And I'm like, I don't know what privacy law is. It sounds fascinating. I'll take it. What I really loved about it was that it was all statutory based, at least in the United States.

I think the same in down under where it's, it was being addressed sector-ally, right? Like for healthcare, we have HIPAA. We have the Fair Credit Reporting Act for data brokers and for credit and for all these different areas, there were different laws, but none of them were really coming out of the constitutional...

You didn't have to go through the same, ‘is this written in the Constitution?’ or anything like that, like the privacy torts. Most of the laws coming out were about how can you ensure that people are protected and that information about them was protected based on laws that were written and pretty well understood.

So then, the work ended up being around how can we embed that into a company and their practices. And I found that particularly interesting. At the time, there were maybe one or two law firms in the entire country that even had a privacy practice. Everything was being addressed sector-ally. There was no, I want to do privacy law, go do it.

But there was a lot of legislation that had come down the pike and there was a need in business for people to embed this into organizations. And so, of course, they started out. I got a job at American Express managing what they called at the time, online privacy. But kind of the things we think of today is more marketing and email, canned spam act and making basically the, the, you know, do web beacons and what are you saying in your privacy policies?

And really that, that kind of beginner level when you start talking privacy today, like the very beginning aspects of what privacy is and kind of the touch points with the consumers and such. My career has been a shift left, which is the name of my privacy podcast, the Shifting Privacy Left podcast, but making the parallels with my career, my career has, over time, I've gotten more and more technical, less and less away from really doing the legal work and I've actually never practiced law.

I actually went straight into operations, but started out doing operations, then moved into understanding what the product development life cycle was really kind of doing a little more counseling, but more from a compliance perspective, not from legal, as I said, I wasn't practicing. I'm not licensed. I never got licensed.

Then kind of moving it even into my interests today are now more, how do we embed into the technology and into the engineering processes, privacy, right? From the beginning. So, you're not leaving it to the attorneys at the end after you've built a product or service and you go to them and you say, this is what we're going to be shipping tomorrow.

And then the lawyers are like, whoa, whoa, wait a minute here. This isn't compliant with this or that here's the potential legal risks. And, and now the lawyers would be looked at as being blockers to innovation or to the business. And when in reality, it was that this thought process of how do we build products?

and services with privacy by design and default. How do we build that into our entire business? That is the piece that I have been fascinated by. That's the piece that's the hardest to do. It requires a lot of behavioral changes, which is the hardest thing to do in a company is to change the behavior of people in the company, the way of working as to what they know.

And so, in my career, I would say that what I'm doing today is, I'm actually not only advising companies on how you can shift left and embed into your business processes and your engineering processes, privacy, but also really excited by the new class of privacy technologies, everything from privacy enhancing technologies that helps unlock the value of your data that you were maybe not able to use because privacy for some reason, but now by using certain PETs, you'd be able to safely use that data in some way, maybe get analytics or, or whatnot, but also technology that's enabling businesses to automate their business processes. So, and we could talk a little bit later about what some of those might be, but everything from data classification and mapping to, to training, to understanding basically different parts of the business where you'd be able to use technology to automate and thus make, Your time market faster need fewer human beings to be doing some risk assessments.

Of course, you're always going to need human beings, but the technology can really assist you in understanding where is that risk in your organization. So, you can give it that attention rather than not knowing and trying to look at everything at the same time and not having enough resources to do so. I know I covered a lot of ground, so feel free to, like, pick some of that apart.

Anthony Woodward

No, absolutely not. And thank you. I mean, it's really, you know, there's so much depth there to dive into, but I wouldn't mind actually focusing just on the outset on the Shifting Left podcast. What inspired you to start that? I know you've been going for a while. Certainly, I discovered it. I really love the questions and the approach you take, and you don't have any concerns about getting into some of the deep and dark issues.

Kris always pushes me away from doing that here. I'd love to explore all the concepts, but I love that you do it as an absolute inspiration.  

Debra Farber

Well, thank you. I do really appreciate that. And I think it comes with like 19 years of being in the industry and, and seeing a lot of crap, seeing a lot of things like, you know, when I say that, I mean, like everything from Watching the tech organizations trying to advocate for self-regulation and then just seeing that turn into surveillance capitalism, right?

Or see, so a lot of the cyclical, like, oh, well, here's how you can address this. And feeling at many times industries failing consumers, right? So, a lot of that comes with confidence that I understand, no, I've seen this before and let me tell people why this isn't an easy problem or why this is. Getting into the depth as to what the potential challenges are.

And then also I don't work for a company, at least not right now. I work for myself. And so, I even have to work, walk that fine line between, I don't want to become unemployable by just pointing out to companies and talking crap about them. Right. But at the same time, I want to highlight problems that are systemic in maybe software development today or engineering, or how companies are bringing products to market.

Okay. So, instead of just calling out, this is what's bad. I try to frame things as this is how you bring ethical tech to market. And if you're not doing it this way, perhaps you're not doing it ethically, which kind of casts, it's a different way of saying the same thing, but without maybe the blowback, right?

Kris Brown

Yeah, I'm going to have to take some advice on that. I think if I slag off Optus one more time on this podcast, there's a very good chance they will never let my phone connect to their network ever again.  

Debra Farber

And then what I want to prevent, and I think putting the focus on the privacy is all about protecting people.

Security, a lot of it is protecting systems and networks, right? From confidentiality, integrity and availability, vulnerabilities or threats. And so, a lot of times you're kind of thinking in terms of like. This is my thing, and the threat actor's trying to get at that thing, and we want to like, you know, it's an arms race, and we want to win the arms race.

With privacy for me, it's easy, you just bring it back to, this is not just, can we use the data in this way, or whatever, it's about, what harms to people are we preventing? And, I think it makes it easier when you bring the focus back onto people, right? Especially for companies that claim to be customer obsessed.

You can't be more obsessed with your customers than ensuring that the data about them is used in a way in alignment with how they believe it's going to be used, or how they give consent for it to be used, and like, it should be that we are putting that on a pedestal and actually honoring that like nothing else, right?

Because that is, that to me is the gold star of honoring privacy is, is making sure that the, the end user, the person, the people, like actually when you call them users, you start to dehumanize, right? So, I, a lot of it is like just putting the emphasis back on getting everybody on board that we're going to make people safe, right?

And treat them with respect. And a lot of that, it comes flows naturally of like, well, yeah, and respecting them means respecting their choices and how data is used. It's a really interesting space, and I think it's probably worth giving the listener, because you've touched on it a little bit, what is that notion of shifting left?

Why is that important to privacy? And why is that important to even beyond privacy? And I know you've touched on that in podcasts, but it's something I think that would give better context. Basically, it comes down to being in this field so long and watching it change, is that we could no longer protect personal information with paper, right?

We've got contracts, we've got laws, we've got constitutions, we've got laws, we've got policies, we've got procedures, we've got contracts. These are all important. You cannot actually protect anything without putting Controls around them and policy controls is not enough at all. You need operational and technical controls.

So, there's been an overemphasis in my opinion, that privacy is the realm of the attorneys and that privacy lawyers should be owning privacy in a business. And I've watched that grow up because originally it was like, Oh, well, there's so many changing privacy laws. But in reality, a lot of them have the same fundamentals.

Like they're not really all that different. It's just small little things here and there for compliance purposes. But they really all are focused on the same thing around personal data and preventing harm to people. But as a result, you might end up having a breach of personal information. And so, outside counsel from long ago would start saying, hey, your first privacy person should be an attorney, but not us.

Cause we're outside counsel and you still want us, but you should get a privacy officer who's responsible for embedding this into your business. But it'd be really good if they were a practicing attorney, because if there is a breach, well, then it's all going to be covered under privilege. Give me one other.

Area within a business, any, where this is a thing, where that, that person owning it is a business, is a practicing attorney that owns it. Even security, your chief security officer, they deal with breaches and incidents, right? It might not result in a privacy breach, but they deal with a similar incident response and whatnot.

If there's a potential incident, they go to their counsel and they say, here's a potential incident. And now everything's covered by privilege. You do not have to have that being the person owning it. And so, I've had this thorn in my side for a while, especially since I'm. I'm trained as a lawyer, but I'm not practicing.

And I truly believe that privacy counsel is important, but that your privacy officer, and this is polarizing, right? There are plenty of people that might be listening to this, that are lawyers that own the privacy program. I recognize that there's many opinions on this. I have friends who we have these conversations with all the time and it's no ill will, but this is where I'm coming from.

And I think that the reason that privacy has not moved into engineering over the many years that I've been doing this, almost 20, is that lawyers know their area very well. They understand risk. They might even understand governance, risk, compliance. But the realm of moving into tech, into how products and services are built and delivered is not part of their true base understanding unless they became lawyers after they were working in that field.

It is not. We shouldn't expect that that's even their world. They're looking at legal risks and contracts and doing business with different companies. And it's a different set of risks. And I think that it has resulted in us getting stuck in the GRC phase and then not being able to understand how to turn these business requirements into engineering requirements.

And so, a lot of the Inspiration for this show has been a little bit to advocate for, Hey, if you're in a product service, an architect, you're a developer, you're in DevOps, you're a designer, not these traditional people we think of as owning privacy, not only is it your responsibility and imperative, but this is how you could build a product, an end product that satisfies privacy.

This is how you could be part of the solution. Let's architect this together. And the other part is that all of these different areas of, even privacy engineering is so new as a field. What is a privacy engineer, right? You, you ask different people, they give you a different answer. And so, what I was finding is that there are, it's almost like that old adage, Indian adage of the elephant and different people holding a different part of the elephant.

Like, oh, blindfold is like, what is in front of me? Oh, it's this tail or, Oh no, it's this ear. That's really big and whatever. Or it's this, I'm touching this weird trunk thing. Right. But you don't see the whole thing because. Like, for instance, as an example, someone that's working on a privacy-enhancing technology like homomorphic encryption is coming from the cryptography world, they're looking at completely different libraries and completely different ways of approaching cryptography than somebody that's working in differential privacy techniques, which is a data science.

technique around statistics and just different, even academic worlds of looking at their part of the elephant. So, I see this as a platform for me to be able to surface interesting things about what's being done to people who are interested in privacy engineering. Maybe they call themselves a privacy engineer.

Maybe they are just interested in the space. space. Maybe they just want to know what they should know outside of their own area of expertise when it comes to privacy engineering and having conversations with people doing interesting work and being able to kind of cross pollinate those areas with and pepper in maybe them going, oh, I haven't thought about that before.

I'm going to go delve in and like, see how I'm going to add this into my way of working. And we cover a lot of ground because there's so much in bringing products and services to market. Shifting left in privacy, the means. Instead of focusing only on the data life cycle, which is what privacy folks have traditionally done, right?

We're talking about the origination or collection the inception, basically, of personal data and managing that ethically and responsibly and governing it and all that all the way through its life cycle until you delete it or make it unrecoverable or whatever it is you do until it's retired. Well, that's all well and good, but what about the systems we put it in?

And how do we know, like, for instance, that system is going to be, we can delete from that system appropriately, or we could, well, you don't until you build with privacy by design. And then that's what I'm really focused on is how do we shift into the product and service development lifecycle and the software development lifecycle.

That's the left part. If you will, I know I'm talking for a little bit, but I want to talk about what shifting security left was because that's what shifting privacy left came from. It took a security concept and said, oh, this is working really well. This makes a lot of sense. Why don't we do this for privacy?

So, shifting security left is a concept in software development. Kind of really came out of DevOps that really emphasizes integrating security measures and practices early in the development life cycle instead of addressing them at the later stages, like during testing after the deployment. And so, this approach is part of the broader DevSecOps movement, which aims to incorporate security as a shared responsibility throughout the development process that has been working and it's been around for a few years as a mantra as DevSecOps has become a thing to like help with.

Early integration, automation, empowering developers, not slowing them down, but kind of giving them the right breaks that enable them to feel comfortable to go faster in their development, because now they have these tools that give them the comfort that they're adding security and or in our case, privacy, and then get more, you could more efficiently address costs by remediating those costs and fixing security issues early, and we want to do that with privacy, fix those early.

You have to first understand how is privacy different than security, right? When we're talking shifting privacy left. So, I feel like that again.  

Kris Brown

And I think the, I go even simpler, I think you said it at the beginning of this, is it being the product guy on the call you know, from man, Anthony, it's like making sure that we don't get to the end.

We're about to go ship. And then I have a lawyer walk in the door and go, oh, by the way, you're not going to ship that for all of these reasons, right? Like. There's efficiency in that by eliminating that cycle. There's efficiency. There's confidence. As you said, I like that piece around the people who are a part of the development of the understanding that they're doing these things in a particular way.

I really like what you said earlier, Debra, too, about giving the individuals the understanding that this is, you know, this is people that we're talking about, not users. It's people. It's their I'm Protecting their data, it's removing them from harm. There's this greater good element to all of this as well.

So, so when we're talking about the shifting left in general, but, you know, even in this case of shifting privacy left that that key piece of, you know, don't wait till the end. Understand as we're working our way through what we're doing with the data. Again, the simple one, can this system delete data, right?

Like it's the number of systems that people have that a legacy that even in our world that we come across. And it's like, we don't have any way to eliminate data from this system. So, how do we, you've got this whole. I won't say new industry, but like this application retirement element now of I've got to just retire that whole application because it's just not fit for purpose, but I need to keep the data.

So, I'm moving it somewhere else so I can turn it off and I'm hoping that the new system that I move it to, not necessarily migrating it to, but moving it to allows me to control and manage and deal with the compliance aspects of that as well. And you had an IT level and a technology operational technology side.

Really a much better understanding of managing that personal data from the get-go and I really enjoyed what you sort of said there. We're very lucky and certainly speaking to yourself today, you know, I love to hear just those different opinions, even a different understanding of the same thing, like just the way in which you articulated it, I was really enjoying.

On your podcast, you've had a whole bunch of really interesting experts and others, can you share a couple that have surprised you? Like when you were talking to the person, I didn't really expect to hear that or the things that sort of came out of the blue from your podcast.

Can you share some of those moments with us?  

Debra Farber

Yeah, I can. So, a lot of the time I do a lot of research on who I'm going to bring on because I'm always thinking about what do I want the audience to take away from this? For me, I don't look at my podcast as something where it's like, I'm just one of as many listeners as possible.

I want it to be that people are like really taking away, become sticky users because they want to hear things that they can maybe deploy in their practice or think about and use themselves. And so, it's hard to say things that have been surprising to me, but one of them is around synthetic data. You hear now with LLMs.

that they're really data hungry. They need to train on large data sets. That's the whole point of the large language models. And they're kind of running out of places to be able to train on. And we're hearing about some, maybe some unethical things in my opinion about how companies are now allowing training sets and stuff.

So, all of a sudden you started hearing how, like synthetic data was going to be the answer to the fact that there isn't enough training data out there for LLMs. And that sounded interesting to me. So, I had. two PhDs. Well, one is in the, he's in his PhD program now, but two PhDs that work with the company Monitaur.

So, Andrew Clark and Sid Mangalik from Monitaur are really, really great stewards and advocates. And I'm very articulate about the benefits and of synthetic data. But also, how so many of the use cases are really bad use cases. And that the way it's being talked about as a panacea for training LLM models is like, not only false, but it's going to lead to some problems.

And so, what I learned there is that there's really only three good use cases for synthetic data. The first one is supplemental data. You've got a lot of data, but you're like, you want to expand the data set a little bit by including some data that's outside the norm, so you can prevent some overfitting, and you have more data at the edges, and you know, maybe what, maybe you throw in there some, some edge cases, so that it's also included in the data set.

And then the second one is stress testing your systems, which has been done since NASA has been doing since the sixties. This isn't new, but most companies aren't stress testing with AI, where you use it to make sure you have safe and performance systems so that for instance, if you're have all this data set about how something works in California and California citizens, you can't necessarily just take that model and then stick it in North Dakota and expect it to work for the people there.

You've just got a completely different base of individuals there that it won't work for, and it might not be safe, and it might not be very performant. And the third is differential privacy, which is a data science technique to add noise to your data set so that you can get answers to a question without being able to track those answers down to any one individual.

So, it kind of preserves the input of that, of that information. It does require a lot of expertise at this point. And you need to be a data scientist to make it happen. But people really like differential privacy, because it gives a privacy guarantee mathematically that something is or is not differentially private.

But unfortunately, it's a really bad use case. I learned from them too, if you want to learn about the base distribution of the people itself, because if you keep training data on synthetic data, so it's data that looks like the original data, you end up overfitting your models. And if it's about people, like the person, a classifier or regressor, it becomes a person, a classifier or regressor, right?

Where you end up, just like you train something on 20 pictures of Debra and one picture of Kris and one picture of Anthony, it's gonna come out a lot like looking like me, right? Because you just overtrained it on too much of the same thing it's going to, and that could also be a privacy problem, right?

In that example, if you spit out something that looks like a, a real human being  

Kris Brown

That, and it shouldn't look like me anyway, right?  

Debra Farber

Yeah. So, I learned a lot. And with this huge overlap with AI recently, where a lot of privacy lawyers and privacy folks generally are being asked to also look at ethical AI, if they're doing ethical Privacy work.

It's not a huge stress. And so, even the IAPP, the International Association of Privacy Professionals, has a new AI certification, personally, I think it's pretty early to even have something like that, but and it's trying to help pave the way to make privacy folks also AI-capable, of advising on the AI space.

And so, there's this overlap. And so, for me, that was something that I learned from them that I thought was pretty interesting. Because if you listen to the marketplace, everyone's just talking about how it's. Synthetic data is going to be this, like, that's the way forward for LLMs. And I think it's a big way to just companies trying to get investment money.

A lot of the hype, a lot of the AI hype, and that brought it, brought it back down to earth and made it a little more cut through the hype for me. And then the other thing that was interesting is learning from Jared Coseglia from TRU Staffing Partners, who has been watching the hiring market for privacy engineers for privacy generally, but I had him on to talk more about privacy engineering.

He's been tracking the numbers and has empirical data around hiring for privacy and how it really goes in waves in a cyclical cycle of two years where Every two years, it's like, Oh no, like let's say GDPR is coming down the pike and every company needs to staff up right now, you know, everyone's looking for a GDPR person and everybody says they're a GDPR expert when it's been around for like a hot second.

Right. And then there's this boon for hiring. And then two years later, there's just like, people are getting laid off. And I'm wearing that cycle right now where like, look at tech right now. It's just like, everyone's getting laid off, not just privacy, but just in tech generally. For other reasons we won't go into and how that has been shaping this.

You still have the needs within the organization to do privacy stuff, but you don't have the full-time resources anymore. So, then they go to contract hires and how there's this huge, all of a sudden contract hiring is really big. Then that goes away, and everyone wants full-time hiring.  

And so, there's this cycle that seems to be based on historical data every two years and so his tidbit was, the takeaway was, that full time hiring for privacy engineers won't increase really until around Q4 of this year. That was his prediction at the beginning of Q4 last year, which was hard to hear. Privacy engineers were gaining momentum, right? I mean, we're starting to see they're all over Google. They're all over Meta.

Unfortunately, they're in a lot of the Companies that the big tech where there's been a lot of privacy snafus, that's where there's all this hiring for them. But at the same token, it's well, good. That's where there should be attention, right? So, if they're laying off all those people, what happens to this momentum of privacy engineer?

The profession, right? Like, are we going to slip backwards where people are going to move into other things? Maybe they go into AI or into security and now when we need them again, they're not going to be there to pull from. So, that was interesting to hear as well. I mean, we have so many interesting guests, but those were two things that surprised me.

Kris Brown

Yeah, I think the, the cyclical things, it's kind of interesting. You go to the full time hires, you get to that point where it's cost cutting, you drop out and go to contracts and finance looks down and sees these expensive contracts because the, People disappeared from the industry. And as you say, the thing is, is that well, there's not a lot of jobs and I've got to go and find something else and tech people in general.

And certainly, you will see this as you're sort of a little bit of morph into what's happening at the time, and you've got other interests, and you move. And so, yeah, there's this, it sort of builds up and then falls out and then builds up and then falls out. Until there's a level of maturity in the industry, it's difficult for people to stay.

It's difficult for, even in our space, we watch as governance people sort of come, fall into the industry almost. It's like, oh, I've started over here and I've fallen into it. We see a lot of that at conferences. Like, oh, I'm, you know, the data governance, now I've picked up the privacy portfolio as well.

Right? Like it's. There'll be plenty of listeners who are going to be like, well, yeah, I've fallen into this, this privacy space. I need to start and then, you know, they'll get dragged up with it as it booms inside that organization. And then all of a sudden say, well, the second they sort of decided to specialize, you have this bust again, where it's like,  

I might go back to my roots, or I might pick up something, as you say, AI. I don't know that that's a bad thing. Like, if you have a bunch of privacy professionals moving into other tech areas, I'm sure that'll help with the shifting left element.  

Debra Farber

I do think that. I just think it's really early in the AI space to say that you're an expert in it, even if you're a privacy person moving into it.

But at C'est La Vie, it's the flavor of the week. And it's not going away. My AI is definitely being rolled out. It's the generative AI that I my eyes at a little bit, because I think we're all just experiments to the people rolling that out right now. Like it should have gone through more of a rigorous testing before it was deployed the way it has been those base models, in my opinion, the same way we do with healthcare, right?

We don't just start experimenting on people, but that's what we're doing right now.  

Kris Brown

It's an interesting take on things.  

Anthony Woodward

Yeah. Going back to some of your podcasts and you had a lot of conversation around some of the fundamentals of shifting left and things like the IEEE standards and a lot of go back even a few years ago, people were talking about the notion of changes to the metaverse and how we would embed it.

Privacy in that, but we really haven't seen a lot of progress. We've seen a lot of progress in shifting left and a lot of progress in new development tools and techniques around how applications and websites collect data. That's why one of the things that Kris and I talk about, we talk about a little bit on this podcast is we haven't really seen it fundamentally occur in data.

So, agree with your point of synthetics. Although I probably take one slight difference in the reality of synthetics and even differential privacy. It's a strong mathematical algorithm. It's still not perfect and it is still hackable and it's still able to be reversed engineered. With the right techniques.

So, when do you think we're going to see that next month? What are you talking about and seeing in the world, which is where we actually start to embed true practices in the data itself, because that's when the machine learning and the AI, that's when we're going to be able to really work with it. When we can tell it particular signals about what to use and what to not to use and decorate that data and respect those rules.

Debra Farber

Yeah, so I don't have answers to all things, but I will tell you that I did have a really great conversation. They weren't on my podcast. I was actually on, on a data mesh podcast. And I learned a lot from that conversation that sent me down a rabbit hole of learning about this new architectural approach for creating data as a product within organizations called data mesh.

Now I feel like I'm not prepared to actually talk about that anymore. Then I just did, but we need more data architects talking about how can we create processes within organizations to embed these things. And so, in that particular conversation, I was talking about what may be the privacy potential issues when the data mesh architecture might be, and I was prepared for that conversation at that time.

But I think we're going to start seeing more innovation coming from data teams about like, how can we consistently ship our data or curate our data, provide our data with two parts of the organization. In a way that is repeatable, and I don't know, usable that also has privacy and security built into that data mesh is just the only one that I can think of in terms of a particular architecture, but actually, that's not entirely true.

I think there's a lot to be said about privacy via architecture because the way you architect something provides those initial constraints and if you prevent ability, I don't know, an example of you prevent inputs into your LLM that are personal data, for instance, you can make it easy for the business to automate sending stuff to their LLM for training if you already have architected it so that it does not send personal data there.

But I also want to point out that I'm seeing a lot around what has security done in this space and how can we do it for privacy. And you're starting to see a lot of that with adversarial tests, everything from starting with threat modeling for privacy, which is really just. thinking about who are the threat actors that would create privacy harms.

So, all you're doing is like, well, a security harm is different from a privacy harm. So, let's just sit there and think about what are those potential threat actors? What are the potential vulnerabilities? What does that modeling look like? And that's going to be different for different companies. I'm seeing a lot of that.

There are now red teams at companies like Google and Meta. I hope to see that in a lot more companies, but they're trying to find vulnerabilities that they can potentially exploit that would be privacy harms that are different. And so, I know Rebecca Balebako in the EU is working in that space where she's done a lot around privacy threat modeling and adversarial tests.

There are a few vulnerability models. So, like there in the EU, there is one of the universities, I forget which one it's called, but they came up with the Linden model. And so, the LINDDUN model really discusses several different privacy threat types. And there's a lot of research that came out of a university, and it is now the leading privacy threat model approach that exists out there. So, LINDDUN's, LINDDUN, stands for the different threat types. linking, identifying, non-repudiation, detecting, data disclosure, unawareness or un intervenability, and non-compliance.

And it goes into each of those and talks about all examples of this, and I highly recommend people take a look at it. There is a woman Isabel Barberá, there's a woman in the EU who created the plot for AI privacy library of threats for artificial intelligence. That's what plot for AI stands for where she took the LINDDUN privacy threat model and right sized it for AI.

And so, there's a really great freely available. paper that she's written on this, and it's the one that I refer to when I'm looking at threat modeling for AI or thinking about it. It's not like I'm currently applying that to stuff. But you're going to start seeing a lot more of these approaches that are very similar to security.

I sit on the advisory board, for instance, of Privado, and I'm mentioning Privado specifically for two reasons. One, they are the sponsor for my podcast. So, thank you, Privado. Two, they do static code analysis for privacy, right? So, they're embedding in the engineering workflow. They're, they're looking at the API calls and they're looking at the engineer's workflow and are surfacing risks in their workflow so that they could address privacy issues and fix them before they ever ship to production.

I mean, that's huge. So, again, that is a privacy test. But it's following along with what security does. And so, engineering teams know what to do with that now. They're like, oh, we know what a static code analysis is for security. Oh, this is for privacy. Great. We don't have something for that, but we know where that fits in.

I think we're going to see a lot more of privacy shifting left into where security has traditionally had tools and processes and techniques and approaches and frameworks and all that good stuff. Standards. Hopefully we'll start seeing standards, but I expect that to take a lot longer. Because, just by nature of the collaboration you need to get to for standards.

Anthony Woodward

I guess that, that does bring up an interesting topic when it starts to talk about test standards. There's a lot of technology we've talked about and a lot of different mathematical techniques around managing data and shifting left of the pipeline and doing static analysis. But this really is a regulatory problem at the end of the day, right?

I mean, this comes back to loop that back to the legal elements and the statutory requirements. One of the big issues is the legislation is going to take time to catch up, as you say. What's your prediction? Is this industry going to the notion of privacy in this form going to exist in 20 years? Will the legislation just catch up?

It'll be okay, everybody now just drives on the left hand side of the road. We're all good. Everybody's all sorted. Off we go. Or is it that this will be an ongoing battle?  

Debra Farber

It is really interesting. I mean, and I've looked at that in my own career, too, because I started out in law, right, and then I moved to D. C. and was working for IBM on government contracts for privacy, and things moved so slow, so terribly slow, and I think that there are reasons we don't have a federal privacy law in the United States. and none of them have to do with privacy. They all have to do with things we can't control, like, will it open the floodgates of litigation?

And like, can you have a right to sue or not? And so, it ends up being pro-business versus pro consumer tensions. And does it cover the federal government or it's just commercial? And is it, it's all these conversations and it's now an election year. And so, people are going to grandstand and nothing's going to pass.

I think that, What we're going to eventually see is that you can't just legislate and expect this stuff to happen. I think it's now a market problem, right? If you're going to keep having privacy problems, it's now a point where consumers understand. This isn't just like we're surveilling them and they don't get cookies, you know, like time has gone on.

They understand surveillance capitalism. They don't like it. They are voting with their feet with a lot of products. Other times they don't know, but they do get harmed. So, there's a lot of suspicion about products and services. And I do believe eventually. As companies are now going to win on trust.

Privacy is a part of trust, trust and safety, right? Privacy is part of trust and safety. If you don't have trust and safety, consumers won't go there. Law is too slow. I mean, we need the law, but the law is too slow to keep up with the current pace of innovation. And I think that engineering teams do not like having to constantly change, like, wait a minute.

Now we do this. Now we do that. Now the lawyer says I need to do this. Now the lawyer says I need to do that, which is why we can no longer have just a compliance culture around privacy. As we know, even with security. If you do security right, compliance is a byproduct of that, right? But if you just aim for compliance, which very often ends up being, just have a bunch of policies and procedures in place, that maybe, maybe are being followed, maybe are not, and certainly aren't embedded into the business, or into the workflows of people, or the way of working, Then we're not going to make any change and everything's going to be kind of like this, like, I don't know what to do.

Privacy is still a blocker. And so, the smart companies are realizing that if you address privacy up front with just good privacy practices, you will be on the road to all that compliance. But you'd be ahead of where future regulations are going because you're really addressing the fundamental issue.

For instance, just issues of transparency or issues of, you know, accountability. It's like, well, then have a method and a process that makes people accountable, right? I do. have a very positive outlook of the future of privacy. But I also do know from my own history that I could see where privacy is going, and it still takes 10 times longer to get there.

And that is frustrating for me because it's so clear where it's going and where it needs to go. But businesses have been kicking and screaming about investing in it. I think they're at the point now where they realize they have to invest in it. It's table stakes. That's a good thing, but at the same time, they don't necessarily know how to effectuate it, which is why when you see layoffs, you're also seeing layoffs of privacy folks, you know, like who's going to manage this now?

I don't know inside stuff, but like Google just laid off its Chief Privacy Officer. What is that about? He's been there 13 years. There are things that I'm like, I don't understand. This makes no sense. And he was very well liked and very, everyone thought well of him. This isn't like a, he was incapable of doing well.

Anthony Woodward

But it's interesting you say it's table stakes and that's what we see is apparently happening in the industry. Push comes to shove. I think this is being seen as a nice to have than a must to have in order to operate.  

Debra Farber

I still do think that there are people who feel that way, especially investors. I mean, I mean, that's the problem I see is that the investor community, all these startups that come out, there's no one telling them that this is what you need, not only security, but you need to embed privacy into this otherwise.

Your product's not going to be as good, or you're going to run into snafus. And I truly, having lived in Silicon Valley for eight years, and now I live just a little further North and near the Portland, Oregon area, but I feel like that's a hole that's unplugged. I feel like that is a problem where there's very much of the initial investors are like, just get your product out there, get the traction and we'll fix any privacy issues later.

We'll just add a box of privacy onto it later, but like get the traction, get the money, and then we'll, when we raise the series a, we'll, or the series B. We'll cash in and let somebody else deal with the, the risk. We'll sell our, our series A stock, whatever, make lots of money and someone else can deal with the risk then, and then we'll bring in a privacy officer.

And I'm not saying everyone needs a chief privacy officer when you're a three person startup, but you can't not think of that as table stake requirements as you're building a product or service that has personal data in it. And that is still going on today. And that is a problem, and I don't know how to fix it.

I don't know if it's a SEC in the US, the Securities and Exchange Commission needs to have a rule that you include in your, not just security breaches, but also potential privacy vulnerabilities. And you need to get that vetted. I don't know, but that is where I see it. We're still bleeding. We still have a problem of ethics and our innovation, at least in the United States.

You know, I can't really comment on other countries.  

Anthony Woodward

I can comment on Australia. I could tell you it's the same here.  

Debra Farber

Yeah, incentives aren't aligned right now, right? It's like, honestly, I believe in capitalism, but capitalism unchecked is sociopathy. I'm sociopath. If you don't care about the harm to people, and that's not a thing that's embedded into your products and services.

I don't really think that that's ethical or you're really thinking about people. You're just thinking about how much money can we make and. There needs to be something that reigns that in, and I don't necessarily have the answers to all of that.  

Anthony Woodward

I do think there's a real risk, though, with a lot of the focus still on the technology industries, because the largest risk is not actually going to be in technology.

It's going to be in healthcare providers, in insurers, in other locations. And we're beginning to see the early signs of that, where I think That's where society is waking up. It's actually not Google as much as it is other players that have data that is far more consequential.  

Debra Farber

Oh yeah. There's definitely life and death data in healthcare or also automated decisions being made about people, which is why the EU makes automated decision making about fundamental sensitive issues.

Like it has to be vetted and you have to have an extra privacy risk assessment around that because of the nature of it and how it can negatively affect people if it's wrong. It's that constant tension of the EU has a rights based way of approaching privacy and the US has more of a consumer protection based way like, well, if you're not harmed, you, if you don't have an actual physical harm or something, then you don't need any sort of  

Anthony Woodward

No harm, no foul, right? I guess,  

Kris Brown

But I think you mentioned this earlier and it's a thing that we like here on FILED and at RecordPoint as well, which is around the data trust element debit. Is there to throw a bit of a spanner in this part of the conversation is it us as an, and I'm not to ask the professionals, us, the people again, we're pretty lax with it.

We sign up to all these tools, we want to use them for nothing, we're happy to give away our data, and then there's just an expectation that this stuff's not going to leak out. Is there an element that we need to be less trustworthy of organizations? And how do we change that mentality? Because I feel that I'm probably relatively prepared for this stuff.

I'm a technologist and I, you know, I've spent my career in the products and services around technology. And so, I'm on a VPN here and I do all of the right things there. And I have a password for everything and all of those pieces. But then the second I get maybe two legs of separation away from me and my family, the people that I talk to are very ill prepared, the way in which they interact on the internet there.

They're sending credit card details to people on the email, or they're like, it's just like, you can't do those things and not expect to have some form of consequences. Is it education and trustworthiness that we have a problem with? That we almost like, it's big business, so therefore we should trust it.

It's big tech, so therefore we should trust it. And we've got almost this name element of. It's fine as it's my email address on my phone number and yeah, but it's like, and there's all a bit over here. It's my address and my birth date. But the aggregation of those things is the technology pieces that most people just simply do not understand.

And I know it's all about back to we've got a shift left and the people building those things should have that. But I also think that there needs to be a grain of salt sprinkled on the rest of the people. And it's like, hey, you just can't play this way. You be very, very mindful.  

Perhaps pre standards, is there a trust score that's needed? This is a new business. Their score is going to be low just by virtue of the fact that they are new. That's your investment. Way to put this back on the investors is, well, their score is going to be low until they do something about it. And here's the five steps they need to do.

And if you've had a breach recently, well here, here's the step you're going to come down on that trust score. And I'm now making up this brand new. Trust pilot version 2 or whatever it might be, but I think there's, yeah, there's an element of we, the people need to do a little bit of a better job around how we're just a sprinkling this out there.

And in some cases, the horse is already bolted, right?  

Anthony Woodward

Kris hasn’t clearly just walked into a whole debate on the 4th amendment and the implications associated with the 4th amendment of it. Yeah,  

Kris Brown

I know, but it's just like, I. Hence, the way the people, as I said, the EU, it's rights based, the US, it's very much about consumer protection.

And if there wasn't a foul, well, cool, go and make your money. But there's this element of, we need to do better in it, even of ourselves, that we're making it too easy.  

Debra Farber

Yes. And I, you've really pointed out  

Kris Brown

Another podcast, all of its own, right? Right. Yeah.  

Debra Farber

I don't know. I have not thought about what is the exact way to fix the education problem.

It feels like a really big one. It feels like something that we should be talking about in schools, educating kids on. I don't know what to do about older folks right now. Even my own mom, I can't get her to like, sit down and learn about the technology in a certain way. And she's actually a little afraid of it, right?

You know, maybe afraid of touching the wrong button, or “what am I doing?” Or “what did I just agree to?” Right? So, there's definitely a lot of research that's being done on privacy policies, and what do people understand? And it's impossible for people to read them. They're not even written in a way that's easy to understand.

They're written more as legalese when they really should be clear and concise and informative about what companies are doing. There's sleight of hand in some of them. You have identified the problem. I don't think we should blame the users, but blame the people, right? It's the thinking security too that we've, oh, you people are the weakest link.

It's like, yes, but this is not a mentality that's helpful in solving any problems. We need to understand that their thought process on this is not set up to be able to make these decisions, right? So, in some ways we're exploiting that weakness businesses by doing this. I do think it's a lot easier to say regulate.

And say, don't do these things or definitely do this thing, then it is to say that each person that comes to your website needs to have an understanding of what you're doing. I mean, like, that's really hard. How do you gauge that? But I do think you're right. There's no way to gauge one website against another or one app or how do you know that this app is actually like a Chinese government surreptitiously, like, you know, stealing things on your phone, right?

Like, You don't, I think it's really up to places like, so in that respect, I would think that for apps, for instance, in an app store, that it should be the app store's responsibility as the one that hosts all those apps to have some level of, to make sure that there's some level of privacy and trust in there.

They don't want that responsibility because if that's their responsibility, then they have to now do due diligence. They have to now invest a lot more money. But really, if you're looking at the community, if you want to have a robust, healthy economic system there, and I'm no economist, it seems that that is the perfect spot to actually like do the most good and most influence to make everyone the most safe.

So, I do think that there's different areas where those who are providing these platforms or spaces where you can go and for apps at least, we should be putting some sort of requirements and obligations on those who bring those.  

Kris Brown

I like the comment that, “do the most good,” right? Like, I think that actually does put me in my place in terms of the conversation because it's like, yeah, where can we do the most good?

And education probably is in the too hard basket. And there are a lot better places where we could do the most good in the first instance. We shouldn't ignore it, but there's lots of places to do the most good.  

Debra Farber

I'm also working with a company officially on their advisory board called Privaini. And to your point of like, how could people look at all these websites and like, is there a trust score?

And what do you do? They have brought to market this product that is basically an outside in view of your web brands. So, they're looking at all of the facets of thousand points of data as to everything from has your website been found in On the dark web, like spilling data, you know, if someone opts out of this thing on your site, does it actually reflect the opt out in the, you know, does it effectuate it?

This is being used by people to kind of get a full monitoring of their privacy posture, but not just for their company, for their entire business network. So, if you're like, I don't know, Walmart, and you have like 10, 000 vendors, and you don't know which vendors to go look at and say, which ones are the riskiest or whatever, you can use this It's kind of like a security scorecard, but beyond that, if you're familiar with that security company that does the monitoring of what's publicly available, whether it's your app or your, your website or public financial statements, and they bring it all into one view, give a trust score and we'll be able to say, Hey, wait, this privacy policy of this vendor that you use, or this partner of yours, that's the front facing partner for something that you're using, that you're doing.

Here's all of their public, Problems that they have. And so, I'm hoping that, I mean, personally, cause I'm working with this company and I have some equity in them, but I'm also hoping because they really think it's something that could be shine a light on some of the problems out there is kind of creating a FICO score for businesses, not for individuals, right, for businesses so that you can understand which businesses have their ducks in a row, because you can't possibly do this yourself.

How would any individual be able to analyze a thousand points of data about one company, right? But to do it across your entire business network so that you can better manage risk. And then hopefully this will be better for individuals that are using these services because regulators will be able to use these tools too.

They'll be able to use these tools to find those who are non-compliant with all the various laws and then go after them easily in an automated way, right? So, hopefully that brings the industry into, and it also shines a light on areas they thought they had in compliance, but they don't. It's not going to panacea for everything, but it is an ongoing monitoring tool that I'm hoping will really spur widespread changes to some privacy postures that's going to benefit everybody, and kind of rising tide lifts all boats perspective.

Anthony Woodward

No, that's awesome. And look, there's so much ground. I think we've covered today. It was a fantastic conversation and I suspect we could continue on, but You know, I think if we could leave the podcast with one thought, what do you think the next 12 months looks like? We're talking again, love to have you on again in 12 months.

We’ll do a reassessment. What would your prediction be that we come back and loop back on again?  

Debra Farber

Wow. I think. The next 12 months, we're going to see a lot of the AI hype cycle kind of come a little bit down to earth. I think there's going to be a lot of companies that we're invested in that end up failing.

Dissipate. Yeah. In 12 months, I'm looking at like, AI, right? Cause that's, I don't know in privacy, I'm not sure there's one thing I can point to. I'm hoping it's more mature, but like, you know, what does that mean? I think that AI is really the one thing I can really cite too. And I am hoping that we shake out a lot of the ridiculous claims of what generative AI can do, especially the fact that it is always an estimation.

Like we can never have guaranteed truthful things that AI in this way spits out. It's always just looking at patterns and trying to figure things out based on patterns. It's great to use for privacy and security, finding anomalies and maybe even finding drug discoveries and doing all these things. But in terms of generative AI, I think a lot of the hype will be taking out of it and we'll see that they have really good use cases in certain areas.

And then a lot of the rest of it was just a bunch of hot air. And I think we'll maybe be talking about, what are those good use cases? I think it'll; it remains to be seen. That's the thing I'll be looking at for sure.  

Anthony Woodward

That's a great forecast. We invite you now formally to come back in 12 months. We'd love to review and have a look at that and talk about it some more, but it was really fantastic.

I appreciated it very much. And there's so much depth to get into. Thanks for listening. I'm Anthony Woodward. If you enjoyed this podcast, please give us a rating. In your podcast platform of choice and share this podcast on social media, there's a whole bunch of information and we're trying to get to a whole bunch more people like you.

Debra Farber

Thanks for having me.  

Kris Brown

And thanks, Anthony and Debra. I'm Kris Brown. We'll see you next time on FILED.

Enjoying the podcast?

Subscribe to FILED Newsletter.  
Your monthly round-up of the latest news and views at the intersection of data privacy, data security, and governance.
Subscribe Now

We want to hear from you! 

Do you have a burning topic you'd love to hear discussed?
Submit your topic idea now to help shape the conversation.
Submit your Topic