Kirsten Martin is a faculty member in Notre Dame’s Mendoza College of Business and a faculty affiliate of the Notre Dame Technology Ethics Center. Recognized nationally for her expertise in privacy, technology, and corporate responsibility, she joined the University this fall and holds the William P. and Hazel B. White Center Chair of Technology Ethics.
Kirsten’s research, including studies of online privacy and the ethics of data aggregation, has been funded by the National Science Foundation, and she serves as the technology and business ethics editor for the Journal of Business Ethics. Earlier this year, Columbia University Press published her co-authored book The Power of AND: Responsible Business Without Trade-Offs.
Kirsten and host Ted Fox started their conversation with what does and does not tend to bother people about their data being gathered through websites and apps—and how most of us don’t realize the extent to which that’s happening. They spent some time on her research on location data in particular.
From there, they talked about things we, as consumers, can do to at least mitigate the spread of our personal data, and why she’s skeptical of any argument from industry that contends governmental regulation would be too costly. They wrapped up with a question inspired by her new book—namely:
When it comes to business, who is a business in business for?
- Kirsten’s TEDx Talk: “It’s Not Their Story to Tell: Why Companies Should Respect Privacy Online”
- Kirsten’s Coauthored Book: The Power of AND: Responsible Business Without Trade-Offs
*Note: We do our best to make these transcripts as accurate as we can. That said, if you want to quote from one of our episodes, particularly the words of our guests, please listen to the audio whenever possible. Thanks.
Ted Fox 0:00
(voiceover) From the University of Notre Dame, this is With a Side of Knowledge. I'm your host, Ted Fox. Before the pandemic, we were the show that invited scholars, makers, and professionals out to brunch for informal conversations about their work. And we look forward to being that show again one day. But for now, we're recording remotely to maintain physical distancing. If you like what you hear, you can leave us a rating on Apple Podcasts or wherever you're listening. Thanks for stopping by.
Kirsten Martin is a faculty member in Notre Dame's Mendoza College of Business and a faculty affiliate of the Notre Dame Technology Ethics Center. Recognized nationally for her expertise in privacy, technology, and corporate responsibility, she joined the University this fall and holds the William P. and Hazel B. White Center Chair of Technology Ethics. Kirsten's research, including studies of online privacy and the ethics of data aggregation, has been funded by the National Science Foundation, and she serves as the technology and business ethics editor for the Journal of Business Ethics. Earlier this year, Columbia University Press published her coauthored book, The Power of AND: Responsible Business Without Trade-offs. Kirsten and I started our conversation with what does and does not tend to bother people about their data being gathered through websites and apps--and how most of us don't realize the extent to which that's happening. We spent some time on her research on location data in particular. From there, we talked about things we as consumers can do to at least mitigate the spread of our personal data and why she's skeptical of any argument from industry that contends governmental regulation would be too costly. We wrapped up with a question inspired by her new book--namely, when it comes to business, who is a business in business for? (end voiceover)
Kirsten Martin, welcome to With a Side of Knowledge.
Kirsten Martin 2:09
Thank you so much. It's great to be here.
Ted Fox 2:11
So you gave a talk a couple of years ago at TEDx Charlottesville titled "Why Companies Should Respect Our Privacy," and I'm going to share that talk in the episode notes so people can check it out. It's a great TED Talk. But in it, you reference "the secondary use of information [by businesses] outside of any bargain that we struck." And I'm wondering as we start here, what is that bargain that we've legitimately struck and can legitimately expect as citizens of an online world? And how has business been pushing the envelope in terms of their secondary uses of our personal information?
Kirsten Martin 2:48
That's a good question only because we don't often talk about--we always talk about what is not in the bargain, but we don't ever really talk about what is in the bargain, right? And so I think, I can say what consumers expect, based on what we've asked them in surveys or given like little vignettes and asked them their expectations. They readily expect websites, apps, to take their data, gather it, and use it to improve their products and services, recommend products like on Amazon, take it for product development--anything that improves the service for the consumer or for even just, selfishly, the company. They kind of expect it to be in a closed bubble, if that makes sense, within that first exchange. And so they do expect some of their data to be gathered and used.
What appears to be well outside is either what data is being collected--so they expect their activities on that website and probably demographic data that they've handed over to be gathered--so what data is being collected in addition to then who gets access to it and what they do with it. So those are the two kind of areas where companies seem to be pushing the boundaries of what people expect online. And around the "what" data would be things like gathering your contacts; gathering your location data; you know, tracking you when you're not on the app or on the website; getting your phone numbers of your contacts; getting your outside information, so maybe purchasing it from a data aggregator about what your last purchases were; get your grocery data. You know, all that type of data that's out there, your public records, they can find out what your house is worth, they can estimate your income, you know, they're going to get all that type of data and then pull it in together. And people don't expect that, even if it's in public records. We did a study around public records, and people were fine with the records being public, like marriage, court records, residential--like, so house ownership records--and they were okay with, like, someone being able to go down and open up the file. But they really got upset when it was, like, in the hands of a data aggregator, when it became slippery and was able to run around. And so that's a type of data.
And then what they don't realize is how many trackers are on the websites and the apps, and how porous the website and the app is to giving the data over to third parties. And right now we really don't hold those consumer-facing companies, the websites that we're engaging with or the apps that we're engaging with, they're not really held responsible for who they give access to us. You know, so they have these third parties that they almost give a window and say, Look, you can track them all you want, and you can give me some money, and I'll let you get their data. And right now, we don't really hold those first parties responsible for that. So the third parties are the ones that kind of are peering in or peeking in and watching what we're doing. And then there's no limitations on what they do with that data. So they can target us for diabetes medicine later on, they can, you know, give us ads that we don't realize that we might want, they can use it for political purposes. There's all sorts of stuff that they can use.
Ted Fox 5:48
So when we talk about the trackers or the cookies or the beacons, are those things then--I guess my question is, how do those work? Are those put there by a third-party aggregator? Basically, they're working with the forward-facing company to say, Hey, we want to place these trackers so they're on your site, and then that forward-facing company is saying, Okay, yeah, we get X amount of dollars for allowing you to do this. Is that basically how that transaction works?
Kirsten Martin 6:16
Yeah, so some trackers are there to, like, place ads, so they both place the ad and then they--or feed a video. So like, they're actually behind the mechanism of, like, if you go on, I'm making [this up], Wall Street Journal or Washington Post, and there's a video embedded in there that's associated with the Wall Street Journal or The Washington Post, there's usually an ad that runs first. So those types of trackers are sometimes there in order to see who you are, and whether or not you belong there, and then feed you the video or the ad with the ad that you should be getting. So they are paying for that service, to be able to serve you the ad, and the website is actually getting money from those companies to be able to have the trackers on that website. Sometimes those trackers are the same as the data aggregators. Sometimes the trackers are there and then sell the data to the data aggregators.
So that is actually one of the complicating factors around being online. It's not--we tend to focus on, like, Facebook, which is almost a closed system of gathering your data, storing it, and then using it for placement or our newsfeed or whatever it might be, ad placement. But the vast majority of your data on the internet is actually not in a closed system. It's lots and lots of companies. So you have these trackers and these beacons who then pass the information onto a data aggregator or a data broker, and there's actually like a backend--like, they're sharing data so that they each can sell the data and actually get a better picture of who you are. So they might sell it to someone that specializes in income, you could have another data aggregator that really specializes in employment, right? They don't really compete in the same market, so they buy and sell each other's data, and then they pass it onto an ad network. So it's actually not like one company. The trackers aren't the evil genies of the, you know, of the ad network world. I mean, they're one step of it, if that makes sense.
Ted Fox 8:09
Kirsten Martin 8:09
But it's actually like a chain of events, almost like a supply chain, that makes it a lot more complicated. And it makes it more complicated to govern, you know? One of the issues is that if you had only one company that was doing everything, just like if you had one company that was doing all the pollution back in the '70s, it was really easy to just focus on U.S. Steel and Pittsburgh and say, You've got to stop polluting the rivers. But it's not the way it is right now with data online. There's a lot of different companies out there.
Ted Fox 8:40
I know you mentioned a couple of minutes ago the idea that in terms of that bargain, we as consumers, there are certain things that we are okay with, or at least subconsciously I guess we're okay with. And I know you had another recent study in the Berkeley Technology Law Journal with a colleague from Cornell Tech about how consumers view the tracking of their location specifically, and there were some interesting findings there. Can you run us through kind of when people thought it was okay, and when it started not being okay?
Kirsten Martin 9:13
Yeah. That was Helen Nissenbaum out of Cornell, and it was the third in like a series of studies we did around public data, and that was our initial--going in is like, well, what do people, even when something's public, and I'm putting this in air quotes, what are their privacy expectations? Our understanding and our theorizing--she has separate theorizing around contextual integrity--around the fact that we always have privacy expectations, even if something is quote-unquote public or released in the wild, kind of given to a third party. So we first have the public records. And then we looked at other public data that was out there like sensitive information, and then we came up--like, we were waiting to do location, and so we did this location data study. There were a couple of interesting findings.
First, we had to decide, how do you even talk about location data? What does that mean when you say "location data"? Does it mean GPS coordinates? Does it mean the street address? Do people think it means just the city that I'm in--like, I'm in Berkeley, California, or I'm in South Bend, Indiana? And it turns out, so we had to first do a study on that and just see, when we change the way we describe location, does it matter to people? If we just say location data, or if we in the same little story, if we use the words "GPS location," or if we use "street address," "city," you know, kind of like the block that we're on. And it turns out, when you say location data, people get that that's GPS location.
Ted Fox 10:36
Kirsten Martin 10:36
They understand that it's actually pretty specific, and they're not sugarcoating it. So those were interchangeable, "GPS" and "location data." And so that let us go to the next part of the study, which was to try to see, does it matter who collects the data, how it's collected, like what technology is being used? Because it's common for companies to ask, Can I collect your location data? And for someone to say, No, you can't. And then what they do is they get it in another way. So they won't track you directly using the app. But they'll get it from a data aggregator or use your Bluetooth or use it based on where you say you are for Twitter, so there's all sorts of other ways that they actually get the location data. So it was important for us to identify if the technology used, facial recognition, license plate reader, I'm trying to think of the other ones--Bluetooth, GPS location, kind of where you said you were on social media, phone tracking through like triangulation of the cell phone data. Those are the major ones that I can think of, but that kind of ran the breadth of how people's location could be actually gathered. So who, how it was collected, and then what it was used for--where did we locate you? So what place did we locate you at?
And so the interesting thing, people were actually fine if you just said, Location was being gathered by almost any technology, and even if you said it was the FBI or your family or a commercial data aggregator, they were much more okay with that when they didn't know how it was going to be used. So when you just ask someone, Do you mind if I collect your location, they don't think about that you're going to be able to place them at a liquor store or at a protest rally or at work or at home--like, they're not thinking about that, they're thinking about whether or not that piece of data should be gathered by you. So we first ran it without telling them what we were going to figure out from the location data. And then we added in the place, so on the National Mall, at a restaurant, at work, at home, at a liquor store--I'm trying to think of where else we located them, those are the main ones where we--oh, voting.
Ted Fox 12:41
Kirsten Martin 12:42
So we added in a voting booth. And then we add, to run it like a fourth time, we added on another statement that actually linked back to the place to say, This is what we could figure out about you. So for the restaurant, for example, we could figure out who your friends were with you. Or when you're at the National Mall, we could figure out if you were at a protest or not. If you were at the liquor store, we could figure out, like, what your drinking habits were. So we added on kind of what you could infer about the person based on the place. And every level of detail that we added on--the place and then adding the inferring information, the inferences drawn--it just like, if you could imagine, the degree that it was okay just kind of plummeted down. It [was] just kind of in this direct graph down of people becoming more and more upset. I'd say across the board, commercial data aggregators were never okay to gather the data, which is interesting because they're probably the worst offenders--like, they're gathering much more data than the FBI is on a regular basis of all of us. And so it was interesting that that was the least okay, and consistent. They did not care what the commercial data aggregator did with the data, why it was being gathered, what they could infer--they just didn't like it. It was just kind of that actor being in there.
They were pretty nuanced if work was trying to figure out if you were at work, your employer was trying to figure out if you were at work, it was more okay; it was still just barely positive, but it was definitely more okay than a commercial data aggregator trying to figure out if you were at a protest rally or not. And one of the more interesting things for me, at least--we did this study, so it was just published, but we probably designed it in like 2017, it was right after the 2016 elections. And two things had come up from that--I mean, lots of things came up from that election--but two things came up specifically around data. And one was, people were able to be tracked at polling places; there was a story about that in the news. And so we added voting in there, whether or not you were voting in that. And the other thing that was coming out were protests and that all sorts of people were attending protests who had never attended protests before. So we added both of those in. Both protected under the First Amendment, right? Or they're both protected acts, I should say, in the Constitution.
Ted Fox 14:49
Kirsten Martin 14:49
So what was interesting was that people, not even knowing that, like, definitely thought it was less okay if people were trying to be placed at voting or a protest. Like, that seemed to be a protected place for them that they didn't care who the actor was, they did not want to be tracked if the inferences drawn were around voting or protesting. Which I just thought was interesting, which tracks with what you would expect if you're, like, a law scholar. (both laugh) But it doesn't always, not always what law scholars think is what people think, too, right? So it doesn't always kind of go together. So that was another interesting thing that people instinctually understood that those should have been protected places and should not be tracked at those locations.
Ted Fox 15:28
So I know from having watched your TEDx talk, I know that you don't advocate that we all withdraw from society ...
Kirsten Martin 15:36
(laughing) Oh no, yeah.
Ted Fox 15:37
Or even all hunker down from the internet. But are there any--and this may come up a little further in our conversation talking about [the] role of businesses and lawmakers--but is there anything as an end-user consumer, anything tangible that we can do to, I'm sure there's no way to completely stop it, but mitigate some of the things that are being collected about us?
Kirsten Martin 15:59
You know, one is, going back to the initial one, which is like these certain groups like Facebook that collect a lot of data about you, but then it's used within that closed space--like, I'm not on Facebook.
Ted Fox 16:09
Me neither. (laughs)
Kirsten Martin 16:10
And none of my kids are, either. I actually don't know very many teenagers that are on Facebook; it's almost like the MySpace of this generation, or aol.com.
Ted Fox 16:19
Kirsten Martin 16:20
Like, they think that this is an old person's thing, and they don't want any part of it. But there's other things like that, you know, you just have to be wary about Instagram and and those types of social media just in general that's gathering all this data about you and what they're going to do with it. So that's one place to look. I use Ghostery--ghost-ery, almost like Casper the Friendly Ghost. And that blocks a lot of trackers on your website so that it's unable to track you as you're going around the internet. So it's almost like--it's not like being in private browsing, but it's very, very similar. And so it's actually an extension that you put on your browser. I find that it actually does mess up sometimes when you're trying to watch a video, so if there's, you know, a video on The Washington Post that I wanted to see, I might have to turn it off for a moment and then reload the page. But to me, it's worth just putting a speed bump in how much data people can collect about me. I do get messages from companies that will say, We would like it if you turned off your ad blocker, but you usually can find a way to say, No thanks, I'm just going to continue on. It's just they have a design where it's like a big button that says okay, and then in like gray, faded font on the left, it says in small font (laughs), it says, No thanks. Which is part of the nudging problem of online. I never say okay, you know, because my answer even though they can't hear me is, Then stop tracking me. It's really simple. Those are easy, like in your room, you can do that, those aren't really hard to do at all.
The other thing is, in a perfect world, we'd be advocating for it. But that's really, really hard because advocating for these types of rules and regulations, unless they are coming up--I mean, if you're in Wisconsin, and they have a facial recognition bill, realize that's really important, and so vote based on it. California has a new privacy rights. Those state level things really work, you know, they set the tone for what companies have to do, and realize that it's important, especially when the data gets in the hands of third parties. And if possible, you know, advocate as much as possible when these things come up. You know, that means calling congresspeople, calling state representatives, that type of thing. But I have to say I'm pretty realistic. Advocacy is a really hard thing to get people to do because it takes time. And usually it has to be something where, I mean, there's a lot of theory around public policy, but one of the theories around public policy is that people do it when it's easy and when the cost of the regulation is really strong. And I don't think most people understand what's being done with their data. So it's not exactly realistic for me to think that people are going to go and call their senator and say, Where's my privacy bill?
Ted Fox 18:57
Kirsten Martin 18:58
And it's because they're busy, and they have lots of stuff to do, so this is just not one of them. And they have a lot of other things they have to worry about, like a pandemic. So I mean, the two big ones are watching social media that you're on, and then second, look at anti-tracking devices. Not anti-ad--anti-ads are one thing. But the things that are ad blockers still track you. And so the ads--you not seeing the ads is not really, I'm not as concerned about the ads being placed for you. Like, the question is whether or not you have tracking devices tracking you as you're online. And the name of it is Ghostery, it's like a purple ghost.
Ted Fox 19:32
Kirsten Martin 19:32
Ted Fox 19:36
So I wanted to pivot a little bit here now because I know you just had--I mean, it's all related--but kind of looking at it from the other side now, from the side of business. I mean, you're a professor in the Mendoza College of Business. You have a new book out as of this June with a couple of coauthors from the University of Virginia on "The Power of AND."
Kirsten Martin 19:58
Ted Fox 19:59
And there is a tendency in the business world--and I'm talking here very intentionally in very general terms, I'm not looking to point a finger at any one company--but as I've seen you note, there's a tendency with issues like this to say, Well, you can't regulate it because it's going to cost us too much business. What is your counter to that?
Kirsten Martin 20:19
They always say that.
Ted Fox 20:21
Kirsten Martin 20:21
I mean, that's honestly, that's just what they say. I mean, I always tell people who aren't in business schools that you just have to ignore them saying that and just keep advocating the way that you need to advocate because business will always say that. They said--I mean, there's two good examples. One is pollution, right? I mean, the language around the EPA was very similar to any federal legislation around privacy--[for example], The industry is so big, you don't understand how many jobs are going to be lost. They'll tell you the size of the industry by billions of dollars, that they won't be able to compete in the global marketplace if you handcuff us by putting EPA regulations on us. I mean, it was all the same. And, you know, they figured it out. They fought, we had to criminally prosecute a few companies to get them to comply with the EPA back in the '70s, and '80s, and then they started to comply. And now it's--I mean, no one remembers it because now it's considered a competitive advantage to have, you know, good environmental practices because they realize that the consumer actually cares about it. And it's not only the consumer. There's like three big markets for companies. One is consumers--this is markets as in, they have to compete for resources--consumers, employees, and stock, like investment. So those are like the three, I'd say, sensitive points of pressure to put on a company. And they realized not only do consumers care about environmental practices, but employees do, they want to go work for a company that has good environmental practices. And then finally, the stockholders ended up caring about it, and the investors, because it ended up being a costly endeavor if you ever annoyed the consumers and the employees. So I'm not saying that stockholders were on the cutting edge of this, but they ended up being like-- they're risk-based, right? So you need to curtail this bad behavior.
The other interesting example is back before the SEC was stood up, and the fighting around buying and selling of stocks, and that did not need to be regulated, and if you regulated us, the stock market's going to crash and, like, nothing's gonna work, and you're going to slow us down and all this other stuff. Now we take the SEC as completely a given. But that's not how it was perceived at the time. So every time we make a big inroads to say, Look--labor laws, You can't, there's no way we can pack 1,000 people in a small garment industry. You know what I mean? Like, and of course, there's going to be fires, but we can't do anything about it. So there's a long history of the industry, especially industry mouthpieces--it's not always the individual companies-- the industry mouthpieces saying, If you regulate us, the industry is going to die. And I've not seen that. You know what I mean? Like, just history does not suggest that. Because businesses are really smart. And they take, once the regulation is a given, they work around it, and they innovate, and they compete, and they figure out something to do with it. So I mean, think about the car--I could go on and on--but think about the car companies with Ralph Nader, you know, with "Unsafe at Any Speed," and how hard they fought any safety. They said the same thing, you know, Oh there's no way we can handle, no, it's too expensive, we can't do seatbelts, mandatory seatbelts. You know, they said the exact same thing, and now we have lots of safety. You know, and we have some company like Volvo that actually competes on it. Anyway, there's lots of evidence of companies saying that, and then they figure out a way around it. And I almost, I really do see the issue of data flowing so freely to these third parties that have no interest in line with mine, right? Like, there's no need for them to not harm me with that data. That's just such a strange anomaly. You know, that's just not normal. We haven't had that before, where companies were able to take our data and then use it against us. Usually, we have to trust somebody before we would give them our data. And so I just have to believe that there's going to be a course correction, similar to the steel industry, you know, the car industry, you go back to the stock exchange back in the early 1900s. You know, there has always been course corrections when things have gotten out of control.
Ted Fox 24:11
It's almost like the internet--this is not a profound thing to say at all--but the internet was such a disruption and changed things so dramatically, so quickly, that now we have this scenario like you're talking about where one of those things, too, that makes it so hard to regulate or to get people energized around is, if I was saying, you know, Okay, car companies should have seatbelts. Well, I can see a car, and I can understand kind of, intellectually, Well, maybe I need to be restrained when I'm in that car so I don't get hurt. But when there's all these levels in this chain, and they're all kind of unseen, it is a lot harder to, once you move beyond that forward-facing company, to think about all these actors kind of behind the scenes. It really is kind of a hard thing to even put your finger on.
Kirsten Martin 24:53
Oh, I totally agree. And I think that is the complicating factor because everything that I just mentioned to you, you'd be regulating, like, individuals or single companies. Pollution, you know, labor standards, cars and the development of cars--there's an easy point to focus on about who's the wrongdoer in this. And who do we need to make sure has the safety and security of the consumer in mind--you know, we could focus on that. And so it does make it more complicated. But to that point, so one of my more recent papers is recommending that we actually hold consumer-facing companies responsible for the third parties that they bring in. So, like, to think of a university that brings in companies to recruit students, you know, they hold the students as a captive audience. It's based on the trust of the university that the students are there, they trust career services, and then career services in the university bring in these third parties to interview and look at students. But we hold the university responsible for who they bring in; they are expected to check the person or the company, make sure their labor standards are fine. You know, universities get in a lot of trouble when there was a lot of unpaid internships and they get mad about it.
So in a similar way, we'd hold consumer-facing companies responsible for those third-party trackers. I do think that's one of the points that we will start doing. I think that they'll start to have to justify, Why do you let that third-party tracker in to track me? Because once you say it out loud, that doesn't make sense that we're not holding The New York Times and The Washington Post responsible for the third parties they invite in. They're almost like what they term a honeypot--they're the sweet nectar that we go after, Washington Post and New York Times, and then they invite in these third parties to track us. You know, based on the way the internet works is, based on how long internet companies can keep us looking, that's how long the trackers can see us. So in that way, we should really hold those consumer-facing companies more responsible for who they allow to track us but then also target us later. So I do actually, I think that that's where it'll go next. To your point about needing a point, a single point of responsibility to say, Who's responsible for all this? I think it'll end up being those consumer-facing companies.
Ted Fox 27:03
And I'm glad you made the point earlier, too, that it's not that you're anti-ad. Because I mean, and I know in the TEDx talk, you talked about how another part of that bargain that we've made is--New York Times and Washington Post, not the best examples in this because they're not free content, they are still subscription services, but it is certainly much cheaper to access it all online. And part of the trade-off is, well, we have ads served to us because we have so much more access to content that in the past, you either had to pay $2 for the physical copy of the newspaper, and they're businesses, and they need to survive. But it certainly seems like there is a middle ground there in between tracking your every movement in order to micro-target you and just kind of the base ability to be able to serve you ads. Because those ads, like you said, it's a honeypot of the number of people looking at those pages, those ads are going to be, they have a certain value no matter what just because of the amount of traffic going there.
Kirsten Martin 28:01
Right, you know, and the interesting thing about it is it's not actually clear who benefits from the current system. So what I mean by that is, Catherine Tucker, who's out at MIT, and her colleagues have done work around how much targeting is actually beneficial to the click-through rates or the actual purchases. And there's kind of like a really small window that hyper-targeting is actually useful. But, like, too early and too late in the process, it's actually harmful. And so it's not necessarily beneficial for the company whose ad is being placed. It's not necessarily helpful because the customer-facing website has to pay more to have it placed. Another person named Alessandro Acquisti, who's at Carnegie Mellon, did a study where he tried to look at like, well, who's benefiting from this current system of hyper-targeting? So the hyper-targeting of ads, but also like this collection of data aggregators. And it turns out, it's these hidden third parties, like the data aggregators and the ad networks, they benefit--like, they are able to charge more, they have an industry, but they don't feel any of the negativity of either the consumer being upset or the company's product not being purchased. Like, that's not actually an issue that they face. And so it's interesting to always ask like, Who's benefiting from this? And it turns out, it's just these hidden ad networks and data aggregators. It's almost like they're self-perpetuating this--like, We have to do this, we can't change any other way. And you're kind of like, Well yeah, of course you're going to say that. I mean, you have a hundred-billion-dollar industry; no doubt that you're going to say that.
But it turns out that actually it might not be any more beneficial to targeting me personally versus just knowing some general thing about me and my age and just targeting me in that way. And I do think, as a side note, we just haven't put in these types of regulations. But if you were to say, Look, we're not gonna allow the data to be in the hands of these third-party data aggregators, I actually believe that computer science engineers would figure out a way to place pretty good ads without using the hyper-targeting use of our data. Like, they would figure out a way to keep the data hidden and only get access to it to place the ad, but they never actually held the data, if that makes sense. Because they actually can do that for other things like running statistics; you never actually hold onto the data, they are able to, like, keep the data in the organization's hands--say a hospital's hands--but actually run queries off of it remotely and actually combine the data without ever holding onto the data. So there's a way to keep privacy and still use the data. So in other areas of our life, we've figured this out--like when there are rules and regulations about maintaining the privacy and security of the data but we still want to use it, we've figured out ways around it, even technical solutions. And so I do think that if we actually had these types of rules, we'd have better ads, even with maintaining privacy.
Ted Fox 30:51
The full title of the book is "The Power of AND: Responsible Business Without Trade-Offs." And in it, you and your coauthors, you kind of propose this new model of business based on five key concepts. And we don't need to run through all five of those here, but one that stood out to me, and you've mentioned this, but I just want to circle back to it: this idea of creating value for stakeholders as well as shareholders. And I think so often, I think even from, you know, an outsider's perspective, I might see those two words, stakeholder and shareholder, and they might kind of blend into the same thing to me.
Kirsten Martin 31:24
Ted Fox 31:24
I'm not even talking about, What is the purpose of a business? So when you're talking about that and asking how a business should be oriented towards the world, what are you all getting at there?
Kirsten Martin 31:38
There's a tension in both business and law around, for whom is the organization in existence for? And it's really clear at the end of an organization's life who's primary, so meaning if you're in bankruptcy, or if you're being taken over, if in any way the organization, the firm, is not going to exist any longer, at that point, the people to whom you owe money and stockholders are actually king. They're the ones that you need to worry about the most. Because they're going to get the residual value of the firm, whatever is left over after you pay out your employees and pay your suppliers, they get the chunk of money that's left at the end because they're actually the last paid. So I mean, it's kind of odd, they're the last paid, so they're really not the most important, they're the last paid.
Ted Fox 32:24
Kirsten Martin 32:25
But there is a thing in the law that at the disillusionment of the firm, that we need to be worried about the executives. Because you could imagine the executives could be like, Well, let me just pay myself a whole bunch of bonuses and pay my friends a bunch of bonuses and pay my suppliers a bunch, dollar-for-dollar what they're owed, and then the stockholders get nothing. So they rightly are like, Look, you can't do that, you have to act in our interest, as well. So that's at the end of its life.
It's still an open question about how it needs to run when it's an ongoing entity that is trying to create value for itself. And usually, the vast majority of people think it's the long-term interest of the firm is what you need to have as, like, kind of your mission or your guiding principle. You're supposed to be an ongoing entity, you're supposed to keep on going, you're supposed to be healthy in that way. And so there's two different ideas around this. One would be if you maximize value for your shareholders, that that's in the long-term interest of the firm. But that's not how that works in practice. So I think that people who advocate for the shareholder-wealth-maximization argument think that it's conflated with the long-term interests of the firm. But there's lots of issues--say, teenage vaping. That's very good for shareholder wealth maximization, you could get teenagers addicted to your product, there's no, at least back in the day, there were no laws against it. You can see why the tobacco companies went down that route. Or addictive games online. You know what I mean? Like, so any type of way that you could actually harm a stakeholder but it's actually in the long-term interest of the shareholder is a tension that becomes problematic if you talk about maximizing shareholder wealth. We argue that in the long-term to the firm, it's actually better to try to think about all the stakeholders, and that would include, like, the users, the people that are hurt or harmed by your product or technology. And I think we have to more likely talk about even like those marginalized groups--such as, think about content moderation with technology companies. It's really hard to talk about shareholder wealth maximization when you're taking down revenge porn. So how do you use shareholder wealth maximization when the person that's being harmed by the content isn't even a user of your platform? They're not even always considered a stakeholder. I mean, they are a stakeholder, they're influenced by your actions, and they should influence you, but shareholder wealth maximization as an ongoing operating principle is really narrow and myopic. You know what I mean?
Ted Fox 34:52
Kirsten Martin 34:52
And it's really short-term. And so our argument is that it's the best for the long-term interests of the firm if you actually think about on any given decision, who are the stakeholders to this decision? And how do I best maximize the value for all of those stakeholders, including the firm? So that's not, you're not doing Kumbaya, and you're not being altruistic, where you only worry about others. Your employees are stakeholders to you. The firm itself is a stakeholder, you know, also with suppliers, the community, we have to extend it to talk about users who aren't consumers. Because the old way of thinking about stakeholders were people with whom you had a financial interest. So consumers, suppliers, community, government, stockholders, those are the kind of the big five. But technology nowadays, and it's not even that recent, is we really need to talk separately about users. We need to talk separately about people who are just impacted by the technology that aren't even a user. Those are things that stakeholder theory actually includes the stockholder as one of those stakeholders, but it doesn't make them the end all and be all that you have to maximize shareholder wealth on an ongoing basis.
And it's somewhat of an academic problem because if you talk to actual managers, no one, like, decides about revenge porn or violent content takedowns based on shareholder wealth maximization. I mean, sometimes it could get the executives off track because the shareholders are the loudest voice. But no one wants to go to work for that company, honestly. Like, it ends up being something that that no one wants to work for or do business with. But it is a shorthand when people want to do the wrong thing, I would say that shareholder wealth maximization is kind of a cloak that covers a bad decision. That's the one thing that you have to be wary of when they talk about, We need to maximize profits. Almost like an alarm bell should go off in your head to say, Well, wait, what are you trying to do? Because that doesn't necessarily, that's not the right answer all the time, that we just try to maximize profits all the time, because that leads us astray. It makes us very short-term, and it makes us very narrow. And so yeah, if I was gonna take away one thing, it'd be like, if someone starts talking about profit maximization or shareholder wealth maximization, you should have an alarm bell go off and think, What are you doing? Because that's not a justification that I like very much.
Ted Fox 37:01
Kirsten Martin, this was great. And welcome to Notre Dame, officially. Thank you for making time to do this.
Kirsten Martin 37:06
Oh, thanks so much. Yeah, this is a great conversation. Thank you.