Inside Privacy Law: The Regulation of Personal Data

Inside Privacy Law: The Regulation of Personal Data

Podcast In the Public Interest

Episode Guests

In This Episode

In today’s interconnected world, personal information has never been more broadly collected and analyzed by governments and corporations alike, making it imperative that we understand, enforce and update privacy laws in order to safeguard individual privacy. In this episode, WilmerHale podcast co-host and Partner John Walsh welcomes two national leaders in privacy law of all kinds: Partner Kirk Nahra and guest Stacey Gray.

Nahra is the co-chair of WilmerHale’s Big Data and Cybersecurity and Privacy Practices. Among his other accolades, he is the winner of the 2021 Vanguard Award from the International Association of Privacy Professionals (IAPP)—one of the most prestigious in the privacy field—which recognizes one IAPP member each year who demonstrates exceptional leadership, knowledge and creativity in privacy and data protection. Gray is a Senior Counsel at the Future of Privacy Forum (FPF) and leads FPF’s engagement on federal and state privacy legislation. Prior to leading FPF’s legislative engagement, she spent several years focusing on the privacy implication of data collection in online and mobile advertising, platform regulation, cross-device tracking, Smart Homes, and the Internet of Things.  

Nahra and Gray talk with Walsh about the challenges of having states with different privacy laws and why that may ultimately drive a single national law. They also dive into consumer consent and the complexities of regulating the collection and sharing of personal data. The episode wraps up with tips for individuals and companies to keep in mind when considering privacy and personal data.

Links:

Read the Episode Transcript

Subscribe to podcast

Hosts

Episode Transcript

Expand All Collapse All
  • Transcript

    +

    Speakers: Brendan McGuire, John Walsh, Kirk Nahra and Stacey Gray

    McGuire: I’m Brendan McGuire

    Walsh: And I’m John Walsh. Brendan and I are partners at WilmerHale, an international law firm that works at the intersection of government, technology and business. In today’s interconnected world, information about who we are has never been more broadly collected by governments and corporations alike. Digital privacy continues to be complicated with different privacy laws taking different approaches, whether it’s the European Unions’ general data protection regulation, known as GDPR, or California’s Consumer Privacy Act, known as CCPA. For that reason it is crucial that we understand, enforce and update privacy laws in order to safeguard our individual privacy.

    McGuire: Today, John is joined by two national leaders in privacy laws of all kinds, Stacey Gray and Kirk Nahra. Stacey is a senior counsel at the Future of Privacy Counsel Forum and leads FPF’s engagement on federal and state privacy legislation. Prior to leading FPF’s legislative engagement, she spent several years focusing on the privacy implications of data collection in online and mobile advertising, platform regulation, cross device tracking, smart homes and the Internet of Things. At FPF, she has authored many agency public filings and published extensive work related to the intersection of emerging technologies and federal privacy regulation and enforcement. And finally, Stacey has served as a member of the Civil Rights Division of the Institute of Public Representation. Kirk, a partner here at WilmerHale has been a leading authority on privacy and cybersecurity matters for more than two decades. In other words, well before it was cool. Counseling clients across industries on implementing domestic and international privacy and data security requirements, with a particular focus in health care privacy. Kirk recently received the Vanguard Award from the International Association of Privacy Professionals. This prestigious award is given to a select few working at the pinnacle of the privacy profession, recognizing those who’ve scaled new heights in leadership, knowledge and creativity in the field of privacy. Kirk is also a founding member and long-time board member of the International Association of Privacy Professionals and currently serves as a fellow with the Cordell Institute for Policy in Medicine and Law at Washington University in St. Louis and at the Institute of Critical Infrastructure Technology. In addition to his practice and professional commitments, Kirk is an adjunct professor at the Washington College of Law at American University and at Case Western Reserve University teaching all of this to the next generation of privacy leaders. And with that, I’ll hand it over to John.

    Walsh: Stacey and Kirk, thank you both for being here today.

    Nahra: Thanks for having us.

    Walsh: So, let’s just start out with a really basic question about privacy. We hear so much about digital privacy, or maybe better put, the lack of digital privacy in the press these days. When we click to accept the terms and conditions of an app or an online service, we’re effectively giving up all sorts of data, sometimes for things and uses that we don’t even realize exist. So, Kirk, let me just start out with that background. Can you give us an overview of what the law does and doesn’t protect from a privacy point of view?

    Nahra: Sure, let me give you the minute version of a semester-long course in privacy law. So, in the United States and let’s focus on the United States, we have lots of laws. But we have categories of laws. We have, what I think of as sector laws, financial services, health care, education. We have laws that deal with particular kinds of practices, some of the things that people might be more familiar with, like the do not call list or the CAN-SPAM law that deal with email marketing. And then we’re starting to have laws that deal with particular categories of data like facial recognition and biometrics and location data and genetic information. And so, those are all specific laws. We also have just on top of that, we have regulators, law enforcement agencies, who are looking at general consumer protection, that’s the Federal Trade Commission and State Attorneys General. So, on the one hand you could paint the picture that there’s lots and lots of laws that are out there.

    Walsh: What jumps out at me at that description, Kirk, is that it is so specific to the particular use and the particular type of data that you’re talking about that, for the point of view, for example, of the consumers it is really hard to know at any given moment what laws are covering what you’re doing.

    Nahra: Absolutely. And I could certainly build the case that that’s getting worse and worse with each new law and there are lots of academics in the field that say that’s never going to be fixed. We need to figure out how law works without making consumers really be the ones that have to bear a burden of that. And so, that’s one of the topics in the national privacy law debate.

    Walsh: So, Stacey, I’m going to dial up the complexity one additional step and ask you to comment about what the role is that states play in state law versus what the federal law plays in this whole realm.

    Gray: Sure, John. Well, that’s a huge area of debate now too, right, in addition to everything that Kirk mentioned. States have always been at the forefront in state common law involved to protect things like libel and invasion of privacy and rights to publicity. And, more than that, I think states have been at the forefront in the last say ten to twenty years because they’re usually able to respond faster to new technologies and new business practices. So, we’ve seen a lot of states passing new privacy laws, regulating things like drones, distribution of non-consensual images, laws aiming to protect student privacy, educational records privacy laws, and now, of course, in last year or two, there’s been a big push for states seeking to pass what we call comprehensive consumer privacy laws, meaning that it would just regulate all uses of personal information collected in commercial contexts, online mobile apps, retail, things that kind of fill in the gaps between all of those sectors that Kirk has mentioned. But, important to note, states can really only do this in the areas where the federal government hasn’t already acted and chosen to preempt that field for federal regulations, so federal laws around things like health care, finance and education are not preemptive, so they set a baseline and states can go further, but other laws, like our federal law regulating email marketing (CAN-SPAM) or our federal law around children’s privacy are very preemptive so states really can’t pass laws in those fields. So, it gets pretty complex pretty fast.

    Nahra: A lot of the attention today is on what Stacey called these sort of comprehensive state laws and California CCPA was the first one of those. We now have a law in Virginia. Personally how I look at those laws and I think Stacey used this word also, they’re really gap-filling laws. They’re not comprehensive laws because a California law, for example, exempts all kinds of people that are covered by other existing laws at this point. The Virginia law is even broader than that, and to go back to a point that you made earlier about consumers, one of the facts that I’ve been using about the California law a lot is that if you are a California consumer, your health care information after the CCPA is now regulated by at least six different regulatory frameworks in California. You have HIPAA for health care, you have the California health care privacy law, you have common law for medical research, you have CCPA, which fills in some gaps, but it doesn’t fill in gaps for non-profits, and it doesn’t deal with employee information at all. So, I don’t know how a consumer could possibly make sense of that and, while consumers at least get some protection provided by this law sort of automatically, it’s not clear and it’s not comprehensive and it’s not consistent, which is, from my perspective, not good for consumers, but I think we’re also getting to a point where it’s not really good for businesses either to have that many sets of rules applying to similar kinds of data.

    Walsh: Following up on that point, when I hear that there are all these different state laws with different state requirements, the thing that immediately jumps to mind is there room for a national law that would preempt some of this and make it consistent across the country in a way that people could follow no matter where they were doing business.

    Gray: On the issue of having a single set of uniform federal rules, for a lot of reasons, is the most important thing driving the momentum behind this sort of political push for a federal privacy law, right now, mostly from a business perspective. But, you’re right, I mean it’s also it’s also a good thing from a consumer perspective, to a certain extent, right, that there be a uniform set of rules around at least certain aspects. So, for example, data subject access requests. There’s sort of a growing legal norm that consumers and individuals acting as consumers have the right to know what data has been collected about them and they can get access to that data and have it either deleted or ported to a different service. It’s going to quickly become impossible to comply if what is included in that access request is different in different states. Some people in some states are going to have what you think of as more rights over their data and also from a business compliance perspective, it’s just impossible to comply, especially when you start talking about data that is less easily identifiable when you sign up for an account with a particular service like your Amazon, or your Spotify, your Facebook, that account is pretty well authenticated, but what do you do if you are just a single website publisher and you’re collecting IP address logs and to be able to prove to you with any sort of confidence that it really is my IP address, right, and not somebody else’s that I’m trying to spy on. So, those sort of issues are really hard when you’re talking about things that are still major privacy interests but harder to authenticate. The IP addresses, advertising IDs, you know, MAC addresses, Wi-Fi network data, things like that becomes really hard really fast.

    Nahra: There are lots of reasons to derive a national law. There’s pressure internationally, there’s pressure because of security breaches and various privacy abuses. There are reasons to pass a national law to give people a baseline of protection. That’s one component of the discussion. At the same time as we’re thinking about a national law, the two biggest issues that are dominating the political debate at the national level are (a) whether there’s going to be a private cause of action, consumers can sue under that law, but second is that question of preemption. Is the national law going to become a floor and a ceiling or just a floor? And that’s very much part of the debate and you can take different positions on that. Right now, I would say that that issue is actually stalling passage of a national law because we haven’t been able to get a consensus on that. My personal view is that as the states start to pass their own laws and those laws are not entirely consistent because we don’t have a specific model at the state level yet, I think there’s going to be more pressure from business to have a national law with that national law preempting. At the same time, whatever the baseline is for that national law, I think grows each time there’s a state law. So, we started off with a pretty low floor for a national law, we’re going to keep going up and if we have ten laws, it’s going to both enormous pressure to have a single law and what that single law is going to say is going to be a more privacy protective law than you might have today.

    Walsh: Stacey, do you agree with that, that the states, in effect, are driving the direction, or at least the minimum level of protection for any potential federal law.

    Gray: Yeah, before that, it was driven by the GDPR, right, so ten years ago, I think most of industry in the United States would not even necessarily classify some of the data that we’re talking about like mobile identifiers as personal information. And the GDPR really changed that on a global stage and then the CCPA set a bar that now any federal law is going to have to exceed.

    Walsh: For our audience’s benefit, when you say GDPR, you’re talking about the European Union’s comprehensive privacy law and CCPA is California’s, which went into effect just a couple of years ago.

    Gray: It went into effect because there was a valid initiative that was imminently threatening to be on the ballot. Privacy is a very popular issue. There was no question that Californians were going to vote for it, and so, the California legislature was incentivized to pass the CCPA relatively quickly. So, there was a bit of a window of time where the CCPA could have been the minimum for whatever we might have passed federally, and that might not have been a bad outcome actually for businesses because even though the California Consumer Privacy Act in 2018 was new for a lot of U.S. businesses, it was not a terribly high standard, we’re talking about basic consumer rights to access and delete data and one very strong right to request that companies not sell your data. But that’s not nearly as far as a federal law could go. I mean there are federal laws that would require companies to have comprehensive in-house data privacy and security programs, to appoint privacy and security officers, to enact things like prohibitions on certain type of collection and use, to require affirmative consent for certain types of collection and use, not just letting people opt out of it, but were actually requiring that you get the permission in the first instance. So the CCPA set a bar and it’s only gone up from there. Two years later and there was another ballot initiative in California that was this past November that raises the bar a little bit further for California and other states are taking approaches that go farther.

    Nahra: You mentioned models and one of the things that I think we all, as consumers, try to get our head around is, what does privacy even mean in a digital sense? And we’re all familiar with the one model which is the consumer choice model because we hear about it all the time. You can choose to opt in or opt out or be sure to look at your privacy settings because we at fill in the blank company are giving you control over that. But the question I have for both of you is, given the complexity of what you’re describing, does that really end up being an effective way to regulate privacy?

    Gray: So part of me wants to just say no, but I think what privacy advocates are grappling with is the fact that notice and choice, for better or worse, has been the dominant model for the last 50 years or however long we’ve been regulating this. Choice makes good sense for a lot of things where the risk is relatively low and where people actually do have different preferences. So, do I want to sign up for this email marketing list? Do I want to pay ten dollars a month for Spotify versus get the version with ads? Do I want to sign up for this local grocery store program? Okay, fine. The problem is that consent has become the legal compliance tool by which anything and everything is now justified, sort of irrespective of what people actually understand or what they want. So, you consent to sharing a certain type of data with a mobile app, for instance, sharing my location with a weather app so that I can get local weather alerts and that app now says okay, by consenting to this you also consent to us sharing and monetizing the data and giving it to our partners, right. Maybe the use case is even something good like location data can be used for all sorts of beneficial things and transportation analytics is a socially good use case. In an ideal world, you would say no and you can access the app anyway using some other method. The harder question is when is it okay for businesses to say sorry we monetize data so that we can give you app, and if you don’t consent you can’t use this service or you can’t use this app so go somewhere else. Okay, you can pay $3.99 if there are enough choices out there, but the fact is we now have a lot of dominant business models especially in the advertising space, unfortunately, even though I think too much of the oxygen gets sucked up by advertising questions, but especially in advertising that relies solely on collection and use of data there’s really no other business model so if people don’t consent or if enough people opt out under an opt out model, that’s an existential threat to businesses and that’s why we see it becoming this sort of check box compliance tool that people don’t understand.

    Nahra: You’re absolutely right that this is an increasing debate. I get a lot of commentary about the notice and choice model failing. One of the options that I think some folks are considering and I’m certainly one of them is whether we can take a lesson from my favorite of the current roles which is the HIPAA rules. What they do under the HIPAA standard is they basically design the rule so that typical things that you do in the health care industry – you go to get treated, you want your claim to get paid and the businesses need to generally operate the administration operations running your business. Those things are automatic. There’s isn’t even a pretense of consent. The companies are regulated and they’re limited to those activities but you’re not asked for consent. We’re not pretending to get your consent for that. It’s just you’re allowed to do it. Where you’re consent comes in is when they want to do something outside of the norm. And so consent is very particular and very specialized and very individualized. And so, that’s a really interesting model that I’d like to see export. The challenge I think is we can define what’s normal for the health care system but that normal for the health care system doesn’t make any sense for a bank, it doesn’t make any sense for a retailer, it doesn’t make any sense for a technology company. And so, we have to come up with a structure in the law that would have a way of defining that norm.

    Walsh: One of the things that recently came to light via some investigative reporting by the New York Times was how easy it is to take anonymized GPS location data and kind of patch it together for multiple sources to effectively identify an individual person who thought maybe they checked up and said fine I’ll submit my anonymized location data to whatever app it is, but when you put a lot of different databases together, you can pretty easily with the computing power we have these days identify individuals, and the New York Times was able to track and identify individual people where they went, what store they went to before they went to work (when we were still going back to work). What jumps out at me is the point you were just making a moment ago, Kirk, and that is that a consumer doesn’t really know how that data is necessarily going to be used in those downstream businesses. Now maybe we don’t want to get rid of those downstream businesses, and yet at the same time there is this potential that I think people feel on a sort of gut level for a real invasion of privacy down the road by conglomerating all of this different data. Thoughts on that – is there a way in law to deal with those sorts of considerations?

    Nahra: The challenge is maybe to get beyond the consent model, right. Consent is largely irrelevant in those downstream models because if I go to a store and they collect my data, I don’t really have any idea what’s going to happen to that data after it leaves that store. I may know that the store is going to use it to send me its own marketing and to give me a discount on this and to do that but if the location vendor who knows that I went to the store downstream gets it for some other purpose, I don’t have any idea about that, and, similarly, there’s no realistic means for that downstream vendor to get my consent because I don’t know who they are. I’ve never dealt with them. I’ve never seen them, and so, that package of issues is pretty complicated. There’s also the technology. I mean the example you gave about location data, I think a lot people thought of that as really how can you follow this piece of thing that moves around, but there aren’t that many people that go from my house to my office every day. And so, it’s actually not necessarily that hard to make that connection, and so, that’s one of the challenges the privacy law is what personal data is keeps changing on a regular basis. I mean one of my favorite elements of the CCPA is that it regulates the collection of olfactory information. Now I don’t know that there are yet companies that are yet tracing you based on your personal smell, but California was trying to anticipate that in case there is some point in the future where we can do that. And so, trying to future proof what personal information is is itself a real challenge.

    Walsh: So, Stacey, I want to build on what Kirk was just saying about that downstream set of uses. Have you seen any proposals or any potential legal regulation that might kind of address that in a way that would protect people’s privacy from companies they didn’t know were going to end up getting their data?   

    Gray: I think all major proposals would address this in some fashion, right. Geolocation data, for example, that’s usually something considered sensitive data so that requires affirmative consent even now under just the FTC’s jurisdiction. They’ve published guidance saying consent is the appropriate mechanism there. And then, if you are a downstream player in that market, and you can’t make the case that you’re a bona fide processor, or service provider, than you are going to have to comply with either the limits of what that consent was, or any future opt outs or objections to it. So, this is where we really need guidance from federal regulators like what does the appropriate scope of consent mean. For example, I think you can make a good argument even under today’s laws that consenting for data even for advertising or transportation analytics or whatever other use case does not involve consenting with that data being sold to ICE for immigration enforcement. That’s out of scope; that’s both an unexpected use and incompatible with the original purpose for which the data was collected. There are certain things that we clearly recognize as out of scope and out of context, and that comes into play in every privacy proposal. The differences are just about whether it’s consent-based or opt out based or something else.

    Walsh: You said something really interesting there which is guidance on what the scope of consent really means. In other words, if I understood you correctly, that there might be a law that says you can ask a consumer to consent to this much, but only this much and not beyond that, basically. Is that a fair description or would it operate some other way?

    Gray: There are some proposals that would seek to ban certain uses, the Center for Democracy and Technology has a proposal kind of like that that would just ban certain uses of geolocation data for things like probabilistic cross-device matching. There’s a very specific geolocation use case where you could say like oh, this watch belongs to the same person as the browser and the phone. Okay, some proposals would seek to just ban that. But I do think we need more guidance on consent so this is a very robust field in the European Union. The basic principle is not that you can’t consent to some things, although may we should be talking about that, but it’s more that what you do consent to ought to be clear and understandable to the user and specific. It can’t just be hey, I’m a mobile app, I’m requesting your consent for analytics. What does that mean? It can’t be I’m just sharing with my partners to give you a better personalized experience. What does that mean? 

    Nahra: But each of those points then has its own stream of issues that you have. For example, location data – you are asked typically when location data is being collected and you can say yes or no. But what you don’t really know at that point is other than, you know, you’re trying to get directions to get to the restaurant for dinner, but you don’t have any idea what’s going to happen to that data after that. Similarly, there are uses of that data where we may say you know what, it’s okay if a bank uses that for this purpose or a retailer uses it for this purpose, but we don’t want an immigration regulator to use it for a different purpose. So, there are advocates in the health care industry who would like to see a very different consent model. They would like to see consent be for any kind of use of your health information. I don’t know how that would work. You’d be asked fifty times a day to give your consent for something. We’re going to have that challenge under the Virginia law. The Virginia law that just got passed seems to require consent for all processing of any health care information. Either that’s going to be you agree to everything, right, or it’s going to be here’s a list of 875 things, go through and you can check off 1, 16 and 47, but nobody can actually do that. And so, that’s again where we push to this idea that the consent model where we are putting a burden on consumers just may not be really the right way to think about it.

    Gray: I think the GDPR provides a good alternative. Look, in the United States, I hear a lot of people saying the GDPR requires consent. That’s just not true at all. I think we see a lot of the consent based cookie banners based on the e-privacy directive that requires consent for cookie placement. But the GDPR that you use for the privacy law relies not just on consent, but on having one of a list of lawful bases in order to collect data. Consent is one way you can collect data if it meets the very very high standard of being specific, and informed and something people can understand and easily, you know, reject later on. That’s actually really hard if not impossible for a lot of companies to operationalize. You know, there are just some contexts where it’s literally not possible to get consent even if you wanted to. For instance, the collection of any passive ambient signals in our offline environment, walking up to a billboard or a bus stop sign that is now digital and is displaying information being perceived from my phone or my device. That kind of thing involves personal information. There’s no user interface though so it wouldn’t be possible for it to get consent even if that were desirable. Technology is like a video camera on your electric vehicle, your connected car is going information from passersby in order to work. That’s not a context where it’s able to get consent. You have to rely on other safeguards. The GDPR I think does this very well. It’s got six lawful bases to collect data and consent is one of them. Another way to collect data is through something called legitimate interests. And it’s something that gives both advocates and companies a lot of anxiety sometimes in the United States, right, because companies look at it and they say oh that’s not clear. I need to know what the rules are in order to be able to comply. Consumer advocates look at it and they say I don’t know that looks like companies are going to be able to do whatever they want. But applied with some enforcement power and strength, it can actually be a useful powerful tool. So, legitimate interests legal analysis involves saying do you have a legitimate interest in collecting this data?  Okay. Is it outweighed by the impact to the rights and freedoms of the data subject? My favorite example is with automatic license plate reader information at gas stations. There is an EU case involving this kind of data collection, the gas station was tracking license plates in order to detect people that were pulling up to the gas station, filling up their tank and then leaving without paying. License plates, that’s not something you can get consent for, right, so they couldn’t rely on consent as a lawful basis to collect that data and they tried to use legitimate interests but in that case the court said it is outweighed by the impact on the rights and freedoms of people because you’re tracking them and you have an alternative and your alternative is just to ask people to pay up front. Just change the way you’re doing business to avoid having to collect data in the first place. And I love that example because it’s so common sense and it takes into account people’s rights and freedoms. It doesn’t involve affirmative consent or opting out. It just says think more clearly about your business practices and avoid having to collect it in the first place.

    Nahra: One of the things that I think is worth some consideration is right now we have enforcement in the gaps from the FTC and state attorneys general who have general consumer protection authority. And I do wonder if there is some vehicle that would essentially utilize the standards that are in those laws as a baseline. I mean you would have to give the FTC some additional authority that they don’t have, but I don’t know that you need necessarily all that much definition to get to the point where we’re having those same arguments about is this right, is this not right, how did you draw your balances, etc. So, that’s another model. I mean if I was in charge of Congress today, I could easily see a relatively straightforward privacy law that dealt with a couple of baseline issues and then delegated everything else to the FTC to write a regulation. That means you’re still several years away from a final law, but they know what they’re doing. They have a lot of sense in this area. They have a lot of experience. I could easily see them dealing with the intricacies. I’m just not highly confidence in Congress’ ability to navigate all, we’ve touched on two percent of the issues in this conversation and already that’s really hard. I’m not sure how you deal with that full array of issues in a single piece of legislation.

    Walsh: I think you both have persuaded me that this is a really complicated area, and one that’s going to require a lot of work over time at the state level and at the federal level. Let me just end with this, first to Stacey. Do you have any advice to consumers based on how the law is and how they should be taking steps to protect their own privacy and what do you say to our listener who’s wondering where their data is going?

    Gray: Well, it’s funny because we’ve just had this great conversation about how hard it is to understand where your data is going. That said, it just makes good sense I think to be engaged in ongoing pursuit of knowledge about how your devices work and what kind of data is out there, right. So, check out the privacy settings of your smart phone, turn on the limit, a tracking to avoid your advertising identifier from being shared with apps. Do an audit of what apps are collecting location data and whether you really need them. It just makes good sense to be aware of that and exercise some control of the things we have control over.

    Walsh: Kirk, as a follow-up for businesses, operating nationally, what’s the best approach?

    Nahra: I end up spending a lot of time with clients and, you know, with audiences in the field, just getting them to think about privacy earlier on in how they’re doing their business strategy. And I could build that argument as a matter of prudence, I could build that argument as a way to stay out of regulatory trouble. I could build that argument as a way to you know if you’re a startup to make your company more viable for a future acquisition. A lot of this is just planning for what you’re doing and thinking about your activities going forward. Building privacy considerations into the front end of your business planning, you know, sometimes that’s called privacy by design in our field. It’s a really smart way to just navigate and anticipate all these issues going forward, really whatever they are, whatever your business is, whatever your industry is, the law of all is to just be conscious of how important these issues are for your overall business planning.

    Walsh: Well, thank you very much, Kirk, and Stacey, thank you so much for joining us. I suspect that we may need to do a follow-up episode in a year or two as we see how all of this develops. Thank you all very much for your help today. Thank you to our listeners for tuning in on this podcast and we look forward to talking to you soon.

    McGuire: Thanks very much, Stacey, Kirk and John. That was such an enlightening discussion and I certainly agree with John that it will be interesting to see how this evolves over the next couple of years and I’m certainly relieved that we have experts like Stacey and Kirk to help us understand these complex topics. I’m going to check the privacy settings on my phone right now. Thank you to everyone for joining us on this episode of In the Public Interest. If you enjoyed this podcast, please take a minute to share with a friend and subscribe, rate and review us wherever you get your podcast. We hope you will join us for our next episode and see you then.

More from this series

Notice

Unless you are an existing client, before communicating with WilmerHale by e-mail (or otherwise), please read the Disclaimer referenced by this link.(The Disclaimer is also accessible from the opening of this website). As noted therein, until you have received from us a written statement that we represent you in a particular manner (an "engagement letter") you should not send to us any confidential information about any such matter. After we have undertaken representation of you concerning a matter, you will be our client, and we may thereafter exchange confidential information freely.

Thank you for your interest in WilmerHale.