McGriff Homepage

Decoding Cyber Risks: Regulatory Landscape

About this webinar

Decoding Cyber Risks is an informative advisory series that features McGriff’s market-leading experts in partnership with industry subject matter specialists. In this installment features, Morrison Mahoney attorneys Daniel Marvin and Alex D’Amico join McGriff for a discussion on the current regulatory landscape as it relates to cyber risk.

Regulatory Landscape

Lisa F.: [00:00:00] I would like to welcome everyone to McGriff's cyber instruction series. This is the first series of our cyber-, cyber security, privacy and data protection webinars. And today we are going to discuss the cybersecurity regulatory landscape with attorneys from Morrison Mahoney. First, we have Daniel Marvin, who is a partner in the New York office of Morrison Mahoney, and a co-leader of the firm's cybersecurity privacy and data protection team where he, advises clients on a range of litigation and advisory services related to some of his advisory services are related to implementing written information, security plans, employee data security policies, third party, vendor contracts, and corporate governance policies.

Along with advising clients in all aspects of data breach prevention, detection and mitigation. So, we are very excited to have Dan with us today. We also have Alex D'Amico who is an associate in the New York office of Morrison Mahoney, and Alex concentrates, his practice in the areas of data, privacy and insurance fraud law. He also has a certification as a member of the international association of privacy professionals and has earned his designation as a certified information privacy professional in both Europe and the United States. From the McGriff team today, we have Aarti Soni, who is the McGriff cyber director and product innovation council in McGriff's New York office, and I am Lisa Frist. I am a vice president of claims in McGriff's Atlanta office. So, we are very excited to discuss this topic today. And with that, I am going to turn it over to Aarti who is going to first talk about the importance of knowing your data and technology with Dan and Alex.

Aarti S.: [00:02:15] Thank you, Lisa. And thank you, Dan and Alex for joining us today, we're very excited to talk about these topics as they come up regularly in our interactions with clients and prospects. So, I think we'll just start at the very beginning the importance of, of knowing data and technology. So, what I've seen being on the broker side is that a number of companies ranging from large sophisticated companies to smaller companies that don't have risk management offices per se. Don't necessarily know what, where their data is and where their networks are. So, they don't have a good handle on, on their, their data network inventory. And so, as a fundamental sort of part of understanding what their risk and exposure is from an insurance perspective, you know, this is something that, you know, we want them to look at.

So, if you have a company or you see a company where there-, there are general cyber policies and general hygiene are not really up to par and they call you to ask you to help them get things in order. How might you help them through those, those very first steps of the process?

Dan M.: [00:03:32] Hi, this is Dan Marvin and thank you for having us today.

That's a very good question. We tell organizations that it's virtually impossible to have an organization be compliant with the various regulations and statutes dealing with cyber security and data privacy without knowing where the critical data is stored, both their physical data, meaning the physical infrastructure, as well as their online data.

Without knowing where the data is. You can't know how an attack them might try to compromise that information. So, we tell our clients that almost every data privacy regulation requires that organizations have, and the kind of the buzzwords are, administrative, technical, and physical safeguards, which are reasonable under the circumstances to protect information.

And it all starts with kind of doing a risk assessment and an asset inventory. And we tell clients probably easiest way to do an asset inventory are two ways. First, you have to know what your physical assets are, meaning your computers, your laptops, your mobile devices, and anything that can connect to the internet.

And you can do an asset inventory by simply writing those things down. You write down all the different pieces of technology you have. You write down when they were last patched, what operating systems they're using, what licenses they may have, and you constantly keep that updated. And you know what physical assets may be compromised.

The next thing you can do is you can classify both your computer systems and your information. So, you may say that some systems are you can use colors and say, we have red systems, which are restricted and only employed with a need-to-know type access to those systems.

And there might be green systems, which are open to all employees, and it's very fact intensive, and every company might have a different way of classifying their data or their computer systems, but it's very important to do that. In the same way you classify systems, you also want to classify the data on that system to know which data is restricted, which is confidential, which is public information because without knowing where information lives on your systems, you can't really protect it.

And I'll give you a perfect example. We're seeing a lot now with this pandemic. Many companies being victimized by phishing scams, where the bad guys are breaking into email accounts. And a lot of these email accounts have somewhere buried in them emails that contain PII, personally identifiable information, but companies aren't keeping track of what information are in those email accounts.

So, when the companies call us and say, well, what do we do? We may have a breach here which could have accessed information in the email accounts. The first question we ask is, well, what PII might you have in those accounts? And they just don't know. And it's oftentimes a very tough scramble for them to find out quickly so we could really throw out and remediate distress. So that's kind of in a nutshell the initial steps that we ask companies to undertake when they're starting from the very beginning is to do that risk assessment.

Alex D.: [00:07:07] This is Alex. I also want to thank you for having us today. I just wanted to add that, In, at that as outside council, which is the role we would be playing in, coming in to help a company. A lot of times we'll have one or two point-people. It might be from the legal compliance department or an IT department and they would be well familiar with the various procedures and policies that might be in play, in place, if any. But if in addition to speaking with those people, it's also important to know what's going, what's happening throughout the company on the ground level. The IT person might say, look at this great policy we have that outlines all of the the data we have and all of our systems. And then if we were to speak with the marketing person or the fin-, someone from the finance department, they might know nothing of it. So, or they might, one of them might have the policy for 2017. And the other one has a policy from 2019.

So, it's important to do a very thorough you know, review as Dan describing in terms of the inventory and you getting information from various sources in order to make sure there's no hidden information or devices that we're not aware of.

Aarti S.: [00:08:33] Yeah, I think you, you sort of predicted my question there, which was going to be, you know, in terms of, and recognizing that different business structures and you may be working with a municipality or what have you have different roles, who would you recommend? You know, who are the, are the key people within a company or an organization that should be involved in this process?

I mean, you mentioned that, that different people aren't necessarily aligned and we've seen that. As well just because they haven't, some of those people haven't had to work together necessarily because they haven't, they haven't really assessed cyber exposure. And you know, we always say that that it's become more of a board issue now rather than an IT issue, but who would you, you know, who do you talk to and who do you think should be involved from the company side, in this process and in decision-making?

Dan M.: [00:09:27] Well, there, there were several people first, if a company has a CISO, which Chief Information Security Officer. It's always remarkably easy for us to, to work with that person and put together a plan. We need IT involved to implement a plan because IT generally knows how to categorize systems, as well as the information, and they know where the information lives.

If there's a compliance person, or team, or an attorney or a general counsel, we work with that person so we can draft a program which is in compliance with regulations. So those, those are the people generally at, at the ground level, we work with the board of directors and senior management of companies usually are a little more hands-off in allowing the people in the know to implement the program. But of course, they are generally the people we speak to, to learn about the business and learn about the, the risks that they envision the business station. So, we can best put together a good policy for them.

Aarti S.: [00:10:43] Yeah, I think that makes sense. I think one of the challenges, and again, we've seen this on the claims side, as well as making sure that that companies have the right players involved you know, throughout the process. Right? So, whether it's in securing their, you know, their IT security or in procuring insurance or both that, that the right people are involved from a, from a company side, So, so just you know, switching gears a little bit, I think sort of the meat of this, of this conversation really is about, you know, the data privacy regulations.

And it's really hard, I think for everyone, for I'm sure for you guys, for us as brokers and particularly for our clients to keep track of, of the data privacy regulations as they change and expand. And, you know, every day there seems to be a new one. Right. So, we just wanted to talk a little bit about what you're seeing.

So, so Alex and Dan, you know, a lot of the laws like HIPAA or the fair credit reporting act, they may seem familiar to non-lawyers because we hear about these all the time, but you're dealing with them in detail every day, right? This is, this is the, the substance of your job. So, in your experience, are any of these laws that look foreboding to a, to a corporation or organization?

Not as threatening as they appear and you know, sort of conversely to any of these laws in practice have sharper teeth than they appear at first glance. And we maybe need to be spending a little bit, you know, a little bit more time and paying a little bit more attention to these laws.

Dan M.: [00:12:30] Well that's well, that's a question that can take a long time to answer, but let me, let me try to give kind of a, a broad overview of where we are.

Yes, there are, I think more than a dozen States that have their own data privacy regulations is of course, all 50 States have breach notification statutes. There's the GLBA, the Gramm-Leach-Bliley Act. There's the DFS regulations. There's HIPAA. I think. One thing that is, that organizations can look to for comfort is that almost all of these statutes essentially say the same thing.

They want companies to have a reasonable cybersecurity program based upon a risk assessment and many of these statutes and not all of them are grounded in the NIST, National Institute of Standards and Technology cyber security guidance for critical infrastructure it's to identify, detect, protect, respond, and recover. All dealing with risks.

So, to identify risks, detect, respond, protect, and recover from risks. So, like I mentioned earlier we're dealing with administrative safeguards. So, we're talking about having firewalls in place, making sure systems are patched. We're talking with, I'm sorry, those were the, kind of the technical safeguards.

So, talking with the physical safeguards, how a system's being secured. We're talking about having an employee training program to make sure your employees, which are the weakest link in any organization, are properly trained. We're talking about having a program in place to monitor and to regularly trust-, test your systems.

And underlying all of this, as we mentioned earlier, is having a risk assessment in place to continually monitor your risks, look out for your risks, and evaluate how they're changing. I will say, and we will do a deeper dive into some of these statutes, the one statute, which I think scares people the most, because it's been talked about the most is the California Consumer Privacy Act.

And I think the one difference between that act and every other statute that we're talking about is the CCPA is not really a data security act. There are no data security regulations in the statute. It is strictly a consumer privacy act where California was trying to give privacy rights or put privacy rights back into the hands of consumers by allowing consumers to control how companies collect, use, and sell their data, in a very, very comprehensive way. So, I think that the CCPA kind of has kind of, has to be looked at separately from a lot of these, from all these other statutes. And the good thing is that the way the statutes are evolving and being amended they're kind of closing the gap on each other and when you're in compliance with one statute, many instances you're necessarily compliant with other statutes. And what I counsel clients is that cybersecurity shouldn't be dependent on statute. It's really sound business practice, and it's a best business practice. And if your organization is engaging in best cybersecurity business practices, you will necessarily also be in compliance with all data privacy statutes.

And that's really the one. That's really the biggest piece of advice I give to clients. And I hope it's the one big takeaway people who are listening could take away from, from this session.

Alex D.: [00:16:16] Yeah. Even when there's not a statute, there's still always the threat of a negligence claim that a company might face.

Should they have a data incident of some kind. And a negligent claim is based on breaching a duty, which is where you would owe a duty of reasonable care to someone. So, what that, what actually represents reasonable care does it mean having a firewall? Does it mean having some particular protections or policies in place? That that's very fact specific.

But as Dan said, your-, the best bet is to you know, to aim for a top level of care. And so, if someone alleges that you can provide reasonable care, then you can say, look, no, I, I satisfied the NIST standard. So, I did. Th- that's and as you said you know, whether or not a statute dictates that there's always the threat of negligence booming in the backgrounds.

Dan M.: [00:17:21] And just to piggyback on that very good thought. One scary trend for organizations, and this is a topic for a different day, if you look at the many, many class action litigation and negligence claims that have happened over the past year-, past five years or so, you'll see a trend within the past year of courts being much, much more receptive to negligence

claims. Courts have previously been denying negligence claims on different grounds, but in particular, one ground we've seen, really two grounds we've seen is one is quite the much more receptive to negligence claims being brought by employer-, employees against employer as a fair failure to safeguard information.

And also, I think a scarier trend for organizations is courts are finding that where organizations have previously suffered a data breach. The, the courts are much more likely to find negligence the second time around if a plaintiff is alleging that there was a prior breach and then the information was compromised in a second secondary breach.

And that's, that's a very interesting trend we've seen. And not just the negligence, but courts are becoming much, much more receptive to, to to these common laws and even contract claims to, to a lesser extent in the past year or so.

Aarti S.: [00:18:46] That actually goes directly to your point then of, of not being so focused on the specific laws than looking at, you know, sort of a holistically your, your cyber hygiene or your cyber fitness.

Right. And that seems to be the way some of these courts I think are looking at it.

Dan M.: [00:19:05] That's, that's definitely right. And we tell clients you want to do what's best for your consumers, what's best for your clients. You want to protect yourself against reputational harm. These are all things that go beyond what statutes or mitigation risk may require.

It's just a good business practice and you'll cover your-, you'll cover your back in other ways, just by engaging in those best practices.

Alex D.: [00:19:32] I'll add that with respect to the part of the question regarding statutes or regulations that have key a bit more so than others. I would just add that whenever a law or regulation have a private right of action, it becomes particularly scary for a company or entity that has data in its possession. That, that's one of the sources of the fear surrounding the CCPA, the California law.

The, you know, should there be a flurry of private actions brought against the company? And then on the other hand there are some other statutes, federal or state, where there is no private right of action and the-, the ability to bring court actions against the company rests with a government entity, practically attorney general or someone else.

And in that case, you have an attorney General's office that's very busy. Maybe they want to focus some of their time on data privacy, but, but they have a lot else on their plate. So sometimes, a law that focuses litigation on an attorney general brings the cases. Some of the cases might-, the smaller case might get lost in the sauce a little bit.

Whereas the private actions would really be very, likely to be very aggressively pursued by the plaintiff's bar.

Dan M.: [00:20:51] And I will also add that the scariest private right of action are those that don't require the plaintiffs is so any harm, which, which are out there in some of these statutes.

Aarti S.: [00:21:02] Right. That's very, really interesting. And I wanted to ask, so, you know, as brokers, you know, we guide our clients through the insurance procurement process. So, you know, we try to watch this legislation, you know, very closely because we want to make sure that our, you know, the insurance policies reflect coverage for whatever new laws are, are popping up.

And, you know, obviously it's a little bit hard to keep up, right, as these laws are being enacted every day, but, you know, and I, I sort of have in mind things like, you know, biometrics, which I know you and Lisa are going to dig into a little bit more and, and geo location. So, as you've seen in expansion, you know, what do you think is, is the next frontier in privacy laws?

Like what, what is something novel that we probably haven't thought about yet that we thought we might start thinking about now?

Dan M.: [00:21:57] Well, I think that something we are starting to think about now is what employers are doing with people working from home so much, which may become a new normal and will become a new normal for many employees.

And in terms of insurance, I think some of the risks that we may see or, or things we may see evolve, are how computer system is defined in policies. Now we have computer systems which seem to be expanding into employers’ homes, employees’ homes, excuse me. So, questions will come depending on policy language, are those computers covered by insurance policies and what has to be done on those computers to have them covered. Does it have to be a work computer? Could it be a personal computer that logs in, through a VPN? So, I think those are very important issues which are going to have to be fleshed out. And I'll also add to that, which is very, very interesting.

If an employee is using a personal computer and logging in through a VPN and there is a breach. And we're, we're now in a position where a forensic evaluation has to be done on that computer. We're dealing with a situation where that employee might have to turn over their computer, including what's on the personal side of that computer, do that evaluation and A, that's something an employee may not want to do. It's something that employers would have to inform. I think the employee beforehand in their policies that that's an inevitability which may come to pass and then we have to deal with all of the, the fallout from what would happen if an employee turns over personal information, which is now out in the open and in some way gets compromised.

So, I think the COVID-19 work from home situation that many employers find themselves in now is something that Is, something we're going to have to deal with it, deal with it. So, it's not novel in the sense that it's so much forward looking because it's already happening, but I don't think many policies are ready for it yet.

Aarti S.: [00:24:13] Yeah, I think it's, it's hard to avoid the involvement of COVID-19 in, in really any aspect of business at this point. But but thank you, that was a really enlightening comments and, and Lisa with that, I'm going to turn it over to you.

Lisa F.: [00:24:27] Yeah. That's, that's a good time that we can kind of jump into some of the more specific statutes.

I know we've talked about the CCPA, which was the first of its kind in the United States after we had been looking at GDPR for, for a few years. And so, I was hoping that you guys could kind of walk us through, you know, an overview of, of CCPA for, for our clients. I know that the compliance date was January 1st of this year.

And you know, we can talk about in a little bit what, what enforcement's going to look like coming up. But maybe if you could just walk us through kind of the, the you know, general most important definitions and overview of the statute, that would be helpful.

Dan M.: [00:25:17] Sure. As I mentioned earlier, that CCPA essentially puts control over consumers, really California residents, information in their hands. These residents have the right to know, under the statute, what personal information about them is being collected. How it's being used, whether it's been sold or otherwise disclosed and to whom. And these consumers also have the right to opt out of allowing a business to sell their personal information.

They have a right to be able to contact a business and ask that business to delete their information. And also, the California privacy act says a company can't discriminate against the person in terms of the-, its service offering, that the person exercises their rights under the act. Now, you mentioned how the CCPA or some of the definitions, the definition of personal information under the CCPA is extremely broad. Probably almost as broad as the definition under the GDPR. It defines personal information, essentially as any information which can be associated with a particular consumer or household. So not only name and address and phone number, but we're talking about geolocation information. Twitter handles, email addresses, IP information.

As long as information can be traced back to a household, it's considered personal information under the CCPA. So other statutes which have been amended in the past year or two, I think in Washington, Massachusetts, and a few other States have also broadened the definition of personal information, but those back boarding of information really only include substantive, email addresses, maybe biometric information debit card numbers and things like that.

California really has rethought and redefined the definition of personal information. So that's something corporations really need to be, be on the lookout for it because they're, they're the people that are the companies that actually collecting this information. So, it all goes back to the beginning when you're doing your risk assessment, if you're covered under the California consumer privacy act, meaning if you collect information on California consumers, you really need to have a very, very, very good handle on what information you're collecting.

And just as an aside, in terms of what the corporations have to do in terms of their responsibility, they have to give very explicit notice to consumers at the point of collection as to what information is being collected. They have to have notices on their websites. If they have a website, letting consumers know what information is being collected and how it's being used. And these corporations also have to set up a procedure for a consumer to be able to contact, easily contact, the company to make expeditious requests to have the information deleted.

And this is one area we found that companies really weren't ready for. They knew about the CCPA and they knew about what information they had to be on the lookout for but they really didn't create any sort of internal mechanisms to respond to consumer complaints. And I think you'll see a great, great, great many organizations out there that are collecting information on California residents who don't have an infrastructure in place. And these companies are all and in violation of the statute.

Lisa F.: [00:29:02] That-, That's funny because my, my next question for you guys was going to be, you know, what are, what are some of the, I guess the biggest hurdles you've seen as far as companies you know, trying to get in compliance with CCPA, you know, before the January 1st deadline of this year. So, you know, any, any other hurdles that you've seen with companies, you know, kind of struggling to get in compliance before the enforcement date?

Dan M.: [00:29:29] Well, one hurdle is, you know, CCPA only applies to information collected on California resident. So, we see businesses that have, you know, nationwide customer basis and clients and in many instances it's difficult for them to really collect the names of the people that are actually in California.

And if, even if they do that, you're now dealing with the situation where they're putting different privacy controls in place for just those residents. And that's very difficult. And, and one, one thing some corporations have done, I think Microsoft has done it and a few others, is there, they've expanded the California consumer privacy rights to everyone.

And they've done that I, you know, in a way that I guess they're being good corporate citizens, but if you look at it from a kind of devil's advocate position, it's a lot easier for them to set up one compliance protocol company-wide as opposed to setting up more than one. But for many companies, smaller companies that aren't Microsoft, it's difficult to set up more than one infrastructure to deal with consumer requests.

So that's really a big struggle for small businesses. It's had to kind of bifurcate their systems with the processes to, from the one hand, you'll handle only consumer requests from California. Now, on the other hand, that opened up those requests nationwide, which would give them a whole host of other problems, including having to answer requests from people all over the country, which could open up the company to more liability liability than they might want to undertake.

Alex D.: [00:31:15] I'd like to add a couple of points as well. First with regards to the hurdles, one is with regards to the verification process that should occur when the request for information is received. So, let's say your company subject to the California consumer privacy act and you want to do the right thing? You want to provide consumer information to them when they request it. But if you think about it, this could easily be a system that's exploited by hackers, submitting fraudulent data requests posing as California consumers. And so, it-, it's on the company to take the steps to verify that the person requesting their data is who they say they are, and that they're entitled to the data they request. I can't request the data of my buddy from college, but I can request my own data. A parent can request their child's data, but it's important to have that verification process in place. Which is also why you want to have an entire system for receiving and handling a data request in place proactively.

If you, if you're handling them on an, on an ad hoc basis, as they come in, your, your company is not going to be positioned to thoroughly and appropriately handle it. The other thing I wanted to point out is that with regards to hurdles, when the act first started and, and this is one of the tasks, there was a lot of fear about how are small businesses going to deal with all these data requests. You know, it's going to put them that put them out of business. In fact, there are a great deal of companies that are not subject to this act. It's a limited subject, a subset of companies that do business in California and have to meet one of three criteria, either they have gross revenue of 25 million or more per year.

They buy, sell, receive or share for commercial purposes, the personal information of 50,000 or more consumers, or they consumers, households or devices I should have said, or they devise 50% or more of their revenue from selling the personal information of California residence. So, there's a great deal of companies that aren't going to meet, that criteria, your, your corner, a cobbler shop, ice cream store, mom and pop shop - they're not going to meet that criteria in all likelihood and will have the burden of having to respond to data requests.

Lisa F.: [00:33:58] Yeah, those are, those are all great points. Thanks for, for sharing those. I wanted to ask you guys about the upcoming enforcement date, which I guess is, is next week. So, regarding the enforcement of the CCPA, starting on July 1st how has that being viewed, given the current state of affairs in, in California and across the United States?

Dan M.: [00:34:23] Well, the attorney general in California had indicated that the pandemic won't stop the July 1st date taking effect. But I think we're likely to see enforcement much in the way the GDPR has been enforced, where initially the enforcement work with, with kind of your bigger companies. I think a GDPR went out to Facebook first and a few other very large companies.

So, I think we're likely to see the same effect happened in California. The rumor is that the California office in charge of with enforcement is not very large. So, his thoughts were a resource problem. So, I think smaller businesses who might not be in compliance yet might still have a little bit of time to, to get into compliance before the attorney general look to them. If there are a significant amount of complaints or really some complaints that doesn't shield them from a private right of action. But in terms of enforcement you know, that might be a bit of a honeymoon period for all, but the largest corporations out there.

Lisa F.: [00:35:33] Great. Well, I think we'll probably have to get together in a few months and, and talk again about what we've seen as far as enforcement and in California. I think we could turn now to New York, another state that has been active in, in privacy and data protections. I know there are two New York regulations and statutes we are going to talk about.

The first is the New York state department of financial services cyber security requirements. And then the second is the New York Shield act, which I believe was signed last summer by governor Cuomo. And, and so New York corporations, you know, potentially are going to have to understand both regulations, laws, and, and comply with both of them.

So, if you guys could talk a little bit about, you know, each, each law individually, and then maybe how a company could you know, comply with both of them, how that, how these two New York laws can kind of work together.

Dan M.: [00:36:45] Sure. I guess first we'll talk about DFS 500. That was kind of the first big privacy law, I think, in, in the U.S., a really comprehensive privacy statute.

But one thing that's important to remember is that statute only cover's entities that are regulated by the New York State Department of Financial Services that includes insurance companies, banks, mortgage brokers, check cashers, and a number of other types of entities that you could find on DFS's website.

In addition, there's also some exclusions to the statute. Some sections of the statute, there are exclusions for smaller businesses, those with fewer than 10 employees, those with fewer than 500,000 new gross revenues. So, they want those covered and the DFS regulation is based substantially on this, which we talked about earlier, identify, detect, protect, respond, and recover, and the regulation kind of sets out six core things that companies need to do to be compliant. They need to identify, there's that word again, and assess their cyber security risks. They need to use defensive infrastructure and implement the policies and procedures to protect information systems from unauthorized access. They need to have a system in place to detect cyber security events, the need to be able to respond to those detective type of security events. They need to be able to recover from those cybersecurity events and put their business practices back into full working order. And they also need to fulfill regulatory reporting obligations.

If there is a breach, they need to report them to DFS I believe within 72 hours. So that's a very, very brief overview of this kind of big regulation, which really requires a very comprehensive cyber security program for covered entities. And one thing I tell clients that if they're DFS compliant, they're most likely in compliance with virtually every other privacy statute out there. Switching gears a little bit to the, to the other statute in New York the Shield Act, which went into effect on March, in March of 2020.

That's kind of a hybrid breach notification and data privacy act, which I think is also one of the first, New York was one of the first States in the country to do it that way. And that statute applies to any business, whether or not they're located in New York that owns the licenses of New York residents, private information, and the two big things I think that, to show that did as opposed to 899 of the business flow, which was the prior statute was added biometric data to the definition of private information. And I know we'll be talking about that a little bit later, but I think the most important thing it did. And I think that this is very, very important is that it really changed the definition of a data breach to unauthorized access, not just acquisition, but access to private information. And as far as I know, the Shield Act is the only statute that actually defines what access means. And this is, this is a problem every company deals with, with every statute that requires some sort of remediation when there's been access to information is how do you know that information has actually been accessed? So, the Shield Act defined access. Not very well. I mean, but it does offer some attempts to kind of say, well, you have to look for indications that the information has been viewed and there's a few other similar type, similar language in the statute. Those were the two main things I think that the Shield Act has done.

And really the one main thing that everyone should be aware of is how it defines access. And I'll also mentioned that access is really a definition that courts have struggled with in-, with negligence claims.

Lisa F.: [00:41:04] Yeah. I wanted to ask about another term in the shield act, which is that employers in possession of New York residents, PI to develop, implement, and maintain reasonable safe guards to protect the security, confidentiality, and integrity of the PI. And I'm curious you know, what are reasonable safeguards and what could a company do to ensure compliance with that term?

Dan M.: [00:41:32] That's a great question. It's one no court has ever answered. And it's, it's a very fact intensive evaluation, and I think. The reason that language is put into the shield act, as opposed to a more specific language is what's reasonable for a mom-and-pop shop, like Alex was talking about, may not be reasonable for a 50-person business and what's reasonable for a 50-person business is not reasonable for a fortune 500 business.

So, it's a very fact intensive evaluation. It's really one that has to be gone through with compliance, with risk, with IT, with outside counsel to determine what's best to any given particular business. Now, I think there are certain baselines of reasonableness that you want to have. You want to have employees that are trained. You want to have operating systems that are patched. You want to have virus protection. You want to have firewalls. You want to have kind of the minimum things that really consumers, and really the courts when it comes down to it, would expect you to have under the circumstances. And I say under the circumstances, it's a bit of a cop out, but again, it's really something that's very difficult for a court or really anyone to define. And it's something that has to be gone into on an individual basis

Lisa F.: [00:43:05] And in comparing California and New York do, do New York residents have, have a private right of action under any of these acts?

Dan M.: [00:43:14] No, there's no private right of action in New York. It's left to the attorney general. To enforce a statute. To enforce the Shield Act. The DFS has jurisdiction over the DFS 500.

Alex D.: [00:43:32] Well, one thing that a creative plaintiff's attorney might try to do is in bringing a negligence action on behalf of someone who's suffered a breach, they might try to cite the statute and see, say, see, look at the things that they say a company should be doing to protect data. This company that had breach didn't do those things. So, they were negligent. It'll be interesting to see if that we see more of that in the coming coming years.

Lisa F.: [00:44:01] Before we move on to the very interesting topic of biometrics, I just wanted to ask you guys one question since we talked about New York and California, you know, I'm wondering about you know, McGriff and Morrison Mahoney clients who have operations in, you know, whether it's in all, all 50 States or, you know, a majority of States, I think it can be intimidating with the ever-changing, you know, regulations and laws, and so I'm curious, you know, if you've got, if you've got operations in, in many States, you know, what, what, what can these companies be doing to stay compliant with ever changing privacy laws?

Dan M.: [00:44:44] Well, I think in most instances it really comes down to four words, which is comply with the strictest law, which, if you're complying with the strictest law then you're covered for the rest. And like I said before, most of these statutes are reasonably similar with the exception of the CCPA and they kind of all have the same goals in mind, worded a little bit differently. But it's, it's always a good practice to comply with the strictest law. And again, I say it it's the best business. Again, it's the best business practice to have these measures in place, not just because these statutes require it.

One thing corporations should be on the lookout for in particular, breach response. If they're doing any of this in house, which they probably shouldn't, they probably should always consult outside counsel is that statute-, the statutes require various notifications to state agencies. Sometimes the attorney general. Sometimes to state police. Sometimes consumer reporting agencies. So, you do want to make sure that if you're collecting data on different residents from different States, you want to know where those people live and you want to know what those particular States require in terms of regulatory notification if there is a breach.

Lisa F.: [00:46:11] Great. Thank you. That's really, really great advice. I know that everybody finds biometrics very interesting and so kind of wanted to dig into that topic for a minute. I know, you know, biometrics are a little bit different than some of the other things we've discussed because, you know, say biometric data is, is stolen by a hacker. You know, it's not like a stolen password that can be reset, you know, biometrics are, are unique to, to individuals. And so, and, and it impacts all of us you know, with our Alexa at home and all of our different devices and technology that we have now. And so, Alex, I wanted to ask you about biometrics and particularly facial recognition.

I know that is a hot button privacy issue. More so than a lot of other forms of data usage. Why, why do you think that is?

Alex D.: [00:47:10] You hit on it a little bit when you noted that you can't just easily change your face, like you can change a username or password. The danger to victims of identity theft with respect to, to biometrics and the facial recognition is, well, there's two things you can't easily change your fingerprints or your face. You also have cyber can just wait. Once they get the data, they can delay their use of it for years because your fingerprint probably isn't going to change. Your, your face isn’t going to change very much.

One thing, over the past decade, while we've seen these blockbuster retail box stores that have had these huge breach cases target home Depot, we've seen that the damages in some of these cases has actually been less than what we might've originally expected because credit card data was stolen and credit cards can be can be canceled. They can, you can call it your company and you know, have fraudulent charges reversed. They can reimburse you for the fraudulent charges. So, a lot of times a victim of identity theft, with respect to a credit card, will not actually be financially responsible for anything in connection with the breach and any fraudulent charges.

On the other hand, if your fingerprint is stolen or your, your face is stolen, in terms of facial recognition technology, you're going to be a, have a hard time overcoming that 10 years later, should your, should your face or a fingerprint be used for something. The the Supreme court is actually signaled in changing winds with respect to the way, data privacy is evolving, and some of the new types of data management.

There's a case, it's a Supreme court case in 2018, where there was a subpoena issued by the government to Sprint, the phone company, for cell site location data, 127 days’ worth of location data. And phone companies are subpoenaed regularly. It's not that out of the ordinary at all, but in this case, it was a very broad amount of data. And what the court said is there's a big difference between the limited types of information that might've been requested decades ago and the massive troves of data that's collected today. You know, you, you might have your, your nosy neighbor, for example, who might be called in as a witness. You know, they may have observed out, they're peering out their window, your comings and goings from your house, but Sprint is different from that. They're ever alert and of course their, their memory is nearly infallible. And so that the court thinks about that differently. And it signals that, that we should be thinking about it differently, in terms of retention of these biometrics and these other new types of data, that the amount of data and the ways that can be used.

Lisa F.: [00:50:20] I have a question, you know, from, from the insurance broker standpoint, we have started seeing BIPA exclusions in our, in our financial lines insurance policies. And I have received a question from some colleagues regarding BIPA coverage. If our employment practices, liability policies are excluding BIPA due to privacy concerns and then we've got our cyber policies that are excluding BIPA due to exposure concerns, you know, they're, they're asking where, where does the book coverage fall and, and how do we obtain coverage for BIPA if the, you know, EPL and the cyber policies are seeking to exclude coverage, any-, any thoughts on that?

Alex D.: [00:51:12] Sure. Yeah. I think it's very company specific and scenario specific. Obviously, I would highly recommend trying to find coverage, but as you're saying, easier said than done, we are seeing creative plaintiff's attorney’s coverage counsel alleging that there's coverage under general, general liability policies.

There was actually a case we reported on in our April newsletter where coverage was found under, within personal injury coverage within the policy, personal in that case, personal injury was defined, included a publication that violates a person's right to privacy. So, in that case, there was disclosure of somebody's personal biometric information to one third-party vendor and the court ruled that yes, sharing and disclosure of biometric data to one vendor constitutes publication, which under this definition of personal injury triggers coverage. And so that's an example of case where there was BIPA coverage which obviously we don't want to-, a cautious policy holder doesn't want to wait till litigation to make sure that they have coverage.

And all I could say is you know, work, work with the insurers trying to negotiate coverage. It has you know, there, there are opportunities throughout, you mentioned EPL policies and cyber policies. General liability could come into play as well. Umbrella policies. So, so there are opportunities and it's really just a matter of working with the insurance company to try to, to try to negotiate something.

In my experience, the insurance companies are still very much as I said, taking a scenario specific look at coverage because so many of these scenarios are so new. So th- there is opportunity in a lot of situations for a real exchange of ideas. And what's important to you, what are you willing to pay and what language can we come up with to allow coverage.

Lisa F.: [00:53:25] Okay. Great. That's helpful. Thank you. Before we wrap up biometrics, I, you know, I know that obviously BIPA in Illinois has been kind of at the forefront of this you know, any other States that we need to be on the lookout for other, you know, outside of Illinois, which I know has had a number of, of class action lawsuits brought in-, in the past few years.

Alex D.: [00:53:55] Well, I I'll point out that the scope of the Illinois law isn't necessarily limited just just to Illinois residents, for example, or Illinois companies, they or Illinois courts. There's, we've seen BIPA cases throughout the country. You know, you want to see a connection to Illinois in most cases, but we've seen California courts try BIPA cases and I expect to see more of that. Dan mentioned earlier, we are seeing biometrics creeping in to amended breach notification statutes in other States. And I would expect to see more and more of biometrics as your average person has become, some legislators are becoming increasingly aware of it.

Lisa F.: [00:54:43] Okay, great. I feel like we could probably talk about biometrics for another hour, but in the interest of time I think we will move to our last topic which is regulatory and litigation trends. So, you know, we talked about California and New York and Illinois a little bit you know, are there any other States that we should be watching right now?

I know, I know you mentioned that, you know, all 50 States have breach notification requirements, but, but what about other States kind of on the forefront of privacy and data protection right now?

Dan M.: [00:55:19] Well last year when Texas amended its statute, they put together a commission to kind of look at California and see how they can. Perhaps implement a similar type of, of of a statutory regime there. So, I think we will be able to look at Texas. You know, one thing a little bit not off topic, but off the question is what may be done federally and for years there have been bills introduced on a federal level on data privacy.

There was a few this year and I think one maybe just introduced last week. And I think we are starting to see kind of a coherence or cohesion between the parties in really moving towards some sort of federal data privacy statute. And I think in the next two years, we may see it. And one of the reasons I say that with COVID-19, we saw many employers collecting health information from individuals in order to determine if they should be allowed back into work and congressmen pretty quickly in a bipartisan way to, to passed two statutes, two laws to protect the privacy of that information. So, there is some hope there. And I think many, I mean, there are many States looking at it. Looking at expanding how they view their data privacy statutes. In particular, I'll again, mention the term personal information, which I think is continuing to broaden throughout the country. And I think before long, we'll see most States kind of be on the same page, with the exception of California of course, in broadening that definition to include biometric information and other categories of information. Which one traditionally thought about is PII.

Lisa F.: [00:57:22] Maybe if we, if we have a follow-up a webinar in a year or two, we will have a comprehensive federal law, like the GDPR, but we'll have to see about that. Does anybody have any, any last-minute thoughts or, or questions Aarti?

Aarti S.: [00:57:42] No, I think this has been really a lot of good information. Dan and Alex, you know, it's, it's not easy to dig into some of these issues and they're all sort of happening at the same time.

So, we really, really appreciate your insights.

Dan M.: [00:58:00] Thank you. It's it's great to talk about these topics and really the truth is each topic lends itself to probably an hour or two discussion on its own, but we hope we covered some of the basics and a broad view of of these issues. And we hope that anyone listening, who doesn't yet have some sort of cyber security program in place, or at least who hasn't looked at theirs in a long time, takes a good second look and and does what they need to do to protect themselves.

Alex D.: [00:58:30] Yeah, thank you for having us.

Lisa F.: [00:58:32] Thank you, Dan. And Alex, and Aarti, that was, that was so informative. And I think that a lot of clients are going to find the information shared really, really helpful. So, thank you so much.

Alex D.: [00:58:46] Thank you.

Dan M.: [00:58:47] Thank you.

Copyright © 2024 Marsh & McLennan Agency LLC. All rights reserved. CA license # #0H18131