Let’s face it, privacy is a busy space right now. Regulations, guidelines, best practice and privacy news are constantly developing. And sometimes, quite frankly, it’s hard to keep up with it all. Sure, we try to distill it down for you with our Daily Dashboard and weekly regional newsletters, but we thought it may help to distill things further for you by presenting the top-five most-read stories of the week. So here’s our inaugural list for this week, going back to last Friday (to be fair to that glorious end-of-the-week day).Â
Perhaps it’s a reflection of the flurry of privacy developments in the U.S. right now, but all five top news stories this week are U.S.-centric. No doubt, the California Consumer Privacy Act has rippled through the privacy community in the U.S. If this were April 2018, all five stories may well have been EU-centric because of the then-looming EU General Data Protection Regulation implementation.Â
1.Â Sen. Markey introduces ‘Privacy Bill of Rights Act’
2.Â Are GDPR-compliant companies prepared for the CCPA: An IAPP FAQ, by Caitlin Fennessy, CIPP/US
3.Â US state comprehensive privacy law comparison, by Mitchell Noordyke, CIPP/E, CIPP/US
4.Â The Privacy Advisor Podcast Special Edition: Edelson on his firm’s $925M privacy class-action win, by Angelique Carson, CIPP/US
5.Â Perspective: A milestone for privacy self-regulation, by Lou Mastria
The winds of regulatory oversight for artificial intelligence are blowing in the U.S. and Europe. The European Commission signed off on its Ethical Guidelines for Trustworthy AI earlier this month, the culmination of several months of deliberations by a select group of â€œhigh-level expertsâ€� plucked from industry, academia, Â research and government circles. In the advisory realm, the EU guidance joins forthcoming draft guidance on AI from a global body, the Organization for Economic Cooperation and Development.
Meanwhile, U.S. federal lawmakers want something on the books. A new bill proposed by Sen. Ron Wyden, D-Ore., Sen. Cory Booker, D-N.J., and Rep. Yvette D. Clarke, D-N.Y., would require large corporations to subject their algorithmic systems to automated decision system and data protection impact assessments. And, in February, U.S. representatives proposed their own guidelines for ethical AI in a House Resolution.
The EUâ€™s guidance hinges on the notion of trustworthy AI or AI that is lawful, ethical and robust. The fact that the detailed Brussels guidance went through a collaborative and likely combative multi-stakeholder tuning process is evident in its length and complex Russian doll-esque structure of components, principles and requirements. Still, coming from the same body that gave us the tectonic privacy plate-shifting GDPR, it could influence AI ethical standards across the globe.
Four principles or â€œethical imperativesâ€� call for AI systems to respect human autonomy, prevent harm, incorporate fairness and enable explicability. Another layer of guidance advises that AI respect human dignity, individual freedom, democracy, justice, the rule of law, equality, non-discrimination, solidarity and citizensâ€™ rights. The document then translates those goals into concepts that apply directly to more technical considerations for AI, such as resilience to attack, data quality and privacy, avoidance of unfair bias, auditability, transparency and explainability for AI-based decisions. Finally, a list of more practical methods for implementation follows.
The meaning of lawful AI
The official guidelines make clear that â€œA number of legally binding rules at European, national and international level already apply or are relevant to the development, deployment and use of AI systems today.â€�Â This emphasis on â€œlawfulâ€� AI seems to be an important distinction between this final version of the document and an earlier draft.
â€œThe emphasis on Lawful AI in the official guidelines indicates that the implementation of existing law is part of the practice of AI Ethics,â€� said High-Level Expert Group Member Gry Hasselbalch, a co-founder of Denmark-based data policy think-tank DataEthics, in an email sent to Privacy Tech.
We can expect lawyers and decision-makers to read the tea leaves sprinkled throughout this document to get a sense for whether the prominence of the lawfulness concept indicates that current European law, such as the General Data Protection Regulation, the EU Charter of Fundamental Rights and anti-discrimination directives suffice in delivering protections against potential AI-enabled harms or not.
There still is debate as to whether the GDPRâ€™s right to an explanation applies to AI and decisions made by autonomous technologies. However, Chris Hankin, co-director of the Institute for Security Science and Technology and a professor of Computing Science at Imperial College London, said he thinks that aspects of that privacy regulation already address ethical AI considerations, particularly the right to an explanation.
Bottom line, the question people in the AI industry or entities employing AI want answered is: â€œWill Europe regulate or establish new laws for AI?â€�
Hasselbalch said thereâ€™s no telling yet. â€œWe are an independent expert group, so what the official EU system will do, we donâ€™t know. But I assume that they will listen to the experts they appointed themselves.â€�
The EU will soon be joined by the OECDâ€™s Committee on Digital Economy Policy, which is set to publish its own draft recommendations for intergovernmental AI policy guidelines in May, based on final guidance approved by its global expert group representing policy, privacy and corporate tech earlier this year. Those recommendations are expected to mirror some of those seen in the EU guidance including human-centered values and fairness, transparency and explainability, robustness, safety and accountability.
Many of these same principles are present in other guidance for ethical AI from national governments and trade groups, including the Association of Computing Machinery and the Institute for Electrical and Electronics Engineers.
Next step: EU policy
For the EU, establishing advisory guidance is just the first half of the process. As noted in the guidelines, â€œTo the extent we consider that regulation may need to be revised, adapted or introduced, both as a safeguard and as an enabler, this will be raised in our second deliverable, consisting of AI Policy and Investment Recommendations.â€� That policy crafting is underway currently.
Hankin suggested the European Commission could use the upcoming policy component of the guidelines to influence discussion around a directive or potential regulation. â€œI wouldnâ€™t be surprised to see the Commission pick up the policy document when itâ€™s produced and use it to start a debate â€¦ but weâ€™re at the start of a very long process,â€� he said.
Greetings from Portsmouth, New Hampshire!
It was another busy week here at IAPP headquarters as we gear up for our Global Privacy Summit in Washington. This one’s shaping up to be … yes, you guessed it … our biggest one ever. Hopefully, you’ve purchased your ticket already, because space is limited, and seats are going quickly. For those who are attending, don’t hesitate to say “hi” if you see me. I’d love to connect and hear about what you’re up to.Â
Clearly, we’re not the only ones who are busy. I think it’s safe to say that the privacy profession as a whole, and especially here in the U.S. right now, is frantically working with all the operational, compliance, technological and legal challenges that are required day-in and day-out.Â
Knowing that everyone is so busy, we’re always tinkering with ways to present you with content that is easy to consume (or in-depth, if you need it that way). That’s why for years we’ve published short news summaries in our Daily Dashboard and weekly, regional newsletters. But we also realize that sometimes even those are too much. Maybe you just need a quick-hit, what’s-big-in-privacy-news-this-week content.Â
To help, we’re offering up the week’s top-five most-read stories each Friday. I include mention of that here, not only to introduce you to our weekly recap, but also because this week’s top-five most-read stories all involve U.S. privacy developments.Â
There was clearly lots of interest in Sen. Ed Markey’s proposed “Privacy Bill of Rights Act.” But not far behind were two resources on frequently asked questions about the CCPA and a comparison table of U.S. state comprehensive privacy laws. For the former, IAPP Senior Privacy Fellow Caitlin Fennessy focused on FAQs related to CCPA compliance for companies that are already GDPR compliant. For those interested, there are some super-practical answers in her post. Westin Privacy Fellow Mitchell Noordyke has been diligently tracking all things U.S. privacy law and, this week, offered a comparison table focusing only on comprehensive state privacy laws. We’ll continue to update this chart as more comprehensive bills come forth.Â
It was also yet another busy week for Privacy Advisor Editor Angelique Carson as she posted not one, but two, podcasts, the first of which was a special edition podcast with Jay Edelson on his firm’s big $925 million privacy class-action win last Friday. For those going to Summit in a couple of weeks, she’ll continue the conversation with Jay, along with Doug Meal, in a follow-up live podcast. A not-to-be-missed conversation for sure.Â
Last Friday, we also featured a write-up from the DAA’s Lou Mastria, who highlighted the self-regulatory work of the Advertising Self-Regulatory Council of the Council of Better Business Bureaus. Coming off their 100th public compliance action, Lou described what goes into these self-regulatory activities, arguing that it’s a “tested and proven approach to managing safeguards in dynamic marketplaces.”Â
OK, that’s enough from me for now, but in the meantime, hopefully you get a bit of time to relax before next week’s onslaught.
Greetings from Brussels!
CNIL President Marie-Laure Denis gave her first official interviews this week coinciding with the release of the French authorityâ€™s 2018 activity report. If you want an overview of the reportâ€™s main points, you can find an English summary here. As you will know, there was a change of guard at the CNIL in early February with the appointment of Denis. It was certainly welcome to see her inaugural interviews in “Le Monde” and “La Tribune.”
She stated that the GDPR has facilitated increased French awareness around data protection. The CNIL website registered a staggering 8 million unique visitors last year, according to one of her recent interviews, which is almost an 80% increase on the year before. This enhanced awareness was also reflected in a record number of complaints to the CNIL, up by 33%, with a third of the complaints centered around the dissemination of personal data on the internet, as well as 20% concerning marketing and commercial practices.
Denis also said there are approximately 17,000 DPOs acting on behalf of more than 51,000 organizations; notably, this number reflects internal, outsourced and “shared” DPO functions. With more than 4 million companies in France, Denis acknowledged that while not all organizations will require a DPO by default, she reconfirmed the CNIL estimate for the need of 80,000 DPOs under the provisions of the GDPR.Â
There is still much uncertainty around the handling of complaints as the CNIL interacts with other EU member state DPAs. The CNIL continues to receive many questions involving corporate obligations around data protection and breach notifications. It is also interesting to note that since GDPR came into effect, breach notifications increased in 2018. However, and conversely in 2019 to date, leading French privacy pros informed me that breach notifications to the CNIL have dropped off significantly.
Denis stated that the authority will continue to support French organizations, acknowledging that not all entities have the same capacity to meet their obligations. However, taken that the GDPR was adopted in 2016, Denis felt that sufficient time and support has been afforded organizations to comply with the regulation, and now it is time to show more determination and firmness on controls and enforcement.
For 2019, priorities are clear: Ensuring that data subject rights, such as access, rectification and deletion are respected. More focus will be directed at processors and subcontractors as keys to personal data flows. Finally, the CNIL will also step up its monitoring in the area of new rights of minors and parental consent for children under 15.
At the macro level, Denis sees an augmented role for the CNIL as an effective digital regulator ahead to anticipate and innovate for social and economic changes in the digital era. On the international scene, the CNIL intends to continue to maintain a leading role within the EDPB, and outside the EU, it will continue to prioritize cooperation that fosters the convergence of data protection principles worldwide.
This year heralds new energy and promise in Marie-Laure Denis, and I think it is fair to say we have another strong and confident leader taking the reins of the French authority.
Greetings, fellow privacy professionals.
Hong Kong was privileged to have an all-star-filled ISACA Asia Pacific conference in early April, where the IAPP was present and able to spread the word on the ongoing importance of privacy. The convergence of cybersecurity and privacy was discussed, and fellow panelists and I discussed the impact and the areas of growing concern.
One such area relates to the massive growth of internet-of-things devices, the cybersecurity, privacy issues and risks to all. With the significant drop in production costs of sensors and cameras, we have seen IoT devices with embedded sensors/cameras become ubiquitous with everyday life â€” from wearables, household gadgets to childrenâ€™s toys. Just this week, an Australian company that sells smartwatches, which allows parents to remotely monitor their children, had to shut down its service after hackers were able to breach through vulnerabilities to track and change the childâ€™s location while gaining full access to personal identifiable information of the parent. And just this week, one of Hong Kongâ€™s top celebrity couples is now embroiled in a scandal in regards to a driver who made a racy recording of the pop icon with another woman (not his wifeâ€¦) in the back passenger seat. The hot debate now includes the rights of the passengers (victims) and whether the driver who recorded the video was in breach of Hong Kong laws.Â
In other news, the IAPP will hold our very own Data Ethics KnowledgeNet Meeting & Privacy After Hours, Thursday, 23 May, with a focus on ethical accountability and artificial intelligence. Data ethics has become a hot topic recently, and for a good introduction into data ethics, I encourage you to view an IAPP web conference recording here.Â The Privacy Commissioner for Personal Data, Hong Kong office, will also be hosting one of its own events on â€œData Ethics in Actionâ€� in May, with a brief synopsis here: â€œWe revised the Best Practice Guide on Privacy Management Programme … last year to assist organisations in constructing a comprehensive PMP. We recently released a study report on the implementation of PMP by data users in Hong Kong. Against this background, we haveÂ assembled corporations and industry leaders from different professions and industries to speak at the Symposium on their insights and experience on how their organisations develop their PMP and adopt data ethics to enhance accountability and trust with their customers.â€� If you are in Hong Kong and interested in attending, then please click here to register.
Finally, great to see the IAPP Asia Pacific Forum website live now with all the details and schedule for the event in July in Singapore. Looks like there will be some really great sessions and speakers from all over the globe, and I look forward to seeing you there. I will be moderating a panel called, â€œGDPR (1 Year Later): Lessons From Data Breach Victims and Security Professionals,â€� so please come by!
Keep safe, keep secure.
Redmond hopes to lure Uncle Sam’s spy agencies, military away from Amazon
Microsoft has set up two new Azure cloud regions in the US – dubbed Azure Government Secret regions – to store data involving American national security. The services are in private preview, and are pending official government accreditation.
The Windows giant hopes the pair of regions will obtain a Dept of Defense Impact Level 6 badge, which would allow it to store and process information classified as secret. It is also looking for Intelligence Community Directive (ICD 503) accreditation.
Each region consists of at least two availability zones, and each availability zone lives on its own individual server farm.
The Azure Government Secret data centers are so secret Microsoft doesn’t disclose their location, only stating on Thursday that they are located more than 500 miles apart.
The new regions join Microsoft’s six existing Azure Government regions, which have now been certified as IL5, which means they are suitable for controlled but unclassified information.
“With our focus on innovating to meet the needs of our mission-critical customers, we continue to provide more PaaS features and services to the DoD at IL5 than any other cloud provider,” said Lily Kim, general manager for Azure.
The tech titan claims its cloud services are used by nearly 10 million people toiling for Uncle Sam, across more than 7,000 government agencies.
Uncle Sam █████████ cloud so much, AWS █████████ it another kinda-secret data center
So what makes a data center fit for restricted and secret government info? Microsoft said it’s down to secure, native connections to classified networks, hardware encryption and storage of cryptographic keys, storage and compute isolation capability – with every virtual machine sitting on its own physical node – and personnel consisting of security-cleared US citizens, among other things.
The announcement this week comes at a time when the US government is working hard to consolidate and modernize its IT footprint, in line with the requirements of the Federal Technology Acquisition Reform Act (FITARA) and its extension, the Data Center Optimization Initiative (DCOI).
Since 2014, these initiatives have helped 24 federal agencies close 6,250 data centers – although the definition of a data center, in this case, is any room with at least one server in it.
More recently, the 2018 ‘Federal Cloud Computing Strategy — Cloud Smart‘, the first cloud policy update in seven years, promoted public cloud as a more than adequate alternative to on-premises data centers run by government agencies.
“To keep up with the country’s current pace of innovation, President Trump has placed a significant emphasis on modernizing the federal government,” said Suzette Kent, federal CIO.
“By updating an outdated policy, Cloud Smart embraces best practices from both the federal government and the private sector, ensuring agencies have capability to leverage leading solutions to better serve agency mission, drive improved citizen services and increase cyber security.”
Another cloud vendor that is competing for government secrets is AWS: Microsoft beat its competitor to the punch when it became the first hyperscale cloud vendor to obtain Impact Level 5 provisional authorization, but it was AWS that managed to open first cloud data centers with provisional IL6 in late 2017.
Both cloud behemoths are competing for JEDI, the controversial ten-year contract to provide cloud services to the Pentagon, worth up to $10bn and designed for just one vendor. Amazon has been widely seen as the front-runner in the race, while IBM and Oracle both complained that the contract was anti-competitive; Oracle even challenged it in a federal court. ®
Sponsored: Becoming a Pragmatic Security Leader
The California Consumer Privacy Act is top of mind for many privacy professionals across the U.S., who are working to leverage their GDPR preparation to build CCPA-compliance programs. They are learning that while their recent GDPR preparation is helpful, the CCPA has nuanced requirements that go beyond the GDPR. After listening to two useful web conferences comparing the CCPA and GDPR, IAPP Senior Privacy Fellow Caitlin Fennessy, CIPP/US, wondered if companies outside of the U.S. have realized that California-specific adjustments will be needed. To help address some of these significant issues,Â Fennessy has compiled and answered 10 frequently asked questions that might be helpful to privacy professionals around the globe. A general caveat, though: The following should not be construed as legal advice.Â
The California Consumer Privacy Act is top of mind for many privacy professionals across the U.S., who are working to leverage their GDPR preparation to build CCPA-compliance programs. They are learning that while their recent GDPR preparation is helpful, the CCPA has nuanced requirements that go beyond the GDPR. Emphasis is often placed on the novel â€œDo Not Sell My Personal Informationâ€� link.
After listening to two useful web conferences comparing the CCPA and GDPR (available here and here, in case you missed them), I wondered if companies outside of the U.S. have realized that California-specific adjustments will be needed. If not, the CCPAâ€™s private right of action and what many consider the litigious nature of the U.S. might soon draw the attention of foreign C-suites.
Participants in the two recent web conferences posed a number of questions, highlighting the challenges privacy pros face in understanding the developing legal landscape. We thought addressing the following 10 frequently asked questions might be helpful to privacy professionals around the globe. A general caveat, though: The following should not be construed as legal advice.
Question: Does the CCPA apply to companies based outside of California?
Answer: Yes, the CCPA will generally apply if doing business in California and collecting the personal information of a California resident. The CCPA grants a â€œconsumerâ€� various rights with regard to personal information held by a â€œbusiness,â€� including the rights of notice, access, deletion, portability and reasonable security. It also requires a business that â€œsellsâ€� â€œpersonal informationâ€� to â€œthird partiesâ€� to provide a clear and conspicuous link on the businessâ€™s internet homepage, titled â€œDo Not Sell My Personal Information,â€� to an internet webpage that enables a consumer or a person authorized by the consumer to opt out of the sale of the consumerâ€™s personal information, among other requirements. The definitions cited above are critical to understanding the lawâ€™s jurisdictional and material scope and are explained in response to the following FAQs.
Q: Does the CCPA apply only to consumers or also to employees and other individuals?
A: While the CCPA refers throughout to a â€œconsumer,â€� currently, the lawâ€™s definition of â€œconsumerâ€� is not limited to those engaged in commercial activity, but rather to the residency status of a natural person. The billâ€™s definition of consumer is â€œa natural person who is a California resident, as defined in Section 17014 of Title 18 of the California Code of Regulations, as that section read on September 1, 2017, however identified, including by any unique identifier.â€� Further, the billâ€™s definition of personal information includes â€œprofessional or employment-related information.â€�
Q: Does a California resident have rights under the CCPA if outside of California? Does the CCPA apply to nonresidents visiting California?
A: The billâ€™s definition of a California resident is provided in Section 17014 of Title 18 of the California Code of Regulations, as that section read Sept. 1, 2017. This provides, in part:
[quote]â€œThe term â€œresident,â€� as defined in the law, includes (1) every individual who is in the State for other than a temporary or transitory purpose, and (2) every individual who is domiciled in the State who is outside the State for a temporary or transitory purpose. All other individuals are nonresidents.â€�[/quote]
The CCPA also states that:
[quote]â€œThe obligations imposed on businesses â€¦ shall not restrict a businessâ€™s ability toâ€¦[c]ollect or sell a consumerâ€™s personal information if every aspect of that commercial conduct takes place wholly outside of California. For purposes of this title, commercial conduct takes place wholly outside of California if the business collected that information while the consumer was outside of California, no part of the sale of the consumerâ€™s personal information occurred in California, and no personal information collected while the consumer was in California is sold. This paragraph shall not permit a business from storing, including on a device, personal information about a consumer when the consumer is in California and then collecting that personal information when the consumer and stored personal information is outside of California.â€�[/quote]
The CCPA further provides that a covered business will not be â€œrequired [to place] links and text on the homepage that the business makes available to the public generally, if the business maintains a separate and additional homepage that is dedicated to California consumers and that includes the required links and text, and the business takes reasonable steps to ensure that California consumers are directed to the homepage for California consumers and not the homepage made available to the public generally.â€�
Taken together, these provisions highlight the decisions both U.S. and foreign businesses will face when collecting and sharing personal information that might include the data of Californians. Should businesses provide the required â€œDo Not Sell My Personal Informationâ€� link and protections globally only in the U.S. or just to Californians?
Q:Â Does the CCPA govern nonprofit organizations?
A: The CCPA defines â€œbusinessâ€� as a sole proprietorship, partnership, limited liability company, corporation, association or other legal entity that is organized or operated â€œfor the profit or financial benefit of its shareholders or other owners,â€� that collects consumersâ€™ personal information, or on the behalf of which such information is collected and that alone or jointly with others determines the purposes and means of the processing of consumersâ€™ personal information, does business in the state of California, and satisfies certain revenue or data-processing thresholds.
Q: Does the CCPA apply to small businesses?
Â A: TheÂ CCPA applies to â€œbusinessesâ€� that meet one or more of the following thresholds:
- Has annual gross revenues in excess of $25,000,000, as adjusted pursuant to Paragraph 5 of Subdivision A of Section 1798.185.
- Alone or in combination, annually buys, receives for the businessâ€™s commercial purposes, sells or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households or devices.
- Derives 50% or more of its annual revenues from selling consumersâ€™ personal information.
Q: Does â€œsellingâ€� include sharing personal data with affiliated organizations?
A:Â Answering this question requires understanding several definitions under the CCPA, including not only â€œselling,â€� but also â€œbusiness,â€� â€œbusiness purposes,â€� â€œservice provider,â€� â€œpersonal informationâ€� and â€œthird party.â€�
â€œSellâ€� is defined as â€œselling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumerâ€™s personal information by the business to another business or a third party for monetary or other valuable consideration.â€�
There are exceptions to this general definition, including when the individual directs the transfer of information or the business shares information pursuant to a contract with a service provider for a business purpose of which the consumer has been informed.
Under the CCPA, sharing personal information with an affiliate for valuable consideration would generally be considered a sale if the affiliate is considered a third party. This would be the case unless the affiliate controls or is controlled by the â€œbusinessâ€� and shares common branding with the business or is a service provider with whom the information is shared pursuant to a contract for a business purpose of which the consumer has been informed.
Q: If a business has entered into a controller-processor data protection agreement with a vendor, is it reasonable to assume the transfer of data to that vendor will not be considered a “sale” under the CCPA? How should businesses categorize and manage service providers differently from third parties in terms of contracts and CCPA obligations?
A: To avoid a data transfer to a vendor being considered a â€œsaleâ€� under the CCPA, the transfer must be to fulfill a business purpose of which the consumer has been informed, and it must be governed by a written contract. The contract with a â€œservice providerâ€� to process personal information on behalf of the â€œbusinessâ€� must prohibit the service provider â€œfrom retaining, using, or disclosing the personal information for any purpose other than for the specific purpose of performing the services specified in the contract for the business, or as otherwise permitted by [the CCPA], including retaining, using, or disclosing the personal information for a commercial purpose other than providing the services specified in the contract with the business.â€�
Q: Does the required â€œDo Not Sell My Personal Informationâ€� link need to go on mobile sites/apps, as well?
A: A business that sells personal information about the consumer to third parties must provide a clear and conspicuous link on the businessâ€™s internet â€œhomepage,â€� titled â€œDo Not Sell My Personal Information.â€� In the case of an online service, such as a mobile application, â€œhomepageâ€� means the applicationâ€™s platform page or download page; a link within the application, such as from the application configuration, â€œAbout,â€� â€œInformation,â€� or settings page; and any other location that allows consumers to review the notice required before downloading the application.
Q: What are key differences between the CCPA and GDPR on which businesses adapting a GDPR-compliance program to CCPA should focus?
A: While there are numerous differences, described in great detail here, areas to focus include:
- The definition of personal data under the CCPA, which is broad and explicitly includes household data. It states, in part: â€œ’Personal information’ means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household â€¦â€�
- New consumer rights and associated business requirements, particularly those associated with the â€œsaleâ€� of personal data, including the requirement that a business that sells a consumerâ€™s personal information to third parties provide a clear and conspicuous link on its internet homepage, titled â€œDo Not Sell My Personal Information,â€� to an internet webpage that enables the consumer to opt out of the sale of the personal information.
- The categorization of information shared for â€œbusiness purposesâ€� versus â€œcommercial purposesâ€� and required disclosures pursuant to a consumer access request.
- Required contractual terms when sharing personal data with â€œservice providers.â€�
Q: Does the CCPA enter into force Jan. 1, 2020, or July 1, 2020?
A: The CCPA enters into force Jan. 1, 2020. On Aug. 31, 2018, the California State Legislature passed SB-1121, which:
- Extended the deadline for the attorney general to adopt the lawâ€™s implementing regulations from Jan. 1, 2020, until July 1, 2020.
- Delayed the attorney generalâ€™s ability to enforce the bill until six months after the publication of those regulations or July 1, 2020, whichever comes sooner.
Given those amendments, the attorney general could enforce the new law prior to July 1, 2020, only if the attorney general adopts implementing regulations prior to Jan. 1, 2020. The fact that the amendments provided the attorney general additional time beyond Jan. 1 to adopt the regulations makes it unlikely (though not impossible) that they will be adopted sooner.
It’s like talking to my children, sighs marketing bigwig
Huawei top brass took to the stage in Shenzhen this week to insist that everything was fine and dandy in the company’s world, despite the shrieking from US lawmakers.
In front of an audience of 750, deputy chairman Ken Hu described 2018 as an “eventful” year for the company and thanked the assembled media for “paying so much attention” to the Chinese outfit.
US: We’ll pull security co-operation if you lot buy from Huawei
If 2018 was eventful, it’ll be interesting to see how Hu describes 2019. The US has ramped up the rhetoric by threatening its pals with a withdrawal of security cooperation if they buy kit from the company, and the UK’s Huawei Cyber Security Evaluation Centre (HCSEC) gave the company a good kicking over some decidedly whiffy coding practices.
On the plus side for Hu, at least the likes of Germany have managed to resist the increasingly shrill demands from the US to ditch the company’s gear.
Still, Hu was happy with how the whole 5G thing was going and flung up slides showing that in the first year of the technology, there were 100,000+ 5G base stations and 40+ phones. Except there aren’t.
Handsets remain a scarce commodity. Huawei’s own 5G flagship, the foldable Mate X, was conspicuous by its absence (although as things have turned out, that might not be such a bad thing). Catherine Chen, president of the company’s Public Affairs and Communications Department, explained that Hu was really talking about contracts signed with telcos, and told The Register that Huawei had actually shipped 70,000 base stations, and the deployment of those were up to the telcos concerned.
Exaggeration and hyberbole about 5G? Say it ain’t so!
The orange elephant in the room did, however, have to be addressed, and Hu stated the company believes that “trust or distrust should be based on fact”, pointing to the company’s new transparency centre in Brussels and the multibillion-dollar transformation plan to deal with its dodgy code. He also congratulated the European Union on its privacy efforts while pointedly ignoring the US.
Let’s talk security turkey
John Suffolk, Huawei’s security boss and former UK government IT bigwig, was a little more blunt.
While he accepted that Huawei’s code contained a lot of “clutter” that had built up over the years, he felt the company was being singled out for special attention “because we’re a Chinese company, the spotlight will always be on us”.
He also promised that the transformation plan, details of which have been infuriatingly limited up to now, would be presented in the coming months.
Suffolk, of course, has the final veto on Huawei’s products from a security standpoint, with the company’s Independent Cyber Security Lab (ICSL) reporting to him with data from internal testing.
The company also allows its code to be inspected (as by HCSEC), although as for open-sourcing the whole lot and being done with it, Suffolk scoffed: “Do you honestly expect we’re going to open-source our crown jewels?”
Though Suffolk insisted the company will comply with every certification requirement and standard set by its customers, and that the company would be “as open as possible”, he said: “Some people you’re never going to convince.” He went on to say that countries such as the US were doing their citizens a “disservice” by barring Huawei from the marketplace.
As well as putting an America-sized dent in the giant’s revenues.
Certainly, the US government presents a challenge. Chief marketing officer Peter Zhou described explaining the technology to officials as similar to how he would explain it to his children, resorting to PlayStation metaphors to get the point across.
Zhou also pointed out that the barring of Huawei from the US marketplace would not make the country a leader in 5G. The spectrum allocation, for one thing, will make international roaming a tad tricky without phones becoming more complicated (and expensive).
However, the furore generated by the US, which Chen said after a decade of rumbling turned “radical” during the Trump presidency, has brought some benefits. She reckoned that the controversy had done much to publicise and raise awareness of 5G and increase the size of the market.
While Washington’s shenanigans were “not the biggest problem” faced by the company over its 30-year history, Chen said Huawei would still very much like in on the US market, even though many telco contracts have now been signed with other 5G providers.
The company is therefore putting its faith in the lumbering US judicial system, which Chen described, without a trace of irony, as “fair, just and transparent”.
In the meantime, with regard to continuing accusations of Chinese government interference, the company continues to trot out its corporate line, Jerry Maguire-style: “Show us the evidence.” ®
Sponsored: Becoming a Pragmatic Security Leader
The European Data Protection Board has released a version of its Guidelines 2/2019 on processing personal data under Article 6(1)(b) of the EU General Data Protection Regulation for public consultation. The guidelines cover the applicability of data processing â€œin the context of contracts for online services, irrespective of how the services are financed.â€� The EDPB document offers an analysis of Article 6(1)(b), how the article interacts with â€œother lawful bases for processing,â€� and its applicability in certain situations. The EDPB announced it will accept comments on the guidelines until May 24.
On Tuesday, April 2, 2019, the Texas House Committee on Business and Industry held a public hearing (testimony starts at the 4:13:55 mark) on House Bill 4390, also known as the Texas Privacy Protection Act, and House Bill 4518, also known as the Texas Consumer Privacy Act. An overview of both bills as filed is available here. Both bills received a mix of testimony in support and opposition. In addition to the verbal testimony heard in committee, several other stakeholders registered their testimony for or against each bill.
Testimony on HB 4390, aka the Texas Privacy Protection Act
Rep. Giovanni Capriglione, R-Texas, the bill’s author, began by presenting it to the committee â€” no notes, nothing rehearsed, just a candid conversation on why he authored this legislation. Capriglione started by pointing out that another representative was using hand sanitizer and that now we will surely all see advertisements for hand sanitizer in the next few days. He added, â€œWhat my bill aims to do is to provide a little bit more regulation, a little bit more oversight, into the information that is being collected on us, about us, every single day without our knowledge â€” a lot of times without our permission â€” and is being used in ways that can negatively affect our credit scores, our health insurance premiums, or car insurance premiums, and even what kind of cars and hotels youâ€™ll be able to get into.â€�
Stephen Scurlock, testifying neutral on the bill on behalf of the Independent Bankers Association of Texas, stated that other entities should be subject to regulations similar to the Gramm-Leach-Bliley Act, just as banks are. He went on to testify that â€œany entity keeping personally identifiable information should be subject to the same rigorous standards we live under, and while breaches happen, both consumers and banks who do bear the brunt of losses from these breaches, would be much better off if that standard were adhered to.â€� John Heasley, testifying in favor of the bill on behalf of the Texas Bankers Association, also emphasized that other entities should be subject to regulations similar to the GLBA.
Sarah Matz, director of state government affairs for the Computing Technology Industry Association, expressed opposition to the bill, stating the legislation, as filed, is modeled after the California Consumer Privacy Act and â€œwhile these laws were developed with the best of intentions, they are more likely to result in significant compliance costs and stifle innovation, rather than improving consumer safety.â€�
She suggested that instead of moving forward with legislation at the state level, Texas should continue to study the evolution of the CCPA and continue a dialogue with impacted stakeholders.
Deborah Giles, testifying against the bill on behalf of the Texas Technology Consortium, conveyed that data privacy regulation should be handled at the federal level and not at the state level. She added, â€œWeâ€™ve already seen signs of every state doing everything differently, and we look at the internet and data privacy, and feel like if 50 states have 50 different laws, what it could do as far as the consequences of costing business would be exorbitant.â€�
In response, Rep. Michelle Beckley, D-Texas, raised concern that federal legislation could take too long, stating that â€œright now, [federal legislation] seems like itâ€™s just an idea, whereas we have something we can vote on.â€� Ms. Giles jovially responded, â€œI completely agree with you â€“ itâ€™s the other states that scare the â€˜begeebeesâ€™ out of us.â€�
James Hines, who also opposed the bill on behalf of the Texas Association of Business, stated that â€œwe do urge that we work with our congressional members on a national framework, and the reason we are in favor of a single national framework is because of the nature of the internet â€“ the internet has no borders.â€�
Rep. Jared Patterson, R-Texas, responded, â€œâ€¦we can see what works in the different states. Texas a lot of times leads the way on that, and then other states adopt what works bestâ€¦ I think we can probably do it better than DC.â€� Beckley also added that states adopting their own data privacy laws could be the very thing that leads to a single national framework.
The final witness, Chris Humphreys, testifying in favor of the bill on behalf of himself and the Anfield Group, stated, â€œItâ€™s inevitable, [the EU General Data Protection Regulation] or GDPR-lite is going to be here at the federal level eventually. We are delaying the inevitable by not doing anything now.â€� Humpheys said bill presents an opportunity for Texas to handle data privacy their way, and ultimately federal legislation may be developed that lands somewhere between Texas and California.
Capriglione closed on the bill by respectfully disagreeing that Congress will reach consensus on federal legislation anytime soon and that Texas has an opportunity to implement these protections now. He also clarified that this legislation is not modeled after the CCPA.
Testimony on HB 4518, aka the Texas Consumer Privacy Act
The committee received far less testimony on HB 4518, but this was in large part due to the fact that most witnesses did not want to repeat similar testimony just laid out on HB 4390.
Rep. Martinez Fischer, D-Texas, said, â€œI fully appreciate and recognize that there might be â€˜higher-upsâ€™ in the federal government that could grade our papers on this, and come up with a solution that can be applied to the entire nation. But unless and until that happens, I think we canâ€™t just sit on our hands and watch time go by.â€�
The only additional witness who testified on HB 4518 but did not testify previously on HB 4390 was David Foy, testifying against the bill on behalf of RELX/Lexis Nexis. Foy stated that this bill is very close, if not identical, to the CCPA and that California is now on the third iteration of that legislation. He stated that â€œthere are currently 40 bills filed right now in California being discussed to go back and fix and try to address what happened in 2018.â€� He added that to this date, there have been about 30 bills filed nationwide and that California is the only state to have adopted it.
He reiterated that experts from the privacy and business industries need to collaborate to get this right before implementing legislation too quickly.
Next legislative steps
Now that both bills have received public testimony, they await a committee vote and then would head to the full Texas House of Representatives. From there, the bills still need to clear committee and floor votes in the Texas Senate before heading to the governorâ€™s desk. As the 86thÂ Texas Legislative Session ends May 27, both bills are racing against the clock.
The European Commission released a 255-page study on EU General Data Protection Regulation data protection certification mechanisms, examining Articles 42 and 43 of the Regulation, according to Hunton Andrews Kurth’s Privacy & Information Security Law Blog. Meanwhile, the European Data Protection Board is now accepting comments on Guidelines 2/2019 on the processing of personal data under Article 6(1)(b) GDPR in the context of the provision of online services to data subjects. Comments will be accepted until May 24.
Following the implementation of the EU General Data Protection Regulation nearly a year ago, European regulators are struggling to keep pace with data breach notifications, The Wall Street Journal reports. IAPP President and CEO J. Trevor Hughes, CIPP, said, â€œThereâ€™s been a real concern about the sheer volume of work that is arriving at the regulatorsâ€™ doorsteps.â€� Zurich Insurance GroupÂ AG Global Head of Cyber Risk Lori Bailey expects that with time, there will be more predictability for how regulators will react and fine companies for incidents but noted, â€œweâ€™re just not there yet.â€� Meanwhile, Insurance Europe released a position paper in response to the European Commission stocktaking exercise on the application of the GDPR. (Registration may be required to access this story.)Â
In this week’s Privacy Tracker global legislative roundup, learn about the European Commissionâ€™s study on data protection certification mechanisms under the EU General Data Protection Regulation. EU Commissioner for Justice, Consumers and Gender Equality VÄ›ra JourovÃ¡ explains why the U.S. government needs to adopt stronger privacy laws, Hong Kong may be preparing to alert its position on the enforcement of data breaches, and the Office of the Privacy Commissioner of Canada announced it plans to revise its policy position on trans-border data flows under the Personal Information Protection and Electronics Documents Act. Â
California Senate Bill 561, the proposed bill created to expand the private right of action under the California Consumer Privacy Act, has been referred by the state’sÂ Senate Judiciary Committee to the Senate Appropriations Committee by a 6-2 vote.
Czech Republic PresidentÂ MiloÅ¡ Zeman signed the Data Protection Act and the Accompanying Act into law.
TheÂ European CommissionÂ released its study on data protection certification mechanisms under the EU General Data Protection Regulation.
TheÂ European Data Protection BoardÂ announced it will accept comments onÂ Guidelines 2/2019 on the processing of personal data under ArticleÂ 6(1)(b) of the EU General Data Protection Regulation until May 24.
GhanaÂ Minister of Communication Ursula Owusu-Ekuful announced the country’s government intends to launch a Cyber Security Authority to handle cyber incidents.
Amendments toÂ Massachusetts’Â Data Breach Notification Law went into effect April 11.
Sen. Ed Markey, D-Mass., introduced the “Privacy Bill of Rights Act.”
A Russian court fined Facebook approximately $47 for violations of the country’s data localization rules.
TheÂ U.K.Â Information Commissioner’s Office has proposed standards that would prohibit children from “liking” posts on social media platforms.
In this post for Privacy Tracker, Marval, O’Farrell & Mairal Partner Diego FernÃ¡ndez looks at the findings of Argentina’s Agency of Access to Public Information Annual Report 2018.
In this post for Privacy Tracker,Â GCA Advogados Partner Ana Carolina Cagnoni, CIPP/E, looks at the current state of Brazilian data protection rules and the upcoming future of the country’sÂ General Data Protection Law.
Almost a year into the implementation of the EU General Data Protection Regulation, Falck A/S Legal Counsel Natalija Bitiukova, CIPP/E, CIPM, FIP, offers a look at Lithuaniaâ€™s implementation efforts and discusses ongoing and upcoming investigations scheduled for the year ahead.
In this article for The Privacy Advisor, Santa Clara Universityâ€™s Cyrus Borhani, CIPP/US, and Justin Banda report onÂ Spain’s approval of a controversial data protection law to facilitate compliance with the country’s law to the EU General Data Protection Regulation and theÂ concern that the rule application has deviated from the GDPRâ€™s intended effect.
Australiaâ€™s Parliamentary Joint Committee on Law Enforcement has asked the government to be more transparent regarding plans to allow state and territory law enforcement access to the countryâ€™s face-matching service.
In a blog entry for Ince Gordon Dadds, Managing Associate Simon Cheng, CIPM, and Trainee Solicitor Suki Fung speculate that Hong Kong is preparing to alter its position on the enforcement of data breaches.
The Office of the Privacy Commissioner of Canada announced it plans to revise its policy position on trans-border data flows under theÂ Personal Information Protection and Electronic Documents Act.
The Canadian government is â€œactively consideringâ€� regulations against social media companies.
In an interview with Digiday, European Data Protection Supervisor Giovanni Buttarelli discussed the landscape of EU General Data Protection Regulation adoption among media and advertising businesses.
Ireland’s Data Protection Commission determined Irish citizens do not have the “absolute right” to have their names spelled correctly in public records.
At the 12th annual Data Protection Practitionersâ€™ Conference, U.K. Information Commissioner Elizabeth Denham said implementation of the EU General Data Protection Regulation is at a â€œcritical stage.â€�
In a post for Politico,Â EU Commissioner for Justice, Consumers and Gender Equality VÄ›ra JourovÃ¡ explains why the U.S. government needs to adopt stronger privacy laws.
Sens. Cory Booker, D-N.J., and Ron Wyden, D-Ore., sponsored theÂ Algorithmic Accountability Act, with Rep. Yvette Clarke, D-N.Y., sponsoring a House of Representatives equivalent.
A bill introduced by Sens. Mark Warner, D-Va., and Deb Fischer, R-Neb., would prohibit the use of alleged deceptive practices by social media companies.
The U.S. Department of Justice announced the release of a white paper on the Clarifying Lawful Overseas Use of Data Act.
The U.S. House of Representatives passed a restoration bill to reinstate net neutrality rules with a 232-190 vote.
U.S. lawmakers held hearings regarding content moderation as the conversation continues around how to regulate big tech while protecting free speech.
A coalition of advertising trade groups has joined forces to work with Congress on a federal U.S. privacy bill.
As the U.S. Federal Trade Commission holds a series of hearings to review privacy regulations, CreativeFuture has raised concern over adopting laws similar to the EU General Data Protection Regulation.
As files pile up, customer numbers grow, storage systems spread, it’s only going to get worse
Sponsored One customer, one customer order, right? Wrong.
Sales will have a copy of the original as will shipping, who have probably copied it to a desktop. That’s two or three, right there. Credit control will receive a copy, via email, that might get stored on a network drive with a copy then sent to accounts receivable. Then the backup and archival process started.
Repeat this, every day, and even the smallest company is soon swimming in copies of the same document.
Welcome to the world of mass data fragmentation – copying, slicing, dicing and then storing – in a multitude of locations – something that’s termed “secondary data.” That is, the data that goes outside transaction systems in production databases. We’re thinking about data like backups, file and object storage, non-production test and development files, search and analytics. Archived data, too.
Why should you care? One reason is the hidden cost of storing those duplicate. If, and when, it comes to consolidating you won’t know where to begin. And consolidate you should, for how can you be sure that everybody has the absolutely latest and definitive view or understanding of the customer?
Stuart Gilks, systems engineering manager at data management company Cohesity, reckons mass data fragmentation is a function of data volume, infrastructure complexity, the number of physical data locations and cloud adoption. And guess what? They’re all growing at an astounding rate.
IT systems aren’t becoming any less complex, thanks to a combination of organic and inorganic IT growth. A succession of different project owners and IT teams’ layer different IT systems atop each other over the years, each of which contains secondary data and few of which talk to each other easily.
As far as data volume goes, enterprises are producing data more quickly than they can manage it. Last year, Cohesity surveyed 900 senior decision makers from companies across six countries, with 1,000 employees or more. Ninety eight per cent of them said their secondary storage had increased over the prior 18 months, and most said that they couldn’t manage it with existing IT tools.
Not content with generating more data than they can handle, companies are starting to fling it around more. They started by storing it with single cloud providers, but quickly gravitated to hybrid cloud and multi-cloud systems. Eighty five per cent are using multi-cloud environments, says IBM.
These multi-cloud environments spread data over different domains, each of which usually has its own data management tool. Oh, joy.
“You’ve got a proliferation of locations and you’ve got a proliferation of silos that have a specific purpose. You’re almost generating a problem in three dimensions,” Gilks says. “This makes it difficult to manage systems, risk and efficiency and deliver business value, especially at a time when budgets aren’t going up.”
Not all this data duplication is haphazard, mind. Organisational and legal drivers sometimes force companies to fragment their secondary data. Compliance or security concerns may make it necessary to draw hard lines between different departments or customers, serving each with different copies of the same data.
Multi-tenancy is a good example. You may provide a service to one company or department but be forced to isolate their data completely from company B in the same computing environment, even if some of it is identical. Other reasons may stem from office politics. Server huggers lurk in every department. We said there might be well-understood reasons for creating data silos, but we didn’t say they were all good.
This mass data fragmentation problem creates several impacts that can cripple a business.
The first is a lack of visibility. This secondary data is valuable because there is a wealth of corporate value locked up inside it. Analytics systems thrive on data ranging from call centre metadata to historical sales information. If data is the new oil, then carving it up into different silos chokes off your fuel supply.
The second is data insecurity. Much of that secondary data will be sensitive, including personally-identifiable customer information. Someone who stumbles on the right piece of secondary data in your organization could find and target members of your skunkworks product research team, or customer list, or email everyone in your company with a list of senior management salaries. None of these outcomes are good.
The third, linked impact is compliance. GDPR was a game-changing regulation that made it mandatory to know where your data is. When a customer demands that you reproduce all the data you hold on them, you’d better be able to find it. If it’s smeared across a dozen corporate systems and difficult to identify let alone retrieve, you’re in trouble.
Bloat and drag
Then, there’s the effect on business agility. Developing new systems invariably means supporting and drawing on secondary data sources. The Cohesity survey found that 63 per cent of respondents had between four and 15 copies of the same data, while 10 per cent had 11 copies or more. Those files aren’t just located on a company’s premises; they’re also stored off-site.
Developing new systems while ensuring integrity across all of those file copies might feel like pulling a garbage dump up a mountain. It would not only affect IT’s agility to support business requirements, but would bloat development budgets too.
The numbers bear this out. Forty eight of those answering Cohesity’s survey spent at least 30 per cent (and up to 100 per cent) of their time managing secondary data and apps. The average IT department spent four months of the working year grappling with fragmented secondary data.
This leaves IT employees feeling overworked and underappreciated. Over half of all survey respondents said that staff were working 10 hours of overtime or more to deal with mass data fragmentation issues. Thirty eight percent are worried about “massive turnover” on the IT team.
Mass data fragmentation also affects a company’s immediate ability to do business. Ninety one of fretted about the level of visibility that the IT team had into secondary data across all sources. That translates directly into customer blindness. If the IT team can’t pull together customer data from different silos, then how can they draw on it for operations like CRM or customer analytics?
Taming in action
So much for the data fragmentation problem. Now, how do you solve it?
The least drastic version involves point systems that manage secondary data for specific workloads. Dedicated backup or email archiving systems are one example. They do one thing really well, although you may well end up needing more than one of them to cope with different departmental silos. In any case, according to Gilks, they don’t handle all of the workloads you might want to apply to secondary data. Instead, you need different software for different things.
Another option is a middleware or integration platform that makes the data accessible at a lower level, for consumption by a variety of applications. These products allow architects to create mappings between different systems. They can program those mappings to extract, transform and filter data from one location before loading it into another.
Gilks still sees problems. “Even if it’s completely successful, I still have 11 copies of my data,” he says. “At best, middleware is an effective Band-Aid.”
Ideally, he says, you’d want to consolidate those 11 copies down to a smaller number, whittling away those that weren’t there purely for security and compliance reasons.
“I’d probably want two or three resilient copies at most,” he continues. “I might think about a copy for my primary data centre, a copy for my secondary data centre, and a copy for the cloud.”
Some companies have had success with hyperconvergence in their primary systems. This approach simplifies IT infrastructure by merging computing, storage, and networking components. It uses software-defined management to coordinate commodity compute, storage and network components in single nodes that scale out.
Squeezing data into a collapsed compute-storage-network fabric has its pros and cons. While the hyper converged kit has no internal silos, it might become its own silo, presenting barriers to the non-hyperconverged infrastructure in the rest of the server room.
Hyperconverged infrastructure also often needs you to scale compute and storage together, and it is typically difficult for non-virtualised legacy applications to access the virtual storage on these boxes.
Perhaps most importantly in this context, you’re unlikely to store the bulk of your secondary data on these systems, especially the archived stuff.
Cohesity applied the hyperconvergence approach to secondary data management, using software-defined nodes that can run on hardware appliances, or on virtualised machines on customer premises or in the cloud. It slurps data and then deduplicates, compresses and encrypts it to produce a smaller, more efficient dataset.
The company then offers scale-out storage for access via standard interfaces including NFS, SMB and S3, and provides a range of services via the platform ranging from anti-ransomware and backup/recovery through to an analytics workbench and App Marketplace.
Whichever approach you choose, fighting the multi-headed data beast now will save you budgetary woes late and free your IT department up to be more sprightly in future developments. If you can’t entirely slay the monster, then at least try to tame it a little.
Sponsored by Cohesity
Sponsored: Becoming a Pragmatic Security Leader
In the UK, it seems, someone is trying to think of the children
Analysis The famous “Like” button may be on the way out if a new code for social media companies, published by the UK’s Information Commissioner’s Office (ICO), has its way.
Among the 16 rules in the consultation document [PDF] is a proposed ban on the use of so-called “nudge techniques” – where user interfaces and software are specifically designed to encourage frequent, daily use – as well as gather information that can then be sold on.
“Do not use nudge techniques to lead or encourage children to provide unnecessary personal data, weaken or turn off their privacy protections, or extend their use,” the code states, among a range of other measures that social media giants like Facebook, Twitter and Snapchat are going to hate.
The code is specifically all about the children, with the head of the ICO, Elizabeth Denham saying in a statement: “We shouldn’t have to prevent our children from being able to use [the internet], but we must demand that they are protected when they do. This code does that.”
Many of the changes would require companies like Facebook to make adjustments to their software and back-end systems to work. And some would directly impact social media companies’ bottom line as they would cut off access to vast quantities of personal information, which the companies repackage and sell to advertisers.
The code is just the latest push by the UK government – also reflected across Europe – to bring social media companies in line with what have been long-established norms and make them more responsible for removing damaging and illegal content, as well as limit the amount of personal data they compile.
It comes a week after the UK government published a White Paper on “Online harms” that argued for new, restrictive laws on social media and amid a global sense among lawmakers that the era of self-regulation in the internet space is over. The code has also been published just before a new law that requires adult content websites to verify the age of UK consumers before providing them with access to their material comes into effect.
One of the key drivers for the new code, film director and children’s rights campaigner Baroness Beeban Kidron said in a statement: “For too long we have failed to recognize children’s rights and needs online, with tragic outcomes.”
She went on: “I firmly believe in the power of technology to transform lives, be a force for good and rise to the challenge of promoting the rights and safety of our children. But in order to fulfill that role it must consider the best interests of children, not simply its own commercial interests.”
Some of the rules are general to the point of vagueness – such as the first requirement that a social media company make “the best interests of the child a primary consideration.”
But others are firm and threaten to have a significant impact on not just the design but also the business model used by such companies. The code makes it plain that unless the companies enact age-verification systems, the UK government expects them to extend all the changes to all users, regardless of age.
One key change is for default settings to be set to “high privacy” – something that Facebook famously gets around by constantly changing its own systems and forcing users to rediscover and reapply content controls. A high-privacy default would significantly limit the amount of personal information that can be automatically gathered through such a service.
Another is the key concept of “data minimization” – which is present in Europe’s GDPR data privacy legislation – where companies are expected to only gather the information they need to provide their service and no more.
And the code says that location tracking should be turned off by default and there should be “an obvious sign” if it is turned on. It also says that making user location visible to others “must default back to off at the end of each session.”
Clear and concise? What madness is this?
And in a clear poke in the eye to Facebook, the code insists that user are provided with “‘bite-sized’ explanations about how you use personal data at the point that use is activated” and those explanation be “concise, prominent and in clear language suited to the age of the child.”
The code is quite clearly aimed at banning all the questionable practices that social media companies have introduced in order to gain access to as much personal data as possible, and uses the fact that different laws exist around the protection of children and their information to push the changes.
Turn me up some: Smart speaker outfit Sonos blasted in complaint to UK privacy watchdog
Somewhat predictably, those companies are not happy although they are currently treading a diplomatic line – in public at least. In its response [PDF] to the ICO’s initial call for feedback, Facebook made it plain that it was not happy with the direction they were going and basically claimed that it was already doing enough.
It even strongly implied that the regulator was patronizing kids by insisting on such controls when “we know that teenagers are some of the most safety and privacy users of the internet.” It adds that, “age is an imperfect measure of maturity” and the proposals risk “dumbing down” controls for children “who are often highly capable in using digital services.”
And in a word-perfect summary of Facebook and its culture, the organization notes that when it comes to its systems “the design journey is never over” and that it is “highly committed to improving people’s experience of its own services.”
The code is out for public review and comment until May 31. ®
Sponsored: Becoming a Pragmatic Security Leader
In an interview with Digiday, European Data Protection Supervisor Giovanni Buttarelli discussed the landscape of EU General Data Protection Regulation adoption among media and advertising businesses. On the topic of consent, Buttarelli said companies must take an active approach, rather than rely on checkboxes and opt-outs. â€œEven ticking a box does not necessarily mean consent is freely given,â€� Buttarelli said. â€œUnambiguous consent means it must not only be explicit but meaningful, not a case of pre-ticked boxes or a case where you have no alternative but to continue through to a website.â€� Buttarelli also discussed fines handed out by the U.K Information Commissionerâ€™s Office and Franceâ€™s data protection authority, the CNIL, against Facebook and Google respectively.
Multiple providers leaving storage cookies up for grabs
The US-Cert is raising alarms following the disclosure of a serious vulnerability in multiple VPN services.
A warning from the DHS cyber security team references the CMU Cert Coordination Center’s bulletin on the failure of some VPN providers to encrypt the cookie files they place onto the machines of customers.
Ideally, a VPN service would encrypt the session cookies that are created when a user logs in to access the secure traffic service, thus keeping them away from the prying eyes of malware or network attacks. According to the alert, however, sometimes those keys were being kept unencrypted, either in memory or on log files, allowing them to be freely copied and re-used.
From directory traversal to direct travesty: Crash, hijack, siphon off this TP-Link VPN box via classic exploitable bugs
“If an attacker has persistent access to a VPN user’s endpoint or exfiltrates the cookie using other methods, they can replay the session and bypass other authentication methods,” the post explains. “An attacker would then have access to the same applications that the user does through their VPN session.”
To be clear, the vulnerable cookies are on the user’s end, not on the server itself. We’re not talking about a takeover of the VPN service, but rather an individual customer’s account. The malware would also need to know exactly where to look on the machine in order to get the cookies.
So far, vulnerable parties include Palo Alto Networks GlobalProtect Agent 4.1.0 for Windows and GlobalProtect Agent 4.1.10 and earlier for macOS, Pulse Secure Connect Secure prior to 8.1R14, 8.2, 8.3R6, and 9.0R2, and Cisco AnyConnect 4.7.x and prior. Palo Alto has already released a patch.
Check Point and pfSense, meanwhile, have confirmed they do encrypt the cookies in question.
Possibly dozens more vendors are going to be added to the list, however, as this practice is believed to be widespread. The site notes that over 200 apps have yet to confirm or deny that their session cookies are left unencrypted.
“It is likely that this configuration is generic to additional VPN applications,” the notice explains. ®
Sponsored: Becoming a Pragmatic Security Leader
ICO says case involving 34.4 million records ‘unprecedented’
Updated The Information Commissioner’s Office has fined commercial pregnancy and parenting club Bounty some £400,000 for illegally sharing personal details of more than 14 million people.
The organisation, which dishes out advice to expectant and inexperienced parents, has faced criticism over the tactics it uses to sign up new members and was the subject of a campaign to boot its reps from maternity wards.
Now Bounty’s data protection practices have fallen under the gaze of the ICO: a probe found it collated personal information to generate membership registration, via its website, mobile app, merchandise pack claim cards and from new mums at hospital bedsides. Nothing new there.
But the business had also worked as a data brokering service until April last year, distributing data to third parties to then pester unsuspecting folk with electronic direct marketing. By sharing this information and not being transparent about its uses while it was extracting the stuff, Bounty broke the Data Protection Act 1998.
Bounty shared roughly 34.4 million records from June 2017 to April 2018 with credit reference and marketing agencies. Acxiom, Equifax, Indicia and Sky were the four biggest of the 39 companies that Bounty told the ICO it sold stuff to.
This data included details of new mother and mothers-to-be but also of very young children’s birth dates and their gender.
“The number of personal records and people affected in this case in unprecedented in the history of the ICO’s investigations into data brokering industry and organisations linked to this,” said the ICO’s director of investigations, Steve Eckersley.
He claimed Bounty was “not transparent” to the millions of people whose data it sold, saying the consent given by people was “clearly not informed”, and Bounty’s action were “motivated by financial gain given that data sharing was an integral part of their business model at the time”.
“Such careless data sharing is likely to have caused distress to many people since they did not know that their personal information was being shared. multiple times with so many organisations, including information about their pregnancy status and their children,” Eckersley added.
Updated 12 April at 14.37BST.
Bounty managing director Jim Kelleher, has sent us a statement:
“In the past we did not take a broad enough view of our responsibilities and as a result our data-sharing processes, specifically with regards to transparency, were not robust enough. This was not of the standard expected of us. However, the ICO has recognised that these are historical issues.”
He said the business overhauled internal processes a year ago “reducing the number of personal records we retain and for how long we keep them, ending relationships with the small number of data brokerage companies with whom we previously worked and implementing robust GDPR training for our staff.”
Of course, if the data sharing had been done since 25 May 2018, Bounty would be facing a far greater fine, up to 4 per cent of annual turnover or €20m, whichever is greater. ®
Sponsored: Becoming a Pragmatic Security Leader
Doctors in Ireland believe the Department of Healthâ€™s interpretation of the EU General Data Protection Regulation may halt their research,Â The Irish Examiner reports. The doctorsâ€™ concerns stem from the requirements to obtain explicit consent before processing patient data.Â In a review published in the Journal of Irish Medical Science, doctors reveal Ireland is the one country among the 28 EU member states that require health researchers to get an express statement of consent. The review also states the explicit consent requirement is â€œproblematic for most (if not all) ongoing research when the re-consenting of patients is required,â€� particularly in regards to research involving biobanks and patients who are unable to consent.
America tries to keep the pressure on Chinese biz
A US official has repeated his country’s threats against its allies over Huawei – stating that the US’s goal is a process that leads “inevitably to the banning” of the Chinese company’s products.
“We have encouraged countries to adopt risk-based security frameworks,” said Robert Strayer, speaking on a call with the world’s press on Wednesday, expressing the hope that such frameworks would “lead inevitably” to bans on Huawei.
Strayer, who is the American foreign ministry’s deputy assistant secretary for Cyber and International Communications and Information Policy, told journalists that his country may withdraw some co-operations with its allies on security matters if they install Huawei equipment on internet and phone networks.
“The most fundamental security standard, really, is that you cannot have this extrajudicial, non-rule of law-compliant process where a government can tell its companies to do something,” Strayer told the Bloomberg newswire. This appears to be a reference to China’s National Intelligence Law, which forces companies to co-operate with the nation’s spy agencies, which in substance is no different from Western laws mandating the same thing.
The US’s main fear appears to be that China will soon be in a position to exercise the same sort of global surveillance that the US does through its dominance of the worldwide tech sector, challenging American hegemony.
Bloomberg also reported that the French parliament is considering a bill that would, in effect, replicate Britain’s Huawei Cyber Security Evaluation Centre part-run by spies from eavesdropping agency GCHQ. HCSEC, also known as The Cell, inspects Huawei source code for evidence of state backdoors. The Chinese company has come under increasing fire from the British state for the pisspoor state of its software development processes.
America’s allies have varied in their responses to the country’s call for a ban. Australia, its closest Pacific ally, has enthusiastically taken up the cudgel. Germany, meanwhile, has pointedly chosen to do its own thing, taking the EU along with it on that path.