On Executive Order 12333
Mark Jaycox has written a long article on the US Executive Order 12333: “No Oversight, No Limits, No Worries: A Primer on Presidential Spying and Executive Order 12,333“:
Abstract: Executive Order 12,333 (“EO 12333”) is a 1980s Executive Order signed by President Ronald Reagan that, among other things, establishes an overarching policy framework for the Executive Branch’s spying powers. Although electronic surveillance programs authorized by EO 12333 generally target foreign intelligence from foreign targets, its permissive targeting standards allow for the substantial collection of Americans’ communications containing little to no foreign intelligence value. This fact alone necessitates closer inspection.
This working draft conducts such an inspection by collecting and coalescing the various declassifications, disclosures, legislative investigations, and news reports concerning EO 12333 electronic surveillance programs in order to provide a better understanding of how the Executive Branch implements the order and the surveillance programs it authorizes. The Article pays particular attention to EO 12333’s designation of the National Security Agency as primarily responsible for conducting signals intelligence, which includes the installation of malware, the analysis of internet traffic traversing the telecommunications backbone, the hacking of U.S.-based companies like Yahoo and Google, and the analysis of Americans’ communications, contact lists, text messages, geolocation data, and other information.
After exploring the electronic surveillance programs authorized by EO 12333, this Article proposes reforms to the existing policy framework, including narrowing the aperture of authorized surveillance, increasing privacy standards for the retention of data, and requiring greater transparency and accountability.
A Case Western Reserve University computer and data sciences researcher is working to shore up privacy protections for people whose genomic information is stored in a vast global collection of vital, personal data.
Erman Ayday pursued novel methods for identifying and analyzing privacy vulnerabilities in the genomic data sharing network known commonly as “the Beacons.”
Personal genomic data refers to each person’s unique genome, his or her genetic makeup, information that can be gleaned from DNA analysis of a blood test, or saliva sample.
Ayday plans to identify weaknesses in the beacons’ infrastructure, developing more complex algorithms to protect against people or organizations who share his ability to figure out one person’s identity or sensitive genomic information using publicly available information.
Doing that will also protect the public – people who voluntarily shared their genomic information to hospitals where they were treated – with the understanding that their identity or sensitive information would not be revealed.
“While the shared use of genomics data is valuable to research, it is also potentially dangerous to the individual if their identity is revealed,” Ayday said. “Someone else knowing your genome is power – power over you. And, generally, people aren’t really aware of this, but we’re starting to see how genomic data can be shared, abused.”
Other research has shown that “if someone had access to your genome sequence–either directly from your saliva or other tissues, or from a popular genomic information service – they could check to see if you appear in a database of people with certain medical conditions, such as heart disease, lung cancer, or autism.”
Human genomic research
Genomics may sometimes be confused or conflated with genetics, but the terms refer to related, but different fields of study:
- Genetics refers to the study of genes and the way that certain traits or conditions are passed down from one generation to another. It involves scientific studies of genes and their effects. Genes (units of heredity) carry the instructions for making proteins, which direct the activities of cells and functions of the body.
- Genomics is a more recent term that describes the study of all of a person’s genes (the genome), including interactions of those genes with each other and with the person’s environment. Genomics includes the scientific study of complex diseases such as heart disease, asthma, diabetes, and cancer because these diseases are typically caused more by a combination of genetic and environmental factors than by individual genes.
There has been an ever-growing cache of genomic information since the conclusion of the Human Genome Project in 2003, the 13-year-long endeavor to “discover all the estimated 20,000-25,000 human genes and make them accessible for further biological study” as well as complete the DNA sequencing of 3 billion DNA subunits for research.
Popular genealogy sites such as Ancestry.com and 23andMe rely on this information–compared against their own accumulation of genetic information and analyzed by proprietary algorithms–to discern a person’s ancestry, for example.
Ayday said companies, government organizations and others are also tapping into genomic data. “The military can check the genome of recruits, insurance companies can check whether someone has a predisposition to a certain disease,” he said. “There are plenty of real life examples already.”
Scientists researching genomics are accessing shared DNA data considered critical to advance biomedical research. To access the shared data, researchers send digital requests (“queries”) to certain beacons, each specializing in different genetic mutations.
What are ‘the Beacons?’
The Beacon Network is an array of about 100 data repositories of human genome coding, coordinated by the Global Alliance for Genomics & Health (in collaboration with a Europe-based system called ELIXIR).
And while “queries do not return information about single individuals,” according to the site, a scientific study in 2015 revealed that someone could infer the membership of a particular beacon by sending that site an excessive number of queries.
“And then we used a more sophisticated algorithm and showed that you don’t need thousands of queries,” Ayday said. “We did it by sending less than 10.”
Then, in a follow-up study, Ayday and team showed that someone could also reconstruct the entire genome for an individual with the information from only a handful of queries.
“That’s a big problem,” Ayday said. “Information about one, single individual should not be that easily found out.”
Edge computing is a foundational technology for industrial enterprises as it offers shorter latencies, robust security, responsive data collection, and lower costs, Frost & Sullivan finds.
In this hyper-connected industrial environment, edge computing, with its solution-agnostic attribute, can be used across various applications, such as autonomous assets, remote asset monitoring, data extraction from stranded assets, autonomous robotics, autonomous vehicles, and smart factories.
Multi-access edge computing market growth rate and revenue
Despite being in a nascent stage, the multi-access edge computing (MEC) market – an edge computing commercial offering from operators in wireless networks – is estimated to grow at an astounding compound annual growth rate of 157.4%, garnering a revenue of $7.23 billion by 2024 from $64.1 million in 2019.
“The recent launch of the 5G technology coupled with MEC brings computing power close to customers and also allows the emergence of new applications and experiences for them,” said Renato Pasquini, Information & Communication Technologies Research Director at Frost & Sullivan.
“Going forward, 5G and MEC are an opportunity for telecom operators to launch innovative offerings and also enable an ecosystem to flourish in the business-to-business (B2B) segment of telecom service providers using the platform.”
Pasquini added: “From the perspective of the MEC ecosystem, software—edge application and solutions—promises the highest CAGR followed by services—telecom operators’ services, cloud providers’ infrastructure-as-a-service, and edge data center colocation services.”
Growth prospects for MEC market participants
It is predicted that approximately 90% of industrial enterprises will utilize edge computing by 2022, presenting immense growth prospects for MEC market participants, including:
- Telecom operators should work on solutions and services to meet the requirements for connected and autonomous cars.
- System integrators should provide end-to-end solutions, which would be a significant value addition for enterprises because 5G requires specific skillsets.
- The combination of 5G and the new specialized hardware-based mobile edge compute technologies can meet the market’s streaming media needs now and in the future.
- Telecom operators must partner with cloud providers and companies with abilities related to artificial intelligence, machine learning, and computer vision to design solutions for autonomous cars, drone delivery, and others.
- Companies in the MEC space must capitalize on the opportunity for innovation and new developments that utilize 5G and MEC, such as augmented reality (AR) and virtual reality (VR), which can also be applied to games.
RiskIQ released a report analyzing the company’s internet-wide telemetry and massive internet data collection to reveal the true extent of the modern corporate digital attack surface.
Digital attack surface challenges
“Today, organizations are responsible for defending not only their internal network but also their digital presence across the internet and the cloud,” said Lou Manousos, CEO, RiskIQ.
“Bringing the massive scope of an organization’s attack surface into focus helps frame the challenges of extending cybersecurity outside the corporate firewall, especially as staff forced to work from home in response to COVID-19 push that boundary farther out.”
When brands understand what they look like from the outside-in, they can begin developing an attack surface management program that allows them to discover everything associated with their organization on the internet—both legitimate and malicious—and investigate the threats targeting them.
- The global attack surface is much bigger than you think: 2,959,498 new domains (211,392 per day) and 772,786,941 new hosts (55,199,067) were observed across the internet over two weeks, each representing a possible target for threat actors.
- Sometimes hackers know more about your attack surface than you do: Looking at the attack surfaces of FT-30 companies, each organization had, on average, 324 expired certificates and 46 Web frameworks with known vulnerabilities.
- The hidden attack surface: In Q1 2020, 21,496 phishing domains across 478 unique brands were identified.
- The mobile attack surface: In 2019, 170,796 blacklisted mobile apps were discovered across 120 mobile app stores and the open internet.
Many organizations are starting to realize the benefits of increased scale and velocity of application deployment in their businesses, according to F5 Networks.
This value, however, can bring significant complexity as organizations maintain legacy infrastructure while increasingly relying on multiple public and private clouds, implement modern application architectures, and face an evolving and sophisticated threat landscape.
At the same time, organizations are adopting more application services designed to accelerate deployment in public cloud and container-native environments, like service mesh and ingress control.
App services requirements evolving
Survey data indicates this trend will accelerate as organizations become proficient in harnessing the data their application ecosystem delivers—creating advanced analytics capabilities and better business outcomes.
The survey shows that as companies manage legacy, multi-cloud, hybrid-cloud, and modern architectures to deliver applications, their requirements for app services are also evolving.
To address limited skill sets and integration challenges, organizations are choosing open ecosystems that offer standardization. Respondents prize application services that are both secure and easy to use.
Matured IT and business process optimization initiatives
80% of organizations are executing on digital transformation – with increasing emphasis on accelerating speed to market. As organizations progress through digital transformation initiatives, IT and business process optimization initiatives mature.
Many organizations have moved beyond the basics of business process automation and are now scaling their digital footprint with cloud, containers, and orchestration. This in turn is driving the creation of new ecosystems and massive growth in API call volumes.
87% of orgs are multi-cloud, most still struggle with security
Organizations are leveraging the public cloud to participate in industry ecosystems, take advantage of cloud-native architectures, and deliver applications at the speed of the business.
However, organizations are much less confident in their ability to withstand an application-layer attack in the public cloud versus an on-premises data center. This discrepancy illustrates a growing need for easy-to-deploy solutions that can ensure consistent security across multiple environments.
73% of orgs are automating the network to boost efficiency
Unsurprisingly, given the primary drivers of digital transformation – IT and business process optimization—the majority of organizations are automating the network.
Despite challenges, organizations are gaining proficiency and moving toward continuous deployment with more consistent automation across all key pipeline components: app infrastructure, app services, network, and security.
69% of orgs using 10 or more application services
As newer cloud-native application architectures mature and scale, a higher percentage of organizations are deploying related app services such as ingress control and service discovery both on premises and in the public cloud. A modern application landscape requires modern app services to support scale, security, and availability requirements.
IT operations still responsible for app services
63% of organizations still place primary responsibility for app services with IT operations, yet more than half of those surveyed are also moving to DevOps-inspired teams.
Operations and infrastructure teams continue to shoulder primary responsibility for selecting and deploying application services. However, as organizations expand their cloud- and container-native app portfolios, DevOps groups are taking more responsibility for app services.
The developments in the area of cybersecurity are alarming. As the number of smart devices in private households increases, so do the opportunities for cyber criminals to attack, TÜV Rheinland reveals.
Uncontrolled access to personal data undermines confidence in the digital society. The logistics industry and private vehicles are increasingly being targeted by hackers.
“From our point of view, it is particularly serious that cybercrime is increasingly affecting our personal security and the stability of society as a whole,” explains Petr Láhner, Business Executive Vice President for the business stream Industry Service & Cybersecurity at TÜV Rheinland.
“One of the reasons for this is that digital systems are finding their way into more and more areas of our daily lives. Digitalization offers many advantages – but it is important that these systems and thus the people are safe from attacks.”
Uncontrolled access to personal data could destabilize the digital society
In 2017, Frenchwoman Judith Duportail asked a dating app company to send her any personal information they had about her. In response, she received an 800-page document containing her Facebook likes and dislikes, the age of the men she had expressed interest in, and every single online conversation she had had with all 870 matching contacts since 2013.
The fact that Judith Duportail received so much personal data after several years of using a single app underscores the fact that data protection is now very challenging. In addition, this example shows how little transparency there is about securing and processing data that can be used to gain an accurate picture of an individual’s interests and behavior.
Smart consumer devices are spreading faster than they can be secured
Smart speakers, fitness trackers, smart watches, thermostats, energy meters, smart home security cameras, smart locks and lights are the best-known examples of the seemingly unstoppable democratization of the “Internet of many Things”.
Smart devices are no longer just toys or technological innovations. The number and performance of individual “smart” devices is increasing every year, as these types of devices are quickly becoming an integral part of everyday life.
It is easy to see a future in which the economy and society will become dependent on them, making them a very attractive target for cyber criminals. Until now, the challenge for cybersecurity has been to protect one billion servers and PCs. With the proliferation of smart devices, the attack surface could quickly increase hundreds or thousands of times.
The trend towards owning a medical device increases the risk of an internet health crisis
Over the past ten years, personal medical devices such as insulin pumps, heart and glucose monitors, defibrillators and pacemakers have been connected to the internet as part of the Internet of Medical Things (IoMT). At the same time, researchers have identified a growing number of software vulnerabilities and demonstrated the feasibility of attacks on these products. This can lead to targeted attacks on both individuals and entire product classes.
In some cases, the health information generated by the devices can also be intercepted. So far, the healthcare industry has struggled to respond to the problem – especially when the official life of the equipment has expired.
As with so many IoT devices of this generation, networking was more important than the need for cybersecurity. The complex task of maintaining and repairing equipment is badly organized, inadequate or completely absent.
Vehicles and transport infrastructure are new targets for cyberattacks
Through the development of software and hardware platforms, vehicles and transport infrastructure are increasingly connected. These applications offer drivers more flexibility and functionality, potentially more road safety, and seem inevitable given the development of self-propelled vehicles.
The disadvantage is the increasing number of vulnerabilities that attackers could exploit – some with direct security implications. Broad cyberattacks targeting transport could affect not only the safety of individual road users, but could also lead to widespread disruption of traffic and urban safety.
Hackers target smart supply chains and make them “dumb”
With the goal of greater efficiency and lower costs, smart supply chains leverage IoT automation, robotics and big data management – those within a company and with their suppliers.
Smart supply chains increasingly represent virtual warehousing, where the warehouse is no longer just a physical building, but any place where a product or its components can be located at any time. Nevertheless, there is a growing realization that this business model considerably increases the financial risks, even with only relatively minor disruptions.
Smart supply chains are dynamic and efficient, but are also prone to disruptions in processes. Cyberattacks can manipulate information about deposits. Thus, components would not be where they are supposed to be.
Threats to shipping are no longer just a theoretical threat but a reality
In 2017, goods with an estimated weight of around 10.7 billion tons were transported by sea. Despite current geopolitical and trade tensions, trade is generally expected to continue to grow. There is ample evidence that states are experimenting with direct attacks on ship navigation systems.
At the same time, attacks on the computer networks of ships used to extort ransom have been reported. Port logistics offers a second, overlapping area of vulnerability.
Many aspects to shipping that can be vulnerable to attack such as ship navigation, port logistics and ship computer network. Attacks can originate from states and activist groups. This makes monitoring and understanding a key factor in modern maritime cybersecurity.
Vulnerabilities in real-time operating systems could herald the end of the patch age
It is estimated that by 2025 there will be over 75 billion networked devices on the IoT, each using its own software package. This, in turn, contains many outsourced and potentially endangered components.
In 2019, Armis Labs discovered eleven serious vulnerabilities (called Urgent/11) in the real-time operating system (RTOS) Wind River VxWorks. Six of these flaws exposed an estimated 200 million IoT devices to the risk of remote code execution (RCE) attacks.
This level of weakness is a major challenge as it is often deeply hidden in a large number of products. Organizations may not even notice that these vulnerabilities exist. In view of this, the procedure of always installing the latest security updates will no longer be effective.
As state houses and Congress rush to consider new consumer privacy legislation in 2020, Americans expect more control over their personal information online, and are concerned with how businesses use the data collected about them, a DataGrail research reveals.
In a OnePoll online survey of 2,000 people aged 18 and above, 4 out of 5 Americans agreed there should be a law to protect their personal data, and 83 percent of people expect to have control over how their data is used at a business.
The request for more control over their personal data comes after many Americans experienced, first-hand, existing protections not working – 62 percent of people continue to receive emails from a company after unsubscribing.
In addition, more than 82 percent of people have concerns about businesses monitoring or collecting data from their phone’s microphone, laptop webcams, home devices (such as Google Home, Alexa, etc.), or mobile devices (phone, laptop, etc.) with location tracking.
Consumers do not feel safe from privacy infringements
Further, the research shows consumers do not feel safe from privacy infringements wherever they may be: 85% of those polled said they were concerned that businesses could be monetizing their laptops’ location.
In response to Americans’ demands, state regulators are listening. Several states have developed their own regulations, including California, Nevada and Maine, with Washington, New York and several other states following suit.
The California Consumer Privacy Act (CCPA) that went into effect Jan. 1, 2020, is one of the most consumer-forward, comprehensive and prominent data privacy laws. However, only 24 percent of Americans are familiar or have heard of it.
“As people put more of themselves online, they expect to have more control and transparency over their personal information,” said Daniel Barber, CEO of DataGrail.
“The good news is that businesses are responding. Brands are already making big moves to show their dedication to privacy, and it’s paying off. Those that proactively update preferences and consent will end up with a more loyal customer-base.
“However, we still have a lot of education to do. It’s clear people want the regulations. Our research shows that 50% of people would exercise at least one right under the CCPA.”
Control personal data: Data security over affordability
If all Americans were given the rights included in the CCPA:
- 65% of people would like to know and have access to what information businesses are collecting about them.
- 62% of people would like the right to opt-out and tell a business not to share or sell personal information.
- 58% of people would like the right to protections against businesses that do not uphold the value of their privacy.
- 49% of people would like the right to delete their personal data held by the business.
People are also more than willing to take their wallets elsewhere, even if it meant breaking their shopping preferences if they discovered their private data was not protected or that their data was being sold. The survey found that 77% would not shop at their favorite retailer if they found they did not keep their personal data safe.
Additionally, consumers said they would be willing to pay more for better privacy protections: 73% of people polled said they would pay more to online services companies (retailers, ecommerce, and social media) to ensure they didn’t sell their data, show them ads, or use their data for marketing or sales purposes.
Only 32% of students agree they are aware of how their institution handles their personal data, compared to 45% who disagree and 22% who neither agree nor disagree, according to a Higher Education Policy Institute (HEPI) survey of over 1,000 full-time undergraduate students.
Perceptions about university data security
Just 31% of students feel their institution has clearly explained how their personal data are used and stored, compared to 46% who disagree and 24% who neither agree nor disagree.
When students were asked whether they are concerned about rumors of university data security issues, 69% of students stated they are concerned. Around one-fifth of students (19%) are unconcerned and 12% are unsure.
65% of students say a higher education institution having a poor security reputation would have made them less likely to apply, compared to around a third (31%) who say it would have made no difference and 4% who said it would have made them more likely to apply.
Only 45% of students feel confident that their institution will keep their personal data secure and private, while 22% are not confident. A third (33%) are unsure.
93% of students agree they should have the right to view any personal information their higher education institution stores about them, 5% neither agree nor disagree and only 2% disagree.
Keeping private data private
When it comes to sharing health or wellbeing information with a student’s parents or guardians, almost half (48%) of respondents say it would be fine for institutions to do so. A further 19% said they neither agree nor disagree and a third (33%) disagree.
Comparatively, only a third (35%) of students were supportive of parents or guardians being contacted about academic performance issues at university, compared to almost half of students (48%) who are opposed and 17% do not take a stance on this issue.
Rachel Hewitt, HEPI’s Director of Policy and Advocacy, said: “Students are required to provide large amounts of data to their universities, including personal and sensitive information. It is critical that universities are open with students about how this information will be used.
“Under a third of students feel their university has clearly explained how their data will be used and shared and under half feel confident that their data will be kept secure and private. Universities should take action to ensure students can have confidence in the security of their data.”
Michael Natzler, HEPI’s Policy Officer, said: “Students are generally willing for their data to be used anonymously to improve the experience of other students, for example on learning and mental wellbeing. Around half are even happy for information about their health or mental wellbeing to be shared with parents or guardians.
“However, when it comes to identifiable information about them as individuals, students are clear they want this data to be kept confidential between them and their institutions. It is important that universities keep students’ data private where possible and are clear with students when information must be shared more widely.”
Organizations are starting to take a much more considered approach to data protection as high-profile regulatory action for data mishandlings has raised both the stakes and interest in data privacy operations.
Since the EU General Data Protection Regulation (GDPR) came into force in May 2018, data protection has risen to the top of the news agenda. Simultaneously, the GDPR has raised the profile and highlighted the importance of the Data Protection Officer (DPO) internationally as, under this legislation, certain entities are under legal obligation to appoint a DPO.
Noncompliance with the GDPR carries hefty fines and is generally associated with a wave of negativity when public trust is compromised. Moreover, there is a growing global awareness that data protection matters, and people expect organizations to handle their personal data with care. It is for this reason that legislators around the world are actively seeking new ways to protect the security and privacy of personal data.
Organizations should strive for ethical handling of personal data
The global movement for an ethical handling of personal information is multidimensional. Investor activism and customer scrutiny – over the way their data is collected, processed and used – is putting the pressure on organizations to act ethically and on legislators to enact laws that effectively deal with rapid technological changes. Issues related to corporate governance and accountability are at the center of this movement.
Every day at HewardMills we speak with more and more organizations recognizing the value of in-depth knowledge and the need for total autonomy in this area. Businesses understand that their reputation is closely aligned with the processes around privacy and data protection in place. As a result, clearer lines are being drawn around departmental responsibilities to better operationalize data protection regulations.
Similar to other data specialist skill sets, demand for qualified and experienced DPOs is raising. This is a result of the role being both legally required for certain entities and organizations realizing the value of fostering a data protection culture.
The DPO role is a cornerstone
The DPO can be internal or external, but they must be allowed to function independently. They are the link between the organization, the supervisory authorities and the data subjects. Thus, it is important that the DPO strike a careful balance to meet their own obligations toward all parties involved.
DPOs play a pivotal role in an organization’s data management health and are required to report directly to the highest level of management. Some tasks that fall under the DPO role include advising on issues around data protection impact assessments (DPIAs), training, overseeing the accuracy of data mapping and responding to data subject access requests (DSARs). These things are all mandated under the GDPR.
Even the best intentions fall flat without the right execution
Organizations may have good intentions to achieve best practices and meet their legal obligations, but the data protection process does not stop there. Practical knowledge on how to operationalize legal obligations is the key to success. For example, if an organization is not adequately prepared to respond to DSARs, it may miss the one-month GDPR deadline or respond in an incomplete manner.
Since the GDPR came into effect, supervisory authorities have actively sought greater transparency. This means that there is a particular focus on accurate privacy notices, data protection impact assessments and legitimate interest assessments. Given the global trend toward accountability, it is safe to argue that investing in data protection and privacy will win the trust of individuals, be the customers or employees. Organizations that foster a culture of integrity are at a competitive advantage in a world where privacy and data protection matter. For those that do not, the financial, legal and public opinion risks can be significant.
Getting ahead of the risks
Being responsive to GDPR data subject requests helps to build trust with individuals and demonstrates a serious dedication to data protection obligations. The DPO is the contact point for data subjects who are exercising their rights. As such, DPOs must be easily accessible, be it by telephone, mail or other avenues. Lack of resources is not an excuse for neglecting legal obligations and denying data subjects their rights. A consultant or outsourced DPO role can provide a cost-effective way to fill this gap.
DPOs help organizations to prioritize risks. While they themselves must address highest-risk activities first, they must also educate on how DPIAs are reached. This allows controllers to know which activities should be prioritized. Ultimately, ensuring data controllers are informed about the perceived risks relating to different processing activities. For instance, the DPO could flag data protection audits, the need for enhanced security measures, or gaps in staff training and resource allocations.
The insurance policy of an autonomous partner
To maintain the level of autonomy needed to act as an independent body, job security has been built into the DPO appointment. The DPO can be disciplined or even terminated for legitimate reasons. However, they cannot be dismissed or penalized by the controller or processor as a result of carrying out their duties. In other words, the organization cannot direct the DPO or instruct them to reach a certain desired conclusion. The DPO must also be given the resources required to achieve this level of independence and carry out their duties. Typically, these resources are budget, equipment and staff.
One of the benefits of using an external DPO is that conflicts of interest are less likely. Organizations should strive to give the DPO the necessary autonomy to successfully act as a bridge between data subjects, the organization and the supervisory authorities. The DPO should not be assigned tasks that would put them in a position of “marking their own homework”. Used correctly, the DPO is a partner that helps navigate the organization toward an ethical handling of personal data.
Faced with meeting strict obligations under GDPR, organizations controlling and processing personal data must empower and embrace their DPOs and work closely with them. Organizations should view DPOs as a type of insurance policy for data risk and not think of them as the regulators’ undercover watchmen.
With the advent of laws like the EU’s GDPR and California’s CCPA, which are sure to be portents of things to come (i.e., more and better data privacy legislation), companies with a global presence are starting to think about whether they should implement different user data privacy protection regimes for each region or whether it would be easier to globally comply with the strictest of the existing laws.
Microsoft, for example, chose the latter course of action. In May 2018, the company announced that it will extend the rights that are at the heart of GDPR to all of their consumer customers worldwide. More recently, it decided to honor California’s digital privacy law all through the U.S.
For companies like Microsoft, who offer services to enterprise clients, the decision is a no-brainer: they are getting a leg-up on competitors as organizations look for solutions that have compliance to the most progressive data privacy laws baked in by default.
Data collection balancing act: Privacy and trust
More and more companies are using privacy as a selling point by offering products that are privacy-friendly, says Cassandra Moons, Data Privacy Officer at TomTom, the Dutch multinational company developing location and navigation technology for both the consumer and business market.
Apple has, for example, made every effort to stand out from the competition in all arenas by incorporating privacy-preserving features from the get-go in many of their products.
Take Apple Maps, for example: in summer 2018, the company detailed its efforts to rebuild and improve the web mapping service and explained how, even though it collects navigation data from iPhone users, it manages not to intrude on users’ privacy.
Apple Maps also still uses data collected by third parties like TomTom but, as Moons notes, anonymizing location data is the foundation of their relationship with their customers.
“It’s crucial that we retain our customers’ trust. For example, we need them to know that we only use their data to deliver meaningful improvements, not to sell them ads or direct them past a sponsor restaurant. That’s why we anonymize all data by disconnecting the link with the customer and their GPS traces,” she told Help Net Security.
“TomTom internally performs Privacy Impact Assessments (PIA), a framework for deciding what data we truly need to gather and how to prioritize user privacy (privacy-by-design). The PIA also governs our data-sharing relationships with third parties, ensuring we’re not just compliant with GDPR but that we’re being truly transparent with (and protective of) our users by vetting third-parties.”
What’s the right amount of data collection?
The “right” amount of data depends on the sensitivity of the data, the volume and what you want to use the data for.
“GDPR recognizes the principle of data minimization, which means one should only collect personal data ‘adequate, relevant and limited to what is necessary in relation to the purposes for which the personal data are processed’. In the end, though, it’s on the individuals who had their data collected to determine the ‘right’ amount. If a company is able to explain to individuals why its data collection is in line with this principle, it’s fair to assume you have collected the ‘right’ amount of data,” she pointed out.
But the issue of privacy should never be addressed in the Terms and Conditions, she feels, because no one ever reads those.
“Privacy should always be a standalone communication. It should be completely clear what a user is signing up to. A user should be well-informed about which data is being collected, should have control over which data can be used and be aware of the purposes for which a company uses their data,” she opined.
True ethical data management can be a business practice for a company that relies on user data in order to run and improve their products, she says. “When a company has embedded ethical values such as preventing user discrimination and putting the user first when it comes to privacy, ethics and big data collection will align and move together in the same direction.”
How will the collection of data for driver apps evolve?
As apps move from phones to cars and power the connected driving experience, driving apps will rely much more on the community to keep them updated, she says.
“To be successful, driving apps need to be trustworthy and reliable, they must protect user data, and they should be completely transparent about how this information is being used. Moving the traditional data gathering and use model from mobile apps to driving apps simply won’t work,” she added.
“In order to maintain reliability, app developers need to work with the community to be sensitive about their legitimate concerns, and show that they are using their data securely and wisely to bring services that add real value to drivers everywhere. The collection of data always needs to comply with relevant privacy laws, including appropriate user control standards, no matter the type and volume of personal data.”
Most U.S. adults say that the potential risks they face because of data collection by companies (81%) and the government (66%) outweigh the benefits, but most (>80%) feel that they have little or no control over how these entities use their personal information, a recent Pew Research Center study on USA digital privacy attitudes has revealed. Interesting discoveries on USA digital privacy attitudes The study has also shown that: 72% of respondents feel that all, … More
The post Most Americans feel powerless to prevent data collection, online tracking appeared first on Help Net Security.
As organizations continue to collect customer and employee data, chief audit executives (CAEs) are increasingly concerned about how to govern and protect it. Gartner conducted interviews and surveys from across its global network of client organizations to identify the biggest risks facing boards, audit committees and executives in 2020. Data governance has risen to the top spot of CAEs’ audit concerns, up from second place in last year’s report, replacing cybersecurity preparedness. Increased regulatory scrutiny … More
The post Top concerns for audit executives? Cyber risks and data governance appeared first on Help Net Security.