Growth of cloud-native apps and containerization to define 2021

Scality announced its data storage predictions for 2021, focusing on the rapid growth rate of cloud-native apps and containerization. According to IDC, by 2023, over 500 million digital apps and services will be developed and deployed using cloud-native approaches. That is the same number of apps developed in total over the last 40 years. 2021 apps and containerization trends “The accelerated growth of next-generation cloud-native digital apps and services will define new competitive requirements in … More

The post Growth of cloud-native apps and containerization to define 2021 appeared first on Help Net Security.

How do I select a data storage solution for my business?

We live in the age of data. We are constantly producing it, analyzing it, figuring out how to store and protect it, and, hopefully, using it to refine business practices. Unfortunately, 58% of organizations make decisions based on outdated data.

While enterprises are rapidly deploying technologies for real-time analytics, machine learning and IoT, they are still utilizing legacy storage solutions that are not designed for such data-intensive workloads.

To select a suitable data storage for your business, you need to think about a variety of factors. We’ve talked to several industry leaders to get their insight on the topic.

Phil Bullinger, SVP and General Manager, Data Center Business Unit, Western Digital

select data storage solutionSelecting the right data storage solution for your enterprise requires evaluating and balancing many factors. The most important is aligning the performance and capabilities of the storage system with your critical workloads and their specific bandwidth, application latency and data availability requirements. For example, if your business wants to gain greater insight and value from data through AI, your storage system should be designed to support the accelerated performance and scale requirements of analytics workloads.

Storage systems that maximize the performance potential of solid state drives (SSDs) and the efficiency and scalability of hard disk drives (HDDs) provide the flexibility and configurability to meet a wide range of application workloads.

Your applications should also drive the essential architecture of your storage system, whether directly connected or networked, whether required to store and deliver data as blocks, files, objects or all three, and whether the storage system must efficiently support a wide range of workloads while prioritizing the performance of the most demanding applications.

Consideration should be given to your overall IT data management architecture to support the scalability, data protection, and business continuity assurance required for your enterprise, spanning from core data centers to those distributed at or near the edge and endpoints of your enterprise operations, and integration with your cloud-resident applications, compute and data storage services and resources.

Ben Gitenstein, VP of Product Management, Qumulo

select data storage solutionWhen searching for the right data storage solution to support your organizational needs today and in the future, it’s important to select a solution that is trusted, scalable to secure demanding workloads of any size, and ensures optimal performance of applications and workloads both on premises and in complex, multi- cloud environments.

With the recent pandemic, organizations are digitally transforming faster than ever before, and leveraging the cloud to conduct business. This makes it more important than ever that your storage solution has built in tools for data management across this ecosystem.

When evaluating storage options, be sure to do your homework and ask the right questions. Is it a trusted provider? Would it integrate well within my existing technology infrastructure? Your storage solution should be easy to manage and meet the scale, performance and cloud requirements for any data environment and across multi-cloud environments.

Also, be sure the storage solution gives IT control in how they manage storage capacity needs and delivers real-time insight into analytics and usage patterns so they can make smart storage allocation decisions and maximize an organizations’ storage budget.

David Huskisson, Senior Solutions Manager, Pure Storage

select data storage solutionData backup and disaster recovery features are critically important when selecting a storage solution for your business, as now no organization is immune to ransomware attacks. When systems go down, they need to be recovered as quickly and safely as possibly.

Look for solutions that offer simplicity in management, can ensure backups are viable even when admin credentials are compromised, and can be restored quickly enough to greatly reduce major organizational or financial impact.

Storage solutions that are purpose-built to handle unstructured data are a strong place to start. By definition, unstructured data means unpredictable data that can take any form, size or shape, and can be accessed in any pattern. These capabilities can accelerate small, large, random or sequential data, and consolidate a wide range of workloads on a unified fast file and object storage platform. It should maintain its performance even as the amount of data grows.

If you have an existing backup product, you don’t need to rip and replace it. There are storage platforms with robust integrations that work seamlessly with existing solutions and offer a wide range of data-protection architectures so you can ensure business continuity amid changes.

Tunio Zafer, CEO, pCloud

select data storage solutionBear in mind: your security team needs to assist. Answer these questions to find the right solution: Do you need ‘cold’ storage or cloud storage? If you’re looking to only store files for backup, you need a cloud backup service. If you’re looking to store, edit and share, go for cloud storage. Where are their storage servers located? If your business is located in Europe, the safest choice is a storage service based in Europe.

Best case scenario – your company is going to grow. Look for a storage service that offers scalability. What is their data privacy policy? Research whether someone can legally access your data without your knowledge or consent. Switzerland has one of the strictest data privacy laws globally, so choosing a Swiss-based service is a safe bet. How is your data secured? Look for a service that offers robust encryption in-transit and at-rest.

Client-side encryption means that your data is secured on your device and is transferred already encrypted. What is their support package? At some point, you’re going to need help. A data storage service with a support package that’s included for free, answers in up to 24 hours is preferred.

AWS adds new S3 security and access control features

Amazon Web Services (AWS) has made available three new S3 (Simple Storage Service) security and access control features:

  • Object Ownership
  • Bucket Owner Condition
  • Copy API via Access Points

Object Ownership

Object Ownership is a permission that can be set when creating a new object within an S3 bucket, to enforce the transfer of new object ownership onto the bucket owner.

AWS S3 security

“With the proper permissions in place, S3 already allows multiple AWS accounts to upload objects to the same bucket, with each account retaining ownership and control over the objects. This many-to-one upload model can be handy when using a bucket as a data lake or another type of data repository. Internal teams or external partners can all contribute to the creation of large-scale centralized resources,” explained Jeff Barr, Chief Evangelist for AWS.

But with this set up, the bucket owner doesn’t have full control over the objects in the bucket and therefore cannot use bucket policies to share and manage objects. If the object uploader needs retain access to it, bucket owners will need to grant additional permissions to the uploading account.

“Keep in mind that this feature does not change the ownership of existing objects. Also, note that you will now own more S3 objects than before, which may cause changes to the numbers you see in your reports and other metrics,” Barr added.

Bucket Owner Condition

Bucket Owner Condition allows bucket owners to confirm the ownership when they create a new object or perform other S3 operations.

AWS recommends using Bucket Owner Condition whenever users perform a supported S3 operation and know the account ID of the expected bucket owner.

The feature eliminates the risk of users accidentally interacting with buckets in the wrong AWS account. For example, it prevents situations like applications writing production data into a bucket in a test account.

Copy API via Access Points

S3 Access Points are “unique hostnames that customers create to enforce distinct permissions and network controls for any request made through the access point. Customers with shared data sets […] can easily scale access for hundreds of applications by creating individualized access points with names and permissions customized for each application.”

The feature can now be used together with the S3 CopyObject API, allowing customers to copy data to and from access points within an AWS Region.

Businesses prioritize security and collaboration tools to manage sustained remote work environments

77 percent of IT professionals believe they were prepared to manage the rapid shift to remote work during the COVID-19 outbreak, according to TeamViewer.

manage remote work

Among those surveyed, the percentage working from home had abruptly jumped from 28 percent prior to the pandemic to 71 percent during the outbreak. The survey included more than 200 IT executives in the U.S. across various industries.

Manage remote work: High productivity, effectiveness and morale

IT professionals identified many challenges in their response to COVID-19, but felt that their productivity, effectiveness and morale remained high. Eighty-four percent of respondents believed that the “survival” of their companies depended on “providing a stable work environment” during and after the pandemic.

Seventy-eight percent said that technical support requests had also increased. Even so, 49 percent indicated that their volume of work “stayed the same” with another 32 percent noting that it was “higher than usual.”

Most IT professionals surveyed believe they were “very effective” (57 percent) or “somewhat effective” (40 percent) at solving urgent problems that arose during the pandemic. Only 3 percent believed their response was “not effective.”

Seventy nine percent said it took up to 3 weeks to establish a stable work environment, but only 41 percent were confident they had sufficient VPN capacity.

Video conferencing as the most effective tool

As part of the initial “work from home” response, video conferencing topped the list as the most effective tool (66 percent), followed by cloud storage (59 percent), device management (49 percent) and collaboration (47 percent), according to respondents.

“Businesses capably managed the rapid transition to remote work in response to the COVID-19 pandemic,” said Gautam Goswami, CMO at TeamViewer. “But it’s critical that IT professionals remain focused on strengthening their infrastructure to guarantee business continuity by putting a range of secure remote connectivity solutions in place.”

“Work from home” concerns

Respondents also identified other concerns as they continue to manage through the pandemic’s extended “work from home” arrangements.

  • Planning for a new normal: On average, IT executives expect that it will take more than seven months to return to “business as usual.” As businesses fortify their infrastructure, 85 percent “agree” or “strongly agree” that their organization will be prepared to manage a future coronavirus outbreak.
  • Security is a top priority: Security remains a top priority for 57 percent of the IT executives surveyed, particularly in response to employees using their own devices and moving from private company networks to the public internet with more access points and increased vulnerabilities.
  • Remote work will continue to trend: Eighty percent of IT leaders say they expect more employees to permanently work remotely, but only 38 percent are sure they have the training needed to handle the rise in remote work.
  • Budget increases: Sixty-nine percent of organizations channelled new funds to IT in the wake of the pandemic, and 80 percent expect say they need additional budget during the next year.

Attackers exploit Twilio’s misconfigured cloud storage, inject malicious code into SDK

Twilio has confirmed that, for 8 or so hours on July 19, a malicious version of their TaskRouter JS SDK was being served from their one of their AWS S3 buckets.

Twilio malicious SDK

“Due to a misconfiguration in the S3 bucket that was hosting the library, a bad actor was able to inject code that made the user’s browser load an extraneous URL that has been associated with the Magecart group of attacks,” the company shared.

Who’s behind the attack?

Twilio is a cloud communications platform as a service (CPaaS) company, which provides web service APIs developers can use to add messaging, voice, and video in their web and mobile applications.

“The TaskRouter JS SDK is a library that allows customers to easily interact with Twilio TaskRouter, which provides an attribute-based routing engine that routes tasks to agents or processes,” Twilio explained.

The misconfigured AWS S3 bucket, which is used to serve public content from the domain twiliocdn.com, hosts copies of other SDKs, but only the TaskRouter SDK had been modified.

The misconfiguration allowed anybody on the Internet to read and write to the S3 bucket, and the opportunity was seized by the attacker(s).

“We do not believe this was an attack targeted at Twilio or any of our customers,” the company opined.

“Our investigation of the javascript that was added by the attacker leads us to believe that this attack was opportunistic because of the misconfiguration of the S3 bucket. We believe that the attack was designed to serve malicious advertising to users on mobile devices.”

Jordan Herman, Threat Researcher at RiskIQ, which detailed previous threat campaigns that used the same malicious traffic redirector, told Help Net Security that because of how easy misconfigured Amazon S3 buckets are to find and the level of access they grant attackers, they are seeing attacks like this happening at an alarming rate.

Om Moolchandani, co-founder and CTO at code to cloud security company Accurics, noted that there are many similarities between waterhole attacks and the Twilio incident.

“Taking over a cloud hosted SDK allows attackers to ‘cloud waterhole’ into the victim environments by landing directly into the operation space of victims,” he said.

The outcome

Due to this incident, Twillio checked the permissions on all of their AWS S3 buckets and found others that were misconfigured, but they stored no production or customer data and haven’t been tampered with.

“During our incident review, we identified a number of systemic improvements that we can make to prevent similar issues from occurring in the future. Specifically, our teams will be engaging in efforts to restrict direct access to S3 buckets and deliver content only via our known CDNs, improve our monitoring of S3 bucket policy changes to quickly detect unsafe access policies, and determine the best way for us to provide integrity checking so customers can validate that they are using known good versions of our SDKs,” the company shared.

They say it’s difficult to gauge the impact on the attack on individual users, since the “links used in these attacks are deprecated and rotated and since the script itself doesn’t execute on all platforms.”

The company urges those who have downloaded a copy of the TaskRouter JS SDK between July 19th, 2020 1:12 PM and July 20th, 10:30 PM PDT (UTC-07:00) to re-download it, check its integrity and replace it.

“If your application loads v1.20 of the TaskRouter JS SDK dynamically from our CDN, that software has already been updated and you do not need to do anything,” they pointed out.

Cloud IT infrastructure spending grows, non-cloud investments plunge

Vendor revenue from sales of IT infrastructure products (server, enterprise storage, and Ethernet switch) for cloud environments, including public and private cloud, increased 2.2% in the first quarter of 2020 (1Q20) while investments in traditional, non-cloud, infrastructure plunged 16.3% year over year, according to IDC.

non-cloud investments plunge

Pandemic as the major factor driving infrastructure spending

The broadening impact of the COVID-19 pandemic was the major factor driving infrastructure spending in the first quarter. Widespread lockdowns across the world and staged reopening of economies triggered increased demand for cloud-based consumer and business services driving additional demand for server, storage, and networking infrastructure utilized by cloud service provider datacenters.

As a result, public cloud was the only deployment segment escaping year-over-year declines in 1Q20 reaching $10.1 billion in spend on IT infrastructure at 6.4% year-over-year growth. Spending on private cloud infrastructure declined 6.3% year over year in 1Q to $4.4 billion.

The pace set in the first quarter is expected to continue through rest of the year as cloud adoption continues to get an additional boost driven by demand for more efficient and resilient infrastructure deployment.

For the full year, investments in cloud IT infrastructure will surpass spending on non-cloud infrastructure and reach $69.5 billion or 54.2% of the overall IT infrastructure spend.

Spending on private cloud infrastructure expected to recover

Spending on private cloud infrastructure is expected to recover during the year and will compensate for the first quarter declines leading to 1.1% growth for the full year. Spending on public cloud infrastructure will grow 5.7% and will reach $47.7 billion representing 68.6% of the total cloud infrastructure spend.

Disparity in 2020 infrastructure spending dynamics for cloud and non-cloud environments will ripple through all three IT infrastructure domains – Ethernet switches, compute, and storage platforms.

Within cloud deployment environments, compute platforms will remain the largest category of spending on cloud IT infrastructure at $36.2 billion while storage platforms will be fastest growing segment with spending increasing 8.1% to $24.9 billion. The Ethernet switch segment will grow at 3.7% year over year.

Vendor revenues by region

At the regional level, year-over-year changes in vendor revenues in the cloud IT Infrastructure segment varied significantly during 1Q20, ranging from 21% growth in China to a decline of 12.1% in Western Europe.

Long term, spending on cloud IT infrastructure is expected to grow at a five-year CAGR of 9.6%, reaching $105.6 billion in 2024 and accounting for 62.8% of total IT infrastructure spend.

Public cloud datacenters will account for 67.4% of this amount, growing at a 9.5% CAGR. Spending on private cloud infrastructure will grow at a CAGR of 9.8%. Spending on non-cloud IT infrastructure will rebound somewhat in 2020 but will continue declining with a five-year CAGR of -1.6%.

Cloud-native security considerations for critical enterprise workloads

Since the advent of the public cloud as a viable alternative to on-premise systems, CIOs and CISOs have been citing security as one of the top concerns when it comes to making the switch.

While most of their worries have abated over the years, some remain, fuelled by the number of data leak incidents, mainly arising from misconfiguration.

cloud-native security considerations

Johnnie Konstantas, Senior Director, Security Go to Market at Oracle, says that the main reason we are seeing so many headlines around sensitive data leaks and loss is that there are almost too many security tools offered by public cloud providers.

Making cloud security administration less person-intensive and error-prone

“Public clouds are, by and large, homogeneous infrastructures with embedded monitoring capabilities that are ubiquitous and have centralized security administration and threat remediation tools built on top,” Konstantas told Help Net Security.

But cloud customers must train anew on the use of these tools and be properly staffed to leverage them and to coordinate amongst the various security disciplines – and this is hard to do as cybersecurity expertise is at historic shortage.

“Customers don’t want more tools, they want the benefit of cloud service provider innovation and expertise to address the challenge. At this point, we need reliable, accurate security configuration management and threat response that is automated,” Konstantas opined.

This is the direction in which she expects cloud-native security to go in the next five years. She believes we are likely to see a shift away from discussions about the shared responsibility model and more toward making customers cloud security heroes through automation.

Automation really is central to effective cloud security. Just take the example of data and consider the volume of data flowing into cloud hosted data bases and data warehouses. Classifying the data, identifying PII, PHI, credit cards etc., flagging overly permissioned access, and requiring additional authorization for data removal – all these things have to be automated. Even the remediation, or prevention of access needs to be automated,” she noted.

Cloud providers will have to break through customers’ fear that automated security means breaking business by over-reacting to false positives, but those that find a way to excel in using machine learning, model tuning and artificial intelligence for effective and accurate automated threat prevention will deservedly earn customer confidence – and not a moment too soon.

Is it safe to put critical enterprise workloads in the public cloud?

Without a doubt, the public cloud has proven a worthy alternative to private data centers by offering high resilience to threats and rapid security incident recovery. But not all public cloud providers are the same when it comes to expertise or built-in security.

Organizations with sensitive data and workloads must find those that will offer adequate security, and can do so by asking many questions and evaluating the answers.

Konstantas proposes the following (though the list isn’t exhaustive):

  • What are your data protection, detection, response and recovery capabilities for both structured (database) and unstructured (object storage) data?
  • How do you protect against hypervisor-based attacks, cross tenant infection, hardware-based attacks?
  • Which customer-controlled security functions are built into your cloud and are they charged for?
  • Which parts of security configuration, detection and threat remediation are automated on your platform and to which services do they apply (i.e. IaaS, PaaS, SaaS)?

For the CISO that has to work with the CIO to lead a massive migration of the organization’s data to the cloud, she advises to get as much visibility into the project as possible.

“CISOs need to prepare answers for how the organization will meet its regulatory and compliance obligations for the data during the migration and once fully operational in the cloud,” she explained.

Again, there are many questions that must be answered. Among them are:

  • How will security coverage look after the migration as compared to what is being done on premises?
  • How will security posture visibility and effectiveness increase?
  • What cost savings will be incurred on security spend by adopting built-in cloud security?
  • How will holistic cloud security posture be communicated to the CIO and board of directors?

“If the CISO is working with a cloud security provider that understands critical enterprise workloads, they will have ample support and guidance in preparing and documenting these answers because enterprise-focused CSPs have deep experience with the specific requirements of global companies, complex enterprise applications and data residency and sovereignty requirements. Enterprise-focused CSPs staff teams ready to share those insights and furnish the proof points customers require,” she concluded.

Unsecured databases continue leaking millions of records

UK ISP and telecom provider Virgin Media has confirmed on Thursday that one of its unsecured marketing databases had been accessed by on at least one occasion without permission (though the extent of the access is still unknown).

unsecured databases

The database, containing contact and service details of approximately 900,000 customers, was not technically breached.

“The incident did not occur due to a hack but as a result of the database being incorrectly configured,” Virgin media said. Access to it was not secured and the database was accessible online for 10 months.

There were no financial details or passwords in it, but the potentially compromised information is enough for skilled phishers to mount attacks via email or phone, trying to get the affected customers to give out additional sensitive information that could be used to steal their identity.

Also on Wednesday, Comparitech revealed that, in January, its security research team discovered a similarly unsecured and exposed database with 200 million records containing a wide range of property-related data on US residents.

“The largest portion of the data is a mix of personal, demographic, and property information,” shared Comparitech’s Paul Bischoff.

The records are pretty thorough – they contain individuals’ name, address, email address, age, gender, ethnicity, employment into, credit rating, investment preferences, income, net worth, as well as information on their habits (e.g., whether they travel, donate to charity, have pets, etc.) and about their property (market value, mortgage amount, tax assessment info, etc.).

“The detailed personal, demographic, and property information contained in this dataset is a gold mine for spammers, scammers, and cybercriminals who run phishing campaigns. The data allows criminals not only to target specific people, but craft a more convincing message,” Bischoff pointed out.

Interestingly enough, they were unable to discover who is the owner of the database. As it was hosted on an exposed Google Cloud server, the alerted the company and it was taken offline on March 4.

The problem with data in the cloud

Time and time again, usecured databases hosted in the cloud end up accessed by unauthorized parties due to configuration mistakes.

Eldad Chai, CEO and co-founder of data protection and governance firm Satori Cyber, says that happens because today’s model for data security is completely inadequate for the cloud.

“For years, data has been couched in layers of security, from network security to application security, end-point security to anomaly detection. This approach ensured that gaps were more or less covered and significantly limited the real threat of a data leak. Unfortunately, this layered security approach has failed to be implemented as companies migrate to the cloud—and nothing else has taken its place,” he noted.

“Relying on cloud configuration management alone cannot keep companies safe from data leaks and is many steps short of keeping big data stores safe. It is enough for one employee to replicate a VM housing sensitive data to an environment that is not configured to hold it to bring the whole [thing] down.”

While necessary, cloud configuration management shouldn’t be the last line of data security defense, he says, as it’s not isolated from environment changes, simple to configure and enforce, transparent and universal (running on any environment).

Vulnerability allows attackers to register malicious lookalikes of legitimate web domains

Cybercriminals were able to register malicious generic top-level domains (gTLDs) and subdomains imitating legitimate, prominent sites due to Verisign and several IaaS services allowing the use of specific characters that look very much like Latin letters, according to Matt Hamilton, principal security researcher at Soluble.

register malicious domains

To demonstrate the danger of these policies, he registered 25+ domains that resemble a variety of popular domains by using a mix of Latin and Unicode Latin IPA homoglyph characters.

“This vulnerability is similar to an IDN Homograph attack and presents all the same risks. An attacker could register a domain or subdomain which appears visually identical to its legitimate counterpart and perform social-engineering or insider attacks against an organization,” he pointed out.

Some homograph domains had already been registered

During this research he also discovered that, since 2017, more than a dozen homograph domains that imitated prominent financial, internet shopping, technology, and other Fortune 100 sites, have had active HTTPS certificates – meaning: they’ve already been registered.

“There is no legitimate or non-fraudulent justification for this activity (excluding the research I conducted for this responsible disclosure),” Hamilton noted, and posited that this technique was used in highly targeted social-engineering campaigns.

He also discovered that Google, for example, also allows the registration of bucket names that use Unicode Latin IPA Extension homoglyph characters. In fact, it also allows the registration of subdomains which contain mixed-scripts (e.g., Latin and Cyrillic characters), which should also be a no-no.

Mitigation and remediation

Hamilton contacted Verisign (which runs the .com and .net domains) and Google, Amazon, Wasabi and DigitalOcean (IaaS providers) in late 2019 and shared his discovery.

Everyone confirmed the receipt of the responsible disclosure report, but only Amazon and Verisign (so far) did something about the problem.

“Safeguarding the stability, security and resiliency of the critical infrastructure we operate is our top priority. While the underlying issue described by Mr. Hamilton is well understood by the global Internet community – and is the subject of active policy development by ICANN – we appreciate him providing additional timely details about how this issue may be exploited,” a Verisign spokesperson noted.

“Although we understand that ICANN has been on a path to address these issues globally, we have also proactively updated our systems and obtained the necessary approval from ICANN to implement the changes to the .com and .net top-level domains required to prevent the specific types of confusable homograph registrations detailed in Mr. Hamilton’s report.

Amazon changed its S3 bucket name validation policy to prevent registration of bucket names beginning with the punycode prefix “xn--”, preventing the use of these and all other Unicode homoglyphs.

Hamilton also pointed out that any TLD which allows Latin IPA characters is likely affected by this vulnerability, but that the majority of the most popular sites on the internet use gTLDs (namely .com).

He advises users who discover that someone has registered a homograph of one of their domains to submit an abuse report to the appropriate organization.

He has also promised to soon make available a tool that will help organizations generate homographs for their domains and discover whether they’ve been registered in the last few years.

Microsoft OneDrive gets a more secure Personal Vault, plus additional storage options

The Microsoft logo displayed at Microsoft's booth at a trade show.

Enlarge / Microsoft at a trade show.

Microsoft is launching a new layer of security for users of its OneDrive cloud storage service. OneDrive Personal Vault is a new section of your storage that’s accessed through two-step verification, or a “strong authentication method,” although Microsoft didn’t define the latter term.

Microsoft notes that fingerprinting, face scans, PINs, and one-time codes by email, SMS, or an authenticator app are among the acceptable two-step verification methods. And you’ll automatically get de-authenticated after a period of inactivity—that’s the key to Microsoft’s special security argument here. Two-factor authentication using text or email is less secure than other options. Using the more heavy-duty face or fingerprint verification will require the appropriate hardware, such as a device with Windows Hello.

It also has options for transferring physical documents to the OneDrive mobile app. You can scan documents or take photos directly into the Personal Vault section without needing to store the file in a less secure part of your device first.

Users will have to be patient about this update, because Personal Vault will be getting a gradual rollout. The company said in its press release that Australia, New Zealand, and Canada will be getting the service “soon,” and all users will have it by the end of 2019. Personal Vault is coming to OneDrive on the Web, the OneDrive mobile app, and on Windows 10 PCs.

OneDrive does have standard security in place for all users even without the extra oomph of Personal Vault, such as file encryption both in Microsoft Cloud servers and in transit to your device. The tighter security option seems intended to give Microsoft customers more peace of mind for backing up very sensitive or important personal information.

The debut of Personal Vault is the big development, but Microsoft has minor items from its storage team that are also welcome news. The OneDrive standalone storage plan is being increased from 50GB to 100GB without any change in cost. This change will be happening soon and won’t require any action from users.

For those of you accessing OneDrive as an Office 365 subscriber, you’ll also have the option to add more storage to the 1TB you already have. Additional storage can be added in chunks of 200GB starting at $1.99 per month. If you’re managing a truly massive file situation, you can buy 1TB of extra storage for $9.99 a month. Additional storage can be increased and decreased at any time. Microsoft said it will be making this update in the coming months.