With database attacks on the rise, how can companies protect themselves?

Misconfigured or unsecured databases exposed on the open web are a fact of life. We hear about some of them because security researchers tell us how they discovered them, pinpointed their owners and alerted them, but many others are found by attackers first.

exposed databases

It used to take months to scan the Internet looking for open systems, but attackers now have access to free and easy-to-use scanning tools that can find them in less than an hour.

“As one honeypot experiment showed, open databases are targeted hundreds of times within a few hours,” Josh Bressers, product security lead at Elastic, told Help Net Security.

“There’s no way to leave unsecured data online without opening the data up to attack. This is why it’s crucial to always enable security and authentication features when setting up databases, so that your organization avoids this risk altogether.”

What do attackers do with exposed databases?

Bressers has been involved in the security of products and projects – especially open-source – for a very long time. In the past two decades, he created the product security division at Progeny Linux Systems and worked as a manager of the Red Hat product security team and headed the security strategy in Red Hat’s Platform Business Unit.

He now manages bug bounties, penetration testing and security vulnerability programs for Elastic’s products, as well as the company’s efforts to improve application security, add new and improve existing security features as needed or requested by customers.

The problem with exposed Elasticsearch (MariaDB, MongoDB, etc.) databases, he says, is that they are often left unsecured by developers by mistake and companies don’t discover the exposure quickly.

“The scanning tools do most of the work, so it’s up to the attacker to decide if the database has any data worth stealing,” he noted, and pointed out that this isn’t hacking, exactly – it’s mining of open services.

Attackers can quickly exfiltrate the accessible data, hold it for ransom, sell it to the highest bidder, modify it or simply delete it all.

“Sometimes there’s no clear advantage or motive. For example, this summer saw a string of cyberattacks called the Meow Bot attacks that have affected at least 25,000 databases so far. The attacker replaced the contents of every afflicted database with the word ‘meow’ but has not been identified or revealed anything behind the purpose of the attack,” he explained.

Advice for organizations that use clustered databases

Open-source database platforms such as Elasticsearch have built-in security to prevent attacks of this nature, but developers often disable those features in haste or due to a lack of understanding that their actions can put customer data at risk, Bressers says.

“The most important thing to keep in mind when trying to secure data is having a clear understanding of what you are securing and what it means to your organization. How sensitive is the data? What level of security needs to be applied? Who should have access?” he explained.

“Sometimes working with a partner who is an expert at running a modern database is a more secure alternative than doing it yourself. Sometimes it’s not. Modern data management is a new problem for many organizations; make sure your people understand the opportunities and challenges. And most importantly, make sure they have the tools and training.”

Secondly, he says, companies should set up external scanning systems that continuously check for exposed databases.

“These may be the same tools used by attackers, but they immediately notify security teams when a developer has mistakenly left sensitive data unlocked. For example, a free scanner is available from Shadowserver.”

Elastic offers information and documentation on how to enable the security features of Elasticsearch databases and prevent exposure, he adds and points out that security is enabled by default in their Elasticsearch Service on Elastic Cloud and cannot be disabled.

Defense in depth

No organization will ever be 100% safe, but steps can be taken to decrease a company’s attack surface. “Defense in depth” is the name of the game, Bressers says, and in this case, it should include the following security layers:

  • Discovery of data exposure (using the previously mentioned external scanning systems)
  • Strong authentication (SSO or usernames/passwords)
  • Prioritization of data access (e.g., HR may only need access to employee information and the accounting department may only need access to budget and tax data)
  • Deployment of monitoring infrastructures and automated solutions that can quickly identify potential problems before they become emergencies, isolate infected databases, and flag to support and IT teams for next steps

He also advises organizations that don’t have the internal expertise to set security configurations and managing a clustered database to hire of service providers that can handle data management and have a strong security portfolio, and to always have a mitigation plan in place and rehearse it with their IT and security teams so that when something does happen, they can execute a swift and intentional response.

How important is monitoring in DevOps?

The importance of monitoring is often left out of discussions about DevOps, but a Gartner report shows how it can lead to superior customer experiences.

DevOps monitoring

The report provides the following key recommendations:

  • Work with DevOps teams during the design phase to add the instrumentation necessary to track business key performance indicators and monitor business metrics in production.
  • Automate the transmission of embedded monitoring results between monitoring and deployment tools to improve application deployments.
  • Use identified business requirements to develop a pipeline for delivering new functionality, and develop monitoring to a practice of continuous learning and feedback across stakeholders and product managers.

While the report focuses on application monitoring, the benefits of early DevOps integration apply equally to database monitoring, according to Grant Fritchey, Redgate DevOps Advocate and Microsoft Data Platform MVP: “In any DevOps pipeline, the database is often the pain point because you need to update it alongside the application while keeping data safe. Monitoring helps database developers identify and fix issues earlier, and minimizes errors when changes are deployed.”

Optimizing performance before releases hit production

Giving development teams access to live monitoring data during database development and testing, for example, can help them optimize performance before releases hit production. They can see immediately if their changes influence operational or performance issues, and drill down to the cause.

Similarly, database monitoring tools can be configured to read and report on deployments made to any server and automatically deliver an alert back to the development team if a problem arises, telling them what happened and how to fix the issue.

This continuous feedback loop not only reduces time spent manually checking for problems, but speeds up communication between database development and operational teams. Most importantly, this activity all takes place on non-production environments, meaning fewer bad customer experiences when accessing production data.

This increased focus on monitoring is prompting many high performing DevOps teams to introduce third-party tools which offer more advanced features like the ability to integrate with the most popular deployment, alerting and ticketing tools.

The advantages

A good example is the financial services sector. Redgate’s report revealed that 66% of businesses in the sector now use a third-party monitoring tool, outpacing all other sectors. And while 61% of businesses deploy database changes once a week or more, compared to 43% across other sectors, issues with deployments are detected faster and recovered from sooner.

The Gartner report states: “By enabling faster recognition and response to issues, monitoring improves system reliability and overall agility, which is a primary objective for new DevOps initiatives.”

Many organizations are discovering there are big advantages in including the database in the monitoring conversation as well.

Database monitoring improves DevOps success for financial services orgs

The financial services sector is outperforming other industries, both in its adoption of database DevOps, and its use of monitoring to track database performance and deployments, a newly released edition of Redgate’s 2020 State of Database Monitoring Report has revealed.

database monitoring

Respondents were surveyed in April 2020 while most were in lockdown due to COVID-19. Those responses form the foundation of the report and reveal the significant adoption of third-party database monitoring tools in financial services, which may reflect the ongoing situation where many disparate IT teams are working remotely. This has increased the need to monitor database environments, particularly when zero downtime is now expected – often demanded – in the sector.

Key findings

The report shows that 61% of those in financial services deploy database changes once a week or more, compared to 43% across other sectors, and 52% deploy multiple times per day or week, up from 35% in other sectors.

Server estates are also larger for financial services, with 36% having between 50 and 500 instances against 26% in other sectors. Notably, the biggest increase has been in estates with over 1,000 instances, which are up eight percentage points year-on-year.

These results have likely contributed to the 66% of companies in financial services reporting that they use a paid-for monitoring tool, compared to only 39% of respondents across other sectors.

To further complicate the picture, the cloud is changing the nature of those estates. 39 percent of those in financial services already host some or all of their databases in the cloud, and the report shows that migrating to and integrating with the cloud is the biggest challenge facing the sector in the next 12 months.

Yet, despite the far higher rate of database deployments and bigger, more mixed estates to manage, failed deployments are detected earlier and recovered from faster. 49 percent of failed deployments are detected within 10 minutes and 32% recover from those failed deployments in 10 minutes or under. In other sectors this falls to 39% and 24%, respectively.

For Grant Fritchey, Microsoft Data Platform MVP and Redgate Advocate, this is where the real value of advanced, third-party monitoring tools lies. “With faster deployments and large, hybrid estates, it’s no longer enough to monitor the usual suspects like CPU, disk space, memory and I/O capacity,” says Fritchey.

“Sectors like financial services – and Healthcare and IT – have recognized they need customizable alerts for the operational and performance issues they face, and every deployment displayed on a timeline alongside key SQL Server metrics. That way, when a bad deployment occurs, they can dive into the details, investigate the cause and remedy it immediately. If you can’t do that, frankly, you’ll have a hard time doing DevOps.”

Financial services leading the way in adopting DevOps, still hurdles remain

Businesses in financial services are ahead of the government sector in adopting DevOps to increase their speed of development and free up developer time, but hurdles still remain, according to Redgate.

adopting DevOps hurdles

Adopting DevOps and overcoming hurdles

“At the heart of what makes the financial services sector so interesting is its willingness to adopt a generative culture, which focuses on breaking free of siloes and promoting a proactive, collaborative atmosphere,” notes Kendra Little, Redgate DevOps Advocate and author of the report.

“It’s what makes DevOps adoption work so well. This is a far cry from the culture and structure we saw from respondents in other sectors, which were often marred by little to no cooperation, and an inability to adapt to change.”

Nowhere is this more important than financial services. While features need to be delivered faster in order to meet heightened expectations from customers looking for added value, regulatory requirements need to be complied with and data privacy needs to be protected.

Key findings

  • 77% of respondents have adopted DevOps across all or some projects, or have a PoC in place, compared to 69% across other sectors
  • 34% of respondents think increased speed of delivery is the biggest driver for automating the delivery of database changes compared to 26% across other sectors
  • 66% of respondents believe the move from traditional database development to a fully automated process for deployment can be achieved in a year or less. This rises to 69% in the US and compares to 61% across other sectors
  • Financial services businesses are ahead of every other sector in adopting automation across the database development process
  • 54% of financial services businesses now deploy database changes daily, weekly, or on demand, compared to 49% across other sectors

“What’s fascinating about the deeper sector analysis is the way businesses in financial services are innovating faster and further than any other sector,” added Little.

“The data shows they face the same obstacles and challenges, but most are still surging ahead. Those who are lagging behind may need to revisit their digital transformation plans in order to remain competitive.”