Companies that ransomware-hit US organizations hire to facilitate the paying of the ransom are at risk of breaking US sanctions, falling afoul of the US Department of the Treasury’s Office of Foreign Assets Control (OFAC) regulations and may end up paying millions in fines.
These include financial institutions, cyber insurance firms, and companies involved in digital forensics and incident response.
What is the OFAC?
The Office of Foreign Assets Control of the US Department of the Treasury administers and enforces economic and trade sanctions based on US foreign policy and national security goals.
Sanctions can be enforced against foreign countries/regimes, organized groups and individuals that “threaten the national security, foreign policy or economy of the United States”. Ransomware-wielding gangs fall in that category.
In a security advisory published on Thursday, the OFAC mentioned the developer of Cryptolocker, Iranian supporters of SamSam ransomware-wielding gangs, the Lazarus Group (a cybercriminal organization sponsored by North Korea that used the WannaCry ransomware) and Evil Corp, a Russia-based cybercriminal organization that wields the Dridex malware, as malicious cyber actors under its cyber-related sanctions program.
The advisory’s salient points
“Ransomware payments made to sanctioned persons or to comprehensively sanctioned jurisdictions could be used to fund activities adverse to the national security and foreign policy objectives of the United States. Ransomware payments may also embolden cyber actors to engage in future attacks. In addition, paying a ransom to cyber actors does not guarantee that the victim will regain access to its stolen data,” the OFAC explained.
“OFAC encourages victims and those involved with addressing ransomware attacks to contact OFAC immediately if they believe a request for a ransomware payment may involve a sanctions nexus. Victims should also contact the US Department of the Treasury’s Office of Cybersecurity and Critical Infrastructure Protection if an attack involves a US financial institution or may cause significant disruption to a firm’s ability to perform critical financial services.”
OFAC might issue a special license allowing them to perform the transaction (the paying of the ransom), but each application “will be reviewed by OFAC on a case-by-case basis with a presumption of denial.”
Also, it won’t matter if the ransomware gangs involved are from countries under US sanctions or under sanctions themselves.
“OFAC may impose civil penalties for sanctions violations based on strict liability, meaning that a person subject to US jurisdiction may be held civilly liable even if it did not know or have reason to know it was engaging in a transaction with a person that is prohibited under sanctions laws and regulations administered by OFAC,” the advisory pointed out.
To pay or not to pay?
If would be best, of course, if a ransomware-hit organization didn’t have to pay the ransom in order to quickly recover their IT capabilities and return to functioning as normal, but sometimes paying up is the only option if they want to stay afloat and/or keep providing vital services.
In and of itself, paying a ransom is not against the law, but if the payment is made to an entity or individual under US sanctions, the action is technically illegal.
But, according to Dissent Doe, FBI and Secret Service officials that attended a panel at the Privacy + Security Forum in Washington, D.C., a year ago confirmed that the US government has never prosecuted any victim for paying ransom.
The same panel, which also gathered private sector lawyers and a representative of a consulting firm, also unanimously confirmed that in an overwhelming majority of cases, victims end up getting the decryption key and their data back after paying up.
“So although the public isn’t told this clearly because the government wants to discourage it, I will repeat what I have been saying for quite a while: for some entities, paying ransom will just be a business decision based on how much money they will lose if they cannot function due to the ransomware attack,” Doe noted.
A (potential) fine levied by the US government then becomes just a factor in that equation.
The financial services sector is outperforming other industries, both in its adoption of database DevOps, and its use of monitoring to track database performance and deployments, a newly released edition of Redgate’s 2020 State of Database Monitoring Report has revealed.
Respondents were surveyed in April 2020 while most were in lockdown due to COVID-19. Those responses form the foundation of the report and reveal the significant adoption of third-party database monitoring tools in financial services, which may reflect the ongoing situation where many disparate IT teams are working remotely. This has increased the need to monitor database environments, particularly when zero downtime is now expected – often demanded – in the sector.
The report shows that 61% of those in financial services deploy database changes once a week or more, compared to 43% across other sectors, and 52% deploy multiple times per day or week, up from 35% in other sectors.
Server estates are also larger for financial services, with 36% having between 50 and 500 instances against 26% in other sectors. Notably, the biggest increase has been in estates with over 1,000 instances, which are up eight percentage points year-on-year.
These results have likely contributed to the 66% of companies in financial services reporting that they use a paid-for monitoring tool, compared to only 39% of respondents across other sectors.
To further complicate the picture, the cloud is changing the nature of those estates. 39 percent of those in financial services already host some or all of their databases in the cloud, and the report shows that migrating to and integrating with the cloud is the biggest challenge facing the sector in the next 12 months.
Yet, despite the far higher rate of database deployments and bigger, more mixed estates to manage, failed deployments are detected earlier and recovered from faster. 49 percent of failed deployments are detected within 10 minutes and 32% recover from those failed deployments in 10 minutes or under. In other sectors this falls to 39% and 24%, respectively.
For Grant Fritchey, Microsoft Data Platform MVP and Redgate Advocate, this is where the real value of advanced, third-party monitoring tools lies. “With faster deployments and large, hybrid estates, it’s no longer enough to monitor the usual suspects like CPU, disk space, memory and I/O capacity,” says Fritchey.
“Sectors like financial services – and Healthcare and IT – have recognized they need customizable alerts for the operational and performance issues they face, and every deployment displayed on a timeline alongside key SQL Server metrics. That way, when a bad deployment occurs, they can dive into the details, investigate the cause and remedy it immediately. If you can’t do that, frankly, you’ll have a hard time doing DevOps.”
Only 10% of organizations are using data effectively for transformational purposes, according to NTT DATA Services.
While 79% of organizations recognize the strategic value of data, the study concludes their efforts to use it are hindered by significant challenges including siloed islands of data across the organization and lack of data skills and talent.
The study analyzes the critical role of data and analytics in helping businesses and organizations pivot from disruption to transformation, an imperative as they respond to today’s global economic climate.
Organizations starting to prioritize a data-driven culture
The study shows only 37% are very effective at using data to adopt or invent a new business model, and only 31% are using data to enter new markets. These different use cases show that organizations have started prioritizing a data-driven culture, but many are still lagging in the most basic aspects of data management and governance.
“Our study reinforces that organizations who act quickly and decisively on their data strategies – or Data Leaders – will recover from the global crisis better and even accelerate their success,” said Greg Betz, Senior Vice President, Data Intelligence and Automation, NTT DATA Services.
“C-suite executives must be champions for the vital role strong data governance plays in resolving systemic process failures and transitioning to new business models in response to the crisis.
“To rebound effectively, corporations, organizations and government agencies must shift to next-generation technologies and create contactless experiences, increased security, and scalable hybrid infrastructures – all reinforced by quality, integrated data.”
Data crisis: Organizations struggle to use data for transformation
The financial services (FS) sector accounts for 25% of the data leaders, making this the sector with the most data leaders. The survey shows that 59% FS organizations report being aware of and fully prepared for new data regulations.
34% report data is shared seamlessly across the enterprise; however, they are the least likely to report they have clear data security processes in place.
The manufacturing sector boasts the second-highest number of data leaders in the study. More than eight out of 10 respondents say they can act swiftly if there is a data privacy breach; however, as with other sectors, when they attempt to derive value from their data, manufacturers struggle with data silos (24%), and they lack the necessary skills and talent to analyze their data (19%).
Among healthcare respondents, 60% say they’re aware and fully prepared for new and upcoming regulations, and approximately eight out of 10 say they’re confident they can comply with data privacy regulations.
However, this sector ranks first in its lack of data literacy skills — about a fifth of respondents report they don’t understand how to read, create and communicate data as information.
Lack of data talent and skills in the public sector
The public sector has the highest number of data laggards at 37%. Like other sectors, lack of data talent and skills is one of the public sector’s biggest barriers when attempting to understand and derive value from data.
Insurance companies are among the most likely to report they’re aware and fully prepared for new data regulations (58%) and have clear processes in place for securely using their data (50%).
However, when it comes to deriving value from data, insurance companies – like manufacturing, struggle with data silos and the lack of the right technologies to analyze their data.
“This study validates that many of the top data challenges organizations face today are decades old,” said Theresa Kushner, Consultant, AI and Analytics, NTT DATA Services. “The 2020 pandemic is a wakeup call for businesses at any scale, and a reminder that in today’s global economic climate the time to address data challenges and chart a new path is now.”
Businesses in financial services are ahead of the government sector in adopting DevOps to increase their speed of development and free up developer time, but hurdles still remain, according to Redgate.
Adopting DevOps and overcoming hurdles
“At the heart of what makes the financial services sector so interesting is its willingness to adopt a generative culture, which focuses on breaking free of siloes and promoting a proactive, collaborative atmosphere,” notes Kendra Little, Redgate DevOps Advocate and author of the report.
“It’s what makes DevOps adoption work so well. This is a far cry from the culture and structure we saw from respondents in other sectors, which were often marred by little to no cooperation, and an inability to adapt to change.”
Nowhere is this more important than financial services. While features need to be delivered faster in order to meet heightened expectations from customers looking for added value, regulatory requirements need to be complied with and data privacy needs to be protected.
- 77% of respondents have adopted DevOps across all or some projects, or have a PoC in place, compared to 69% across other sectors
- 34% of respondents think increased speed of delivery is the biggest driver for automating the delivery of database changes compared to 26% across other sectors
- 66% of respondents believe the move from traditional database development to a fully automated process for deployment can be achieved in a year or less. This rises to 69% in the US and compares to 61% across other sectors
- Financial services businesses are ahead of every other sector in adopting automation across the database development process
- 54% of financial services businesses now deploy database changes daily, weekly, or on demand, compared to 49% across other sectors
“What’s fascinating about the deeper sector analysis is the way businesses in financial services are innovating faster and further than any other sector,” added Little.
“The data shows they face the same obstacles and challenges, but most are still surging ahead. Those who are lagging behind may need to revisit their digital transformation plans in order to remain competitive.”
Phishers are trying to trick investment brokers into sharing their Microsoft Office or SharePoint login credentials by impersonating FINRA, a non-governmental organization that regulates member brokerage firms and exchange markets.
Phishers target investment brokers with malicious emails
The “widespread, ongoing phishing campaign” takes the form of emails purportedly sent by FINRA VPs Bill Wollman and Josh Drobnyk from @broker-finra.org email addresses.
They can contain an attached document or a malicious link, though occasionally they are just are simply a way to elicit a response and gain the recipient’s trust before sending an email with an infected attachment or link or a request for confidential firm information.
The organization has requested that the Internet domain registrar suspend services for the broker-finra.org domain, but attackers can easily register another convincing one, so securities firms and brokers are advised to always be on the lookout for suspicious emails and to verify their legitimacy before responding to them or interacting with them.