healthcare

Finnish Data Theft and Extortion

The Finnish psychotherapy clinic Vastaamo was the victim of a data breach and theft. The criminals tried extorting money from the clinic. When that failed, they started extorting money from the patients:

Neither the company nor Finnish investigators have released many details about the nature of the breach, but reports say the attackers initially sought a payment of about 450,000 euros to protect about 40,000 patient records. The company reportedly did not pay up. Given the scale of the attack and the sensitive nature of the stolen data, the case has become a national story in Finland. Globally, attacks on health care organizations have escalated as cybercriminals look for higher-value targets.

[…]

Vastaamo said customers and employees had “personally been victims of extortion” in the case. Reports say that on Oct. 21 and Oct. 22, the cybercriminals began posting batches of about 100 patient records on the dark web and allowing people to pay about 500 euros to have their information taken down.

Senators want answers about algorithms that provide black patients less healthcare

Senators want answers about algorithms that provide black patients less healthcare

Enlarge (credit: Media for Medical/UIG via Getty Images)

A recent blockbuster study found that software used in healthcare settings systematically provides worse care for black patients than white patients, and two senators want to know what both the industry and regulators are going to do to fix the situation.

Senators Cory Booker (D-N.J.) and Ron Wyden (D-Ore.) on Tuesday issued letters to the Federal Trade Commission, the Centers for Medicare and Medicaid Services (CMS), and the five largest US health insurers asking about bias in the algorithms used to make healthcare decisions.

“In using algorithms, organizations often attempt to remove human flaws and biases from the process,” Booker and Wyden wrote. “Unfortunately, both the people who design these complex systems, and the massive sets of data that are used, have many historical and human biases built in. Without very careful consideration, the algorithms they subsequently create can further perpetuate those very biases.”

Read 8 remaining paragraphs | Comments