The 'ID-on't renounce my freedom' website contains articles and news related to the growing threat to our personal freedom and privacy.
info@id-ont.org
The state of Michigan has introduced a bill to prevent employers from forcing their employees to accept microchip implants. The prospect of a company requiring employees to accept the implantation of a microchip monitoring under their skin, it could be a good idea if it was a movie script, which is not the case with the current reality.
Digital security advocates have expressed outrage at revelations that Australian law enforcement agencies have been potentially abusing mandatory metadata retention laws and receiving data explicitly excluded from the legislation.
Remember that 50 million euro fine that Google got slapped with in France for failing to comply with the General Data Protection Regulation last year?
SAN FRANCISCO (Reuters) - Google is planning to move its British users’ accounts out of the control of European Union privacy regulators, placing them under U.S. jurisdiction instead, sources said.
The latest version of the European Commission's white paper on AI does not specify a temporary ban on remote facial recognition.
The information includes sensitive information and encrypted passwords.
Ireland’s data protection authority has announced a new probe into Google’s handling of location data, specifically the “the legality of Google’s processing of location data and the transparency surrounding that processing.”
The Stasi, East Germany’s state security service, may have been one of the most pervasive secret police agencies that ever existed. It was infamous for its capacity to monitor individuals and control information flows. By 1989, it had almost 100,000 regular employees and, according to some accounts, between 500,000 and two million informants in a country with a population of about 16 million. Its sheer manpower and resources allowed it to permeate society and keep tabs on virtually every aspect of the lives of East German citizens. Thousands of agents worked to tap telephones, infiltrate underground political movements, and report on personal and familial relationships. Officers were even positioned at post offices to open letters and packages entering from or heading to noncommunist countries. For decades, the Stasi was a model for how a highly capable authoritarian regime could use repression to maintain control.
Dating app Tinder is the latest tech service to find itself under formal investigation in Europe over how it handles user data.
Liberty, the human rights organisation, and Privacy International, today announced a joint legal action against MI5 following revelations the intelligence agency has systematically broken surveillance laws for years and kept it secret from the surveillance watchdog.
China has included the internet industry for the first time in an envisioned overhaul of its anti-monopoly laws, potentially giving regulators the power to rein in the country’s increasingly dominant technology giants.
The latest in a long line of privacy scandals happened last week, after Google was found to have been pulling unredacted data from one of America's largest healthcare providers to use in one of its projects.
The latest in a long line of privacy scandals happened last week, after Google was found to have been pulling unredacted data from one of America's largest healthcare providers to use in one of its projects. Despite assurances that it won't use this information to supplant its ad business, that's not the issue here. How was Google able to acquire this knowledge in the first place?
Professor Sandra Wachter is an expert in law, data and AI at the University of Oxford's Internet Institute. She says that every time your data is collected, "you leave something of yourself behind." She added that anyone can use your online behavior to "infer very sensitive things about you," like your ethnicity, gender, sexual orientation and health status.
It's bad enough when the companies use those inferences for targeted ads. But it gets a lot worse when they gain access to very private data. For instance, would you feel comfortable if Google started displaying ads for fertility treatments in your emails after a trip to the doctor? Or if your healthcare provider could access your browser history without your knowledge to determine how suitable you are for insurance.
Last week, we heard that Google has pulled vast amounts of unredacted and unanonymized data from healthcare provider Ascension. The files included test results, diagnoses and hospitalization records from tens of millions of patients.
These, Google said, were made available for researchers inside its Project Nightingale team as part of plans to build software that might help improve software in healthcare environments. It also said that access to the records were tightly controlled and only accessible by staffers who had been vetted by Ascension. That hasn't stopped Congress and the Department of Health and Human Services from opening an investigation.
How was Google able to grab this data without the consent of the people involved? In the US, it's legal under HIPAA, the Health Insurance Portability and Accountability Act, and Google and Ascension followed the law. At least, within the letter of the law, which allows cross-company data flows under certain conditions. But this isn't just a failing with the law in the US.
"I don't think we could rule something like this out in the EU," says technology lawyer Neil Brown of decoded.legal. "There are no absolute prohibitions in the GDPR," he said, referencing the European General Data Protection Regulation, which covers the European Union and the wider European Economic Area.
"We don't have time to read a 600-page privacy policy. Nobody has the time to do that, and everybody knows that nobody has the time to do that."
Brown says that, instead, the GDPR is "a series of controls or standards which companies must meet if they want to operate in a compliant manner. One of these conditions is that processing is necessary for scientific research purposes, so what Google is doing here may meet the requirements of that." Although with no case law to support that claim, we're in a gray area.
As data-hungry companies eye healthcare, the only way to stop these quiet deals from happening could be reform. Wachter says the first step should be removing those long often-ignored terms and conditions scrolls. "Consent is a very bad tool," she said. "We don't have time to read a 600-page privacy policy. Nobody has the time to do that, and everybody knows that nobody has the time to do that," she added.
Another issue is that data protection laws focus too much on the moment when data is collected, wrote Wachter in 2018, not on what happens after it has been obtained. That's at least one benefit of GDPR, which forces companies to minimize the data they hold on people. But otherwise, even if you offered informed consent at the time, you can't control what conclusions are drawn from the data. If a company thinks you're a bad debtor, then you can't challenge that.
These conclusions are often the biggest issue, especially in areas where machine learning has been implemented. That's why Wachter believes that now is the time to shift the onus from the individual to the entity hoarding all of that data. She wants to "make it an obligation or responsibility" of whoever is collecting the data to handle it in a responsible and ethically acceptable way.
Wachter also feels that a one-size-fits-all model for data privacy doesn't work in a world where information is so crucial. "You want to have stricter rules when it comes to financial regulation," but potentially looser ones if you're "doing cancer research in a university." But it would be up to each institution, body or company to demonstrate that they deserve that trust.
A key plank of Wachter's reform proposals is the notion that, like the right to be forgotten, we need a right of "reasonable inferences." This would, for instance, allow us to learn what data influenced a decision and the underlying assumptions generated at the time of gathering the data.
We've reported on this before -- where data collection agencies look at our online activity and make totally wrong assumptions. When I polled one of the biggest US data companies to examine what and who they thought I was (under GDPR), there were major errors in the data. They had even ignored basic facts available as a matter of public record, like my age and marital status, in favor of algorithmic conclusions.
This is going to be an issue, both now and in the future, especially as organizations trust machines to draw inferences on their behalf. Facial recognition already infers your employability beyond what's written on your resume. Even Facebook uses it as a form of security, despite numerous catastrophic data breaches.
In Europe, experts are already urging lawmakers to ban more advanced forms of these social credit schemes. And in the US, there is some call for tougher privacy laws in the spirit of Europe's GDPR. But without specific action on preventing companies from pulling vast amounts of sensitive data and running them through their own machine learning, there's even more trouble ahead.
Source: yahoo.com
China has reportedly ordered all foreign PC hardware and operating systems to be replaced in the next three years, intensifying an ongoing tech war
The US criminal legal system uses predictive algorithms to try to make the judicial process less biased. But there’s a deeper problem.
There is a stretch of highway through the Ozark Mountains where being data-driven is a hazard.
Intelligence agencies stopped the practice last year
Legislation gives government power to order social media sites to put warnings next to posts authorities deem to be false
Legal action brought against Google for allegedly tracking the personal data of four million iPhone users can go ahead in the UK, three judges have ruled.
Judges have ruled against a shopper who brought a legal challenge against police use of automated facial recognition (AFR) technology.