Open Rights Group https://www.openrightsgroup.org Protecting the digital rights of people in the UK Wed, 06 Dec 2023 15:52:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.1 https://www.openrightsgroup.org/app/uploads/2023/01/cropped-Wordpress-Icon-32x32.png Open Rights Group https://www.openrightsgroup.org 32 32 Ofcom’s age verification proposals risk privacy and security https://www.openrightsgroup.org/press-releases/ofcoms-age-verification-proposals-risk-privacy-and-security/ Tue, 05 Dec 2023 08:21:55 +0000 https://www.openrightsgroup.org/__preview__/press_release/19272 Open Rights Group has responded to the publication of Ofcom guidelines on the age verification of pornography websites. The guidelines outline how sites and apps that display or publish pornographic content must fulfil duties outlined in the Online Safety Act to prevent children and young people from encountering pornography.

ORG’s Programme Manager for Platform Power, Abigail Burke said: 

“Open Rights Group agrees that it is important that children are protected online; however, Ofcom’s proposed guidelines create serious risks to everyone’s privacy and security .”

“Age verification technologies for pornography risk sensitive personal data being breached, collected, shared, or sold. The potential consequences of data being leaked are catastrophic and could include blackmail, fraud, relationship damage, and the outing of people’s sexual preferences in very vulnerable circumstances. 

“It is very concerning that Ofcom is solely relying upon data protection laws and the ICO to ensure that privacy will be protected. The Data Protection and Digital Information Bill, which is progressing through parliament, will seriously weaken our current data protection laws, which are in any case insufficient for a scheme this intrusive. Specific and clear privacy rules are needed, given the vast amount of sensitive data that will potentially be processed. 

“Additionally, the ICO has proven itself to be one of the weakest data regulators in Europe and is in urgent need of reform. Ofcom must go further in setting out clearer standards and guidelines to ensure users’ data will be protected from the substantially increased risk of fraud and cybercrime that comes with invasive age verification technologies.” 

ORG has outlined our concerns about the impact of age verification on free speech and privacy in a joint briefing with the EFF.

We are also campaigning against the government’s attempt to grab data powers through the Data Protection and Digital Information Bill, which is expected to be passed in early 2024. Read our latest DPDI briefing on how this Bill will take away the power we have over our data and give more power to government departments and corporations.

]]>
DPDI Bill: New ‘welfare surveillance’ proposals target vulnerable people https://www.openrightsgroup.org/press-releases/dpdi-bill-new-welfare-surveillance-proposals-target-vulnerable-people/ Tue, 28 Nov 2023 16:31:59 +0000 https://www.openrightsgroup.org/__preview__/press_release/19225

The UK government is proposing to give the Department of Work and Pensions (DWP) powers to force financial institutions to hand over personal information belonging to people who claim benefits from the state. The plans have been presented as an amendment to the Data Protection and Digital Information (DPDI) Bill, which returns to the House of Commons for its third reading and report stage on Wednesday 29 November.

The proposals will remove safeguards and accountability for requesting financial data from institutions, meaning that DWP would be allowed to bypass the safeguards enshrined in UK data protection law and gain access to UK residents’ bank accounts arbitrarily. ORG believes that this data could easily be misinterpreted and benefits sanctions incorrectly imposed.

Migrants, refugees, and people who are disabled, sick, or in need of care, may encounter additional challenges when navigating the welfare system. This policy treats these vulnerable populations as potential criminals rather than individuals in need of support.

ORG’s Policy Manager Mariano delli Santi commented:

“These proposals could see people’s private banking information being shared with DWP. Welfare surveillance further stigmatises people who receive benefits, many of whom already face discrimination and negative stereotyping.

“It could lead to some of the most vulnerable people facing unjust accusations of fraud, and potentially having their benefits removed and their lives destroyed.

“This is not just a hypothetical risk. In 2021 the Dutch government resigned amid an escalating scandal over child benefits in which more than 20,000 families were wrongly accused of fraud by the tax authority.”

Netherlands benefits scandal

The Netherlands benefits scandal saw thousands of people being unjustly accused of fraud and have their benefits incorrectly withdrawn after errors involving data sharing and automated decision making. If passed, it is possible that DWP could see itself embroiled in a similar scandal where benefits have been unfairly revoked and individuals wrongly convicted of fraud.

Impact on migrants and refugees

Refugees, who may have endured years of waiting without the right to work or access to public funds, often rely on benefits to embark on their new lives. Subjecting refugees, to continuous scrutiny of their bank accounts poses a risk of discriminatory behaviour, potentially triggering red flags on their accounts without valid reasons.

For example, red flags that are triggered in the context of transferring money overseas, could disproportionately affect migrants and refugees engaged in international transactions for legitimate reasons, such as sending money to family members abroad. This may result in unwarranted suspicion and discrimination based on financial activities that are entirely legal.

The fear of constant surveillance and potential errors in the welfare system may also deter eligible migrants and refugees from accessing the support they require. This could lead to increased poverty, homelessness, and other adverse outcomes.

Insufficient time for scrutiny

The amendment is just one of hundreds of new additions to the DPDI Bill, which MPs are expected to discuss tomorrow. delli Santi added:

“The government has added hundreds of amendments, making it almost impossible for MPs to scrutinise these proposals properly even though many of them could seriously impact the constituents they serve. The Bill could also lead to the UK adequacy decision being revoked, something that would cost 1.2 billion pounds to UK businesses in legal fees alone.

We urge the government to allow our parliamentarians sufficient time to do their duty and assess the impact of these proposals.”

Briefing on the DPDI Bill

Read ORG’s latest briefing on the key issues with the Bill.

Find out more

Notes to editor:

The DPDI Bill, which will replace the UK GDPR, is a problematic piece of legislation, which will remove some of the powers that individuals have over their data and hand them to government departments and corporations.

The UK adequacy decision was adopted by the European Commission in 2021 and allows the free flow of personal data to and from the EU. Conservative estimates found that the loss of the adequacy agreement would cost UK businesses 1 to 1.6 billion pounds in legal fees alone, not including the cost resulting from disruption of digital trade, investments, and the relocation of UK businesses to the EU.

]]>
Parliamentary Briefing – Data Protection and Digital Information Bill https://www.openrightsgroup.org/publications/parliamentary-briefing-data-protection-and-digital-information-bill-report-stage/ Mon, 20 Nov 2023 17:03:24 +0000 https://www.openrightsgroup.org/__preview__/publication/19150 Briefing for the Report Stage, November 2023

DATA BILL WILL SET BACK UK ECONOMY AND RIGHTS

The Data Protection and Digital Information (DPDI) Bill will have its report stage in Parliament on November 29 2023. In this briefing, we explain how the Bill weakens data rights, lowers scrutiny and accountability, unduly expands Government powers and harms the UK economy and relations with the EU.

The Bill will weaken UK data protection rights, reduce accountability for private businesses and the Government, and have a negative impact on the UK economy: In an ever-digitalised and data-driven world, existing data protection laws provide legal protection for the public against predatory commercial practices and the increased use of algorithmic decision-making across public services, law enforcement and employment. The Bill will take away controls the public has over its data and hand more power to government bodies and corporations.

The Bill lowers standards and removes protections and redress mechanisms against harmful uses of artificial intelligence (AI): AI systems are built, trained, and sustained by access to the huge amount of of data that companies and governments collect about us. AI and automated-decision making systems have been found to replicate and amplify biases that exist in every day life. We need strong data rights protection to ensure that our data is not misused by AI systems.

The Bill will impact marginalised people, for whom data protection is extremely important: For example, refugees and asylum seekers, must share data with the authorities in order to apply to live in the UK. If, as proposed, their personal data could be more easily shared with their country of origin or with UK law enforcement or national security bodies, they could be at risk of persecution or harm. This may undermine their trust in the authorities and discourage them from seeking help or accessing needed services, such as healthcare, legal aid, or social support programs.

The Bill is set to undermine the UK adequacy decision, which allows the free flow of personal data from the EU to the UK and underpins important cooperation initiatives with the EU in trade, law enforcement and research. The loss of the adequacy agreement would cost UK businesses £1 to 1.6 billion in legal fees alone. These are the risks the Government is taking in order to reduce their accountability and allow bad- faith companies to test dangerous technologies on your constituents.

Download the full briefing

ORG briefing on Data Protection and Digital Information Bill (Report Stage).

Download now

Weakened data protection rights

New barriers to exercising data protection rights (Clause 8)

  • The Bill lowers the threshold that allows organisation to deny or charge for a data rights request, such as to access or delete personal data, from manifestly excessive to ‘’vexatious or excessive’. This term is open to interpretation and will lead to more requests being refused.

Lower protections around AI and automated decision-making (Clauses 8, 12)

  • Clause 12 removes the right to say no to automated decision-making. Although individuals would retain a right to appeal automated decisions, this would be of little use, as individuals would lack access and resources to scrutinise and challenge how an AI system works.

It will take longer to obtain redress against injustices (Clauses 9, 41, 42)

  • Victims of data abuses will have to wait longer for a rights’ request to be processed and undergo a privatised complaint procedure with the offending organisation before lodging a complaint with the ICO. In turn, complaints could routinely take 20 months or longer to resolve.
  • Also, the Bill will expand the ICO’s discretion to dismiss complaints, condoning rather than addressing their poor track record on handling complaints from the public.

Less public scrutiny and accountability

Weakened accountability framework (Clauses 15, 16, 18 and 19)

  • The Bill removes important accountability requirements, such as requirements to keep Records of Processing Operations, Data Protection Impact Assessments, and Data Protection Officers, and replaces them with less robust requirements that only need to be fulfilled in limited circumstances.
  • The Bill also removes the requirement to consult with people affected by high- risk data processing, thus making these assessments less reliable and objective.
  • The Bill threatens responsible AI governance by removing existing accountability rules. It’s important to have standardised documentation and practice for assessing risks throughout the AI lifecycle, ensuring effective enforcement by the ICO, and increasing overall transparency. However, the Bill removes “prescriptive requirements” of the UK GDPR, making this documentation less coherent, less useful and more prone to misrepresentation.

Reduced accountability for businesses

The Bill makes it easier for companies and organisations to circumvent legal data protection requirements by:

  • Misclassifying personal data as anonymous data (Clause 1).
  • Allowing personal data to be used for commercial purposes under the guise of “research purposes” (Clauses 2, 3 and 10); and removing cookies’ consent requirements for online tracking and personalised advertising (Clause 83).
  • These changes are particularly concerning for when seen in relation to the training of AI. The DPDI Bill extends lower regulatory standards set forth by research provisions to “commercial research”. However, research exemptions were meant to underpin public interest research, not the deployment of commercial products that will have practical implementations, which will impact people’s lives.

Undemocratic expansion of government powers

Politicising the ICO (Clauses 29 and 30, 33)

  • The Bill will give the Secretary of State new powers to issue instructions to the ICO and to interfere with how it functions. For instance, the government will be given the power to issue the ICO with a statement of strategic priorities and require the regulator to respond in writing as to how it will address them. The ICO will also have to seek the approval of the Government before issuing Codes of Practice. The ICO plays a key role in the oversight of the Government’s handling of data so it is vital that it is completely independent from Government.

Removing critical oversight of biometrics use and surveillance (Clauses 111, 112, 113)

  • The Bill abolishes the role of the Biometrics and Surveillance Camera Commissioner. A report by the Centre for Research into Surveillance and Privacy warns that, “plans to abolish and not replace existing safeguards in this crucial area will leave the UK without proper oversight just when advances in artificial intelligence and other technologies mean they are needed more than ever”.

Lowered protections for personal data transferred abroad (Schedule 5)

  • The Secretary of State will be able to approve international transfers to countries with weak data protection and a lack of enforceable rights and effective remedies. In particular, the new “data protection test” gives arbitrary discretion to the UK government to consider, as a justification for authorising international data transfers, “any matter which the Secretary of State considers relevant”.

Expanding government control over data (Clauses 5 and 6)

  • The Secretary of State will be given additional powers to introduce (without meaningful democratic scrutiny) new grounds for processing data and new exemptions that would legitimise data uses regardless of the impact this may have on individuals. The list of exemptions is overly broad and vague. For instance, it includes “crime detection”, “national security” or “disclosures to public authorities”. The UK government is given broad powers to amend this list at any time and without meaningful limits to their discretion.

Negative impact on the UK’s economy and EU relations

Harming UK businesses

  • Numerous businesses have spoken out about the negative impacts of the Bill’s proposals. Some startups are already leaving the UK in anticipation of this reform. Navigating multiple data protection regimes will significantly increase costs and create bureaucratic headaches for businesses. A separate data protection regime creates barriers between the UK and its closest trading partner.

Undermining adequacy and threatening relationships with the EU

  • Loosing the UK adequacy decision would introduce significant frictions in trade, undermine the competitiveness of UK businesses, and threaten important relationships with the EU including law enforcement, research, and the Windsor Framework. The European Commission issued in a written statement that the powers to the Secretary of State and proposed changes to the Information Commissioner’s Office “raise questions with respect of the level of protection” for personal data in the UK. Likewise, the European Parliament found that the Bill raises significant concerns over the implementation of the EU-UK Trade and Cooperation Agreement. Several EU civil society groups have already demanded the UK Adequacy Decision be scrapped if this Bill is passed.

GET IN TOUCH

For more information on this Bill, get in touch with mariano@openrightsgroup.org and james.baker@openrightsgroup.org.

About Open Rights Group (ORG): Founded in 2005, Open Rights Group (ORG) is a UK-based digital campaigning organisation working to protect individuals’ rights to privacy and free speech online. ORG has been following the UK government’s proposed reforms to data protection since their inception. In June 2022, we organised an open letter signed by a coalition of over 30 organisations that highlighted the failure of the DCMS to properly engage with civil society groups about the proposed reforms, and in March 2023, we delivered a letter signed by 25 CSOs to Michelle Donelan, highlighting our serious concerns with the Government’s draft legislation

Imprint: Published by Open Rights, a non-profit company limited by Guarantee, registered in England and Wales no. 05581537. The Society of Authors, 24 Bedford Row, London, WC1R 4EH. (CC BY-SA 3.0).

]]>
King’s Speech: Investigatory Powers Act reforms threaten security https://www.openrightsgroup.org/press-releases/kings-speech-investigatory-powers-act-reforms-threaten-security/ Tue, 07 Nov 2023 12:54:33 +0000 https://www.openrightsgroup.org/__preview__/press_release/18630 Open Rights Group has responded to the proposed amendments to the Investigatory Powers Act, announced in the King’s Speech today.

The amendments could mean that global tech companies are forced to get permission from the UK government if they want to make changes to security features of their products and services. This is likely to mean further attacks on end-to-end encryption, which keeps our communications and transactions safe. At the time of the consultation into these changes, Apple responded by saying that if it was forced to weaken the security of products such as FaceTime and iMessage, they would simply remove them from the UK.

Abigail Burke, Platform Power Programme Manager for Open Rights Group said:

“End-to-end encryption keeps our data and our communications safe and secure. The proposed reforms to the Investigatory Powers Act are the government’s latest attack on this technology.

“If enacted, these reforms pose a threat to companies’ ability to keep our data safe and increase the risk of criminal attacks. We urge the government to engage with civil society and tech companies, and to reconsider these potentially dangerous proposals.”

Read Open Rights Group’s submission on the consultation into the reform of the Investigatory Powers Act.

]]>
Response to Consultation on revised notices regimes in the Investigatory Powers Act 2016 https://www.openrightsgroup.org/publications/response-to-consultation-investigatory-powers-act-2016/ Tue, 07 Nov 2023 12:46:06 +0000 https://www.openrightsgroup.org/__preview__/publication/18620

Introduction

ORG would like to express concern regarding the proposed changes, which seem to be squarely aimed at reducing the possibility of the introduction of encryption to protect user data from unwanted access.

The Home Office appears to regard encryption, especially “end to end encryption” (E2EE), where a service provider is unable to see the contents of communications they facilitate, as a threat to its capabilities, and by extension, to national security. It appears to be seeking to extend its powers to prevent E2EE from being used at scale, despite its benefits to users and vendors.

E2EE is a significant protection against everyday criminality, abuse and intrusion. E2EE protects vendors from being a vector for potentially massive data loss, with often traumatic consequences, where the data is especially personal. The increase of encryption should be seen as a benefit, as has been highlighted by the former head of the NCSC.

Where encryption has been introduced or is sought to be introduced, it is typically because of a threat to individuals’ data that would have high impacts if accessed by criminals; or else because abuse or interference in that data is already routinely taking place, but is difficult to enforce against.

In either case, limiting the roll-out of encryption because of law enforcement considerations risks enabling criminal or unlawful behaviour, and places large numbers of users at risk, in order for the Home Office to have the easiest means to address questions of lawful access, which can however usually be addressed by other means.

Large numbers of UK professionals – doctors, lawyers, accountants, management consultants, IT experts – deal in personal data which they are required, under Data Protection legislation, to exchange securely with others. They exchange information via encrypted services and encrypted file attachments. The current proposals undermine these routes and with significant implications for the UK’s service economy. The services industry amounts to 79% of UK GDP.1

While the government may have particular reasons to seek access to data and systems in certain limited circumstances, it should neither assume that all data should be easily accessible nor seek legal regimes to ensure that data is kept easily accessible. We reiterate that encryption does not prevent lawful access per se. It may require law enforcement to covertly access a device, or to seize it and demand passwords; however these approaches are likely to be more proportionate than simply preventing security measures from evolving for the population at large.

Finally, it is worth noticing that the consultation does not provide meaningful explanation as to what changes the government are planning to introduce and why. In particular, the consultation lacks details concerning safeguards and conditions that would apply to the issuing of these notices under the (unclear) revised terms. These changes also affect respondents’ ability to comment on the effectiveness of the Judicial Commissioner’s oversight under the new regime, which will depend on the breadth of discretion being given to the Secretary of State for the issuing of these notices. The overall approach to this consultation appears quite dubious and wholly inadequate and raises more questions about what is being omitted rather than about what is being proposed.

Proposed objectives of the changes

The objectives outlines are concerning especially when taken together. Under Objective 1 – Strengthening the notice review process, the consultation states that: “When giving a notice for the first time the Secretary of State has a statutory obligation to engage in a consultation period with the relevant operator. … during a review period the operator is not required to comply with the notice, so far as referred, until the Secretary of State has concluded the review … Where an operator is seeking to make changes to their system that would have a detrimental effect on a current lawful access capability, this could create a capability gap during the review period, which is an issue we believe should be addressed.

This could be done through a general requirement to maintain the status quo through this period, ensuring that our lawful access to data is maintained.”

Likewise, Objective 2 – Timely and informative responses asks that1 House of Commons Library 14/07/2023 “there should be an obligation placed on the operators to cooperate with the consultation process before the decision to give a notice is made, and with any subsequent review process, and to provide relevant information as necessary and within a reasonable time.”

These objectives appears to be designed to impose a “freeze” on changes to the service while consultations are taking place. The intention appears to be to stop improvements to user security from being rolled out.

While this objective may appear to be reasonable, it would allow the Home Office to prevent secure services from launching in the UK, even where they are rolled out elsewhere. This provision would allow the Home Office to place itself in a position of power over the provider as soon as it hears about the possibility of data being less accessible than it is currently. This situation would take place without reference to an independent authority to assess the rationale or proportionality. Such a move might not be proportionate, for instance, if the security technology had already been introduced safely and with demonstrable benefits to users in other parts of the world.

We understand that the Home Office has already intervened in the use of encrypted technologies that are widespread in other parts of the globe, but are either delayed or less available in the UK. The change in powers suggested would make it easier for the Home Office to routinely intervene to delay roll out of security improvements, making the UK a far less attractive place for digital businesses as well as less safe for its citizens.

At present, we believe that some companies have complied with Home Office requests to negotiate over security changes. The Home Office is able to request that the company comes to an agreement with them about a suitable way forward, and not to implement any changes until this is done, under threat of a Technical Capability Notice should they proceed without agreement. Since agreement is often not possible without compromising the premise of encrypted communications, the company is left unable to conclude its negotiations. This conundrum nevertheless suits the Home Office, as preventing the proposed security improvements is its goal.

With the proposed power, should a company call the Home Office’s bluff, and say that in fact they will proceed with their security improvements, and that they might simply withdraw from the UK market should they be issued with a TCN, then the Home Office
could simply start the TCN process, in order to freeze the technology for UK users. The company would find it much harder to conclude the process satisfactorily.

Under Objective 4 – notification requirements, the Home Office proposes: “to make changes that would support cooperation between government and industry by setting clear expectations about the circumstances in which operators might be expected to notify the Secretary of State of planned changes to their service that could have a negative impact on investigatory powers and, where necessary, mandating notification of planned changes. … we propose to introduce a requirement for the Secretary of State to consider the necessity and proportionality of imposing a requirement to notify, including taking into account the impact on the business or businesses to whom it will apply as well as the likely benefit of early notification. This would avoid placing burden on those telecommunications operators whose data is of minimal operational importance. …

Additionally, we intend to develop a series of thresholds that would also trigger the notification requirement, for example, if a technical change could substantively impact existing IPA capabilities or the availability of communications and communications related data for a certain number of users or a certain percentage of the market. We welcome comments from respondents on this approach, including potential thresholds.”

It is unclear whether the notification requirement would exist without a TCN being in place, or whether one might be issued to a company to provide broad access to data without any specific changes being required, and then be used to limit any changes that would improve user security in the future. The requirement appears to be aimed at ensuring that any large digital provider would be obliged to explain if any of its services might be made more secure, in order that the Home Office be able to prevent security measures from being put in place

Combined with the first three objectives, this requirement forms a strategy whereby the Home Office seeks to know about any possible secure service being introduced, and then intervenes to stop it from being rolled out.

Compromising technology and companies

A further result of this strategy is that companies may be asked to lie about the security of their products, after compromising them to introduce ‘back doors’, or may be asked to provide services they believe are sub standard and produce risks to their users. There are reputational issues to making such compromises, should compromises become apparent, and the UK will be known as a sub-standard place for digital business. This strategy is unreasonable, and may lead to companies leaving the UK market. At best, it may mean that improved security available to people elsewhere is specifically unavailable in the UK.

Extraterritoriality

It must also be made clear that the UK cannot and should not purport to be able to “freeze” the development and deployment of products outside of the UK. Other countries are governed by the rule of law and have their own, potentially conflicting, requirements and allowances. The UK should not purport to be able to veto the development and deployment of security technologies that could be beneficial to millions of users outside of the UK.

Proportionality of the Home Office strategy

It is unclear that it is possible to make the Home Office’s overall strategy proportionate, if the intended outcome is to limit the introduction of E2EE or other security technologies and thereby reduce the security of millions of users.

As the strategy relies on coercive powers, to start processes which would stall the deployment of security technologies with the intention of preventing or compromising their introduction, the proportionality test would need to be made both at the start of
the process, and particularly at the level of the Home Office’s overall strategy of preventing the use of these security technologies.

Unfortunately, as the use of these powers in practice seems to be through the informal use of the potential threat of a TCN, to dissuade companies from taking steps the Home Office does not like, the use of these powers to prevent the deployment of E2EE and technology security improvements is in practice often not subject to an proportionality test whatsoever.

Finally, the government propose “to introduce a requirement for the Secretary of State to consider the necessity and proportionality of imposing a requirement to notify.” While the wording of this proposal appears rather vague, it is reasonable to assume that the government intends to introduce a requirement in legislation to “have regard to” necessity and proportionality. This change, however, would represent a soft requirement that fails to provide a legally binding safeguard—indeed, the Secretary of State may even decide to breach proportional and necessity when issuing a notice, insofar as it had “regard” of it. As such, this change would fail to ensure the proportionality of the regime and would instead create an arbitrary and unaccountable power in the hands of the Secretary of State.

Minimum change needed

At a minimum, if companies are to be subject to a coercion, such as temporarily making no changes to their technologies as set out in Objective 2 and 3, or notifying the government of intentions to change technologies, as set out in Objective 4, then these requests should be subject to the ‘double lock’, and be tested for proportionality.

An alternative strategy

The Home Office should cease being concerned about the increase of the use of encrypted technology, but rather should see it as a benefit against routine criminals. It should assess the specific alternative methods of access it has available. It should improve the investigative abilities of law enforcement.

Published by Open Rights, a non-profit company limited by Guarantee, registered in England and Wales no. 05581537. The Society of Authors, 24 Bedford Row, London, WC1R 4EH. (CC BY-SA 3.0).

About Open Rights Group (ORG): Founded in 2005, Open Rights Group (ORG) is a UK-based digital campaigning organisation working to protect individuals’ rights to privacy and free speech online. ORG has been following the UK government’s proposed reforms to data protection since their inception.

]]>
Hands Off Our Data https://www.openrightsgroup.org/campaign/hands-off-our-data/ Wed, 01 Nov 2023 15:05:40 +0000 https://www.openrightsgroup.org/__preview__/campaign/18304 Your data will be used against you and you’ll have less ability to do anything about it.

The Data Protection and Digital Information Bill gives the State and companies more access to our data with fewer limitations. What rights remain over our personal data are made harder to enforce.

The Data Grab Bill will:

  • Create powers to snoop on the bank accounts and financial assets of anyone who receives any benefit, Including the State Pension!
  • Make it harder to access your data by giving organisations more powers to refuse requests
  • Increase automated decision-making
  • Expand exemptions for data sharing, use and reuse
  • Increase political interference with the ICO without parliamentary oversight
  • Remove the need to carry out impact assessments
  • Create new powers to approve international data transfers

JOIN OUR CAMPAIGN

Your data rights are under attack. Tell the government to get their hands off your data and stop the Data Grab Bill.

TAKE ACTION

Why data protection matters

Data protection rights protect people across all aspects of life – at work, using NHS services, applying for jobs or renting a flat.

As the use of flawed and biased decision-making algorithms increases across every sector, data protection is particularly important for protecting the rights of marginalised groups, including migrants, LGBTQ people and people from racialised backgrounds.

The UK GDPR has been used to challenge unfair dismissals in the workplace, private companies’ use of public health data, and illegal profiling by advertising companies. The Data Grab Bill will undermine your ability to have control over how your data is processed.

FIND OUT MORE

Our analysis on how the Data Grab Bill worsens the power imbalance between us, the State and corporations.

Read now

What’s wrong with the Data Grab Bill?

The Data Protection and Digital Information Bill will erode data subject’s rights and corporate accountability mechanisms. It expands the Secretary of State’s powers in numerous ways, creating a greatly weakened data protection structure.

The Bill damages proper oversight of data processing, jeopardise sensitive information about UK residents, and create new opportunities for discrimination against vulnerable groups.

  • data subject rights and corporate accountability

Changes to Data Protection Impact Assessments remove the requirement for organisations to consult with data subjects who are affected by high risk data processing. Additionally, the bill lowers the threshold for organisations to refuse a Subject Access Request and removes individuals’ right to not to be subjected to solely automated decision making.

  • Oversight of data processing

The independence of the Information Commissioner’s Office (ICO) will be reduced. As the ICO plays a key part in the oversight of the government’s use of data, this is extremely problematic.

  • Expansion of data processing

The Bill creates grounds for processing data and more exemptions to the limits that restrict how data can be processed. The Secretary of State could make changes to increase data the ways that data is used and reused without meaningful parliamentary oversight.

  • Data transfers

The Secretary of State would have the discretion to approve international data transfers to countries with insufficient data protection standards.

GET INVOLVED

Download our campaigning toolkit for you to take action to protect your data rights.

Find out more
]]>
Black History Month 2023 https://www.openrightsgroup.org/blog/black-history-month-2023/ Tue, 31 Oct 2023 17:16:05 +0000 https://www.openrightsgroup.org/__preview__/blog/18291 As Black History Month draws to a close, we’d like to recognise some of the individuals, organisations and initiatives, who inspire us through their work challenging racism within tech, policing and our wider society.

Kids of Colour

Kids of Colour is a project for young people of colour aged 25 and under to explore ‘race’, identity and culture. Their advocacy on behalf of the Manchester 10 raised awareness of how online communications and social media can be misinterpreted and weaponised to feed into already discriminatory criminal justice practices, such as certain conspiracy charges and joint enterprise. This practice is being increasingly used by prosecutors to form gang narratives and disproportionately impact young Black men. There’s still lots to still do and we encourage you to follow their work, especially as 3 of the Manchester 10 are having their sentencing appeals heard on 3 November.

Joy Buolamwini

In 2018, Joy Buolamwini and Timnit Gebru co-wrote Gender Shades, which exposed how commercial facial recognition systems often failed to recognize the faces of Black and brown people, especially Black women. Her work has led big tech companies, including Google, IBM and Microsoft, to improve their software and reduce its bias, and deterred them from selling their technology to law enforcement. Watch the excellent documentary Coded Bias to understand more.

Abeba Birhane

Abeba Birhane is a cognitive scientist and one of the people who blew the lid on how harmful data from the darkest corners of the internet is being used to train large language models. She is the keynote speaker at the 2023 AI and Society Forum and Time Magazine has named her one of the 100 most influential people in AI.

Tracey Gyateng

Tracey Gyateng is a quantitative social researcher who has used her experience in the social sector and in the access, collection and management of quantitative data to inform decision making. Her projects have included supporting The Legal Education Foundation to develop The Justice Lab.

Patrick Williams

Patrick Williams’ outstanding work around gangs databases and tech use in policing exposes how racial biases can be embedded within institutions with power and reinforce the marginalisation of communities. Here’s one article everyone should read.

Holding our Own

Only by moving away from policing as a response to social problems, can we tackle systemic racism. Holding our Own, produced by Liberty and other partners, is a guide to tackling serious youth violence in ways that don’t harm communities but give all children a chance to thrive.

Timnit Gebru

Timnit Gebru used to co-lead of Google’s ethical AI team before the company forced her out due to a paper she wrote around the risks associated with large language models, which brought up tensions around a core line of the company’s own research.

Awate/UNJUST UK action

Musician Awate and UNJUST – a small non-profit tackling unjust policing practices – took on the UK’s largest police force and helped expose the legal flaws of the Gangs Matrix – a biased and pre-crime database that subjects a disproportionate number of young Black males to surveillance. They joined forces with Liberty and the Met are now being forced to overhaul the database.

Habib Kadiri/StopWatch

Habib Kadiri, Executive Director of StopWatch is shining a light on stop and search, and campaigning against the over policing of marginalised communities. StopWatch has been a vital partner for Open Rights Groups and many others who are challenging over policing. They have been among those keeping pace and challenging the innovations in and harmful advancement of police tech. Read this StopWatch article on facial recognition in policing.

]]>
Campaigners urge schools not to rush to report pupils to Prevent in wake of escalated Israel-Palestine conflict https://www.openrightsgroup.org/press-releases/prevent-duty-israel-palestine-conflict/ Tue, 31 Oct 2023 17:12:54 +0000 https://www.openrightsgroup.org/__preview__/press_release/18278 It is right that young people will want to discuss issues of terrorism, human rights and humanitarian obligations in the context of the current crisis in Israel and Palestine. Many will have strong opinions and feelings about the matter which will need to be handled sensitively and professionally by teachers.

Open Rights Group and Prevent Watch have serious concerns that current government advice to schools will place pupils and their teachers at risk rather than enabling discussion and proper debate of the topics. These concerns have been exacerbated by disturbing reports that the Met are increasing intelligence-gathering at schools.

Earlier this month, Education Secretary Gillian Keegan wrote to both schools and universities about the crisis and its impact on children and young people. Both letters remind educational establishments of their responsibilities under the Prevent duty, which requires them to report students if they are concerned that they could be drawn into terrorism.

Prevent is not about unlawful ideas, but about lawful ideas that are judged to be extreme. Instead of any issues of concern being addressed within the classroom, or in private discussion, by a teacher familiar with the student, they become a matter for external authorities, including counter-terrorism police. This will be distressing for the young person and it will be something that is recorded and potentially shared with other agencies, even if the referral does not proceed further.

Prevent is a flawed programme that undermines freedom of expression. It has been shown disproportionately to impact Muslims, something that will be accentuated in the context of Israel/Palestine. Since 2015 there have been more than 45,000 Prevent referrals. The overwhelming majority were never progressed to a Channel deradicalisation intervention, either being dropped or subsumed into other safeguarding interventions.

We believe schools need to be supported in how they encourage freedom of expression, rather than being compelled to surveil and potentially report their pupils.

In addition, Keegan’s letter to schools acknowledges the horror of the Hamas attacks and how this might impact on students, but makes no reference to similar consequences of Israel’s attacks onGaza, which are estimated to have killed over 8,000 people and wounded many thousands more. Schools need to ensure that children feel safe to talk about all aspects of world events that affect them.

Sophia Akram, Programme Manager at Open Rights Group, highlighted the ongoing risks from Prevent referrals:

“When teachers and others working with young people and children report them to Prevent, they unlikely appreciate the potential ramifications of that action to that young person or child.

“The reality is that the referral becomes a stain on their record, which could be shared between multiple databases between police forces, local authority systems and other records – possibly indefinitely. That’s a heavy charge on someone in their formative years who is simply attempting to process the calamitous world around them.”

Dr. Layla Aitlhadj, Director, Prevent Watch said:

“Young people should be able to develop their ideas and have them discussed and challenged in an environment that supports them. Having different opinions, questioning government policies and being moved by humanitarian crises should be encouraged, rather than discouraged. They should not face the threat of being interviewed by counter-terrorism police. Young Muslims should not be made to feel that their voices should not be heard.”

Resist Pre-Crime

Data and content is being weaponised to criminalise people without cause.
Find Out More
Resist Pre-Crime
]]>
Letter to Rishi Sunak: AI Summit is dominated by Big Tech and a ‘missed opportunity’ https://www.openrightsgroup.org/press-releases/letter-to-rishi-sunak-ai-summit/ Mon, 30 Oct 2023 12:32:14 +0000 https://www.openrightsgroup.org/__preview__/press_release/18236
  • More than 100 UK and international organisations, experts and campaigners sign open letter to Rishi Sunak
    • Groups warn that the “communities and workers most affected by AI have been marginalised by the Summit.”
    • Closed door’ event is dominated by Big Tech and overly focused on speculative risks instead of AI threats “in the here and now”
    • Signatories to letter include leading human rights organisations, trade union bodies, tech orgs, leading academics and experts on AI


    In an open letter to Prime Minister Rishi Sunak, the groups warn that the “communities and workers most affected by AI have been marginalised by the Summit” while a select few corporations seek to shape the rules.

    The letter has been coordinated by the TUC, Connected by Data and Open Rights Group and is released ahead of the official AI Summit at Bletchley Park on 1 and 2 November.

    The full letter and signatories can be found here and include:

    • Major and international trade union confederations – such as the TUC, AFL-CIO, European Trade Union Confederation, International Trade Union Confederation representing tens of millions of workers worldwide
    • International and UK human rights orgs – such as Amnesty International, Liberty, Article 19, Privacy International, Access Now
    • Domestic and international civil society organisations – such as Connected by Data, Open Rights Group, 5 Rights, Consumers International
    • Tech community voices – such as Mozilla, AI Now Institute and individuals associated to the AI Council, Alan Turing Institute & British Computing Society
    • Leading international academics, experts, members of the House of Lords

    Highlighting the exclusion of civil society from the Summit, the letter says:

    Your ‘Global Summit on AI Safety’ seeks to tackle the transformational risks and benefits of AI, acknowledging that AI “will fundamentally alter the way we live, work, and relate to one another. Yet the communities and workers most affected by AI have been marginalised by the Summit.

    The involvement of civil society organisations that bring a diversity of expertise and perspectives has been selective and limited. This is a missed opportunity.

    Open Rights Group Policy Manager for Data Rights and Privacy, Abby Burke said: 

    “The government has bungled what could have been an opportunity for real global AI leadership due to the Summit’s limited scope and invitees. The agenda’s focus on future, apocalyptic risks belies the fact that government bodies and institutions in the UK are already deploying AI and automated decision-making in ways that are exposing citizens to error and bias on a massive scale. 

    It’s extremely concerning that the government has excluded those who are experiencing harms and other critical expert and activist voices from its Summit, allowing businesses who create and profit from AI systems to set the UK’s agenda.

    The organisations and individuals who signed this letter represent the interests of millions of people both across the UK and more globally. We hope the government heeds this call to democratise the future of AI.”

    Senior Campaigns and Policy Officer for Connected by Data Adam Cantwell-Corn said: 

    “The open letter is a powerful, diverse and international challenge to the unacceptable domination of AI policy by narrow interests”.

    AI must be shaped in the interests of the wider public. This means ensuring that a range of expertise, perspectives and communities have an equal seat at the table. The Summit demonstrates a failure to do this.

    Beyond the Summit, AI policy making needs a re-think – domestically and internationally – to steer these transformative technologies in a democratic and socially useful direction.”

    TUC Assistant General Secretary Kate Bell said:

    “It is hugely disappointing that unions and wider civil society have been denied proper representation at this Summit.

    AI is already making life-changing decisions – like how we work, how we’re hired and who gets fired. But working people have yet to be given a seat at the table.  

    This event was an opportunity to bring together a wide range of voices to discuss how we deal with immediate threats and make sure AI benefits all. It shouldn’t just be tech bros and politicians who get to shape the future of AI.”

    OPen letter on the ai safety summit

    Over 100 signatories send a warning to the UK Prime Minister.

    Read in full

    Notes to editors:

    The letter has been coordinated by the TUC, Connected by Data and Open Rights Group. 

    Each organisation will be speaking at the AI and Society Forum taking place at the Wellcome Collection, London on Tuesday 31st October, NW1 2BE. 

    The list of signatories and the open letter can be found here: https://ai-summit-open-letter.info/

    About Open Rights Group: Open Rights Group (ORG) is a UK-based campaigning organisation working to protect digital rights. We exist to promote rights like privacy and free expression online and to challenge threats to our rights through public campaigns, media commentary, legal action, and policy interventions. 

    About the TUC: The Trades Union Congress (TUC) exists to make the working world a better place for everyone. We bring together the 5.5 million working people who make up our 48 member unions. We support unions to grow and thrive, and we stand up for everyone who works for a living.

    About Connected by Data:Connected by Data is a campaign to democratise data and AI, by ensuring a powerful say for communities over decisions that affect them.

    ]]>
    Digital Rights, Israel and Palestine https://www.openrightsgroup.org/blog/digital-rights-israel-and-palestine/ Thu, 26 Oct 2023 13:27:12 +0000 https://www.openrightsgroup.org/__preview__/blog/18210 It has been over two weeks since the Hamas attacks and the start of the Israeli offensive on Gaza. To date, 1,400 Israelis and over 5,000 Palestinians have been killed, and the death toll is expected to rise as Israel continues to carry out airstrikes and block supplies to Gaza, including fuel and water, and attacks on Israel by Hamas and others continue. As a human rights organisation, we condemn the targeting of civilians by all parties, both through indiscriminate terror attacks on Israelis, and what the UN has called the “collective punishment” of Palestinians.

    As a UK digital rights organisation, we also condemn how these events are being used to infringe the civil liberties of people in this country. This is predominantly targeted at Palestinians and people advocating for a ceasefire and objecting to Israel’s occupation of the Palestinian territories. But we are also seeing the protest rights of Jewish people being undermined. Hate crimes against both Jewish and Muslim people are rising sharply.

    Below are some of our key concerns and observations:

    Internet restrictions

    Internet restrictions in Gaza are contributing to the suffering felt by people there, denying people the ability to communicate, to access life-saving information, and to document what is happening. We join organisations such as Access Now who have called for the Israeli authorities to stop targeting telecommunication infrastructure and for steps to be taken to restore telecommunications services. The restrictions also impact the flow of information out of Gaza and the world’s understanding of what is happening there. These impacts are felt by people in the UK attempting to find out what is happening to their friends and relatives, heightening their worry and distress.

    Misinformation and disinformation

    There has been a proliferation of misinformation and disinformation online.

    There is the concern that the UK and other governments will push companies to remove content, suppressing legitimate debate among people concerned with the situation in Gaza. The EU Commission has already called for companies to remove disinformation within 24 hours, wrongly asserting that such content is unlawful, and wrongly claiming that companies have a duty to remove it.

    Rather, companies need to step up their systems while ensuring they are not removing content that is necessary for public debate. In this situation, it is all too easy for tech giants to remove information and suppress discussion, especially in non-European languages.

    Sooner or later we need to address the centralised power of platforms to spread misinformation, and to censor content incorrectly. This is not that time, but in the short term we need to hold them accountable for suppressing legitimate debate, while also combating hate speech and disinformation. We can recognise that this is difficult, but also that their attention-based business models and desire for low costs do not help.

    Hate speech, censorship and freedom of expression online

    7amleh – The Arab Center for the Advancement of Social Media documented 19,000 violent tweets out of 23,000 Hebrew tweets on X, formerly known as Twitter, between Oct 7-11, 2023. 7amleh, and other human rights organisations, have also highlighted the “significant and disproportionate censorship of Palestinian voices” and have called on tech companies “to adhere to business and human rights principles as well as international human rights laws in safeguarding freedom of expression violations”. Instagram users have also accused the platform of censoring pro-Palestine posts.

    404 Media reported that Instagram’s “see translation” feature for user bios was auto-translating phrases that included “Palestinian” and “alhamdulillah” into “Praise be to god, Palestinian terrorists are fighting for their freedom.”

    Meta has apologized for this mistake but it highlights how easy it is for errors in algorithms and software to erroneously label people as terrorists, exacerbating racism and discrimination. It’s also forewarns the problems with false-positives that will arise when legislation like the Online Safety Act is implemented.

    Undermining the right to protest

    Last week, the Home Secretary, Suella Braverman wrote a letter to police chiefs which stated: “Behaviours that are legitimate in some circumstances, for example the waving of a Palestinian flag, may not be legitimate such as when intended to glorify acts of terrorism.” Braverman also invited police to “consider whether chants such as “From the river to the sea, Palestine will be free” should be understood as an expression of a violent desire to see Israel erased from the world, and whether its use in certain contexts may amount to a racially aggravated section 5 public order offence.”

    This is political interference into policing, which the police seem to be largely resisting. Dame Lynne Owens, the Deputy Commissioner of the Metropolitan Police, clarified in a letter to London’s Jewish communities that, “an expression of support for the Palestinian people more broadly, including flying the Palestinian flag, does not, alone, constitute a criminal offence.”

    While protests against the occupation of Palestine have taken place across the country, the Christian Action Against Anti-Semitism say that they were pressured by police to cancel a “Pray for Israel and the Jewish People” event in North London this weekened. The Met also requested that the Campaign Against Antisemitism (CAA) volunteers turn off videos vans showing images of Israeli children kidnapped by Hamas. In both cases, the Met have claimed that decision was made to protect the safety of the protestors. But the police need to protect everyone’s rights to freedom of expression. Our colleagues at Liberty have written a useful explainer of the rights of people in the UK to protest.

    Online ‘terrorist’ content

    The UK now has dangerous laws within the Online Safety Bill that can, once it is enacted, be used to order companies to remove illegal content, including content related to terrorism. The scope of this content can also be easily extended, while Ofcom can also order companies to employ technologies to find and remove that content before it is posted.

    Braverman’s letter to police chiefs emphasised that her directives applied to online content as well as physical protests. It is not unimaginable that Braverman may wish companies to remove content featuring flags which she believes represent terrorism from platforms. This, or other terms or slogans, could be used as a means to “identify” content that is “probably” unlawful.

    The potential for suppression of debate is obvious. In the coming months, debates on what constitutes “accurate” identification of terrorist content will begin, through public consultations run by Ofcom. .

    Scanning private message

    The Online Safety Act gives Ofcom the powers to force tech companies to monitor private messages for child abuse material, even though the government admitted that these powers would not be used because the technology did not exist to do this safely. As ORG warned when the Bill was debated, it would be unsurprising if there were not mission creep, with these powers extended to look for terror-related content. While the Online Safety Act does not have these powers today, extending duties to scan is a matter of politics, as the legal frameworks are largely in place.

    Private messages are vital for human rights defenders, in Israel and Palestine ever more so. The UK however could be a leader in undermining their ability to work freely.

    Prevent duty

    Education Secretary Gillian Keegan has written to both schools and universities about the crisis and its impact on children and young people. While the letters recognise the impact of the Hamas attacks on students, they make no reference to the impact of Israel’s bombardment of Gaza and the numbers of Palestinians that have been killed. Schools need to ensure that all harms are recognised and that all children feel safe to talk about world events that affect them.

    The letters also remind educational establishments of their responsibilities under the Prevent duty, which requires them to refer students if they are concerned that they could be drawn into terrorism.

    Prevent is a flawed programme that undermines people’s right to be presumed innocent, and which has been shown to disproportionately impact Muslims. Since 2015 there have been more than 45,000 Prevent referrals. The overwhelming majority of these referrals resulted in no further action take by police but the data still languishes in police and local authority and other institutional systems, which could have long-term impacts on the individuals affected.

    Keegan’s directive could encourage schools and universities to refer students rather than engage in difficult and challenging conversations. This in itself could be counterproductive, confirming fears about the surveillance state and leading to affected people to look for more extreme views. We believe schools need to be supported in how they encourage freedom of expression rather than being compelled to act as an arm of state surveillance.

    Migrants’ rights

    We are concerned that the crisis will be used to dehumanise migrants, and further undermine their rights. Earlier this week, the Telegraph reported that Immigration Minister Robert Jenrick has asked civil servants to look into revoking the visas of people who ‘praise Hamas’. At Prime Minister’s Questions, Savid Javid echoed Jenrick when he asked Rishi Sunak to keep foreign nationals committing antisemitism and hate crimes out of the UK. While the government is right to tackle hate crimes, it’s not clear whether a conviction would be required before a visa is revoked. This could have a chilling effect on migrants who might feel that they are unable to comment on the crisis or attend protests for fear of having their visa revoked.

    Such decisions could easily be partially automated, or may rely on evidence that is flimsy, such as social media posts. Rapid changes in systems and laws made in crises are rarely fair and balanced. They usually serve the purpose of ensuring that governments are seen to act, rather than genuinely tackling problems with workable solutions.

    ]]>