Harnessing Innovation and Growth Opportunities from AI Foundation Models

25 March 2025

How UK data protection, competition and consumer protection laws support a diverse range of open and closed-access foundation models

Stephen Almond, ICO Executive Director, Regulatory Risk
Will Hayter, CMA Executive Director, Digital Markets

From spurring scientific advances to enhancing productivity for people and businesses, foundation models (FMs)1 can solve complex problems, drive economic growth and transform our lives for the better. Their adaptability across different contexts and ability to scale can support businesses and consumers accessing new and improved products and services.

The CMA and ICO have taken forward extensive work to support FM development and deployment in line with our regulatory objectives and remits: 

  • The CMA published an initial review of FMs in 2023 followed by an update report in April 2024. Following extensive consultation, these papers set out six principles to guide the development and deployment of FMs, to promote a thriving AI ecosystem and positive outcomes for competition and consumer protection.2 These principles align with the CMA’s broader objectives to promote fair, competitive and open markets, taking action where necessary to achieve these aims.3 
  • The ICO undertook a series of consultations over the course of 2024 focussing on how specific aspects of data protection law apply to generative AI systems. The ICO’s response to this consultation details its policy positions on these issues4, and flags where further work and engagement with industry is needed to develop and inform ICO thinking. Consistent with its regulatory approach, the ICO will provide support and guidance to help organisations comply and innovate, while taking steps to protect people where organisations disregard their responsibilities or cause harm.5

Through the Digital Regulation Cooperation Forum, the CMA and ICO have worked closely for a number of years to enhance regulatory coherence and clarity where our regimes interact. In this article, we clarify our shared positions on open and closed-access FM approaches. In doing so, we outline the business actions we believe will support competitive and innovative FM markets while protecting consumers and respecting people’s information rights.  

What do we mean by open and closed-access FMs? 

There are many ways to release FMs along an open-access or closed-access spectrum, as set out below. Different approaches can entail different benefits and risks.6 

  • An open-access release approach refers to an FM having some or all of its key assets – such as their architecture, code, weights and biases and data – available for anyone to view, modify and use when the model is released. This can reduce barriers to accessing FMs and allow models to be adapted to a wide variety of use cases across the economy. This approach has similarities to open-source software and can help increase transparency and auditability of an FM via external reviews that can highlight issues and risks. This could help to support deployers in complying with their own consumer protection responsibilities to ensure that information generated by an AI model as part of the promotion or supply of a product to consumers is not misleading.
  • A closed-access release approach involves few or none of the FM’s key assets being open or publicly accessible, and instead access is controlled and shared only to the extent that a business (i.e. the FM developer) chooses. Closed-access models may be deployed by businesses for their own internal use and operations, or they may make them available to external parties to use – such as on a paid-for basis via an Application Programming Interface (API) – while maintaining control over the FM.

The question of whether an FM follows an open or closed-access approach will not have a binary answer, nor necessarily stay consistent over time. FMs can have different levels of access and subsequent controls. The release and access approach a developer chooses to use for an FM sits on a spectrum of openness according to the operating model and the objectives of the FM developer.

Joint CMA and ICO positions on open and closed-access 

In our engagement with business, we sometimes hear a perception that certain types of release approach are favoured by the CMA or ICO. This is not the case. All released models carry their own benefits and risks wherever they sit on the spectrum of openness. It is for firms to choose the approach that is most appropriate to their specific situation.

The CMA and ICO agree that existing UK data protection, competition, and consumer protection laws can support both open and closed-access models. If all regulatory requirements are complied with, we do not endorse or oppose a particular approach for releasing FMs. Indeed, a diversity of models supports a broad range of deployers to utilise FMs, supporting innovation, and providing the best outcomes for competition and consumers.

What is important is that those who are developing and releasing FMs ensure that, whatever their chosen approach, they have appropriate risk mitigations and safeguards in place to support effective data protection compliance7 and protection for consumers.8

What does this mean for developers and deployers?

As an example of appropriate mitigations, developers releasing an open-access FM trained on personal data should consider the implications of not having control over the downstream use of their model.9 To manage downstream risks where personal data is used, effective technical and organisational measures need to be developed and implemented, in line with a data protection by design and default approach.10 These developers may consider using licences or terms of use to ensure that deployers are using their models in a compliant way. To rely on these types of measures, FM developers will need to demonstrate that these documents and agreements contain data protection requirements and safeguards that are effective in practice.11

Developers releasing closed-access FMs trained on personal data might be able to rely more on technical controls, for example Application Programming Interfaces (APIs), that help them to monitor and control against data misuse downstream. These technical controls support data protection compliance and can help build consumer confidence. In this case, it is crucial that developers are also driving appropriate transparency for users by providing FM deployers with the right information about the model, training data and risks.

Transparency about how an FM has been developed for both open and closed-access models is necessary to support deployers in making informed decisions about personal data processing and help them verify their accountability for their own data protection and consumer protection compliance.

Coherence between competition, consumer protection and data protection regimes

Our 2021 Joint Statement12 sets out how our regulatory regimes interact on a number of areas. Many of these are relevant to FMs. 

  • Promoting user choice and control: Those developing and deploying FMs and FM-related products and services need to be open and honest with people about how their personal data is collected and used.  Without transparency about how their personal data is processed, it will be hard for people to exercise their information rights and make informed decisions as consumers, such as choosing between different products and services.
  •  Creating a level playing field for data access:  We agree that access to data is crucial to promoting competition and innovation in FM markets and that data protection law can facilitate fair and proportionate data-sharing and access. We would be concerned if businesses sought to interpret data protection laws as being unduly restrictive of access to personal data in order to shield themselves from competitive pressure and distort competition.
  • Allocating accountability across the supply chain:  Proper allocation of accountability and responsibility is key to incentivising businesses to prevent harms and will help foster a market that businesses and people continue to trust. In particular, developers sharing sufficient information with deployers about how an FM has been developed is important for enabling deployers to verify their compliance with their legal duties including consumer protection and data protection compliance.

We will continue to work together on these issues to support innovation and investment in FMs.

Next steps

We welcome further engagement with stakeholders on their experiences of AI in general, and on the issues raised in this article in particular. As the development and use of FMs matures, we will provide updated information where new issues arise or our views change. Through dialogue with business and others interested, we can help ensure FMs drive benefits for people, businesses and the whole UK economy through greater innovation, productivity and growth.

We at the CMA and ICO will continue to collaborate on a broad range of projects to identify the interactions between our regulatory remits and ensure clarity for stakeholders. This includes working together on online advertising issues, as well as collaboration on Strategic Market Status investigations under the CMA’s new digital markets competition regime.

If stakeholders are interested in engaging with us, please get in touch at:

digitalregulationcooperation@ico.org.uk

[1] Foundation models are base models for AI systems that are trained on large amounts of data. They can generate outputs such as text, images and audio, and be adapted to a range of tasks. 

[2] The six AI principles are: access; diversity; choice; fair dealing; transparency; and accountability. See full detail in AI Foundation Models technical update report.

[3] CMA Prioritisation Principles - GOV.UK (www.gov.uk).

[4] The consultation response does not cover the entirety of our regulatory expectations, which are covered in more detail in our core guidance on AI. See: Artificial intelligence | ICO  

[5] ICO25 – Our regulatory approach

[6] For further analysis and discussion on open and closed-access FMs see: CMA Technical update report (pp. 43 - 52) and Generative AI fifth call for evidence: allocating controllership across the generative AI supply chain | ICO .

[7] The ICO has set out positions on the allocation of controllership and accountability for data protection compliance across the FM supply chain. This includes consideration of open-access and closed-access approaches. Whatever approach is used for developing and releasing FMs, if an organisation is processing personal data, then data protection law will be applicable.   See: Generative AI fifth call for evidence: allocating controllership across the generative AI supply chain | ICO

[8] The CMA’s AI Foundation Models: Initial Report (in particular paragraphs 6.7 to 6.12) sets out the CMA’s general views on compliance with the CPRs and other consumer protection legislation (e.g. unfair terms law).

[9] See ICO blog “Generative AI fifth call for evidence: allocating controllership across the generative AI supply chain | ICO” for more information on these implications

[10] Data protection by design and default is a legal requirement. This means that when organisations develop FMs (whether they be open or closed-access) they must adopt appropriate measures to protect people’s right from the outset, before they start processing personal data. See: Engineering individual rights into generative AI models | ICO ; Data protection by design and default | ICO

[11] The lawful basis for web scraping to train generative AI models | ICO

[12]  Our Joint Statement set out our commitment to continue working together to enhance regulatory coherence and clarity where our regimes interact. See: Competition and data protection in digital markets joint statement.   

Back to top