Intellectual Property and the EU AI Act: Implications for the Space Sector

Posted on November 19, 2024

The European Union’s AI Act, which came into force on the 1 August 2024, introduces substantial new requirements for generative AI, in particular regarding the disclosure of copyrighted works used in training these systems. Here’s what this means for AI developers and providers in the space sector.

Background on the AI Act Amendments

The amendments to the AI Act specifically target generative AI, mandating providers to publicly disclose details of copyrighted works used in training their models. This move addresses growing concerns about copyright infringement in the training processes of generative AI systems. These systems, capable of creating text, images, and other media, have seen explosive growth and investment, exemplified by OpenAI’s ChatGPT and Microsoft’s significant investments in AI.

For the space sector, where AI and machine learning are increasingly used for satellite data analysis, mission planning, and autonomous navigation, these new requirements mean that developers must be transparent about the data sources used to train their models.

New Obligations for Generative AI Providers

  1. Policy for Compliance: Providers of general-purpose models must establish policies to comply with Union copyright law, including honouring opt-outs from the EU’s commercial text and data mining (TDM) exception using state-of-the-art technologies.
  2. Global Scope: The AI Act requires compliance with EU copyright law even if the model is trained outside the EU. This ensures a level playing field by preventing providers from gaining a competitive advantage through lower copyright standards outside the EU.
  3. Disclosure Requirements: Providers must publicly disclose details of the content used for training their models. This includes listing main data collections or sets, such as large private or public databases, while protecting trade secrets and confidential business information. No exceptions are made for open-source models, and those fine-tuning general-purpose models must disclose new training data sources.

Broader IP Considerations

The AI Act aims to balance innovation with the protection of fundamental rights, including IP rights. The specifics of achieving this balance remain to be seen, but the legislation emphasises protecting the intellectual property of AI system developers, such as trade secrets, while mandating public disclosure under certain conditions. For the space sector, this means ensuring that suppliers of AI tools must provide necessary information for compliance without compromising their IP or trade secrets, and authorities applying the AI Act must also protect these interests.

Impact on the UK and Future Developments

Post-Brexit, the UK’s approach to AI regulation has diverged from the EU’s. The UK has not implemented the EU’s text and data mining exceptions, limiting such exceptions to non-commercial research. While the UK Government is consulting on broader exceptions, it has yet to propose obligations for AI providers to disclose copyrighted works used in training.

The UK aims for a lighter regulatory touch to foster innovation within the AI space. However, this approach may face challenges in a globalised market where the EU’s stringent regulations could set a de facto standard. The UK’s narrower exceptions and different regulatory frameworks could impact its competitiveness in AI development compared to the EU.

Practical Steps for AI Developers in the UK

UK-based AI developers in the space sector should collaborate closely with IP lawyers to navigate the complex landscape of training data and copyright compliance.

Key steps include:

  1. Identifying necessary works and applicable copyright restrictions.
  2. Evaluating the extraction of protected data and potential infringement risks.
  3. Considering jurisdiction-specific strategies for training and hosting models.
  4. Implementing safeguards like filters and human oversight to prevent infringing outputs.
  5. Documenting compliance efforts meticulously to adhere to regulatory requirements.

Compliance Deadlines

This landmark regulation takes a risk-based approach, categorising AI applications into levels: minimal, limited, high, and unacceptable risk. Full compliance deadlines vary by risk category: bans on “unacceptable” applications take effect six months post-enactment, while general-purpose AI compliance requirements and the Code of Practice must be met within twelve months. Most other obligations will apply two years after the Act’s entry into force, allowing time for companies and national authorities to align their practices with the new regulations.

Conclusion

  • The EU AI Act represents a significant step towards regulating the use of generative AI and protecting intellectual property (IP) rights.
  • The space sector must adapt to these new requirements, carefully considering where and how to train their models to ensure compliance with EU laws. This will involve detailed documentation, transparent disclosure of training data, and robust compliance policies to navigate the complex landscape of AI and IP.
  • The UK’s different regulatory approach adds another layer of complexity, emphasising the need for thorough legal guidance and strategic planning in AI development and deployment.

If you need help navigating this intellectual property ‘space’, feel free to contact ip21.

In case you missed it...


Register to find out more about Space East

    By completing this form you consent to us contacting you.

    Interested in being involved in Space East?

    Stuart Catchpole

    Space Cluster Manager