(29 Apr 2025) With their ability to regulate the way in which information is accessed, used, shared and created, copyright laws potentially have a major role in deciding not just how new AI models are trained, but also how they can be used. Bearing in mind libraries’ wider ambitions for – and reservations about – AI, a new IFLA Statement, prepared by the Advisory Committee on Copyright and other Legal Matters and approved by the Governing Board, sets out recommendations for governments and libraries alike.
IFLA’s statement on copyright and artificial intelligence has been prepared by IFLA’s Advisory Committee on Copyright and other Legal Matters, in collaboration with IFLA’s Artificial Intelligence Special Interest Group and others. It sets out key considerations and recommendations. It starts by underlining the potential of AI to support libraries’ achievement of their missions, as well as growing efforts to restrict AI training.
The Statement also notes that some countries already have copyright laws that are flexible enough to allow for the development of new technologies, and highlights the importance of looking outside of copyright law in order to find solutions to the challenges that certainly exist.
In terms of recommendations, it calls for advocacy in favor of limitations and exceptions that enable text-and-data mining of legitimately acquired or accessed content, as well as for access to the widest possible datasets as defense against bias and error.
Libraries too can support this by making collections AI-ready with appropriate licenses, and respecting FAIR and CARE principles. Crucially, it suggests that copyright should not be used as a blunt force tool for addressing ethical issues, and endorses the ICOLC statement on artificial intelligence and licensing.
The statement underlines the range of practical steps that governments and libraries alike can take, from building capacity and awareness, making ethical choices around AI tools, and developing guidance and monitoring capacity.
The statement can be accessed here.
The announcement is here.