The aim of this discussion is to identify the main issues surrounding the AI Act requirements related to open source softwares and their possible solutions.
The AI Act provides some unique rules for open source artificial intelligence, while exempting it from some obligations under certain conditions.
In general terms, there are no exceptions for open source AI systems when it comes to high or unacceptable risk systems. In the same sense, providers of general-purpose artificial intelligence (GPAI) models under free and open source licence with systemic risk are also not exempt from the obligations of the Act.
However, third parties can be excluded from the scope of the regulation if they make certain elements, other than GPAI models, available under a free and open license.
Although the rules seem straightforward at first glance, they do involve some issues that may give rise to doubt and debate, such as:
- Exemption of third parties under the condition of non-monetization of their products
- Exemption for GPAI models intended for research
- Complexity of decentralized contributions from individual developers in open source projects
- Legal risk and uncertainty for developers