The characteristic of an artificial intelligence system is to be able to develop elements of reflection in an autonomous way, that is to say independently of the human being. To do this, data are integrated into the system that will serve as the basis for its reasoning. Logically, Big Data is often used, since it is an extremely massive set of data. For example, Tesla used Big Data to design Autopilot, an autopilot feature that is part of the Autonomous Cars[i] project.
The Big Data also contains personal data, which Article 2 of the amended Data Protection Act of 6 January 1978 defines as « any information relating to an identified natural person or which can be identified, directly or indirectly, by reference to an identification number or to one or more elements specific to that person. In determining whether a person is identifiable, all means available or accessible to the controller or any other person for the purpose of identification must be considered[ii].
Moreover, it is far from easy to obtain the anonymization of personal data since Big Data exponentially increases the possibilities of cross-referencing between data, and consequently the possible identification of a person through its data[iii].
Nevertheless, it will now have to deal with the new European General Data Protection Regulation (GDPR) which was adopted on 27 April 2016 and will enter into force on 25 May 2018. The aim of this text is to ensure that European States adapt their legal frameworks to the new realities arising from digital developments. It is based on three pillars: strengthening the rights of individuals, empowering data processors and finally making regulation more credible, in particular through better cooperation between data protection authorities, including the CNIL[iv]. These authorities have several important powers, among which we can cite those to issue warnings, to give formal notice to the company but also to order the limitation, rectification or erasure of data.
We must also highlight the possibility for these authorities to impose extremely heavy penalties on entities that do not comply with the regulation, with the power to impose fines of between EUR 10 and 20 million, or between 2% and 4% of worldwide turnover if the offender is a company.
During the IA Paris event on June 7, 2017, lawyer Alain Bensoussan explained to the companies that they will have to « empty Data Lake of personal data that is not accompanied by the explicit consent of the persons concerned, otherwise there is a very serious risk of conviction[v] ».
In order to avoid being seriously criticised by data protection authorities, data controllers, that means all persons responsible for the purposes and means of such processing, have been imposed a number of rules to be respected.
First of all, they must apply the principle of « privacy by design », in other words do everything possible to ensure that personal data is respected from the design of the product or service.
Furthermore, the companies responsible for processing will be subject to an accountability principle, which means that they will have to establish a form of data governance in order to be able to report on their actions to the protection authorities if they so request. It follows from this principle that the obligations to produce administrative declarations concerning processing operations that do not involve the privacy of individuals are no longer relevant with this Regulation.
Processing operations that involve risks to privacy are those concerning the racial or ethnic origin, political, philosophical or religious opinions, trade-union membership, health or sexual orientation, or the genetic or biometric data of the individual concerned. The controller of this type of processing will be obliged to carry out a privacy impact study explaining « the characteristics of the processing, the risks and the measures adopted ». The protection authorities will have a say if the risk is high and will even be able to oppose it.
Data controllers must also treat the data collected securely and confidentially, and are even obliged to notify the competent protection authority in their country if they find a breach of personal data. This notification must then be made within 72 hours of the finding of the violation.
It is also relevant to underline the obligation for certain data controllers to designate a Data Protection Officer (DPO). With the GDPR, his appointment will become obligatory in three cases: when the controller exercises within the public sector, when it results from his activities the regular and systematic monitoring of individuals on a large scale or the processing of so-called sensitive data or data relating to criminal convictions or offences[vi].
This DPO’s mission will be to inform and advise the controller who employs him, to ensure compliance with the law concerning the protection of personal data, and finally to ensure cooperation with the data protection authority.
Therefore, the articulation between the provisions of the GDPR and the needs of artificial intelligence will be a significant issue in the coming months.