Dr Gemma Galdon-Clavell is a leading voice on technology ethics and algorithmic accountability. She is the founder and CEO of Eticas Consulting, where she is responsible for leading the management, strategic direction and execution of the Eticas vision. Gemma is also the head consultant for health data protection within the Horizon 2020 MOOD project, which is key when handling privacy, as people have the fundamental right to privacy and data protection.
One of the sectors where AI has been embed into rapidly, especially in the past few years is Healthcare. The advantages that might come from this are enormous in terms of time management and efficiency, so enormous that we are missing the point: caring.
It is critical to apply ethics and oversight when handling individuals’ data that is so crucial and impactful. At the MOOD science webinar on July 25th, Gemma explored the complexity behind this data and systems and shared the best practices to ensure success with the partners.
In order to achieve data protection, the EU’s General Data Protection Regulation (GDPR) advises respecting two principles in the construction of AI systems: data minimisation and data protection, both of which should never be used to hide bias or avoid accountability.
As for privacy, it should be safeguarded by data governance, to ensure accuracy and representativeness, but also to protect and allow oneself to manage personal data. To develop trust, both in data sharing and the uptake of sharing models, appropriate data protection could also be very helpful. It is important to keep an eye on ethical issues related to processing personal data and the use of non-personal data by AI systems when considering these topics.
To achieve these ethical requirements, Privacy and Data Governance should follow these guidelines (extract from “Ethics By Design and Ethics of Use Approaches for Artificial Intelligence” by the European Commission):
- The AI systems MUST process personal data in a lawful, fair and transparent manner.
- Appropriate technical and organizational measures MUST be set in place to safeguard the rights and freedoms of data subjects.
- Strong security measures MUST be set in place to prevent data breaches and leakages. – Data should be acquired, stored and used in a manner which can be audited by humans. All EU-funded research must comply with relevant legislation and the highest ethical standards. This means that all Horizon Europe beneficiaries must apply the principles enshrined in the GDPR.
That includes projects such as the Horizon 2020 MOOD project.
Watch the full video here to learn more!