More than one-third (34%) of compliance professionals in Ireland say artificial intelligence is making it more challenging for financial institutions to safeguard customer and other sensitive data, while just 7% feel it has made data protection easier.
The findings come from a new survey carried out by the Compliance Institute, Ireland’s professional body for compliance practitioners.
The survey, which gathered responses from approximately 150 compliance professionals working primarily across Irish financial services organisations, explored views on the impact of AI on data protection, as well as the steps firms are taking to comply with new requirements under the EU AI Act to ensure staff have an appropriate level of AI literacy.
Under Article 4 of the EU’s AI Act[1], which came into effect in February 2025, providers and deployers of AI systems are required to ensure that staff, and others acting on their behalf, have a sufficient level of AI literacy to support its responsible use. This should take into account their technical knowledge, experience, education and training, as well as the context in which the AI systems are used.
When asked whether their organisation has put measures in place to meet this requirement, three in ten (31%) compliance professionals said no action has yet been taken. A further one in five (22%) reported that measures are currently being implemented. Just one in three (32%) said their organisation has fully put measures in place, while 15% indicated that only some steps have been taken.
Michael Kavanagh, CEO of the Compliance Institute, commented on the findings,
“AI is increasingly being used in day-to-day operations across the sector, and that is changing how organisations think about governance, oversight and capability. What the results really show is a period of adjustment, where firms are actively building and strengthening the frameworks needed to support the safe and effective use of these technologies alongside their existing regulatory responsibilities.
In practice, AI is often being used across a range of functions at the same time, which can make it more difficult to maintain full visibility over how data is being accessed, processed and shared. As a result, we are seeing firms place much greater focus on governance structures, data mapping and ongoing monitoring, so they can retain the right level of oversight as adoption increases”.
Mr. Kavanagh continued,
“Just one in three organisations have already put measures in place to ensure a satisfactory level of AI literacy amongst staff, which indicates that a significant amount of implementation work is still progressing across the sector.
This is very much a transition phase and, while some firms have moved quickly to put structures and processes in place, others are still in the process of developing and embedding the necessary AI literacy frameworks to support compliance in a way that works within their existing governance and risk models.
The focus now is on ensuring that implementation is consistent, practical and fully integrated into day-to-day operations, so that organisations can meet the requirements of the Act in a structured and sustainable way”.