Frequently Asked Questions
-
Here are the key factors that determine whether GDPR applies to your business:
You are subject to GDPR if:
Your business is established in the EU/EEA, regardless of where your data processing takes place
Your business is outside the EU but offers goods or services to people in the EU (even for free)
Your business is outside the EU but monitors the behaviour of people in the EU (e.g. tracking, profiling, cookies)
You process personal data on behalf of an EU-based organisation (as a data processor)
Personal data includes:
Names, email addresses, phone numbers
IP addresses, cookie identifiers, location data
Health, financial, biometric or genetic data
Any information that can identify a living individual, directly or indirectly
GDPR applies regardless of:
The size of your business (no minimum size threshold)
Whether processing is automated or manual (if data is held in a structured filing system)
Whether you charge for your services or offer them free of charge
Limited exemptions may apply if:
You process personal data for purely personal or household activity
You are a small business processing only employee data with no EU customers or users (though other national laws may still apply)
Bottom line: If you collect, store, use or share any personal data relating to people in the EU — GDPR almost certainly applies to you.
GDPR Applies to AI in Full GDPR does not contain a carve-out for artificial intelligence. If an AI system processes personal data about individuals in the EU — at any stage, whether during training, testing, or live deployment — GDPR applies in full. The regulation is technology-neutral by design, and AI is no exception.
Using AI Tools in Your Business Many organisations are now using AI tools — chatbots, document analysis tools, AI-assisted recruitment platforms, customer service automation, and more. Each of these involves personal data and triggers GDPR obligations. Before deploying any AI tool, you must identify what personal data it processes, on what legal basis, for what purpose, and with what safeguards in place.
Your Vendor is Likely a Data Processor Where you use a third-party AI tool that processes personal data on your behalf, that vendor is a data processor under GDPR. You are the data controller. This means you must have a Data Processing Agreement (DPA) in place with the vendor before any processing begins, and you remain responsible for ensuring the vendor meets GDPR standards. Do not assume compliance — ask for evidence.
Legal Basis is Required for Every Use Every AI processing activity involving personal data requires a lawful basis under Article 6. Depending on the context, this might be legitimate interests, contract performance, or consent. Special category data — such as health information, biometric data, or data revealing racial or ethnic origin — processed by AI systems requires both an Article 6 basis and a separate condition under Article 9. This is a high bar and must be carefully assessed.
A DPIA is Almost Always Required Deploying AI that processes personal data — particularly at scale, or involving profiling, automated decision-making, or sensitive data — will typically constitute high-risk processing under Article 35. This triggers a mandatory Data Protection Impact Assessment before the system goes live. The DPIA must identify risks, assess their severity, and document the measures taken to mitigate them.
Automated Decision-Making Carries Specific Rules Article 22 of GDPR gives individuals the right not to be subject to decisions made solely by automated means that produce significant legal or similarly significant effects — such as loan decisions, recruitment screening, or insurance pricing. Where AI is used in this way, you must either obtain explicit consent, establish contractual necessity, or rely on a specific legal authorisation. In all cases, individuals must be able to request human review of the decision.
Transparency is Non-Negotiable Individuals must be informed when AI is being used to process their data, what it is being used for, and — where automated decision-making applies — meaningful information about the logic involved. Vague references to "automated systems" or "algorithms" in a privacy policy do not satisfy this requirement. The explanation must be clear enough for an ordinary person to understand what is happening and why.
Data Minimisation Applies to AI AI systems have an appetite for data — more data often means better performance. GDPR pushes back against this. You must only use the personal data that is actually necessary for the specific purpose. Feeding an AI system with excessive or irrelevant personal data because it might improve results is not compatible with the data minimisation principle under Article 5.
Individuals' Rights Are Technically Challenging Where personal data has been used to train an AI model, individuals' rights — particularly the right of access and the right to erasure — become technically complex. If a model has been trained on personal data and that data cannot easily be extracted or deleted, you face a significant compliance problem. This must be considered at the design stage, before training begins.
The EU AI Act Runs Alongside GDPR The EU AI Act, which is being phased in from 2024 to 2027, adds a separate but complementary layer of obligations. It classifies AI systems by risk level — from minimal to unacceptable — and imposes strict requirements on high-risk systems including transparency, human oversight, accuracy, and robustness. GDPR and the AI Act must be considered together; compliance with one does not guarantee compliance with the other.
Data Protection by Design Article 25 of GDPR requires data protection to be built into AI systems from the earliest design stage — not addressed as an afterthought. This means choosing privacy-preserving approaches where possible, limiting data collection, implementing strong access controls, and conducting privacy reviews throughout development and deployment.
AI does not exist outside GDPR — it sits squarely within it. Businesses using AI must identify their legal basis, carry out DPIAs, put vendor agreements in place, respect individuals' rights, and ensure transparency. With the AI Act also coming into force, the regulatory landscape around AI is becoming more demanding, not less. Building compliance in from the start is far more effective than retrofitting it later.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to your business, contact us now.
-
The short answer is: probably yes, if your website uses cookies beyond those that are strictly necessary to make it function.
Cookie banners are required under ePrivacy legislation (in Ireland and the EU, this is the ePrivacy Regulations, which work alongside GDPR). This law requires that you obtain prior, informed consent before placing non-essential cookies on a user's device.
Strictly necessary cookies — such as those that keep a user logged in or remember items in a shopping cart — do not require consent. However, you cannot use this exemption broadly; it applies only to cookies that are technically essential for the service the user has requested.
Any other type of cookie requires consent before being set. This includes analytics cookies (e.g. Google Analytics), advertising or tracking cookies, social media cookies, and performance or personalisation cookies. Consent must be freely given, specific, informed, and unambiguous — a pre-ticked box or continued browsing does not count.
Your cookie banner must give users a genuine choice. Regulators across the EU, including the Irish Data Protection Commission (DPC), have been clear that accepting cookies must be as easy to do as refusing them. A banner that only offers an "Accept All" button, with no equally prominent option to reject, is not compliant.
You should also provide a Cookie Policy on your website that explains what cookies you use, their purpose, who sets them, and how long they last.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to your business, contact us now.
-
GDPR gives regulators significant powers to fine organisations that breach the rules. There are two tiers of administrative fines, depending on the seriousness of the infringement.
The lower tier covers less severe breaches — such as failing to maintain proper records of processing activities, not having appropriate data processing agreements in place, or failing to notify a data breach in time. Fines at this level can reach up to €10 million, or 2% of global annual turnover, whichever is higher.
The upper tier applies to more serious violations — such as breaching the core principles of data processing, failing to have a lawful basis for processing, violating individuals' rights, or transferring personal data internationally without adequate safeguards. These fines can reach up to €20 million, or 4% of global annual turnover, whichever is higher.
Fines are not automatic. Regulators consider a range of factors before imposing a penalty, including the nature and severity of the breach, whether it was intentional or negligent, the number of people affected, the steps taken to mitigate harm, and how cooperative the organisation was during the investigation.
Beyond fines, regulators have other enforcement powers. These include issuing warnings and reprimands, ordering organisations to stop processing data, requiring data to be deleted, and imposing temporary or permanent bans on processing.
Individuals also have rights under GDPR. A person who suffers material or non-material damage as a result of a GDPR breach — such as financial loss, distress, or reputational harm — can seek compensation through the courts, independent of any regulatory action
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to your business, contact us now.
-
Under GDPR, you must have a valid legal reason — known as a "lawful basis" — before you can process anyone's personal data. Processing without one is unlawful, regardless of your intentions. There are six lawful bases in total, and you must identify the most appropriate one before processing begins as you cannot swap between them after the fact.
Consent means the individual has freely given a clear, specific, and unambiguous agreement to their data being processed. It must be as easy to withdraw as it is to give, and pre-ticked boxes or assumed agreement do not qualify. Consent is not always the most appropriate basis and is often over-relied upon by organisations.
Contract applies where processing is necessary to perform a contract with the individual, or to take steps at their request before entering into one. For example, processing a customer's address in order to deliver goods they have purchased.
Legal obligation covers situations where you are required to process personal data to comply with a law or regulation. For example, retaining employee payroll records to meet tax obligations.
Vital interests is a narrow basis that applies where processing is necessary to protect someone's life. It is intended for emergency situations and is rarely the appropriate basis for routine business processing.
Public task applies to public authorities or organisations carrying out tasks in the public interest, where those tasks are laid down in law. It is generally not relevant to most private businesses.
Legitimate interests is the most flexible basis and can apply where you have a genuine and proportionate business reason to process personal data, provided that reason is not overridden by the rights and interests of the individual. It requires you to carry out and document a Legitimate Interests Assessment (LIA) to demonstrate the balance has been considered.
Choosing the right basis matters. It affects the rights individuals can exercise against you — for example, the right to erasure applies differently depending on which lawful basis you rely on. You should document your chosen basis in your Records of Processing Activities and communicate it clearly in your Privacy Notice.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to your business, contact us now.
-
Under GDPR, individuals — referred to as "data subjects" — have a set of rights over their personal data. These rights give people greater control over how their information is collected, used, and stored. As an organisation that processes personal data, you are obliged to respect and facilitate these rights.
The right to be informed means individuals must be told how and why their personal data is being used, usually through a clear and accessible Privacy Notice. This information must be provided at the time the data is collected.
The right of access — commonly known as a Data Subject Access Request (DSAR) or Subject Access Request (SAR) — allows individuals to request a copy of the personal data you hold about them, along with information about how it is being used. You generally have one month to respond, and in most cases you cannot charge a fee.
The right to rectification entitles individuals to have inaccurate or incomplete personal data corrected without undue delay. If you have shared that data with third parties, you must also inform them of the correction where possible.
The right to erasure — sometimes called the "right to be forgotten" — allows individuals to request the deletion of their personal data in certain circumstances, such as where the data is no longer necessary for the purpose it was collected, or where consent is withdrawn and there is no other lawful basis to continue processing.
The right to restrict processing allows individuals to request that you limit how you use their data in certain situations — for example, while the accuracy of data is being contested, or where they have objected to processing and a decision is pending.
The right to data portability allows individuals to receive their personal data in a commonly used, machine-readable format, and to have it transferred directly to another organisation where technically feasible. This right applies only where processing is based on consent or contract, and is carried out by automated means.
The right to object gives individuals the ability to object to processing based on legitimate interests or for direct marketing purposes. Where an objection to direct marketing is raised, you must stop processing immediately with no exceptions.
Rights related to automated decision-making and profiling protect individuals from being subject to decisions made solely by automated processes — including profiling — where those decisions have a significant or legal effect on them. In such cases, individuals have the right to request human intervention and to challenge the decision.
You must have clear processes in place to handle these requests. Rights requests should be responded to within one month, though this can be extended by a further two months in complex cases. You cannot ignore requests, and failing to respond adequately can lead to complaints to the Data Protection Commission and potential enforcement action.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to your business, contact us now.
-
Handling a Data Breach Under GDPR
Identify and Contain As soon as a breach is discovered, act immediately to contain it — isolate affected systems, stop the source of the breach, and preserve evidence. Every hour counts, as your 72-hour notification clock starts from the moment you become "aware" of the breach.
Assess the Risk Evaluate what personal data was affected, how many individuals are impacted, and the likely consequences (e.g. identity theft, financial loss, reputational harm). Not every breach requires notification — only those likely to result in a risk to individuals' rights and freedoms.
Notify the DPC (within 72 hours) If the breach poses a risk to individuals, you must notify your supervisory authority — in Ireland, the Data Protection Commission (DPC) — within 72 hours of becoming aware. If you can't provide full details in time, submit an initial notification and follow up. Late notifications must include a reason for the delay.
Notify Affected Individuals (if high risk) If the breach is likely to result in a high risk to individuals (e.g. sensitive data exposed, financial data stolen), you must also notify those individuals directly — without undue delay. The communication must be clear, plain-language, and explain what happened and what steps they can take.
Document Everything Under Article 33(5), you are required to document all breaches — even those you decide don't require DPC notification. Your internal breach register should record the facts, effects, and remedial actions taken.
Review and Remediate Once the immediate crisis is managed, conduct a post-breach review. Update your security measures, retrain staff if human error was a factor, and revise your incident response procedures to prevent recurrence.
Key timeframes to remember:
72 hours — notify the DPC
Without undue delay — notify affected individuals if high risk
Immediately — begin containment and internal documentation
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to your business, get our free download 10 steps here or contact us now.
-
Yes — the short answer is yes, GDPR Applies Fully to any processing of personal data, regardless of the technology used. If an AI system or large language model (LLM) processes, generates, stores, or analyses personal data about EU individuals, GDPR applies. The regulation does not exempt AI — it is technology-neutral by design.
Training Data is a Key Risk Area LLMs are trained on vast datasets that may contain personal data scraped from the internet — names, emails, opinions, and more. Collecting and using this data for training constitutes "processing" under GDPR and requires a valid legal basis under Article 6. This is one of the most contested compliance issues in AI today.
AI Outputs Can Constitute Personal Data If an AI model can generate or reproduce information that identifies a living individual — even indirectly — that output may itself be personal data. This creates obligations around accuracy, data minimisation, and individuals' rights.
Individuals Have Rights Over Their Data Under GDPR, individuals have the right to access, rectify, and erase their personal data (Articles 15–17). Applying these rights to AI systems is technically challenging — particularly the right to erasure ("right to be forgotten") where personal data may be embedded within a model's weights.
Automated Decision-Making Rules Apply Article 22 GDPR gives individuals the right not to be subject to solely automated decisions that have significant effects on them. If an AI system makes decisions about people — in hiring, lending, insurance, or similar — specific safeguards and human oversight must be in place.
Data Protection by Design is Required Article 25 requires data protection to be built into AI systems from the outset — not bolted on afterwards. This means minimising data collection, implementing access controls, and considering privacy implications at the design stage.
A DPIA is Likely Required Where AI processing is likely to result in high risk to individuals — particularly when using sensitive data, profiling, or large-scale processing — a Data Protection Impact Assessment (DPIA) under Article 35 is mandatory before the processing begins.
The AI Act Adds Another Layer Alongside GDPR, the EU AI Act introduces additional obligations based on the risk level of the AI system. High-risk AI systems face strict requirements around transparency, human oversight, and accountability. GDPR and the AI Act operate together and must both be considered.
Bottom line: GDPR applies to AI and LLMs in full. Businesses deploying or developing AI systems that touch personal data must assess their legal basis, conduct DPIAs where required, respect individuals' rights, and ensure data protection is embedded in the system from design through to deployment.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to all things AI and your business, contact us now.
-
What is a DPIA? A Data Protection Impact Assessment (DPIA) is a structured process used to identify and minimise the data protection risks of a project or system before it goes live. It is a requirement under Article 35 of GDPR and is one of the key tools for demonstrating accountability — showing that your organisation has actively considered privacy risks and taken steps to address them.
What Does a DPIA Involve? A DPIA should describe the processing activity, assess its necessity and proportionality, identify the risks to individuals, and set out the measures you will put in place to mitigate those risks. It is not a one-off tick-box exercise — it should be a living document reviewed as the project evolves.
When is a DPIA Mandatory? Under Article 35, a DPIA is required where processing is "likely to result in a high risk" to individuals. The GDPR specifically requires a DPIA in three scenarios: large-scale systematic monitoring of public areas, large-scale processing of special category data, and automated decision-making that produces significant effects on individuals.
Other Triggers for a DPIA The Data Protection Commission (DPC) has published a list of processing activities that always require a DPIA in Ireland. Beyond those, a DPIA is generally needed when you are profiling individuals at scale, processing children's data, using new technologies, combining datasets in ways individuals would not expect, or processing data that could result in physical, financial, or reputational harm.
Does AI Trigger a DPIA? Almost always, yes. Deploying an AI or machine learning system that processes personal data — particularly one involving profiling, automated decisions, or sensitive data — will typically meet the high-risk threshold and require a DPIA before deployment.
What if the Risks Cannot be Mitigated? If your DPIA identifies high risks that cannot be reduced to an acceptable level through technical or organisational measures, you must consult the DPC before proceeding. This is known as prior consultation under Article 36. Proceeding without doing so is a breach of GDPR.
Who Should Carry Out the DPIA? The Data Controller is responsible for ensuring the DPIA is carried out. Your Data Protection Officer (DPO), if you have one, must be consulted as part of the process. In practice, DPIAs are often completed by a team including legal, IT, and operational staff, with the DPO providing oversight.
When Should You Do a DPIA? A DPIA must be completed before the processing begins — not after. Starting a high-risk project without a DPIA in place is itself a GDPR violation, regardless of whether any harm actually occurs.
If you are starting a new project, deploying a new technology, or significantly changing how you process personal data — and there is any possibility of high risk to individuals — carry out a DPIA first. It protects individuals, demonstrates accountability, and could save your organisation from significant regulatory scrutiny.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to all things Data Protection and your business, contact us now.
-
Why Cross-Border Transfers Are Regulated GDPR restricts the transfer of personal data to countries outside the European Economic Area (EEA) to ensure that the level of protection afforded to individuals under EU law travels with their data. Simply put, you cannot export personal data to a third country and leave individuals' rights behind.
Adequacy Decisions — This is the Simplest Route The European Commission can formally decide that a third country provides an adequate level of data protection. Where an adequacy decision exists, data can flow freely without any additional safeguards. Countries with adequacy decisions include the UK, Japan, Canada (partially), and Switzerland. The EU-US Data Privacy Framework, adopted in 2023, currently provides adequacy for certified US organisations.
Standard Contractual Clauses (SCCs) Where no adequacy decision exists, the most commonly used mechanism is Standard Contractual Clauses — pre-approved contract terms issued by the European Commission that bind both the sender and receiver to GDPR-equivalent protections. Updated SCCs were issued in 2021 and cover a range of transfer scenarios including controller-to-controller and controller-to-processor transfers.
Transfer Impact Assessments (TIAs) Since the Schrems II ruling in 2020, organisations using SCCs must also carry out a Transfer Impact Assessment — an analysis of whether the laws of the destination country could undermine the protections offered by the SCCs. If they do, supplementary measures must be put in place or the transfer should not proceed.
Binding Corporate Rules (BCRs) Multinational organisations can adopt Binding Corporate Rules — internal codes of conduct approved by a supervisory authority — to govern transfers of personal data within a corporate group. BCRs are a robust but time-consuming mechanism, more suited to large enterprises with complex international data flows.
Other Lawful Mechanisms In limited circumstances, transfers can be based on other grounds under Article 49, including explicit consent of the individual, necessity for the performance of a contract, or important reasons of public interest. These derogations are intended to be exceptional and should not be used as a routine basis for transfers.
Cloud Services and SaaS Tools Many businesses transfer data outside the EU without realising it — by using US-based cloud platforms, CRM systems, email services, or analytics tools. If your vendor processes personal data on servers outside the EEA, that is a cross-border transfer and must be covered by an appropriate mechanism, typically SCCs incorporated into a Data Processing Agreement.
Document Your Transfers Your Record of Processing Activities (ROPA) under Article 30 should identify all transfers to third countries and the legal mechanism relied upon. The DPC may request this information, and being unable to demonstrate a lawful transfer basis is a significant compliance risk.
Before transferring personal data outside the EEA, identify the destination country, check whether an adequacy decision exists, and if not, put Standard Contractual Clauses and a Transfer Impact Assessment in place. Review your cloud and software vendors — cross-border transfers are often hidden in plain sight within everyday business tools.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to all things Data Protection and your business, contact us now.
-
Item descriptionWhat is the GDPR Omnibus? The GDPR Omnibus refers to a package of proposed amendments to GDPR put forward by the European Commission as part of its broader drive to reduce regulatory burden on businesses — particularly small and medium enterprises (SMEs). It forms part of the EU's Competitiveness Compass and Omnibus simplification agenda, announced in early 2025.
What Changes Are Being Proposed? The Commission has proposed targeted but meaningful changes to GDPR, including reducing the obligation to maintain Records of Processing Activities (ROPA) for organisations with fewer than 500 employees unless the processing is high risk. It also proposes simplifying Data Protection Impact Assessments and streamlining cooperation mechanisms between supervisory authorities for cross-border cases.
SME Relief is a Central Theme A core aim of the proposals is to ease the compliance burden on smaller organisations. Many SMEs have struggled with the administrative weight of GDPR since its introduction in 2018, and the Omnibus seeks to recalibrate proportionality — ensuring that compliance obligations more accurately reflect the actual risk posed by an organisation's processing activities.
What is Not Changing The fundamental principles of GDPR — lawfulness, fairness, transparency, data minimisation, purpose limitation, accuracy, storage limitation, and accountability — are not being altered. Individuals' core rights remain intact. The Omnibus is a simplification exercise, not a weakening of data protection standards.
Where Are the Proposals Now? As of early 2026, the proposals are working their way through the EU legislative process, requiring agreement from both the European Parliament and the Council of the EU. This process typically takes considerable time, and the final text may differ significantly from the Commission's initial proposals. No changes are in force yet.
The EU AI Act Intersects Here Too Separately from the Omnibus, the EU AI Act is being phased in throughout 2025 and 2026, adding a parallel layer of obligations for organisations using AI systems. While not an amendment to GDPR, it interacts closely with it — particularly around automated decision-making, high-risk AI, and data governance requirements.
Should You Change Your Compliance Approach? Not yet. Until proposals are formally adopted and transposed, current GDPR obligations remain fully in force. Organisations should continue to meet their existing requirements while monitoring developments. Those who have invested in solid compliance frameworks are well placed — the fundamentals are not going away.
Change is coming to GDPR, but it is evolutionary rather than revolutionary. The Omnibus aims to reduce administrative friction for smaller organisations without dismantling the core rights and protections that GDPR provides. Watch this space for updates, but keep complying in full in the meantime.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to all things Data Protection and your business, contact us now.
-
What is a DPO? A Data Protection Officer is a formally designated role under GDPR responsible for overseeing an organisation's data protection strategy and ensuring compliance with the regulation. The DPO acts as an internal expert, adviser, and point of contact — both for staff within the organisation and for the supervisory authority (in Ireland, the DPC).
When is a DPO Mandatory? Article 37 of GDPR requires organisations to appoint a DPO in three specific circumstances:
1. where the organisation is a public authority or body;
2. where the core activities involve large-scale, regular, and systematic monitoring of individuals (such as behavioural tracking or surveillance); or
3. where the core activities involve large-scale processing of special category data or data relating to criminal convictions.
What Does "Core Activities" Mean? Core activities are the primary business processes of the organisation — not incidental or support functions like payroll or HR. A hospital processing patient medical records, an insurer processing health data, or an advertising platform tracking user behaviour online would all likely meet the threshold. A small retail business processing customer orders generally would not.
What if I Am Not Required to Appoint One? If mandatory appointment does not apply to you, it is still good practice to designate someone internally with responsibility for data protection compliance. This does not need to be a formal DPO — it could be a compliance manager, legal counsel, or a senior person with appropriate knowledge — but having a named, accountable individual strengthens your overall governance.
Can the DPO Be External? Yes. GDPR expressly permits organisations to appoint an external DPO — an individual or service provider contracted to fulfil the role. This is a practical and cost-effective option for smaller organisations that require a DPO but do not have the resources to employ one full-time. The external DPO must still be accessible to staff and the DPC.
What Are the DPO's Key Responsibilities? The DPO's role includes informing and advising the organisation on its GDPR obligations, monitoring compliance, advising on DPIAs, cooperating with the supervisory authority, and acting as the first point of contact for individuals exercising their rights. The DPO must be provided with the resources needed to carry out these tasks effectively.
DPO Independence is Essential A DPO must be able to perform their duties independently. They cannot be dismissed or penalised for performing their role and must not hold a position that creates a conflict of interest — for example, a DPO should not simultaneously act as Head of Marketing or Chief Technology Officer, where data protection decisions could conflict with commercial objectives.
Notify the DPC of Your DPO Where a DPO is appointed — whether mandatory or voluntary — their contact details should be published and communicated to the DPC. In Ireland, this can be done through the DPC's online notification system.
A DPO is mandatory if your organisation is a public body, conducts large-scale systematic monitoring, or processes special category data at scale. If you fall outside these criteria, formal appointment is not required — but designating someone with clear data protection responsibility remains a sound and advisable practice.
Note: We are not lawyers, and this is general guidance only. We offer outsourced Data Protection Officer as a service, click here for more. For specific advice in relation to all things Data Protection Officer and your business, contact us now.
-
The Core Definition Under Article 4 of GDPR, personal data is any information relating to an identified or identifiable living individual. That individual is referred to as the "data subject." The definition is deliberately broad — if information can be used, directly or indirectly, to identify a person, it is personal data and GDPR applies.
Direct Identifiers Some data obviously identifies a person — a name, email address, phone number, home address, or national identity number. These are direct identifiers and are straightforwardly personal data. Most organisations recognise these immediately.
Indirect Identifiers Data that does not name a person but can still be used to identify them is also personal data. This includes IP addresses, device identifiers, location data, cookie IDs, vehicle registration numbers, and employee reference numbers. On their own they may seem anonymous — combined with other data they can single out an individual.
Online Identifiers Count GDPR explicitly brought online identifiers within scope. IP addresses, cookie identifiers, advertising IDs, and similar digital traces have all been found to be personal data where they can be linked back to an individual. This has significant implications for website analytics, digital advertising, and any business with an online presence.
Special Category Data — Higher Protection Certain categories of personal data are treated as especially sensitive and attract stricter rules under Article 9. These include health data, racial or ethnic origin, religious beliefs, political opinions, trade union membership, genetic data, biometric data, sexual orientation, and data relating to criminal convictions. Processing this data requires both a lawful basis under Article 6 and a separate condition under Article 9.
Pseudonymised Data is Still Personal Data If data has been pseudonymised — for example, names replaced with codes — it remains personal data under GDPR if it is still possible to re-identify the individual using additional information. Only truly anonymous data, where re-identification is genuinely impossible, falls outside the scope of GDPR.
Data About Employees and Sole Traders Personal data is not limited to customers. Employee records, job applicant information, and data about sole traders or individual contractors all constitute personal data and must be handled in compliance with GDPR. Many organisations overlook internal HR data as a significant area of compliance obligation.
The "Relates To" Test A useful way to assess whether information is personal data is to ask whether it relates to an individual — either in its content, its purpose, or its likely effect. If the information is used to evaluate, influence, or make decisions about a person, it almost certainly qualifies as personal data regardless of its form.
Personal data under GDPR is far broader than most people assume. If information can identify a living individual — directly or indirectly, online or offline, in records or in systems — it is personal data and your obligations under GDPR apply in full. When in doubt, treat it as personal data.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to all things Data Protection and your business, contact us now.
-
There Are Six Lawful Bases — Not Just Two A common misconception is that GDPR requires consent for all data processing. In fact, Article 6 of GDPR provides six lawful bases for processing personal data. Consent is one. Legitimate interests is another. Neither is superior to the other — the key is identifying which basis is most appropriate for your specific processing activity.
What is Legitimate Interests? Legitimate interests (Article 6(1)(f)) allows you to process personal data without consent where you have a genuine, legitimate purpose, where the processing is necessary to achieve that purpose, and where your interests are not overridden by the rights and interests of the individuals whose data you are using. It is the most flexible lawful basis but requires careful assessment before it can be relied upon.
The Three-Part Test Before relying on legitimate interests, you must carry out a Legitimate Interests Assessment (LIA) — an informal but important balancing exercise with three steps. First, identify a legitimate interest (commercial, operational, or societal). Second, confirm the processing is necessary to achieve it. Third, balance your interests against the individual's rights and reasonable expectations. If the individual's interests override yours, legitimate interests cannot be used.
What Qualifies as a Legitimate Interest? There is no exhaustive list, but recognised examples include fraud prevention, network and information security, direct marketing to existing customers, intra-group transfers for administrative purposes, processing for safeguarding purposes, and using client data to deliver a contracted service. The interest must be real and present — not speculative or hypothetical.
Legitimate Interests is Not a Shortcut Around Consent This is a critical point. Legitimate interests should not be chosen simply because obtaining consent is inconvenient or because you think individuals might refuse. The lawful basis you choose must genuinely reflect the nature of your processing. Selecting legitimate interests as a workaround for consent — particularly for intrusive or unexpected processing — is a misuse of the basis and leaves you exposed to regulatory challenge.
Where Legitimate Interests Cannot Be Used Legitimate interests cannot override the requirement for explicit consent where special category data is involved. It also cannot be used by public authorities acting in their official capacity. And where individuals would not reasonably expect their data to be used in the way you intend — or where the processing could cause real harm — their interests are likely to override yours and legitimate interests will not apply.
Direct Marketing — A Common Use Case The GDPR Recitals specifically acknowledge direct marketing as a potential legitimate interest. However, this does not give blanket permission. Electronic direct marketing to individuals is separately governed by ePrivacy rules (in Ireland, the ePrivacy Regulations), which generally require consent for email and SMS marketing. Legitimate interests may support postal marketing or B2B communications in certain contexts, but always review both GDPR and ePrivacy obligations together.
Document Your Assessment If you rely on legitimate interests, you must document your Legitimate Interests Assessment. You must also inform individuals — typically in your privacy notice — that you are relying on this basis and that they have the right to object to the processing at any time under Article 21. If an individual objects and you cannot demonstrate compelling grounds to continue, you must stop.
We are experts in assisting companies to maximise their data using the Legitimate interest exception, but remember while it is a valid and useful lawful basis, it is not a free pass. However when used correctly, it offers flexibility; when misused, it creates significant compliance risk.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to all things Legitimate Interest, your data and your business, contact us now.
-
Why a Privacy Policy Matters A privacy policy is not just a legal formality — it is how you fulfil your transparency obligations under GDPR. Articles 13 and 14 require organisations to provide clear, accessible information to individuals about how their personal data is being used. Failing to do so is a breach of GDPR in its own right, regardless of whether any other harm occurs.
Article 13 vs Article 14 — What's the Difference? Article 13 applies where you collect personal data directly from the individual — for example, through a website form, account registration, or job application. Article 14 applies where you obtain personal data about an individual from a third party or another source — for example, purchasing a marketing list or receiving data from a partner organisation. Both require you to provide privacy information, but the timing and method of delivery differ.
What Must Your Privacy Policy Include? Under Articles 13 and 14, your privacy policy must contain the identity and contact details of your organisation, contact details of your DPO if you have one, the purposes and legal basis for each processing activity, details of any legitimate interests relied upon, who you share data with and why, details of any transfers outside the EEA and the safeguards in place, how long you retain data, individuals' rights and how to exercise them, the right to withdraw consent where consent is the lawful basis, the right to lodge a complaint with the DPC, and whether providing data is a contractual or statutory requirement.
Plain Language is a Legal Requirement Your privacy policy must be written in clear, plain language that is easily understood by your intended audience. Lengthy, legalistic text buried in small print does not satisfy GDPR's transparency requirements. If your policy is aimed at the general public — or at children — the language must be pitched accordingly. Concise, layered notices that summarise key points upfront with links to fuller detail are considered best practice.
When Must You Show It? Where data is collected directly from individuals (Article 13), privacy information must be provided at the time of collection — not afterwards. Where data is obtained from a third party (Article 14), you must provide the information within one month of obtaining it, or at the point of first contact with the individual if you intend to communicate with them.
A Single Privacy Policy is Rarely Enough Many organisations rely on a single privacy policy page on their website and consider the job done. In practice, GDPR requires that privacy information is provided in context — at the point where data is collected. This means a pop-up or notice at a sign-up form, a layered notice on a contact page, or specific information provided during a job application process. A generic link to a privacy policy in a website footer does not always satisfy the at-the-time-of-collection requirement.
Keep It Current Your privacy policy must accurately reflect your actual data processing activities. If you introduce a new system, share data with a new third party, or change your retention periods, your privacy policy must be updated accordingly. Outdated policies that no longer reflect reality are a common finding in regulatory investigations.
Cookies and Your Privacy Policy Your privacy policy should address your use of cookies and similar tracking technologies — but a separate, standalone cookie policy is also advisable. Cookie consent and the information provided around it is governed by both GDPR and ePrivacy rules and warrants dedicated, specific disclosure beyond what a standard privacy policy typically covers.
Your privacy policy must be comprehensive, accurate, clearly written, and delivered at the right time — not just published and forgotten on a website footer. Think of it as an ongoing commitment to transparency rather than a one-time document. Review it regularly, update it when your processing changes, and ensure individuals can find it easily whenever their data is being collected.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to all things Privacy Notices and Privacy Policies, your data and your business, contact us now.
-
GDPR Applies to AI in Full GDPR does not contain a carve-out for artificial intelligence. If an AI system processes personal data about individuals in the EU — at any stage, whether during training, testing, or live deployment — GDPR applies in full. The regulation is technology-neutral by design, and AI is no exception.
Using AI Tools in Your Business Many organisations are now using AI tools — chatbots, document analysis tools, AI-assisted recruitment platforms, customer service automation, and more. Each of these involves personal data and triggers GDPR obligations. Before deploying any AI tool, you must identify what personal data it processes, on what legal basis, for what purpose, and with what safeguards in place.
Your Vendor is Likely a Data Processor Where you use a third-party AI tool that processes personal data on your behalf, that vendor is a data processor under GDPR. You are the data controller. This means you must have a Data Processing Agreement (DPA) in place with the vendor before any processing begins, and you remain responsible for ensuring the vendor meets GDPR standards. Do not assume compliance — ask for evidence.
Legal Basis is Required for Every Use Every AI processing activity involving personal data requires a lawful basis under Article 6. Depending on the context, this might be legitimate interests, contract performance, or consent. Special category data — such as health information, biometric data, or data revealing racial or ethnic origin — processed by AI systems requires both an Article 6 basis and a separate condition under Article 9. This is a high bar and must be carefully assessed.
A DPIA is Almost Always Required Deploying AI that processes personal data — particularly at scale, or involving profiling, automated decision-making, or sensitive data — will typically constitute high-risk processing under Article 35. This triggers a mandatory Data Protection Impact Assessment before the system goes live. The DPIA must identify risks, assess their severity, and document the measures taken to mitigate them.
Automated Decision-Making Carries Specific Rules Article 22 of GDPR gives individuals the right not to be subject to decisions made solely by automated means that produce significant legal or similarly significant effects — such as loan decisions, recruitment screening, or insurance pricing. Where AI is used in this way, you must either obtain explicit consent, establish contractual necessity, or rely on a specific legal authorisation. In all cases, individuals must be able to request human review of the decision.
Transparency is Non-Negotiable Individuals must be informed when AI is being used to process their data, what it is being used for, and — where automated decision-making applies — meaningful information about the logic involved. Vague references to "automated systems" or "algorithms" in a privacy policy do not satisfy this requirement. The explanation must be clear enough for an ordinary person to understand what is happening and why.
Data Minimisation Applies to AI AI systems have an appetite for data — more data often means better performance. GDPR pushes back against this. You must only use the personal data that is actually necessary for the specific purpose. Feeding an AI system with excessive or irrelevant personal data because it might improve results is not compatible with the data minimisation principle under Article 5.
Individuals' Rights Are Technically Challenging Where personal data has been used to train an AI model, individuals' rights — particularly the right of access and the right to erasure — become technically complex. If a model has been trained on personal data and that data cannot easily be extracted or deleted, you face a significant compliance problem. This must be considered at the design stage, before training begins.
The EU AI Act Runs Alongside GDPR The EU AI Act, which is being phased in from 2024 to 2027, adds a separate but complementary layer of obligations. It classifies AI systems by risk level — from minimal to unacceptable — and imposes strict requirements on high-risk systems including transparency, human oversight, accuracy, and robustness. GDPR and the AI Act must be considered together; compliance with one does not guarantee compliance with the other.
Data Protection by Design Article 25 of GDPR requires data protection to be built into AI systems from the earliest design stage — not addressed as an afterthought. This means choosing privacy-preserving approaches where possible, limiting data collection, implementing strong access controls, and conducting privacy reviews throughout development and deployment.
AI does not exist outside GDPR — it sits squarely within it. Businesses using AI must identify their legal basis, carry out DPIAs, put vendor agreements in place, respect individuals' rights, and ensure transparency. With the AI Act also coming into force, the regulatory landscape around AI is becoming more demanding, not less. Building compliance in from the start is far more effective than retrofitting it later.
Note: We are not lawyers, and this is general guidance only. For specific advice in relation to all things AI/GDPR, your data and your business, contact us now.