The GDPR applies to every piece of data your institution collects about a prospect or student
Since 25 May 2018, the General Data Protection Regulation (GDPR โ Regulation 2016/679) governs all personal data processing in the European Union and the European Economic Area. For a university or college, that scope extends well beyond enrolment records: contact forms, chatbot interactions, website analytics, open day registrations, academic results, health data, and even photographs taken at campus events.
Non-compliance is not a theoretical risk. In 2025, the ICO and several European data protection authorities issued fines against educational organisations for inadequate legal bases and excessive data collection. The maximum penalty โ EUR 20 million or 4% of annual global turnover โ focuses minds.
This guide covers the concrete obligations for private higher education institutions: data categories, legal bases, consent, data subject rights, DPO requirements, and the implications of the AI Act for admissions tools and chatbots.
Categories of personal data processed by a university
Prospect data (pre-enrolment)
Data collected before enrolment forms the first GDPR perimeter for any institution. This includes:
- Identification data โ name, email address, phone number, collected through contact forms, chatbot, or open day registration
- Browsing data โ pages visited, time spent, acquisition source, gathered by Google Analytics or equivalent
- Conversational data โ questions posed to the chatbot, conversation history, language used
- Application data โ CV, personal statement, transcripts, identity documents
89% of prospects ask about tuition fees and 78% enquire about work placements (Source: analysis of 12,000 Skolbot chatbot conversations, Sept 2025 โ Feb 2026). These exchanges constitute personal data the moment an identifier (name, email) is linked to the conversation.
Enrolled student data
Once enrolled, a student generates a significantly larger volume of data:
- Academic data โ marks, attendance, progression, degree certificates
- Financial data โ tuition fees, payment schedules, bursaries
- Campus life data โ building access (ID card), catering, partner accommodation
- Sensitive data โ disability, social circumstances, health records (campus health service)
Sensitive data (Article 9 of the GDPR) demands enhanced protections: a specific legal basis, strict access limitation, and a prohibition on automated decision-making except for explicit exceptions.
Alumni data
Processing alumni data (directory, donations, networking events) requires a distinct legal basis from the one used during studies. Consent given for enrolment does not automatically cover post-graduation engagement.
Applicable legal bases in higher education
The 6 GDPR legal bases and their application to universities
The GDPR (Article 6) defines six legal bases for processing personal data. In higher education, four are primarily used:
-
Performance of a contract (Article 6.1.b) โ The strongest basis for data linked to enrolment, study, and billing. The student contract justifies processing the data necessary for its performance.
-
Legitimate interest (Article 6.1.f) โ Applicable to recruitment marketing (sending prospectuses, follow-ups) and website analytics. Requires a documented balancing test: the institution's interest must not override the individual's rights. The EDPB (European Data Protection Board) recommends formal documentation of this balance for each processing activity.
-
Consent (Article 6.1.a) โ Required for marketing newsletters, non-essential cookies, and data sharing with partners. Consent must be freely given, specific, informed, and unambiguous. A contact form with a pre-ticked box reading "I wish to receive communications" does not constitute valid consent.
-
Legal obligation (Article 6.1.c) โ Covers data transmission to authorities (UCAS, OfS, HESA) and the retention of degree certificates for a statutory period.
Common mistake: consent as the default basis
Many institutions use consent as the sole legal basis for all processing. This is a strategic error. Consent can be withdrawn at any time (Article 7.3), meaning that if a student revokes consent, the institution loses the right to process their data โ including data required for their studies.
The correct approach: use contract performance for processing linked to education, legal obligation for regulatory submissions, legitimate interest for recruitment (with a documented balancing test), and consent only for marketing and cookies.
Consent in the educational context
Consent for minors
The GDPR (Article 8) sets the digital consent threshold at 16, but each Member State may lower it to 13. In the UK, the Age Appropriate Design Code applies additional protections. For most post-18 higher education, prospects are adults, but foundation year and BTEC programmes may include 16- and 17-year-olds.
For minor prospects: parental consent is required for any consent-based processing (marketing newsletter, marketing cookies). Forms must include a verification mechanism (parental email, double opt-in).
Consent and AI chatbot
An AI chatbot that collects personal data must inform the prospect before the conversation begins:
- That they are interacting with an artificial intelligence (AI Act transparency obligation, Article 52)
- What data is collected and why
- How to exercise their rights (access, rectification, erasure)
- How long conversations are retained
An information banner at the chatbot launch, with a link to the privacy policy, fulfils this obligation. The chatbot must not condition access to information on providing personal data: a prospect should be able to ask about programmes without giving their name or email.
Data subject rights
The 8 rights your institution must guarantee
The GDPR confers eight fundamental rights on data subjects (prospects, students, alumni). Your institution must have operational procedures to respond to each within one month:
- Right of access (Article 15) โ The student may request a copy of all data you hold about them.
- Right to rectification (Article 16) โ Correction of inaccurate or incomplete data.
- Right to erasure (Article 17) โ The "right to be forgotten". Limited by statutory retention obligations (degree certificates, accounting records).
- Right to restriction (Article 18) โ Freezing of processing while a complaint is investigated.
- Right to data portability (Article 20) โ Transfer of data in a structured format to another institution.
- Right to object (Article 21) โ Refusal of processing based on legitimate interest, including marketing profiling.
- Right not to be subject to automated decision-making (Article 22) โ Fundamental for admissions tools that use AI.
- Right to withdraw consent (Article 7.3) โ At any time, without justification.
Cascading erasure: a technical challenge
When a prospect exercises the right to erasure, all data concerning them must be removed from every system: CRM, chatbot, email tool, named analytics, backups. The cost per enrolled student ranges from GBP 2,400 to GBP 3,200 in the UK (Source: estimates based on EAIE, StudyPortals, EAB, British Council data). Each erasure request therefore represents a marketing investment loss โ all the more reason to minimise data collection from the outset.
Deletion must be effective within one month. A documented cascading erasure process, tested regularly, is essential.
The DPO: role and obligations for universities
When is DPO designation mandatory?
The GDPR (Article 37) makes DPO designation mandatory when processing is carried out by a public body, or when the controller's core activities require regular and systematic monitoring of individuals at large scale.
For a private university, the ICO considers that tracking hundreds or thousands of prospects and students constitutes large-scale processing. DPO designation is therefore, in practice, almost universally required for higher education institutions.
Internal or external DPO?
Both options are valid. An internal DPO understands institutional processes better but risks a conflict of interest if they also hold a decision-making role (IT director, legal director). An external DPO brings specialist expertise and guaranteed independence, but needs time to understand the specific context of higher education.
The DPO must have direct access to senior management, cannot be penalised for carrying out their duties, and must be given adequate resources (budget, time, tools).
The AI Act and its implications for universities
Classification of AI systems in education
The EU AI Act (Regulation 2024/1689) classifies artificial intelligence systems by risk level. For higher education, two categories are relevant:
High risk (Annex III) โ AI systems used for admissions, application assessment, or automated exam grading are classified as high risk. They require:
- A documented risk management system
- High-quality, representative, and bias-free training datasets
- Effective human oversight (AI recommends, human decides)
- Full transparency towards data subjects
- Registration in the European AI high-risk system database
Limited risk (Article 52) โ Pre-admissions information chatbots fall under limited risk. The primary obligation is transparency: the prospect must know they are interacting with AI. No conformity assessment, no registration, but a clear information duty.
Implementation timeline
The AI Act enters into force progressively. Prohibitions on unacceptable-risk systems have been effective since February 2025. Obligations for high-risk systems apply fully from August 2026. Universities using AI tools for application screening must prepare now.
Country-specific obligations
United Kingdom โ ICO
The ICO (Information Commissioner's Office) enforces UK GDPR post-Brexit. Key UK-specific considerations:
- The Age Appropriate Design Code adds obligations for services likely to be accessed by under-18s
- Data transfers to/from the EU require appropriate safeguards (standard contractual clauses)
- The UK has maintained an adequacy decision with the EU, but this must be monitored for changes
- HESA data submissions have specific retention and processing requirements
France โ CNIL
The CNIL is the French supervisory authority. Beyond the GDPR, French national law (loi Informatique et Libertes) imposes additional obligations including digital consent at 15 (vs 16 in GDPR) and enhanced cookie requirements. UK institutions recruiting in France should be aware of these differences.
Germany โ BfDI
The BfDI (Bundesbeauftragter fur den Datenschutz und die Informationsfreiheit) oversees federal data protection, but each Land has its own supervisory authority for education. The Landesdatenschutzbeauftragte sometimes apply divergent interpretations, complicating compliance for multi-campus institutions in Germany.
Spain โ AEPD
The AEPD (Agencia Espanola de Proteccion de Datos) has published education-specific guides, including guidelines on digital platform use in teaching and image capture in educational establishments.
Netherlands โ AP
The AP (Autoriteit Persoonsgegevens) pays particular attention to the Dutch education sector. Fines have been issued against institutions for using online exam proctoring software without an adequate legal basis.
Portugal โ CNPD
The CNPD (Comissao Nacional de Protecao de Dados) applies the GDPR within the specific framework of Law 58/2019. Portuguese institutions must pay particular attention to graduate data retention, governed by national statutory periods.
Data security: technical and organisational measures
The principle of data minimisation
Article 5.1.c of the GDPR requires collecting only the data strictly necessary for the stated purpose. For a chatbot, this means: not requiring name, email, or phone number to answer a question about programmes. Identifier collection is only justified when the prospect wishes to be contacted.
Essential technical measures
- Encryption โ In transit (TLS 1.3) and at rest (AES-256) for all personal data
- European hosting โ Servers within the EU/EEA, in line with EDPB recommendations on international transfers
- Pseudonymisation โ Separation of direct identifiers from behavioural data
- Access logging โ Traceability of who accesses which data, when
- Encrypted backups โ With regular restoration testing
- Automated deletion โ Purge of data beyond the defined retention period
Data Protection Impact Assessment (DPIA)
Article 35 of the GDPR requires a DPIA before any processing likely to result in a high risk. For a university, this includes:
- Deploying an AI chatbot that collects personal data
- Using AI tools for application assessment
- Campus CCTV surveillance
- Prospect profiling for marketing purposes
The DPIA must describe the processing, assess its necessity and proportionality, identify risks, and propose mitigation measures.
FAQ
Is an AI chatbot GDPR-compliant?
Yes, provided four obligations are met: informing the prospect that they are interacting with AI (AI Act transparency), collecting only strictly necessary data (minimisation), offering easy access, rectification, and erasure (data subject rights), and hosting data within the EU/EEA. A compliant chatbot informs before collecting, and does not condition access to information on providing personal data.
How long can you retain a non-enrolled prospect's data?
The ICO and EDPB recommend a maximum of 3 years after the last contact for marketing prospecting data. For a prospect who never responded: erasure after 3 years. For a rejected applicant: dossier retention for 1 year (potential litigation), then erasure. These periods must be recorded in the processing register.
Does the AI Act ban AI use for admissions?
No. The AI Act does not ban it, but classifies it as a high-risk system (Annex III). This imposes enhanced obligations: risk management, training data quality, effective human oversight, transparency towards applicants, and registration. AI may recommend, but the final admissions decision must remain with a human.
Must a university with 500 students appoint a DPO?
In practice, yes. The ICO and EDPB consider that processing data from hundreds of students (marks, financial data, health records) constitutes large-scale processing. DPO designation is therefore almost universally required for higher education institutions, regardless of size. The DPO may be shared between several institutions or outsourced.
How do you handle an erasure request from a graduated student?
Erasure cannot be total: the institution has a legal obligation to retain proof of degree award (legal obligation, Article 6.1.c). Financial data is subject to statutory accounting retention periods (typically 6-7 years in the UK). However, campus life data, browsing data, and marketing communications must be erased. Document the response in writing, detailing which data was erased and which was retained with its legal basis.
GDPR compliance is not a one-off project. It is a continuous process that touches every department of your institution โ admissions, registry, marketing, IT, and senior leadership. Institutions that build compliance into their tools from the outset (privacy by design) protect their students and protect themselves.
To understand how the AI Act specifically changes obligations for universities, read our article on the AI Act and higher education. For technical protection measures, see our guide on protecting prospect data.
See how Skolbot protects your prospects' data