
Post: Recruitment Data Privacy Terms Glossary for HR Tech
Recruitment Data Privacy Terms Glossary for HR Tech
Data privacy in recruiting is not a compliance afterthought. Every candidate record your team creates, every tag your automation applies, and every AI score your system generates carries legal obligations under an expanding web of global regulations. The recruiting professionals who understand these terms — not just their legal teams — build systems that are both high-performance and defensible. This glossary defines the core concepts shaping how HR technology must handle candidate data, and connects each term directly to your dynamic tagging architecture in Keap™ for HR and recruiting automation.
Core Regulatory Frameworks
The two regulations that define the floor of recruiting data compliance for most organizations are GDPR and CCPA/CPRA. Understanding them precisely — not roughly — determines whether your HR tech stack is compliant or merely policy-adjacent.
General Data Protection Regulation (GDPR)
GDPR is a regulation enacted by the European Union that governs how personal data of EU residents is collected, stored, processed, and deleted — by any organization, anywhere in the world, that handles that data.
For recruiting operations, GDPR is not a European problem. Any organization that accepts applications from EU residents, sources candidates from EU-based talent pools, or operates ATS and CRM platforms that touch EU resident data must comply. Key obligations under GDPR include:
- Lawful basis documentation: Every data processing activity must rest on a documented legal basis — consent, legitimate interest, contract, legal obligation, vital interests, or public task. “We’ve always collected this” is not a lawful basis.
- Data subject rights: Candidates have enforceable rights to access their data, correct it, restrict processing, and request deletion. Your automation platform must be able to execute all four on demand.
- 72-hour breach notification: If candidate data is compromised, GDPR requires notification to the relevant supervisory authority within 72 hours of becoming aware of the breach. This is a recruiting operations timeline, not just an IT one.
- Automated decision-making restrictions (Article 22): Decisions that produce significant effects on individuals — including AI-assisted candidate scoring and automated screening — trigger specific rights for candidates, including the right to human review.
- Data Processing Agreements (DPAs): Every vendor that touches candidate PII on your behalf — your CRM, ATS, email platform, background check provider — requires a signed DPA.
Teams building tag-based recruiting workflows in Keap™ should audit every tag trigger against these obligations. A tag that fires based on an AI-inferred attribute may constitute automated decision-making under Article 22.
California Consumer Privacy Act (CCPA) / California Privacy Rights Act (CPRA)
CCPA and its successor CPRA are California state laws that grant individuals — including job applicants — specific rights over their personal data collected by businesses operating in or targeting California residents.
CPRA explicitly extended protections to HR data, closing a gap in the original CCPA. For recruiters, the operative rights are:
- Right to know: Applicants can request disclosure of what personal information you collected, how it was used, and with whom it was shared.
- Right to delete: Applicants can request deletion of their personal information, subject to limited exceptions.
- Right to correct: Applicants can request correction of inaccurate personal information.
- Right to opt out of sale or sharing: If your recruiting platform shares candidate data with third parties in ways that qualify as “sale” or “sharing” under CPRA, candidates must be able to opt out.
- Sensitive personal information restrictions: CPRA creates a separate, higher-protection category for sensitive data — including Social Security numbers, biometric data, and precise geolocation — with its own opt-out right.
Operationally, CPRA compliance means your Keap™ workflows must be able to locate, surface, and purge a California applicant’s full data record — tags, custom fields, email history, and automation logs — within a defined response window.
Foundational Privacy Principles
Regulations are enforced. Privacy principles are designed. The following concepts determine how well your HR tech stack performs under regulatory scrutiny.
Personally Identifiable Information (PII)
PII is any data that can be used — alone or in combination with other data — to identify a specific individual.
In a recruiting context, PII is broader than most teams assume. It includes:
- Direct identifiers: Full name, email address, phone number, home address, Social Security number, passport number, date of birth.
- Indirect identifiers: IP address, device ID, cookie identifiers, precise geolocation data.
- Inferred identifiers: AI-generated candidate scores, behavioral engagement signals, or demographic inferences — when linked to an individual record — qualify as PII in most regulatory frameworks.
- Biometric data: Voice recordings, facial recognition data, and fingerprint scans collected during screening processes carry the highest protection tier under CPRA and most GDPR interpretations.
Every field in your ATS intake form, every custom field in Keap™, and every tag that encodes candidate attributes is potentially a PII container. Understanding what qualifies as PII is the starting point for designing the essential Keap™ tags HR teams use to automate recruiting without creating compliance exposure.
Sensitive Personal Information (SPI)
Sensitive Personal Information is a distinct, higher-protection subcategory of PII. Under CPRA and GDPR’s Article 9, SPI includes data revealing racial or ethnic origin, political opinions, religious beliefs, trade union membership, genetic data, biometric data, health data, and data concerning sex life or sexual orientation.
In recruiting, SPI surfaces in contexts that may not be immediately obvious: disability accommodation requests, diversity self-identification questionnaires, health-related screening questions, and certain background check categories. Any automation workflow that tags, scores, or routes candidates based on SPI requires explicit consent and, in many jurisdictions, a Data Protection Impact Assessment (DPIA).
Data Minimization
Data minimization is the principle that organizations should collect and retain only the personal data that is strictly necessary for the defined purpose of the processing activity.
This is an active design discipline, not a passive policy. For recruiting operations, data minimization means:
- Auditing every field in your application forms and removing any that cannot be justified by a specific hiring decision requirement.
- Configuring resume parsing rules to extract only role-relevant attributes, not every available data point.
- Building Keap™ custom fields with defined retention periods so data is automatically purged when no longer needed.
- Resisting the instinct to tag candidates with “nice to know” attributes that have no operational use in the current or near-term pipeline.
Data minimization also reduces breach exposure. Every data point you do not hold is a data point that cannot be compromised. Gartner research consistently identifies data sprawl — collecting more data than the organization can govern — as a primary driver of compliance failures in HR technology deployments.
Purpose Limitation
Purpose limitation requires that personal data collected for one defined purpose not be repurposed for a different, incompatible use without a new lawful basis and, in many cases, new consent.
In recruiting, purpose limitation becomes operationally relevant when organizations want to:
- Use a candidate’s application data from one role to consider them for a different, unrelated role.
- Add former applicants to marketing or employer brand email sequences.
- Share candidate profiles with a third-party staffing partner not disclosed in the original collection notice.
Each of these scenarios requires a documented legal basis. Automation workflows that re-tag or re-route candidate records for new purposes without a fresh consent or legitimate interest assessment are a common, often invisible source of GDPR violations.
Privacy by Design
Privacy by Design is the principle that data protection should be engineered into systems and processes from the outset, not retrofitted after the fact. Under GDPR Article 25, Privacy by Design and Privacy by Default are legal requirements for organizations processing personal data.
For HR technology teams, Privacy by Design means:
- Building tag taxonomies in Keap™ that structurally exclude protected-class attributes before any automation is deployed.
- Setting field-level access restrictions by user role so that sensitive candidate data is visible only to authorized team members.
- Designing retention and purge automation into every workflow at build time, not as a later cleanup task.
- Defaulting to the least-invasive data collection method that still serves the operational purpose.
This connects directly to the guidance in our post on AI bias risks in automated candidate screening — the same structural discipline that produces ethical AI screening produces compliant data architecture.
Consent and Lawful Basis
Collecting candidate data without a valid lawful basis is a GDPR violation regardless of intent. Two lawful bases dominate recruiting use cases.
Consent
Consent under GDPR must be freely given, specific, informed, and unambiguous. For recruiting, this means:
- Pre-ticked boxes do not constitute consent.
- Consent cannot be bundled into general terms and conditions.
- Candidates must be able to withdraw consent as easily as they gave it — and withdrawal must trigger deletion or cessation of processing.
- Consent for one purpose (e.g., applying for a specific role) does not extend to another purpose (e.g., talent pool nurturing) without a separate consent action.
Legitimate Interest
Legitimate interest allows processing without explicit consent when the organization has a genuine, proportionate business need that does not override the individual’s rights and freedoms. A three-part test applies: purpose test (is the interest legitimate?), necessity test (is processing necessary?), and balancing test (do the individual’s interests override the organization’s?)
Legitimate interest is commonly used for:
- Retaining unsuccessful candidate data for a defined period to defend against discrimination claims.
- Internal talent pipeline management for roles similar to one applied for.
- Security and fraud prevention processing.
Organizations relying on legitimate interest must document the three-part assessment and make it available to data subjects on request.
Rights, Obligations, and Technical Terms
Data Subject Rights
Data subject rights are the legally enforceable entitlements individuals hold over their personal data under GDPR and equivalent regulations. In recruiting, the most operationally impactful are:
- Right of access (Subject Access Request / SAR): The candidate can request a copy of all personal data held about them, including tags, scores, and processing history.
- Right to rectification: The candidate can demand correction of inaccurate data.
- Right to erasure (Right to be Forgotten): The candidate can request deletion of their data where the lawful basis no longer applies or consent is withdrawn. Your automation platform must execute a full purge across all connected systems — not just the primary record.
- Right to restrict processing: The candidate can request that processing be paused while a dispute is resolved.
- Right to data portability: The candidate can request their data in a structured, machine-readable format.
- Right to object to automated decision-making: Where AI or automation produces decisions with significant effects, the candidate can request human review.
Each right requires a technical response capability inside your HR tech stack. A policy document that names these rights but lacks the system architecture to execute them is non-compliant. See also our guidance on using Keap™ tags to capture deeper candidate insights beyond keywords — the same tag depth that enables insight must also be fully erasable on demand.
Data Processing Agreement (DPA)
A Data Processing Agreement is a contract between a data controller (the employer or recruiting firm) and a data processor (any vendor that handles personal data on the controller’s behalf) that defines the scope, purpose, security requirements, and obligations of the data processing relationship.
Every HR tech vendor your team uses — CRM, ATS, background check provider, email automation platform, AI screening tool — must have a signed DPA in place before processing candidate PII. Operating without one is a direct GDPR violation on the part of the controller, regardless of the vendor’s own compliance posture.
Data Protection Impact Assessment (DPIA)
A DPIA is a structured risk analysis required by GDPR Article 35 when a processing activity is likely to result in high risk to individuals’ rights and freedoms. In recruiting, DPIAs are typically required for:
- Systematic and large-scale processing of sensitive personal data (e.g., health data in pre-employment screening).
- Automated processing that produces decisions with significant effects on individuals — including AI-driven candidate scoring systems.
- Large-scale systematic monitoring of candidates (e.g., behavioral tracking across a career site).
A DPIA documents the nature of the processing, its necessity and proportionality, the risks identified, and the mitigations applied. It is not a one-time exercise — it should be revisited when the underlying system changes materially.
Data Retention and Purge Schedules
Data retention schedules define how long each category of personal data is kept and what triggers its deletion. In recruiting, retention periods vary by data type and jurisdiction:
- Active candidate records: retained for the duration of the hiring process.
- Unsuccessful applicant data: legal advisors commonly recommend 6–12 months post-process to enable response to discrimination claims, though this varies by jurisdiction and sector.
- Hired employee data: governed by employment law retention requirements, which differ significantly by country.
- Background check data: subject to Fair Credit Reporting Act (FCRA) requirements in the US and equivalent regulations elsewhere.
Purge schedules should be automated where possible. A manual deletion process is an operational risk — tasks that depend on human memory fail. Building time-triggered tag removal and contact archival into your Keap™ workflows is the operational implementation of a retention schedule.
Data Breach
A data breach is any security incident that results in unauthorized access, disclosure, alteration, or destruction of personal data. In a recruiting context, breaches can include:
- Unauthorized access to an ATS or CRM containing candidate records.
- Accidental transmission of candidate data to the wrong recipient.
- Loss of a device containing unencrypted candidate files.
- Ransomware attacks that encrypt or exfiltrate HR data.
Under GDPR, breaches that pose risk to individuals must be reported to the relevant supervisory authority within 72 hours. Breaches that pose high risk must also be communicated directly to affected data subjects. Recruiting operations teams — not just IT departments — need a documented incident response plan that includes these timelines.
Encryption and Pseudonymization
Encryption converts personal data into an unreadable format that can only be decoded with the correct key. Pseudonymization replaces direct identifiers with artificial identifiers, so that data can only be re-linked to an individual with access to a separate key file. Both are recognized under GDPR as technical safeguards that reduce risk and, in the event of a breach, can eliminate or reduce the notification obligation if the data is rendered unintelligible to unauthorized parties.
For HR technology, encryption at rest and in transit for all candidate data is a baseline requirement. Pseudonymization is particularly relevant for analytics and reporting use cases — aggregated talent pipeline reporting should use pseudonymized or anonymized data wherever possible.
Related Terms in the HR Tech Privacy Ecosystem
Anonymization vs. Pseudonymization
Anonymized data is data from which all identifying information has been irreversibly removed, such that re-identification is not possible. Anonymized data falls outside GDPR’s scope entirely. Pseudonymized data, by contrast, can still be re-linked to an individual with the right key — it remains personal data under GDPR and must be handled accordingly. The distinction matters for talent analytics: aggregate diversity reporting that uses truly anonymized data is not subject to GDPR; pseudonymized candidate scoring data is.
Cross-Border Data Transfers
Transferring personal data outside the EU to countries without an equivalent level of data protection requires a legal mechanism. The primary mechanisms post-Schrems II are Standard Contractual Clauses (SCCs), Binding Corporate Rules (BCRs) for intra-group transfers, and adequacy decisions where a country’s legal framework has been formally recognized as equivalent by the EU. For multinational recruiting operations using cloud-based HR tech with US-based servers, cross-border transfer compliance is a live obligation, not a theoretical one.
Fair Credit Reporting Act (FCRA)
The FCRA is a US federal law governing the collection, use, and disclosure of consumer report information — including employment background checks. It requires employers to obtain written consent before conducting a background check, provide a copy of the report and a summary of rights before taking adverse action, and follow a specific adverse action notification process. Integrating background check data into your recruiting CRM without FCRA-compliant consent workflows creates significant legal exposure.
Equal Employment Opportunity (EEO) Data Handling
EEO data — voluntary self-identification of race, ethnicity, gender, disability status, and veteran status — is collected by US employers to meet federal reporting requirements. This data must be kept strictly separate from hiring decision workflows. It should not be accessible to hiring managers, should not be used as input to any automated scoring or tagging system, and must be stored with access controls that enforce the separation. Mixing EEO data with operational candidate data in the same CRM record without architectural separation is both a legal risk and an ethical one.
For a deeper look at how AI-assisted recruiting tools must handle these boundaries, see our analysis of AI bias risks in automated candidate screening.
Consent Management Platform (CMP)
A CMP is a technology solution that manages the collection, storage, and enforcement of user consent preferences across digital touchpoints. In a recruiting context, CMPs are used to manage cookie consent on career sites, track consent to specific data uses (talent pool inclusion, marketing communications, interview recording), and provide an auditable record of consent that can be produced in response to regulatory requests. Integration between your CMP and your recruiting CRM is a privacy-by-design requirement, not an optional enhancement.
Common Misconceptions in Recruiting Data Privacy
Several widespread misunderstandings about data privacy regularly produce compliance failures in HR technology deployments.
Misconception 1: “Our data is in the cloud, so our vendor handles compliance.” Cloud hosting does not transfer regulatory responsibility. The employer is the data controller and bears primary accountability for compliance. Vendors are processors — they must meet the standards the controller sets in the DPA, but they do not absorb the controller’s obligations.
Misconception 2: “GDPR only applies to European companies.” GDPR applies to any organization that processes the personal data of EU residents, regardless of where the organization is based. A US recruiting firm that sources candidates from Germany is subject to GDPR for those candidates’ data.
Misconception 3: “If a candidate applied to us, we can keep their data forever.” Applying for a role does not create a perpetual license to hold a candidate’s data. Consent lapses, legitimate interest must be re-evaluated, and retention limits apply. Holding data beyond defined retention periods is a direct violation of the storage limitation principle under GDPR.
Misconception 4: “Anonymized data is the same as pseudonymized data.” They are legally distinct. Pseudonymized data remains personal data under GDPR. Only true anonymization — where re-identification is impossible — removes data from GDPR’s scope.
Misconception 5: “Our AI screening tool handles its own compliance.” AI vendors may have their own compliance programs, but the employer remains accountable for the lawfulness of automated decision-making applied to candidates. Article 22 obligations are the controller’s, not the AI vendor’s.
Putting the Glossary to Work in Your HR Tech Stack
These definitions are operational inputs, not reference material to file away. Every term in this glossary maps to a concrete decision inside your recruiting technology: which fields to collect, which tags to apply, how long to retain records, which vendors require DPAs, and which workflows require human oversight.
The most effective place to apply this knowledge is in the architecture phase — before workflows are built and before data is collected. Teams that have already deployed their recruiting automation should conduct a structured audit: map every data point against a lawful basis, verify every vendor DPA is in place, and confirm that every data subject right can be executed technically, not just described in a policy document.
For guidance on structuring the underlying tag taxonomy to support both performance and compliance, see our posts on Keap™ tag naming and organization best practices for HR and preserving candidate data intelligence during a Keap™ migration. And for the full strategic framework that connects tagging architecture to AI-driven recruiting, the parent resource on dynamic tagging architecture in Keap™ for HR and recruiting automation provides the operational blueprint.
Privacy compliance and recruiting performance are not in tension. The systems rigorous enough to satisfy a GDPR audit are the same systems rigorous enough to produce reliable, scalable hiring outcomes. Build them together from the start.