Effective May 2026
DPIA Template
A Data Protection Impact Assessment helps you decide whether and how to deploy BuiltAI workflows in a way that respects UK GDPR + the Data Protection Act 2018. This template is sized for SME procurement teams - not a substitute for legal advice, but a practical starting point covering the questions BuiltAI deployments typically raise.
1. Engagement context
- Engagement name and pack - the BuiltAI pack being deployed (Commercial Control Kit, Bidroom-in-a-Box, RAMS Factory, etc.).
- Business owner - the named person inside your organisation accountable for the deployment.
- Data Protection Officer - if your organisation has one, name them; if not, name the senior manager handling DP enquiries.
- Date of assessment - and the date of the next scheduled review (we recommend annually or on significant scope change).
2. Data flows
Describe the personal data, if any, that will flow through the BuiltAI workflow. For most engagements this will be business-contact-level data (name, work email, role) of client and supplier representatives. Note any free-text fields (instructions, variation narratives, RAMS notes) that could incidentally include personal data and reference BuiltAI's Red-data classification rules - personal, security-sensitive or regulated data is hard-blocked from AI processing in the MVP.
3. Lawful basis
- Contract - delivering services to a named client under a written engagement.
- Legitimate interest - typical for internal commercial workflows, with a documented LIA for any borderline use case.
- Special category data - if your workflow involves health and safety records that touch on individual circumstances, identify the Article 9 condition you rely on (e.g. employment law obligations).
4. Necessity and proportionality
Document why the data flows above are necessary for the engagement and proportionate to its purpose. BuiltAI workflows are built around the principle of minimum necessary data - if a workflow can be designed against anonymised or aggregated data, it should be. Capture any trade-offs you have made and the rationale.
5. Risks identified
- Confidentiality - unauthorised access to commercial or operational data.
- Integrity - data altered without authorisation, or AI output mistaken for human-authored record.
- Availability - inability to access business-critical records when needed.
- Individual harm - any direct harm individuals could suffer if the workflow misfires.
6. Mitigations
- Role-based access controls; client users only see their own client account.
- Hard block on Red-classified data reaching any AI service.
- Audit log of every AI invocation, every QA decision and every data export.
- Two-stage QA gate (QA1 + QA2) on every AI-assisted deliverable before issue.
- Sub-processor list is published and notified on change - see /sub-processors.
- Encryption at rest and in transit. See /data-security for full technical and organisational measures.
7. Data Subject rights
BuiltAI supports the standard rights under UK GDPR - access, rectification, erasure, restriction, portability and objection. Subject access requests are handled within 30 days; if your organisation receives a request, contact BuiltAI via the standard route and we will respond inside that window with the data we hold on your behalf.
8. Consultation
Document any consultation with the ICO, your DPO or affected individuals where the residual risk is high. For most BuiltAI deployments residual risk is low and no ICO consultation is required - but the assessment should explain why you reached that conclusion.
9. Sign-off
- Business owner sign-off (name, role, date).
- DPO or senior manager sign-off (name, role, date).
- Date of next review.
Procurement reviewing?
Request the signed summary