Order processing is part of everyday life: cloud hosting, newsletter distribution, support desk, payment gateway, AI labeling, CRM operation. At the latest when personal data is processed on behalf of others, Art. 28 GDPR requires a robust data processing agreement (DPA). The DPA is not a form attachment, but the legally binding bridge between technology, organization and liability. This article bundles the mandatory contents of Art. 28 para. 3 GDPR in a structured continuous text, shows practical clause mechanics for sub-processors, audits, TOMs and data portability and delineates typical error patterns. Stylistically, the text builds on the compact knowledge articles on itmedialaw.com, but goes into greater depth at the operational level for SaaS, agency, games and AI setups.
Starting point and area of application
Order processing means the processing of personal data “on behalf of” a controller. The decisive factor is not the label “service provider”, but rather the fact that the processor is bound by instructions and purpose limitation: The processor does not act for its own purposes, but exclusively within the documented instructions of the controller. Typical constellations are hosting, e-mail dispatch, analytics, payroll accounting, ticket systems, content moderation or data enrichment. The distinction between joint controllership (Art. 26 GDPR) and independent controllership is mandatory: those who co-determine purposes are no longer “mere processors”. The legal clarity protects against role conflicts and prevents data subjects’ rights, information obligations, erasure periods or security levels from being misplaced. A look at the compact itmedialaw introductions shows the practical relevance of the topic in the contract and cloud context.
A DPA must specify the subject matter and duration of the processing. This means not only the heading “Hosting”, but also a meaningful description of the processing operations covered by the contract (storage, retrieval, transmission, deletion, backup, testing, training). The duration follows the lifetime of the main contract, but is differentiated for backups, log retention and follow-up processes. Without this differentiation, deletion and release remain purely theoretical.
The type and purpose of the processing are required. For example, anyone operating a CRM processes master, contact and interaction data for customer management; a DSP/AdTech partner processes pseudonymous IDs for campaign management; an annotation service processes image, text or audio segments for AI training. The purpose is strict; a “purpose update” for internal analytics of the processor is not permitted without a separate legal basis.
The type of personal data and categories of data subjects must be specified. The categories are not described generically, but on a project-specific basis (e.g. customers, leads, employees, creators, player accounts; data types such as identification, communication, contract, usage, support, payment or health data). The more sensitive the data (Art. 9), the more precise the TOM depth, the tighter the sub-processor control.
The GCU enshrines the rights and obligations of the controller, in particular the right to issue instructions. Instructions are documented in writing or in a system; the processor checks obviously unlawful instructions and reports any concerns. The instruction regime includes emergency instructions if security incidents require immediate action.
The core of the processor obligations are confidentiality, TOMs and security level. Every person with access to data is obliged to maintain confidentiality; the TOMs are based on Art. 32 GDPR and are kept as a dynamic annex. This includes taking into account the state of the art, implementation costs, type, scope, circumstances and purposes. The DPA not only refers to “ISO certificates”, but also describes access controls, encryption, key management, network segmentation, hardening, logging/monitoring, backup/restore, vulnerability management, MFA obligations, role and rights concepts, pseudonymization/minimization, test and staging isolation and regular effectiveness checks.
The processor supports the controller with data subject rights (access, rectification, erasure, restriction, data portability, objection), security incidents (Art. 33/34), DPIA (Art. 35) and communication with supervisory authorities – in each case with clear response times and process descriptions. Without SLA cycles, deadlines become a mere hope.
Once the order has been completed, the personal data, including archive, backup and log files, is deleted or returned, provided there are no legal retention requirements to the contrary. The DPA regulates export formats and checking mechanisms, such as random deletion verifications or hash-based comparison procedures, in order to avoid “deletion fictions”.
Finally, verification and audit obligations must be anchored. The processor provides all information required to provide evidence of compliance and enables audits – by the person responsible or independent auditors – with appropriate safeguards. The GCU balances confidentiality, frequency and costs and allows remote audits, combined audit weeks, report recognition (e.g. ISO/SOC) and follow-up audits after major incidents.
Important for the design: Electronic form is sufficient (Art. 28 para. 9 GDPR). The DPA is therefore signature- or portal-capable; active change logs and version statuses ensure evidence in the audit. (
Subprocessors: permission, chain, control
No modern setup works without sub-processors. Nevertheless, Art. 28 para. 2/4 GDPR requires prior authorization and the transfer of all obligations to the sub-processors. Two models are common: specific individual authorization or general authorization with sub-processor register and right of objection. General approval with clear notice periods for changes and new additions, differentiated according to “critical” (storage, core compute, identity) and “non-critical” (e.g. e-mail delivery), is standard practice. A register lists the company, function, country, data categories and the relevant TOM anchors. Flow-down means: the same data protection obligations apply in the chain, including audit cooperation and incident reporting channels. Additional guarantees are agreed for third countries (standard contractual clauses, transfer impact assessment if applicable). A lean but resilient process prevents “shadow sub-processors” and addresses change scenarios without blocking operations.
Audit rights without downtime: evidence, remote verification, recognition logic
Audits are not a license to interrupt operations. The GCU defines how audits are carried out: preferably remote audits, inspection of policies, TOM systems, risk and action registers, random sampling of tickets/incidents, inspection of pen test summaries and certification reports. On-site audits are reserved for safety-critical cases and announced slots. A recognition mechanism clarifies which external evidence partially fulfills the audit obligation (e.g. ISO 27001, SOC 2 Type II) without undermining the statutory inspection rights. Deadlines, audit day quotas and confidentiality locks (clean rooms, view-only access) prevent data leaks and minimize trade secret risks.
Set up TOMs correctly: Art. 32 level, but operational
TOM annexes often turn into lists of keywords. The TOM annex becomes effective when technical and organizational measures are categorized, measurable and verifiable. These include identity and rights management (RBAC/ABAC, least privilege, JIT admin), MFA for internal and external access, key management (KMS/HSM, rotation, separation of duties), encryption at rest and in transit, network segmentation and zero trust principles, security logging with tamper-proof storage, backup/restore including regular recovery tests, vulnerability and patch processes, secure SDLC (code reviews, SAST/DAST, secrets scanning, build integrity), data lifecycle management (minimization, pseudonymization, retention), supply chain controls (SBOM, dependency monitoring), home office/BYOD rules and awareness programs. The effectiveness test is carried out cyclically and after major changes; results flow into the risk register. The supervisory authorities remind us that TOMs do not necessarily have to be fully spelled out in the GCU itself, but must be verifiably evaluated and documented; operationally, versioning in an annex with a change log makes sense.
Data portability, exit strategy and proof of deletion
The cleanest AVV loses value if the exit remains unclear. A portability clause therefore defines export paths: formats (CSV, JSON, Parquet), schemas, API accesses, timelines, revision loops and cost logic. For complex client data (e.g. in SaaS systems), a “read-only phase” is agreed after the end of the contract, during which access is still possible but no new processing takes place. Deletion does not only mean removal from production systems; the AVV includes backups, snapshots, cold storage, crash dumps and log data. Random deletion tests or hash comparisons ensure that proof is provided without exposing company secrets. In chain relationships, the processor obliges sub-processors to perform synchronous erasure and documents their evidence in the register.
Practical examples from SaaS, AI and games
SaaS operation of a CRM: Controller uses a hosted CRM. The DPA describes processing operations (collection, storage, segmentation, sending, deletion), data types (master data, communication data, usage data), data subject groups (leads, customers), TOM level (including encryption, RBAC, MFA), sub-processors (IaaS provider, email relay), audit mechanism (remote, SOC reports, annual slot check), exit (full export, data deletion, log retention). The controller receives copies of data subjects via the SaaS export; the processor provides support if system fields cannot be read out 1:1. In this way, the data protection foundation often outlined in the itmedialaw knowledge database is translated into product practice.
AI annotation and fine-tuning: A labeling service processes image and text data. The GCU specifies strict purpose limitation, confidentiality, isolated VDI environments, watermarks for synthetic data, handling of hallucinated content and a specific process for data subject rights support for training datasets. SCCs are implemented for third country references; subprocessor changes are displayed 30 days in advance. DSFA support receives its own reaction windows. Measurable TOM criteria (no BYOD storage, copy/paste blocker, clipboard protection) move up into the audit scope.
Games live ops with an external cloud: telemetry data flows into an analytics pipeline. The IaaS provider is a sub-processor, as are an event processing service and an A/B testing tool. The AVV requires data minimization (only necessary events), separate pseudonyms, deletion propagation towards subprocessors and clear A/B data retention. Since live ops is fast, the GCU regulates an express instruction window for hotfix changes to event schemas so that the purpose limitation is not “overtaken”.
Typical mistakes – and how to avoid them
Undefined scope: “Processor may process data to fulfill the contract” describes neither processes nor purposes. This results in extensions through the back door. Remedy: a meaningful scope including a catalog of processes and a sample matrix that sets out the framework without legalizing rigid technology.
Empty TOM keywords: “Encryption, logging, backup” without procedures and audit trails are not sufficient. A TOM annex specifies procedures (e.g. TLS versions, KMS operation, rotation cycles), controls (role assignment, recertification) and checks (recovery tests, pen test frequency).
Shadow sub-processors: Agencies incorporate “little helpers”, without register update. Solution: general approval with register, threshold values for “critical”, notification window, objection option and exit variants for mandatory changes.
Unrealistic audit rights: “Anytime, unannounced, full system access” is not practical in scaling multi-tenant environments. A smart AVV couples remote audits, report recognition and slot audits with escalation levels for incidents.
Exit without portability: “Deletion after end” without export formats leads to lock-in. A portability section defines formats, deadlines, interfaces and cost logic so that data can actually be transferred.
Art. 26/28 confusion: Joint controller models are incorrectly labeled as GCU. Result: Incorrect information obligations, unclear responsibility for data subject rights, liability shifts. Differentiation is based on purpose: those who co-determine purposes are at the wheel, not in the passenger seat.
“Data protection later” in agile roll-outs: feature teams go live, AVV follows. That is risky. Contracts and TOM processes are early enable requirements – not subordinate formalities.
Copied patterns without data reality: Copy & paste of external clauses ignores real data flows, subprocessor chains and deletion logic. This takes its toll in an audit or after an incident. Practical guidelines therefore remind you to anchor the mandatory content in concrete terms.
Mini pattern logic in words – without form aesthetics
A scope section specifies processing operations, purposes, data types, data subject categories and the operational systems. It contains a dynamic appendix “Processing activities” with versioning. An instruction section describes channels (ticket, portal, email signature), priorities, response windows and the obligation to object to manifestly unlawful instructions. The TOM section refers to an updatable TOM appendix with technical and organizational measures, test and review cycles; changes are documented and announced without having to renegotiate each time. The subprocessor section selects the general approval, provides a register, regulates announcement deadlines, objection and flow-down including audit cooperation. The audit section allows remote audits, recognizes ISO/SOC reports, maintains client and confidentiality protection and makes on-site an exception with lead time. The support section for data subject rights, DPIA and incidents contains SLA windows (e.g. 48 h for initial response to data protection incidents; 5 working days for data subject rights-related work). The data transfer section clearly classifies third country transfers and SCC and links transfer impact assessments to subprocessor changes. Portability and exit define export formats, test steps, deletion propagation and evidence. Liability/contractual penalties remain moderate and close to causality; they are not a substitute for good TOMs, but flank due diligence obligations.
Interlocking with the main contract
The DPA does not exist in isolation, but refers to the main contract. Service descriptions and SLAs must reflect the data protection obligations: If 24/7 support is agreed, incident processes need 24/7 availability; if nearshore/offshore teams are working, the subprocessor register and transfer protection must reflect this setup. Price blocks take data protection costs (e.g. export costs or additional audit days) into account transparently instead of “hiding” them. Change processes provide for a data protection check (privacy by design/default) so that new features do not fail due to Art. 28. The result is a consistent approach that can be recognized on itmedialaw.com in the cloud and contract environment: data protection is seen as an integral part of the product performance, not as a disruptive add-on.
Operational implementation: Governance beats form
A signed DPA is just the beginning. Those responsible check whether sufficient guarantees are in place before starting to use a service provider – not just in the event of an incident. This includes assessing the TOMs, inspecting certificates/reports, risk scoring and documentation in the vendor register. This status is updated on a regular basis during the term, and on an ad hoc basis in the event of major changes and incidents. The processor carries out awareness programs, renews confidentiality bindings, controls access rights and maintains the subprocessor list. The common focus is on verification: anyone who can show in three clicks which TOM version applied, when a subprocessor was added and when the last restore test was successful will pass every test – contractually and de facto. This governance line arises from the basic obligations of Art. 28 (1) and (3) GDPR and the verification logic set out therein.
12. short answer for the search intention “AV contract / order processing contract sample”
What belongs in a DPA? A clear processing scope with duration, purposes, data types and data subjects; a documented instruction regime; robust TOMs in accordance with Art. 32; transparent sub-processor chains with approval, notice periods and flow-down; audit and verification mechanisms; SLA-capable support for data subject rights, DPIA and incidents; exit and portability rules; deletion with verification; third country protection; and consistent integration with the main contract. If you anchor these elements at the level of procedures, deadlines and evidence, you not only have “a DPA”, but also a verifiable data protection architecture. Practical checklists show the same core – the decisive factor is the translation into your own tech stack. (
Conclusion
A good DP agreement is precise, auditable and operationally feasible. Mandatory content from Art. 28 GDPR forms the foundation; subprocessor control, audit mechanics, TOM versioning and portability make the contract suitable for everyday use. Samples help – the decisive factor remains the adaptation to real data flows, systems and responsibilities. For cloud, SaaS, games and AI projects, the GCU is both product and process law. Understanding it in this way reduces risks, speeds up audits and creates trust among customers, partners and regulators.
Recommendation: Check the specific data reality before concluding a contract and in the event of major changes – and seek advice.























