• Mehr als 3 Millionen Wörter Inhalt
  • |
  • info@itmedialaw.com
  • |
  • Tel: 03322 5078053
Kurzberatung

No products in the cart.

  • en English
  • de Deutsch
  • Informationen
    • Ideal partner
    • About lawyer Marian Härtel
    • Quick and flexible access
    • Principles as a lawyer
    • Why a lawyer and business consultant?
    • Focus areas of attorney Marian Härtel
      • Focus on start-ups
      • Investment advice
      • Corporate law
      • Cryptocurrencies, Blockchain and Games
      • AI and SaaS
      • Streamers and influencers
      • Games and esports law
      • IT/IP Law
      • Law firm for GMBH,UG, GbR
      • Law firm for IT/IP and media law
    • The everyday life of an IT lawyer
    • How can I help clients?
    • Testimonials
    • Team: Saskia Härtel – WHO AM I?
    • Agile and lean law firm
    • Price overview
    • Various information
      • Terms
      • Privacy policy
      • Imprint
  • Services
    • Support and advice of agencies
    • Contract review and preparation
    • Games law consulting
    • Consulting for influencers and streamers
    • Advice in e-commerce
    • DLT and Blockchain consulting
    • Legal advice in corporate law: from incorporation to structuring
    • Legal compliance and expert opinions
    • Outsourcing – for companies or law firms
    • Booking as speaker
  • News
    • Gloss / Opinion
    • Law on the Internet
    • Online retail
    • Law and computer games
    • Law and Esport
    • Blockchain and web law
    • Data protection Law
    • Copyright
    • Labour law
    • Competition law
    • Corporate
    • EU law
    • Law on the protection of minors
    • Tax
    • Other
    • Internally
  • Podcast
    • ITMediaLaw Podcast
  • Knowledge base
    • Laws
    • Legal terms
    • Contract types
    • Clause types
    • Forms of financing
    • Legal means
    • Authorities
    • Company forms
    • Tax
    • Concepts
  • Videos
    • Information videos – about Marian Härtel
    • Videos – about me (Couch)
    • Blogpost – individual videos
    • Videos on services
    • Shorts
    • Podcast format
    • Third-party videos
    • Other videos
  • Contact
  • Informationen
    • Ideal partner
    • About lawyer Marian Härtel
    • Quick and flexible access
    • Principles as a lawyer
    • Why a lawyer and business consultant?
    • Focus areas of attorney Marian Härtel
      • Focus on start-ups
      • Investment advice
      • Corporate law
      • Cryptocurrencies, Blockchain and Games
      • AI and SaaS
      • Streamers and influencers
      • Games and esports law
      • IT/IP Law
      • Law firm for GMBH,UG, GbR
      • Law firm for IT/IP and media law
    • The everyday life of an IT lawyer
    • How can I help clients?
    • Testimonials
    • Team: Saskia Härtel – WHO AM I?
    • Agile and lean law firm
    • Price overview
    • Various information
      • Terms
      • Privacy policy
      • Imprint
  • Services
    • Support and advice of agencies
    • Contract review and preparation
    • Games law consulting
    • Consulting for influencers and streamers
    • Advice in e-commerce
    • DLT and Blockchain consulting
    • Legal advice in corporate law: from incorporation to structuring
    • Legal compliance and expert opinions
    • Outsourcing – for companies or law firms
    • Booking as speaker
  • News
    • Gloss / Opinion
    • Law on the Internet
    • Online retail
    • Law and computer games
    • Law and Esport
    • Blockchain and web law
    • Data protection Law
    • Copyright
    • Labour law
    • Competition law
    • Corporate
    • EU law
    • Law on the protection of minors
    • Tax
    • Other
    • Internally
  • Podcast
    • ITMediaLaw Podcast
  • Knowledge base
    • Laws
    • Legal terms
    • Contract types
    • Clause types
    • Forms of financing
    • Legal means
    • Authorities
    • Company forms
    • Tax
    • Concepts
  • Videos
    • Information videos – about Marian Härtel
    • Videos – about me (Couch)
    • Blogpost – individual videos
    • Videos on services
    • Shorts
    • Podcast format
    • Third-party videos
    • Other videos
  • Contact

Private AI use in the company

24. October 2025
in Law on the Internet, Data protection Law
Reading Time: 9 mins read
0 0
A A
0
ai generated g15ccac1ec 1920

Private accounts on ChatGPT & Co. for corporate purposes are a gateway to data protection breaches, leaks of secrets and labor law conflicts; if you want to use AI in your company, you need clear prohibitions or a properly set up “secure enablement” with technical, contractual and behavioral rules.

Content Hide
1. Why private AI accounts become a compliance risk in the corporate context
2. Legal framework: GDPR, GeschGehG, employee data, co-determination
3. Typical risk scenarios – and how they arise
4. Prohibit or allow in a controlled manner? – A governance model that works
4.1. Clear ban on the private use of AI for business purposes
4.2. “Secure enablement” – controlled authorization, but the right way
5. Model modules for guidelines, contracts and technology
5.1. Policy principle
5.2. Contract modules
5.3. Company agreement
5.4. Technical protective measures
5.5. 5.5 Training & communication
6. Conclusion
6.1. Author: Marian Härtel

Why private AI accounts become a compliance risk in the corporate context

Many teams have long been working with AI assistants. Often not via company licenses, but with private accounts. This is where the liability issues begin:

a) Loss of control over data
Depending on the provider, it is no longer possible to fully control what has been entered into a prompt. Without a contractually secured opt-out for training purposes or clear deletion periods, neither the principle of purpose limitation (Art. 5 para. 1 lit. b GDPR) nor the storage limitation (Art. 5 para. 1 lit. e GDPR) can be reliably proven. Accountability in accordance with Art. 5 para. 2 GDPR fails in practice if entries are made via private accounts for which there are no logs, no binding guidelines and no order processing contracts (Art. 28 GDPR).(EUR-Lex)

b) Unlawful international data transfers
Many AI providers process data outside the EU. Without a reliable transfer basis in accordance with Art. 44 et seq. GDPR, there is a risk of fines. Although the EU-US Data Privacy Framework is (once again) a viable adequacy decision, it only applies to certified US companies – and only if they are correctly integrated. Private use circumvents any transfer due diligence by the company.

c) Trade secrets in the insecurity loophole
Trade secrets are only protected if “appropriate confidentiality measures” have been taken (Section 2 No. 1 b GeschGehG). Tolerating private AI channels counteracts precisely these measures: There is no contractually secured confidentiality standard, no technical access control and no audit trail. In the event of a dispute, there is no protection – with considerable consequential claims.

d) Pitfalls under employment and works constitution law
As soon as the use of AI tools is controlled, monitored or evaluated, the works council is usually involved: Section 87 (1) no. 6 BetrVG (technical equipment for monitoring behavior/performance) regularly imposes a co-determination obligation here – regardless of the intention, because the objective suitability for monitoring is sufficient.

e) Liability for incorrect content and rights chains
Hallucinated facts, license ambiguities in generated code or images and unauthorized use of confidential information can trigger contractual and tortious liability. Without approval processes and source documentation, it is difficult to prove that work/services have been provided with due care.

Interim conclusion: Private AI accounts are convenient from an organizational point of view, but legally a “blind flight”: No one knows what data goes where, who accesses it, how long it is stored – and whether its use is compatible with the GDPR, the GeschGehG or the company’s own confidentiality architecture.

Legal framework: GDPR, GeschGehG, employee data, co-determination

a) GDPR obligations of the controller

  • Legal basis & purpose (Art. 5, 6 GDPR): Corporate data processing needs a viable legal basis and a clear purpose. Private accounts are not subject to this control.
  • Special data (Art. 9 GDPR): Even harmless prompts can contain health, trade union or biometric data. There is no protection architecture for private use – a categorical exclusion of such content is not reliable without technical control.
  • Order processing (Art. 28 GDPR): If an external provider is used, a data processing agreement is mandatory – with minimum content (subject matter, duration, type of data, TOMs, etc.). With private accounts, there is no effective contract between the controller and provider.
  • Security (Art. 32 GDPR) and DPIA (Art. 35 GDPR): Depending on the process and risk, a data protection impact assessment is required; in any case, appropriate TOMs must be implemented – technically not possible if employees use private tools without control.
  • International transfers (Art. 44 ff. GDPR): No transfer compliance package without corporate governance (DPF certification, SCCs, TIA).

b) Employee data protection
Specific requirements apply to employee data. § Section 26 BDSG is interpreted more narrowly in some cases following ECJ ruling C-34/21; the general legal bases of the GDPR must often be applied. For private AI use, this means that consent is only voluntary to a limited extent in the employment relationship; legitimate interest (Art. 6 para. 1 lit. f GDPR) requires careful consideration and technical protective measures.

c) Trade secrets
Protection under GeschGehG requires proactive measures: policies, training, access restrictions, technical barriers. Private AI channels undermine these elements. Anyone who tolerates private use considerably weakens their own claim position (Sections 2, 3 GeschGehG; willful violations may result in criminal prosecution, Section 23 GeschGehG).

d) Co-determination under the BetrVG
The introduction and use of AI tools, logging, proxy blocks or DLP rules is typically subject to co-determination (Section 87 (1) No. 1, 6 BetrVG). Without a works agreement, both bans and “permission with conditions” can be challenged.

e) EU AI Act (outlook)
The AI Act regulates the obligations of providers, distributors and users (“deployers”) of high-risk AI. Initial bans have been in force since February 2025; obligations for general-purpose AI and further stages will take effect in stages from August 2025/2026. For companies, this means that processes for model labeling, risk assessment, logging and incident handling will become standard – improvised private use does not fit into this compliance framework.

Practical anchor: The EDPB ChatGPT task force emphasizes transparency, legal bases, data accuracy and minimization – precisely the fields that are structurally undermined in private use.

Typical risk scenarios – and how they arise

Scenario 1: “Just a quick check”
An account manager copies customer data into a prompt in order to obtain a tonality check. Problem: Personal reference, possibly special categories, no DP basis, unknown transfer paths. Result: Violation of Art. 5, 6, 28, 32 GDPR; confidentiality at risk.

Scenario 2: Pitch concept with confidential figures
A creative director validates price sheets, margin and product roadmap via the private AI account. This information is regularly business secrets. Without appropriate measures (Section 2 No. 1 b GeschGehG), protection is no longer applicable – the company is cutting into its own claims.

Scenario 3: Code snippets and Git links
A developer has code explained via a private tool and attaches Git links for contextualization. In addition to possible license/copyright risks, the link itself can reveal secrets (repo structure, branch names, tickets). Depending on the provider, meta/access data may be sent to third countries.

Scenario 4: HR texts with employee data
HR generates employment references via a private account and feeds in internal performance data. Employee data is subject to strict rules; consent in the employment relationship is problematic, especially if it is not clear where the data ends up.

Scenario 5: Monitoring “by mistake”
IT tries to prevent private use, but activates proxy logging without a BV, which records entries. This is a technical device within the meaning of Section 87 (1) No. 6 BetrVG – tricky without co-determination.

Scenario 6: False statements in the customer project
A privately used AI tool hallucinates technical content. Without a documented source/review obligation and without versioning, diligence cannot be proven; contractual liability risks escalate.

Prohibit or allow in a controlled manner? – A governance model that works

There are two viable options: (A) a clear ban with technical enforcement or (B) “secure enablement” via approved company accounts. Mixed forms create friction.

Clear ban on the private use of AI for business purposes

Objectives: Protection of personal data, protection of business secrets, compliance with co-determination and contractual chains.

Building blocks:

  1. Policy: General ban on the use of private AI accounts for company purposes; ban on entering personal data, customer data, source code, confidential documents and non-public roadmaps into external tools. Reference to Art. 5, 6, 28, 32, 44 et seq. GDPR and Section 2 No. 1 b GeschGehG.
  2. Technology: DNS/proxy blocks for known AI domains; DLP rules (copy-paste block for sensitive classes), secrets scanner; browser policies; mobile device management.
  3. Organization: training with negative/positive examples, whistleblower interface for incidents; defined approval process for exceptions.
  4. Labor law: Enforcement via right of direction (§ 106 GewO) + contractual clauses; coordinated with works council (BV according to § 87 para. 1 no. 1, 6 BetrVG).

Pros & cons: A ban is legally secure and can be communicated quickly, but inhibits innovation and efficiency.

“Secure enablement” – controlled authorization, but the right way

Goals: Utilize productivity gains without sacrificing data protection and confidentiality.

Building blocks (minimum standard):

  1. Approved providers & licenses
    Only enterprise contracts with DPAs in accordance with Art. 28 GDPR, documented TOMs (Art. 32), opt-out from training, clear data residency, clear deletion periods and support SLAs. For US providers: DPF certification or SCCs + TIA (Art. 44 ff. GDPR).
  2. Identities & access
    SSO/MFA, role-based authorizations, tenant isolation, logging, key management; no private accounts.
  3. Use case catalog
    Permitted: generic text optimization without personal reference, boilerplates, code explanations with synthetic examples.
    Prohibited: Personal data, customer dossiers, confidential financial figures, unresolved IP assets, health data, company/trade secrets.
    Yellow zone (only with approval/DPIA): internal evaluations with pseudonymization, production-related prototypes.
  4. Prompt hygiene & output review
    Mandatory instructions against sharing sensitive content; red flag list; dual control approval for external use; source references and versioning. EDPB guidelines (transparency, accuracy) are thus anchored in the organization.
  5. Company agreement
    Rules on use, logging, purpose limitation, deletion periods, training, incident processes, co-determination; clear demarcation from performance/behavioral monitoring (no “micro-monitoring”).
  6. DPIA & risk register
    Pre-assessment (Art. 35 GDPR) for each sensitive use case; assignment of responsibilities; annual re-certification of providers.
  7. AI act readiness
    “AI-supported” labeling, risk assessments, logging, data source transparency – tailored to the relevant obligations and transition periods.

Pros & cons: High security with simultaneous usability, but implementation costs (technology, contracts, BV).

Model modules for guidelines, contracts and technology

Note: Formulations are intended as practical building blocks and must be adapted to the size of the company, sector, works council situation and existing policies.

Policy principle

  1. Scope and objective
    This policy regulates the business use of AI systems. Employees’ private accounts may not be used to process company information or personal data. The aim is to ensure compliance with data protection (in particular Art. 5, 6, 28, 32, 35, 44 ff. GDPR) and the protection of business secrets (Section 2 No. 1 b GeschGehG).
  2. Categorization of information
    Information is divided into public, internal, confidential and strictly confidential classes. Entries in AI systems are only permitted for the “Public” and “Internal” classes, provided there are no personal references. “Confidential”/”Strictly confidential” are generally excluded.
  3. Prohibited content
    It is prohibited to enter personal data (including special categories within the meaning of Art. 9 GDPR), customer data, source code, passwords, access tokens, financial/price lists, roadmaps, internal legal documents or confidential third-party data into AI systems.
  4. Permitted use
    Allowed are generic formulation, structuring and ideation aids without personal reference, with approved company licenses and opt-out from training.
  5. Approval procedure
    Use cases that are not covered require prior approval from data protection, information security and – where relevant – the works council (check DPIA obligation).
  6. Review and labeling
    Content created by KI is always reviewed by experts; external use is labeled if required by law or contractually guaranteed.

Contract modules

Order processing (Art. 28 GDPR) – minimum points vis-à-vis the AI provider:

  • Subject matter/type/purpose of processing; categories of data/data subjects; duration.
  • TOMs (including encryption at rest/transport, client separation, key management, role models, incident handling, sub-processor approval).
  • Sub-processors: list, pre-approval procedure, information obligations in the event of changes.
  • Data deletion/return: deadlines, formats, proof.
  • Audit and information rights; support with data subject rights, DPIA, notifications.
  • Third country transfers: DPF certification or SCCs + TIA, supplementary measures.

Tip: Many AI enterprise offerings provide training opt-out, data residency and zero retention modes. Confidential data cannot be used without these options.

Company agreement

  1. Purpose and validity: Efficiency gains through defined AI applications, no performance/behavior profiling.
  2. Permitted tools/use cases: Whitelist, change management.
  3. Data protection/TOMs: logging scope, pseudonymization, deletion concept, access only for defined roles.
  4. Transparency/information: informing the workforce, documentation, training.
  5. Monitoring/reporting: Aggregated usage reporting, no individual monitoring; procedure for violations; incident management.
  6. Evaluation: Review after 12 months or in the event of legislative changes (take AI Act Roadmap into account).

Technical protective measures

  • Identities: SSO/MFA, conditional access, role-based approvals.
  • Data flow control: DLP rules in the browser/end device, clipboard control for sensitive classes, secret scanner in IDEs/repos.
  • Network: Proxy release only for whitelisted domains of the released providers; block for known public AI endpoints.
  • Client protection: Separate tenants, key sovereignty; logging with data-saving pseudonymization.
  • Sandboxing: Internal “AI sandboxes” with synthetic/depersonalized data for experiments.
  • Lifecycle: Version control for prompts/outputs, binding review checklists, archiving according to retention periods.

5.5 Training & communication

  • Case studies instead of a desert of paragraphs: What is allowed in prompts – and what isn’t?
  • “Red flags”: personal references, customer lists, pricing models, source code, secret agreements, health information.
  • Alternative courses of action: Internal templates, pseudonymization, synthetic dummies, secure enterprise models.
  • Reporting channels: Low-threshold incident reporting (“false prompt”), no fear of error culture – but clearly regulated remedial action.

Conclusion

Allowing the private use of AI for corporate purposes creates a whole host of legal and security risks: lack of AV contracts, unclear third-country transfers, loss of confidentiality, conflicts under works constitution law and a lack of verifiability of due diligence. Two approaches are viable: a consistent ban (with technical and training support) or controlled enablement via company licenses, clean contracts, TOMs, company agreements and clear use case limits. In both models, the following applies: operationalize data protection principles, actively shape secrecy protection and consider AI act readiness – then productivity remains possible without collateral damage to compliance.

 

Marian Härtel
Author: Marian Härtel

Marian Härtel ist Rechtsanwalt und Fachanwalt für IT-Recht mit einer über 25-jährigen Erfahrung als Unternehmer und Berater in den Bereichen Games, E-Sport, Blockchain, SaaS und Künstliche Intelligenz. Seine Beratungsschwerpunkte umfassen neben dem IT-Recht insbesondere das Urheberrecht, Medienrecht sowie Wettbewerbsrecht. Er betreut schwerpunktmäßig Start-ups, Agenturen und Influencer, die er in strategischen Fragen, komplexen Vertragsangelegenheiten sowie bei Investitionsprojekten begleitet. Dabei zeichnet sich seine Beratung durch einen interdisziplinären Ansatz aus, der juristische Expertise und langjährige unternehmerische Erfahrung miteinander verbindet. Ziel seiner Tätigkeit ist stets, Mandanten praxisorientierte Lösungen anzubieten und rechtlich fundierte Unterstützung bei der Umsetzung innovativer Geschäftsmodelle zu gewährleisten.

Weitere spannende Blogposts

How to set up and design the terms and conditions of an Esports event

How to set up and design the terms and conditions of an Esports event
28. December 2022

If you are organizing an Esports event, it is important to set up clear and detailed T&Cs, otherwise conflicts and...

Read moreDetails

BGH: Women also gamble on first-person shooters

BGH: Women also gamble on first-person shooters
12. March 2019

In a file-sharing case, the Federal Court of Justice has ruled that a woman is eligible as the perpetrator of...

Read moreDetails

Esports and international workers?

pexels photo 896851 1
7. November 2022

On the topic of when and whether an esports team should consider a player an employee or a contractor, I've...

Read moreDetails

What is the Artificial Intelligence Act?

What is the Artificial Intelligence Act?
6. January 2023

Introduction The Artificial Intelligence Act is a proposal for a European law on artificial intelligence (AI) - the first law....

Read moreDetails

Twitter must not block accounts for no reason

Lupus in Saxonia
14. November 2019

The topic of how US social networks such as Twitter, Facebook or Instagram deal with German law has become increasingly...

Read moreDetails

IT contract law: What start-ups should look out for when working with service providers

IT contract law: What start-ups should look out for when working with service providers
10. October 2024

For many start-ups, collaboration with external IT service providers is essential, be it for the development of software, the implementation...

Read moreDetails

USK with new testing practice

USK with new testing practice
7. November 2022

The Entertainment Software Self-Regulation Body (USK) is changing its practice in the age rating procedure for games in which symbols...

Read moreDetails

When can I avoid the cookie banner?

ECJ: Cookies require explicit consent of users
18. October 2019

The ECJ has just ruled on the subject of cookies in the Planet49 case(see this article). Because of this procedure...

Read moreDetails

MDR may delete comments without broadcast reference on its Facebook page

MDR may delete comments without broadcast reference on its Facebook page
2. December 2022

Public broadcasters are entitled to delete non-broadcast-related comments made by users in forums on their corporate social media pages. This...

Read moreDetails
ChatGPT and lawyers: recordings of the Weblaw launch event
Law on the Internet

Private AI use in the company

24. October 2025

Private accounts on ChatGPT & Co. for corporate purposes are a gateway to data protection breaches, leaks of secrets and...

Read moreDetails
Lego brick still protected as a design patent

App purchases, in-app purchases and sales tax

21. October 2025
dsgvo 1

What belongs in a DPA? Data processing agreement in accordance with Art. 28 GDPR

17. October 2025
Smart contracts in the insurance industry: contract design and regulatory compliance for InsurTech start-ups

Contract for work vs. service contract in software, AI and games projects

15. October 2025

Influencer contract: performance profile, rights/buyouts, labeling and AI content

13. October 2025

Podcastfolge

da884f9e2769f2f96d6b74255be62c27

The role of the IT lawyer

5. September 2024

In this exciting podcast episode, we delve into the fascinating world of IT start-ups and find out why an experienced...

Read moreDetails
75df8eaa33cd7d3975a96b022c65c6e4

Life as an IT lawyer, work-life balance, family and my career

26. September 2024
fcb134a2b3cfec5d256cf9742ecef1cd

The unconventional lawyer: a nerd in the service of the law

26. September 2024
86fe194b0c4a43e7aef2a4773b88c2c4

On the dark side? A lawyer in the field of tension of innovative start-ups

26. September 2024
8ffe8f2a4228de20d20238899b3d922e

Web3, blockchain and law – a critical review

26. September 2024

Video

My transparent billing

My transparent billing

10. February 2025

In this video, I talk a bit about transparent billing and how I communicate what it costs to work with...

Read moreDetails
Fascination between law and technology

Fascination between law and technology

10. February 2025
My two biggest challenges are?

My two biggest challenges are?

10. February 2025
What really makes me happy

What really makes me happy

10. February 2025
What I love about my job!

What I love about my job!

10. February 2025
  • Privacy policy
  • Imprint
  • Contact
  • About lawyer Marian Härtel
Marian Härtel, Rathenaustr. 58a, 14612 Falkensee, info@itmedialaw.com

Marian Härtel - Rechtsanwalt für IT-Recht, Medienrecht und Startups, mit einem Fokus auf innovative Geschäftsmodelle, Games, KI und Finanzierungsberatung.

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In
  • Informationen
    • Ideal partner
    • About lawyer Marian Härtel
    • Quick and flexible access
    • Principles as a lawyer
    • Why a lawyer and business consultant?
    • Focus areas of attorney Marian Härtel
      • Focus on start-ups
      • Investment advice
      • Corporate law
      • Cryptocurrencies, Blockchain and Games
      • AI and SaaS
      • Streamers and influencers
      • Games and esports law
      • IT/IP Law
      • Law firm for GMBH,UG, GbR
      • Law firm for IT/IP and media law
    • The everyday life of an IT lawyer
    • How can I help clients?
    • Testimonials
    • Team: Saskia Härtel – WHO AM I?
    • Agile and lean law firm
    • Price overview
    • Various information
      • Terms
      • Privacy policy
      • Imprint
  • Services
    • Support and advice of agencies
    • Contract review and preparation
    • Games law consulting
    • Consulting for influencers and streamers
    • Advice in e-commerce
    • DLT and Blockchain consulting
    • Legal advice in corporate law: from incorporation to structuring
    • Legal compliance and expert opinions
    • Outsourcing – for companies or law firms
    • Booking as speaker
  • News
    • Gloss / Opinion
    • Law on the Internet
    • Online retail
    • Law and computer games
    • Law and Esport
    • Blockchain and web law
    • Data protection Law
    • Copyright
    • Labour law
    • Competition law
    • Corporate
    • EU law
    • Law on the protection of minors
    • Tax
    • Other
    • Internally
  • Podcast
    • ITMediaLaw Podcast
  • Knowledge base
    • Laws
    • Legal terms
    • Contract types
    • Clause types
    • Forms of financing
    • Legal means
    • Authorities
    • Company forms
    • Tax
    • Concepts
  • Videos
    • Information videos – about Marian Härtel
    • Videos – about me (Couch)
    • Blogpost – individual videos
    • Videos on services
    • Shorts
    • Podcast format
    • Third-party videos
    • Other videos
  • Contact
  • en English
  • de Deutsch
Kostenlose Kurzberatung