• Latest
  • Trending
ChatGPT and lawyers: recordings of the Weblaw launch event

Private AI use in the company

24. October 2025
Lego brick still protected as a design patent

App purchases, in-app purchases and sales tax

21. October 2025
dsgvo 1

What belongs in a DPA? Data processing agreement in accordance with Art. 28 GDPR

17. October 2025
Smart contracts in the insurance industry: contract design and regulatory compliance for InsurTech start-ups

Contract for work vs. service contract in software, AI and games projects

15. October 2025

Influencer contract: performance profile, rights/buyouts, labeling and AI content

13. October 2025
AI content for subscription platforms

AI content for subscription platforms

29. September 2025
E-sports finally charitable? What the government draft of the Tax Amendment Act 2025 really brings

E-sports finally charitable? What the government draft of the Tax Amendment Act 2025 really brings

23. September 2025
Clubs, photos and minors: managing consent properly

Clubs, photos and minors: managing consent properly

22. September 2025
AI faces, voice clones and deepfakes in advertising: rules of the game under the EU AI Act and German law

AI faces, voice clones and deepfakes in advertising: rules of the game under the EU AI Act and German law

17. September 2025
Modding in EULAs and contracts – what applies legally in Germany?

Modding in EULAs and contracts – what applies legally in Germany?

8. September 2025
Arbitration agreements in EULAs and developer contracts

Arbitration agreements in EULAs and developer contracts

7. September 2025
Chain of title in game development: building a clean chain of rights

Chain of title in game development: building a clean chain of rights

6. September 2025
Fail-fast clauses in media productions – what are they actually?

Fail-fast clauses in media productions – what are they actually?

5. September 2025
Founder’s agreement vs. shareholder agreement: setting the course for startups at an early stage

Founder’s agreement vs. shareholder agreement: setting the course for startups at an early stage

12. August 2025
Cheat software without code intervention: What the BGH really decided in the Sony ./. Datel case (I ZR 157/21)

Cheat software without code intervention: What the BGH really decided in the Sony ./. Datel case (I ZR 157/21)

11. August 2025
Digital integrity as a (new) fundamental right: status in Germany and the EU in 2025

Digital integrity as a (new) fundamental right: status in Germany and the EU in 2025

10. August 2025
European Economic Interest Grouping (EEIG)

EU Digital Decade 2030: Data law, Data Act & eIDAS 2 – what needs to be implemented in 2025

8. August 2025
Upload filters between copyright and personal rights

Upload filters between copyright and personal rights

7. August 2025
On-demand transmission right in the digital space: streaming, Section 19a UrhG and licensing

On-demand transmission right in the digital space: streaming, Section 19a UrhG and licensing

6. August 2025
Q&A: Legal issues for game developers

5-day guide: Founding a game development studio

5. August 2025
Blockchain in digital forensics: fields of application, evidential value and data protection limits

Blockchain in digital forensics: fields of application, evidential value and data protection limits

3. August 2025
  • Mehr als 3 Millionen Wörter Inhalt
  • |
  • info@itmedialaw.com
  • |
  • Tel: 03322 5078053
Kurzberatung

No products in the cart.

  • en English
  • de Deutsch
  • Informationen
    • Ideal partner
    • About lawyer Marian Härtel
    • Quick and flexible access
    • Principles as a lawyer
    • Why a lawyer and business consultant?
    • Focus areas of attorney Marian Härtel
      • Focus on start-ups
      • Investment advice
      • Corporate law
      • Cryptocurrencies, Blockchain and Games
      • AI and SaaS
      • Streamers and influencers
      • Games and esports law
      • IT/IP Law
      • Law firm for GMBH,UG, GbR
      • Law firm for IT/IP and media law
    • The everyday life of an IT lawyer
    • How can I help clients?
    • Testimonials
    • Team: Saskia Härtel – WHO AM I?
    • Agile and lean law firm
    • Price overview
    • Various information
      • Terms
      • Privacy policy
      • Imprint
  • Services
    • Support and advice of agencies
    • Contract review and preparation
    • Games law consulting
    • Consulting for influencers and streamers
    • Advice in e-commerce
    • DLT and Blockchain consulting
    • Legal advice in corporate law: from incorporation to structuring
    • Legal compliance and expert opinions
    • Outsourcing – for companies or law firms
    • Booking as speaker
  • News
    • Gloss / Opinion
    • Law on the Internet
    • Online retail
    • Law and computer games
    • Law and Esport
    • Blockchain and web law
    • Data protection Law
    • Copyright
    • Labour law
    • Competition law
    • Corporate
    • EU law
    • Law on the protection of minors
    • Tax
    • Other
    • Internally
  • Podcast
    • ITMediaLaw Podcast
  • Knowledge base
    • Laws
    • Legal terms
    • Contract types
    • Clause types
    • Forms of financing
    • Legal means
    • Authorities
    • Company forms
    • Tax
    • Concepts
  • Videos
    • Information videos – about Marian Härtel
    • Videos – about me (Couch)
    • Blogpost – individual videos
    • Videos on services
    • Shorts
    • Podcast format
    • Third-party videos
    • Other videos
  • Contact
  • Informationen
    • Ideal partner
    • About lawyer Marian Härtel
    • Quick and flexible access
    • Principles as a lawyer
    • Why a lawyer and business consultant?
    • Focus areas of attorney Marian Härtel
      • Focus on start-ups
      • Investment advice
      • Corporate law
      • Cryptocurrencies, Blockchain and Games
      • AI and SaaS
      • Streamers and influencers
      • Games and esports law
      • IT/IP Law
      • Law firm for GMBH,UG, GbR
      • Law firm for IT/IP and media law
    • The everyday life of an IT lawyer
    • How can I help clients?
    • Testimonials
    • Team: Saskia Härtel – WHO AM I?
    • Agile and lean law firm
    • Price overview
    • Various information
      • Terms
      • Privacy policy
      • Imprint
  • Services
    • Support and advice of agencies
    • Contract review and preparation
    • Games law consulting
    • Consulting for influencers and streamers
    • Advice in e-commerce
    • DLT and Blockchain consulting
    • Legal advice in corporate law: from incorporation to structuring
    • Legal compliance and expert opinions
    • Outsourcing – for companies or law firms
    • Booking as speaker
  • News
    • Gloss / Opinion
    • Law on the Internet
    • Online retail
    • Law and computer games
    • Law and Esport
    • Blockchain and web law
    • Data protection Law
    • Copyright
    • Labour law
    • Competition law
    • Corporate
    • EU law
    • Law on the protection of minors
    • Tax
    • Other
    • Internally
  • Podcast
    • ITMediaLaw Podcast
  • Knowledge base
    • Laws
    • Legal terms
    • Contract types
    • Clause types
    • Forms of financing
    • Legal means
    • Authorities
    • Company forms
    • Tax
    • Concepts
  • Videos
    • Information videos – about Marian Härtel
    • Videos – about me (Couch)
    • Blogpost – individual videos
    • Videos on services
    • Shorts
    • Podcast format
    • Third-party videos
    • Other videos
  • Contact

Private AI use in the company

24. October 2025
in Law on the Internet, Data protection Law
Reading Time: 9 mins read
0 0
A A
0
ai generated g15ccac1ec 1920

Private accounts on ChatGPT & Co. for corporate purposes are a gateway to data protection breaches, leaks of secrets and labor law conflicts; if you want to use AI in your company, you need clear prohibitions or a properly set up “secure enablement” with technical, contractual and behavioral rules.

Content Hide
1. Why private AI accounts become a compliance risk in the corporate context
2. Legal framework: GDPR, GeschGehG, employee data, co-determination
3. Typical risk scenarios – and how they arise
4. Prohibit or allow in a controlled manner? – A governance model that works
4.1. Clear ban on the private use of AI for business purposes
4.2. “Secure enablement” – controlled authorization, but the right way
5. Model modules for guidelines, contracts and technology
5.1. Policy principle
5.2. Contract modules
5.3. Company agreement
5.4. Technical protective measures
5.5. 5.5 Training & communication
6. Conclusion
6.1. Author: Marian Härtel

Why private AI accounts become a compliance risk in the corporate context

Many teams have long been working with AI assistants. Often not via company licenses, but with private accounts. This is where the liability issues begin:

a) Loss of control over data
Depending on the provider, it is no longer possible to fully control what has been entered into a prompt. Without a contractually secured opt-out for training purposes or clear deletion periods, neither the principle of purpose limitation (Art. 5 para. 1 lit. b GDPR) nor the storage limitation (Art. 5 para. 1 lit. e GDPR) can be reliably proven. Accountability in accordance with Art. 5 para. 2 GDPR fails in practice if entries are made via private accounts for which there are no logs, no binding guidelines and no order processing contracts (Art. 28 GDPR).(EUR-Lex)

b) Unlawful international data transfers
Many AI providers process data outside the EU. Without a reliable transfer basis in accordance with Art. 44 et seq. GDPR, there is a risk of fines. Although the EU-US Data Privacy Framework is (once again) a viable adequacy decision, it only applies to certified US companies – and only if they are correctly integrated. Private use circumvents any transfer due diligence by the company.

c) Trade secrets in the insecurity loophole
Trade secrets are only protected if “appropriate confidentiality measures” have been taken (Section 2 No. 1 b GeschGehG). Tolerating private AI channels counteracts precisely these measures: There is no contractually secured confidentiality standard, no technical access control and no audit trail. In the event of a dispute, there is no protection – with considerable consequential claims.

d) Pitfalls under employment and works constitution law
As soon as the use of AI tools is controlled, monitored or evaluated, the works council is usually involved: Section 87 (1) no. 6 BetrVG (technical equipment for monitoring behavior/performance) regularly imposes a co-determination obligation here – regardless of the intention, because the objective suitability for monitoring is sufficient.

e) Liability for incorrect content and rights chains
Hallucinated facts, license ambiguities in generated code or images and unauthorized use of confidential information can trigger contractual and tortious liability. Without approval processes and source documentation, it is difficult to prove that work/services have been provided with due care.

Interim conclusion: Private AI accounts are convenient from an organizational point of view, but legally a “blind flight”: No one knows what data goes where, who accesses it, how long it is stored – and whether its use is compatible with the GDPR, the GeschGehG or the company’s own confidentiality architecture.

Legal framework: GDPR, GeschGehG, employee data, co-determination

a) GDPR obligations of the controller

  • Legal basis & purpose (Art. 5, 6 GDPR): Corporate data processing needs a viable legal basis and a clear purpose. Private accounts are not subject to this control.
  • Special data (Art. 9 GDPR): Even harmless prompts can contain health, trade union or biometric data. There is no protection architecture for private use – a categorical exclusion of such content is not reliable without technical control.
  • Order processing (Art. 28 GDPR): If an external provider is used, a data processing agreement is mandatory – with minimum content (subject matter, duration, type of data, TOMs, etc.). With private accounts, there is no effective contract between the controller and provider.
  • Security (Art. 32 GDPR) and DPIA (Art. 35 GDPR): Depending on the process and risk, a data protection impact assessment is required; in any case, appropriate TOMs must be implemented – technically not possible if employees use private tools without control.
  • International transfers (Art. 44 ff. GDPR): No transfer compliance package without corporate governance (DPF certification, SCCs, TIA).

b) Employee data protection
Specific requirements apply to employee data. § Section 26 BDSG is interpreted more narrowly in some cases following ECJ ruling C-34/21; the general legal bases of the GDPR must often be applied. For private AI use, this means that consent is only voluntary to a limited extent in the employment relationship; legitimate interest (Art. 6 para. 1 lit. f GDPR) requires careful consideration and technical protective measures.

c) Trade secrets
Protection under GeschGehG requires proactive measures: policies, training, access restrictions, technical barriers. Private AI channels undermine these elements. Anyone who tolerates private use considerably weakens their own claim position (Sections 2, 3 GeschGehG; willful violations may result in criminal prosecution, Section 23 GeschGehG).

d) Co-determination under the BetrVG
The introduction and use of AI tools, logging, proxy blocks or DLP rules is typically subject to co-determination (Section 87 (1) No. 1, 6 BetrVG). Without a works agreement, both bans and “permission with conditions” can be challenged.

e) EU AI Act (outlook)
The AI Act regulates the obligations of providers, distributors and users (“deployers”) of high-risk AI. Initial bans have been in force since February 2025; obligations for general-purpose AI and further stages will take effect in stages from August 2025/2026. For companies, this means that processes for model labeling, risk assessment, logging and incident handling will become standard – improvised private use does not fit into this compliance framework.

Practical anchor: The EDPB ChatGPT task force emphasizes transparency, legal bases, data accuracy and minimization – precisely the fields that are structurally undermined in private use.

Typical risk scenarios – and how they arise

Scenario 1: “Just a quick check”
An account manager copies customer data into a prompt in order to obtain a tonality check. Problem: Personal reference, possibly special categories, no DP basis, unknown transfer paths. Result: Violation of Art. 5, 6, 28, 32 GDPR; confidentiality at risk.

Scenario 2: Pitch concept with confidential figures
A creative director validates price sheets, margin and product roadmap via the private AI account. This information is regularly business secrets. Without appropriate measures (Section 2 No. 1 b GeschGehG), protection is no longer applicable – the company is cutting into its own claims.

Scenario 3: Code snippets and Git links
A developer has code explained via a private tool and attaches Git links for contextualization. In addition to possible license/copyright risks, the link itself can reveal secrets (repo structure, branch names, tickets). Depending on the provider, meta/access data may be sent to third countries.

Scenario 4: HR texts with employee data
HR generates employment references via a private account and feeds in internal performance data. Employee data is subject to strict rules; consent in the employment relationship is problematic, especially if it is not clear where the data ends up.

Scenario 5: Monitoring “by mistake”
IT tries to prevent private use, but activates proxy logging without a BV, which records entries. This is a technical device within the meaning of Section 87 (1) No. 6 BetrVG – tricky without co-determination.

Scenario 6: False statements in the customer project
A privately used AI tool hallucinates technical content. Without a documented source/review obligation and without versioning, diligence cannot be proven; contractual liability risks escalate.

Prohibit or allow in a controlled manner? – A governance model that works

There are two viable options: (A) a clear ban with technical enforcement or (B) “secure enablement” via approved company accounts. Mixed forms create friction.

Clear ban on the private use of AI for business purposes

Objectives: Protection of personal data, protection of business secrets, compliance with co-determination and contractual chains.

Building blocks:

  1. Policy: General ban on the use of private AI accounts for company purposes; ban on entering personal data, customer data, source code, confidential documents and non-public roadmaps into external tools. Reference to Art. 5, 6, 28, 32, 44 et seq. GDPR and Section 2 No. 1 b GeschGehG.
  2. Technology: DNS/proxy blocks for known AI domains; DLP rules (copy-paste block for sensitive classes), secrets scanner; browser policies; mobile device management.
  3. Organization: training with negative/positive examples, whistleblower interface for incidents; defined approval process for exceptions.
  4. Labor law: Enforcement via right of direction (§ 106 GewO) + contractual clauses; coordinated with works council (BV according to § 87 para. 1 no. 1, 6 BetrVG).

Pros & cons: A ban is legally secure and can be communicated quickly, but inhibits innovation and efficiency.

“Secure enablement” – controlled authorization, but the right way

Goals: Utilize productivity gains without sacrificing data protection and confidentiality.

Building blocks (minimum standard):

  1. Approved providers & licenses
    Only enterprise contracts with DPAs in accordance with Art. 28 GDPR, documented TOMs (Art. 32), opt-out from training, clear data residency, clear deletion periods and support SLAs. For US providers: DPF certification or SCCs + TIA (Art. 44 ff. GDPR).
  2. Identities & access
    SSO/MFA, role-based authorizations, tenant isolation, logging, key management; no private accounts.
  3. Use case catalog
    Permitted: generic text optimization without personal reference, boilerplates, code explanations with synthetic examples.
    Prohibited: Personal data, customer dossiers, confidential financial figures, unresolved IP assets, health data, company/trade secrets.
    Yellow zone (only with approval/DPIA): internal evaluations with pseudonymization, production-related prototypes.
  4. Prompt hygiene & output review
    Mandatory instructions against sharing sensitive content; red flag list; dual control approval for external use; source references and versioning. EDPB guidelines (transparency, accuracy) are thus anchored in the organization.
  5. Company agreement
    Rules on use, logging, purpose limitation, deletion periods, training, incident processes, co-determination; clear demarcation from performance/behavioral monitoring (no “micro-monitoring”).
  6. DPIA & risk register
    Pre-assessment (Art. 35 GDPR) for each sensitive use case; assignment of responsibilities; annual re-certification of providers.
  7. AI act readiness
    “AI-supported” labeling, risk assessments, logging, data source transparency – tailored to the relevant obligations and transition periods.

Pros & cons: High security with simultaneous usability, but implementation costs (technology, contracts, BV).

Model modules for guidelines, contracts and technology

Note: Formulations are intended as practical building blocks and must be adapted to the size of the company, sector, works council situation and existing policies.

Policy principle

  1. Scope and objective
    This policy regulates the business use of AI systems. Employees’ private accounts may not be used to process company information or personal data. The aim is to ensure compliance with data protection (in particular Art. 5, 6, 28, 32, 35, 44 ff. GDPR) and the protection of business secrets (Section 2 No. 1 b GeschGehG).
  2. Categorization of information
    Information is divided into public, internal, confidential and strictly confidential classes. Entries in AI systems are only permitted for the “Public” and “Internal” classes, provided there are no personal references. “Confidential”/”Strictly confidential” are generally excluded.
  3. Prohibited content
    It is prohibited to enter personal data (including special categories within the meaning of Art. 9 GDPR), customer data, source code, passwords, access tokens, financial/price lists, roadmaps, internal legal documents or confidential third-party data into AI systems.
  4. Permitted use
    Allowed are generic formulation, structuring and ideation aids without personal reference, with approved company licenses and opt-out from training.
  5. Approval procedure
    Use cases that are not covered require prior approval from data protection, information security and – where relevant – the works council (check DPIA obligation).
  6. Review and labeling
    Content created by KI is always reviewed by experts; external use is labeled if required by law or contractually guaranteed.

Contract modules

Order processing (Art. 28 GDPR) – minimum points vis-à-vis the AI provider:

  • Subject matter/type/purpose of processing; categories of data/data subjects; duration.
  • TOMs (including encryption at rest/transport, client separation, key management, role models, incident handling, sub-processor approval).
  • Sub-processors: list, pre-approval procedure, information obligations in the event of changes.
  • Data deletion/return: deadlines, formats, proof.
  • Audit and information rights; support with data subject rights, DPIA, notifications.
  • Third country transfers: DPF certification or SCCs + TIA, supplementary measures.

Tip: Many AI enterprise offerings provide training opt-out, data residency and zero retention modes. Confidential data cannot be used without these options.

Company agreement

  1. Purpose and validity: Efficiency gains through defined AI applications, no performance/behavior profiling.
  2. Permitted tools/use cases: Whitelist, change management.
  3. Data protection/TOMs: logging scope, pseudonymization, deletion concept, access only for defined roles.
  4. Transparency/information: informing the workforce, documentation, training.
  5. Monitoring/reporting: Aggregated usage reporting, no individual monitoring; procedure for violations; incident management.
  6. Evaluation: Review after 12 months or in the event of legislative changes (take AI Act Roadmap into account).

Technical protective measures

  • Identities: SSO/MFA, conditional access, role-based approvals.
  • Data flow control: DLP rules in the browser/end device, clipboard control for sensitive classes, secret scanner in IDEs/repos.
  • Network: Proxy release only for whitelisted domains of the released providers; block for known public AI endpoints.
  • Client protection: Separate tenants, key sovereignty; logging with data-saving pseudonymization.
  • Sandboxing: Internal “AI sandboxes” with synthetic/depersonalized data for experiments.
  • Lifecycle: Version control for prompts/outputs, binding review checklists, archiving according to retention periods.

5.5 Training & communication

  • Case studies instead of a desert of paragraphs: What is allowed in prompts – and what isn’t?
  • “Red flags”: personal references, customer lists, pricing models, source code, secret agreements, health information.
  • Alternative courses of action: Internal templates, pseudonymization, synthetic dummies, secure enterprise models.
  • Reporting channels: Low-threshold incident reporting (“false prompt”), no fear of error culture – but clearly regulated remedial action.

Conclusion

Allowing the private use of AI for corporate purposes creates a whole host of legal and security risks: lack of AV contracts, unclear third-country transfers, loss of confidentiality, conflicts under works constitution law and a lack of verifiability of due diligence. Two approaches are viable: a consistent ban (with technical and training support) or controlled enablement via company licenses, clean contracts, TOMs, company agreements and clear use case limits. In both models, the following applies: operationalize data protection principles, actively shape secrecy protection and consider AI act readiness – then productivity remains possible without collateral damage to compliance.

 

Marian Härtel
Author: Marian Härtel

Marian Härtel ist Rechtsanwalt und Fachanwalt für IT-Recht mit einer über 25-jährigen Erfahrung als Unternehmer und Berater in den Bereichen Games, E-Sport, Blockchain, SaaS und Künstliche Intelligenz. Seine Beratungsschwerpunkte umfassen neben dem IT-Recht insbesondere das Urheberrecht, Medienrecht sowie Wettbewerbsrecht. Er betreut schwerpunktmäßig Start-ups, Agenturen und Influencer, die er in strategischen Fragen, komplexen Vertragsangelegenheiten sowie bei Investitionsprojekten begleitet. Dabei zeichnet sich seine Beratung durch einen interdisziplinären Ansatz aus, der juristische Expertise und langjährige unternehmerische Erfahrung miteinander verbindet. Ziel seiner Tätigkeit ist stets, Mandanten praxisorientierte Lösungen anzubieten und rechtlich fundierte Unterstützung bei der Umsetzung innovativer Geschäftsmodelle zu gewährleisten.

Weitere spannende Blogposts

Virtual Avatars and Influencers: Legal Aspects in the Marketing World

Virtual Avatars and Influencers: Legal Aspects in the Marketing World
5. July 2023

In the ever-changing digital landscape, virtual avatars and influencers have become a significant trend that has the potential to revolutionize...

Read moreDetails

Attention: Advertising with commission-free dwellings warnable

Online shops: Attention to advertising with EIA
7. November 2022

The Brandenburg Higher Regional Court ruled at the end of last year that offering a commission-free brokerage service for rental...

Read moreDetails

Koblenz Regional Court also grants restitution for gambling losses

Lottery brokerage/gambling/betting on the Internet without permission?
6. September 2023

Can a gambler reclaim her losses suffered in an online casino from 2015 to 2020 from its operator? This question...

Read moreDetails

OLG Braunschweig to Instagram & Influencer without consideration

Legal form as an influencer? A few hints!
3. April 2019

So slowly it becomes lonely for the legal opinions that argue that Instagram posts by influencers only have to be...

Read moreDetails

New info on the status of the State Media Treaty

New info on the status of the State Media Treaty
7. November 2022

Right now, information on the Interstate Treaty on the Media in the form of the 23rd Amendment to the Interstate...

Read moreDetails

“Am I fuckable?” – Women streamers should not put up with everything!

“Am I fuckable?” – Women streamers should not put up with everything!
10. February 2023

I represent a lot of streamers/influencers or the agencies and the more YouTube/Twitch or other platforms gain in importance compared...

Read moreDetails

Attorney Härtel in the metaverse – so somehow: The chatbot

Attorney Härtel in the metaverse – so somehow: The chatbot
12. January 2023

AIs are currently on everyone's lips. And all tech-savvy lawyer who is an IT nerd myself and primarily advises and...

Read moreDetails

Careful with the presentation of KPIs in investor negotiations

New info on the status of the State Media Treaty
7. September 2023

Introduction Investor negotiations are an important part of corporate financing in the Federal Republic of Germany. Whether a startup or...

Read moreDetails

25 years of self-employment: a path full of challenges and opportunities

Home
29. November 2023

Introduction: The journey begins Self-employment is a journey that requires courage, determination and vision. This journey began for me 25...

Read moreDetails
ChatGPT and lawyers: recordings of the Weblaw launch event
Law on the Internet

Private AI use in the company

24. October 2025

Private accounts on ChatGPT & Co. for corporate purposes are a gateway to data protection breaches, leaks of secrets and...

Read moreDetails
Lego brick still protected as a design patent

App purchases, in-app purchases and sales tax

21. October 2025
dsgvo 1

What belongs in a DPA? Data processing agreement in accordance with Art. 28 GDPR

17. October 2025
Smart contracts in the insurance industry: contract design and regulatory compliance for InsurTech start-ups

Contract for work vs. service contract in software, AI and games projects

15. October 2025

Influencer contract: performance profile, rights/buyouts, labeling and AI content

13. October 2025

Podcastfolge

4f3597d5481e0f38e37bf80eaad208c7

The IT Media Law Podcast. Episode No. 1: What is this actually about?

26. August 2024

Yeah, the first real episode with myself! In this podcast, we dive into the exciting world of IT law and...

Read moreDetails
da884f9e2769f2f96d6b74255be62c27

The role of the IT lawyer

5. September 2024
9e9bbb286e0d24cb5ca04eccc9b0c902

Legal challenges of innovative business models

1. October 2024
3c671c5134443338a4e0c30412ac3270

“Digital law decoded” with lawyer Marian Härtel

26. September 2024
238a909c26a0302cbd4792cbd18e4922

Global challenges for start-ups – A legal guide

10. October 2024

Video

My transparent billing

My transparent billing

10. February 2025

In this video, I talk a bit about transparent billing and how I communicate what it costs to work with...

Read moreDetails
Fascination between law and technology

Fascination between law and technology

10. February 2025
My two biggest challenges are?

My two biggest challenges are?

10. February 2025
What really makes me happy

What really makes me happy

10. February 2025
What I love about my job!

What I love about my job!

10. February 2025
  • Privacy policy
  • Imprint
  • Contact
  • About lawyer Marian Härtel
Marian Härtel, Rathenaustr. 58a, 14612 Falkensee, info@itmedialaw.com

Marian Härtel - Rechtsanwalt für IT-Recht, Medienrecht und Startups, mit einem Fokus auf innovative Geschäftsmodelle, Games, KI und Finanzierungsberatung.

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In
  • Informationen
    • Ideal partner
    • About lawyer Marian Härtel
    • Quick and flexible access
    • Principles as a lawyer
    • Why a lawyer and business consultant?
    • Focus areas of attorney Marian Härtel
      • Focus on start-ups
      • Investment advice
      • Corporate law
      • Cryptocurrencies, Blockchain and Games
      • AI and SaaS
      • Streamers and influencers
      • Games and esports law
      • IT/IP Law
      • Law firm for GMBH,UG, GbR
      • Law firm for IT/IP and media law
    • The everyday life of an IT lawyer
    • How can I help clients?
    • Testimonials
    • Team: Saskia Härtel – WHO AM I?
    • Agile and lean law firm
    • Price overview
    • Various information
      • Terms
      • Privacy policy
      • Imprint
  • Services
    • Support and advice of agencies
    • Contract review and preparation
    • Games law consulting
    • Consulting for influencers and streamers
    • Advice in e-commerce
    • DLT and Blockchain consulting
    • Legal advice in corporate law: from incorporation to structuring
    • Legal compliance and expert opinions
    • Outsourcing – for companies or law firms
    • Booking as speaker
  • News
    • Gloss / Opinion
    • Law on the Internet
    • Online retail
    • Law and computer games
    • Law and Esport
    • Blockchain and web law
    • Data protection Law
    • Copyright
    • Labour law
    • Competition law
    • Corporate
    • EU law
    • Law on the protection of minors
    • Tax
    • Other
    • Internally
  • Podcast
    • ITMediaLaw Podcast
  • Knowledge base
    • Laws
    • Legal terms
    • Contract types
    • Clause types
    • Forms of financing
    • Legal means
    • Authorities
    • Company forms
    • Tax
    • Concepts
  • Videos
    • Information videos – about Marian Härtel
    • Videos – about me (Couch)
    • Blogpost – individual videos
    • Videos on services
    • Shorts
    • Podcast format
    • Third-party videos
    • Other videos
  • Contact
  • en English
  • de Deutsch
Kostenlose Kurzberatung