IT Law and Data Protection14.12.2023 Newsletter

Focus IT&C – 4rd Quarter 2023

Find out more about our IT law & data protection practice group - now regularly summarised for you at a glance! On a quarterly basis, we will be presenting you with the most important developments in IT law and data protection. In addition to informing you of the latest draft laws and developments in the field, we advise you on classic IT law, data protection law and new media. Please also feel free to contact us for audits, IT project support and consulting, including cloud computing, e-commerce topics and social media issues.

 

1. Milestone in AI regulation: EU reaches agreement on AI Act

2. Artificial intelligence in the workplace

3. ECJ on distance-selling subscriptions: consumers can only revoke contract once

4. CJEU on Art. 83 GDPR: a violation of GDPR is not sufficient for GDPR fines

5. EU Data Act adopted

6. IT security update

7. Draft German Digital Services Act: new regulations for digital intermediary services

8. CJEU judgement on Schufa scoring

 

 

1. Milestone in AI regulation: EU reaches agreement on AI Act

Following tough negotiations, the EU reached agreement on the Artificial Intelligence Act ("AI Act") on the night of 8 December 2023. Depending on the date of its formal adoption, its requirements are therefore expected to apply from 2026. As the world’s first comprehensive AI law, the AI Act could set a global standard in the regulation of AI. It will bring greater clarity on the requirements for using AI systems, although it still leaves many uncertainties. 

1. Negotiations on the AI Act

During the negotiations on the AI Act, highly controversial topics to the very end were the EU-wide ban on certain AI systems, the categorisation of AI systems into different risk groups and the increased minimum standards for high-risk AI. The regulation of basic models, i.e. generative pre-trained transformer models (GPT models for short), was particularly controversial. Only time will tell to what extent the EU has succeeded in striking a balance between increasing confidence in the control of AI systems on the one hand and preventing overregulation that is detrimental to its development on the other.

2. Who and what does the AI Act affect?

The AI Act regulates the market launch, commissioning and use of AI systems within the EU and imposes certain obligations on providers, manufacturers and distributors of AI systems. The Act therefore also applies to companies based outside the EU if they provide AI systems on the EU market. The greater the risk potential of the AI system, the stricter the requirements to be complied with (risk-based approach). The Act focuses on the regulation of so-called high-risk AI systems.

AI systems that are used exclusively for military or defence purposes, research or innovation purposes or for private purposes are excluded from the scope of application.

Texts, images and sounds created with the help of AI must be labelled as such. 

3. How can companies already prepare themselves for the AI Act? 

Companies should already bring the implementation and operation of their AI systems into line with the AI Act. They otherwise face a risk of high costs for retrofitting their systems in a legally compliant manner or even shutting them down when the AI Act comes into force.

(a) Risk assessment

Every company should carry out and document a risk assessment of the AI systems used. The categorisation of the AI system into (i) unacceptable risk, (ii) high-risk AI or (iii) low/minimum risk AI is decisive for the specific obligations.

(i) AI with an unacceptable risk

AI systems that pose an unacceptable risk may not be placed or used on the market. Such an unacceptable risk exists in the case of systems that are used to subliminally detrimentally influence human behaviour or to exploit the weaknesses of vulnerable persons. AI systems that classify people according to their trustworthiness, known as social scoring, may also not be marketed or used.

(ii) High-risk AI

High-risk AI systems are systems that pose a high risk to the fundamental rights, health or safety of natural persons. On the one hand, the legislator includes systems that are intended to fulfil functions as security components and are subject to corresponding certification obligations. In addition, the AI Act contains a list of AI systems that are always considered to be high-risk AI. These include, for example, systems in the law enforcement or justice sector and for biometric classification.

(iii) AI with a low/minimal risk

The category of AI with a low/minimal risk includes systems whose use neither leads to an unacceptable risk nor can be categorised as high-risk AI. Chatbots, for example, fall into this category.

(b) Transparency, documentation and information obligations

For all systems, the AI Act requires compliance with transparency, documentation and information obligations. To fulfil these obligations, developers should document the development and functionality of their systems in detail. The training data used and the results generated with it also needs to be documented.

(c) Establishment of control mechanisms

Companies should establish responsibilities and procedures for the regular monitoring of AI systems and offer training for employees. The AI Act obliges companies to continuously monitor AI, as many systems are constantly accessing new data and evolving.

4. Conclusion

The EU has agreed on the AI Act and makes companies responsible for the use of AI systems. Violations of the regulations can be penalised with substantial fines of up to 40 million euros or up to seven percent of the total global turnover of the preceding financial year. The implementation of the individual programme of obligations arising from the AI Act should therefore not be put on the back burner. Companies can already take action now.

We would be pleased to assist you in this respect.

Dr. Axel Grätz

Back

2. Artificial intelligence in the workplace

1. AI in our daily work: a problem child or a beacon of hope?

According to a recent study by the digital association Bitkom, artificial intelligence ("AI") is being used by 15 percent of German companies. This is an increase of 6 percent compared to last year. As a whole, our expectations of AI are also rising significantly. A good two thirds of the companies questioned consider AI to be the most important technology of the future. Not only can AI now be used to increase the efficiency of robotics on production lines or to optimise the distribution of stocks in logistics centres. Employees are now using AI-supported applications to revolutionise the marketing of entire companies. Standardised letters are being drafted by ChatGPT. Presentations created with the help of AI can even be used in reports to the management board. AI also affects individual job candidates. With active sourcing, employers can search through social networks for suitable candidates for job advertisements and subsequently contact them. The potential of AI in the workplace is therefore enormous.

But why do only comparatively few German companies use AI today? On the one hand, this is doubtlessly due to the fact that concrete practical AI applications are only being developed slowly. Above all, however, companies are put off by the unclear legal situation surrounding AI and the associated liability risks and loss of reputation. This is hardly surprising. The intelligent helpers require large amounts of data in order to operate. As soon as this data is related to a person, data protection law applies. There are also imminent regulatory projects specifically for AI: in addition to the European AI Regulation, the Machinery Regulation will reassess AI. Companies that already use AI or are planning to do so should start preparing for the entry into force of these regulations today. They otherwise face the risk of their internal processes suddenly no longer being legally compliant.

Nevertheless, the legally compliant use of AI to assist employees in the company is possible. Below, we are informing you of some of the implications that need to be considered.

2. Implications under data protection law

Data is often referred to as the most valuable raw material of our time. AI is also affected by the coexistence of big data, starting with the training of the systems and continuing through to their commissioning. This data is often personal and is therefore subject to data protection law. The employer is responsible for complying with these requirements.

2.1

Since the ECJ declared the federal state law equivalent from Hesse on Section 26 (1) sentence 1 of the Federal Data Protection Act (Bundesdatenschutzgesetz, BDSG) to be contrary to European law (ECJ, judgement of 30 March 2023 - C-34/21), the European permission criteria of the GDPR also have to be applied in the employment context. Besides the fundamental problem that the voluntary nature of the granting of consent can be problematic in many cases where employees' personal data is processed, AI is causing further disruptions in this area. This is because consent must be based on an informed decision. The employer therefore has to provide the information according to Art. 13 GDPR. This is particularly difficult in the case of black box models, the functioning of which even experts do not yet fully understand. In addition, consent must be freely revocable and consequently it must also be possible to implement this revocation. As most AI models used today are based on artificial neural networks and the Hebbian learning rule, the data pool used for training has long-lasting effects on the results when the system is in operation. Pseudonymisation and anonymisation techniques in particular can be of great help here. It is not uncommon for a compromise of interests to ultimately decide whether the use of AI in the data processing at a company is permissible under data protection law (Art. 6 (1) sentence 1 lit. f GDPR). When selecting applicants using active sourcing, the choice of search platform can tip the balance in favour of the job advertiser. Members of a network that has a professional context are more likely to be interested in being approached about new jobs than visitors to platforms without such a context. Incidentally, Art. 6 (1) sentence 1 lit. b GDPR helps us no further with active sourcing, despite the fact that the acquisition can certainly be understood as a pre-contractual measure. The application of the authorisation criterion already lacks the request by the data subject.

2.2

AI is often considered to be autonomous and hence it stands to reason that there is automated decision-making. At a company, work orders can be assigned to individual employees in a fully automated manner. The creation of work schedules can also be transferred to a machine. In terms of data protection law, such processes have to be evaluated against Art. 22 GDPR. Accordingly, decisions based exclusively on automated processing that have a legal effect on the data subject or similarly affect them are generally not permitted. In this context, in the Netherlands the automated distribution of driving orders to taxi drivers was recently assessed on the basis of Art. 22 GDPR, as this decision has a direct impact on the driver's income and therefore has legal effect (Gerechtshof Amsterdam of 4 April 2023 - 200.295.747/01). Against this background, in commission-based transactions, for example, work orders should not be awarded on the basis of automated decision-making as they have a direct legal effect. However, AI can be used in the preparatory phase of decision-making. Drawing the line here is a question to be answered in the individual case. In all events, processes applied to date need to be analysed and reviewed. Furthermore, the transparency requirements under data protection law are increased in the case of profiling. According to Art. 13 (2) lit. f GDPR, the data subjects must be provided with meaningful information about the logic involved. Hence, here as well, the lack of transparency of AI models also affects data protection issues.

3. Conclusion

It is clear that the biggest data protection pitfall in the implementation and operation of artificial intelligence systems at companies is their frequently publicised black box. However, light is slowly being shed on this issue too. The ability to explain AI is a separate field of information technology research in which increasing progress is being made. This makes it easier to implement data protection transparency regulations. At the same time, this ensures trust in and a broader acceptance of the systems. Future regulatory projects such as the AI Regulation are also addressing precisely this point with a risk-based approach.

Early accompanying legal advice not only prevents liability risks, but also ensures that the integrated systems can be used in a legally compliant manner in the long term. Please feel free to contact us with your project.

Dr. Axel Grätz

Back

3. ECJ on distance-selling subscriptions: consumers can only revoke contract once

In October of this year, the ECJ clarified that consumers are only entitled to a one-off right of revocation, even in case of distance contracts that include a prior free trial subscription. The consumer therefore has no new right of revocation if the subsequent subscription is subject to a charge or is automatically renewed. However, if the consumer has not been informed of the total price of the service by the trader in a clear, comprehensible and explicit manner, this exceptionally gives rise to a renewed right of revocation.

In its judgement of 5 October 2023 (C-565/22) Art. 9 (1), the ECJ interpreted Directive 2011/83/EU (so-called "Consumer Rights Directive").

The specific case

The original proceedings centred on the general terms and conditions of an Internet learning platform for students. When consumers take out a subscription for the first time, according to the platform's terms and conditions they receive a 30-day free trial subscription. If the consumer does not cancel or revoke the subscription within this period, the trial subscription is converted into a paid subscription.

In the opinion of the Association for Consumer Information (Verein fürKonsumenteninformation, “VKI”) in Austria, both the conversion of the subscription into a paid subscription and the subsequent automatic renewal give rise to a new right of revocation within the meaning of Art. 9 (1) of the Directive. By failing to provide information about the new right of revocation, the platform was in breach of its notification obligations under Art. 6 (1) (e) in conjunction with Art. 8 (2) of the Directive.

The referring court sought clarification as to how the phrase "distance contract" within the meaning of Art. 9 (1) of the Directive should be understood. It asked whether the norm was to be interpreted as meaning that the consumer is entitled to a repeated right of revocation in case of a distance selling service that initially runs as a free trial subscription and then becomes subject to a charge or is automatically renewed after expiry of a certain period - in other words, whether the right of revocation arises anew with each conversion.

Decision of the ECJ

The ECJ ruled that the consumer is generally only entitled to a one-off right ofrevocation in case of distance selling. In the exceptional case, the consumer can be granted a new right of revocation only if the trader has not informed the consumer in a clear, comprehensible and explicit manner about the total price due after the free trial period.

The ECJ based its decision primarily on the meaning and purpose of the right of revocation. The original purpose of the consumer right of revocation is to compensate the consumer for the disadvantage resulting from the distance contract. The 14-day right of revocation gives the consumer a reasonable period of reflection for examining and trying out the ordered service. This enables the consumer to gain knowledge of all the features of the service in good time and to make an informed decision about the conclusion of the contract and all the associated conditions. An essential factor for the decision as to whether the consumer wishes to enter into a contract with a company is the total price. The trader must provide information about this in accordance with Art. 6 (1) (e) in conjunction with Art. 8 (2) of the Directive. In case of a free trial subscription, this includes information to the effect that the service will be subject to a charge after the initial free period. If the company informs the consumer of this in a clear, comprehensible and explicit manner at the conclusion of the contract, the purpose of the right of revocation is fulfilled. The end of the trial period does not change the contractual conditions that were brought to the consumer's attention. The right of revocation can therefore no longer fulfil its purpose upon conversion.

The case must be assessed differently if the company has not properly fulfilled its duty to provide information. If the information provided by the company about the contractual terms and conditions differs fundamentally from the actual circumstances, the consumer has a renewed right of revocation.

Conclusion

With its judgement, the ECJ has once again emphasised the importance of providing transparent information at the time of contract conclusion about the total costs incurred. In the specific case, the referring court must now review whether the platform user was clearly, expressly and comprehensibly informed of the total price of the service offered.

Companies that offer distance selling trial subscriptions should review their contractual terms and conditions. If the contractual terms and conditions inform the consumer inadequately or even incorrectly about the costs incurred after the trial subscription converts into a paid subscription or the automatic renewal, the company must allow the assertion of a new right of revocation. In addition, the company concerned is obliged to notify the consumer anew about the right of revocation.

Dr. Hanna Schmidt

Back

4. CJEU on Art. 83 GDPR: a violation of GDPR is not sufficient for GDPR fines

1. Key statements of the CJEU judgement

In its ruling of 5 December 2023 (C-807/21), the European Court of Justice (CJEU) clarified the conditions for imposing GDPR fines on legal entities. In particular, the court emphasised that

  • a company can be the direct addressee of an administrative fine without first having to prove an offence by a management body; and
  • however, a negligent or intentional violation is required, whereby no action by or knowledge on the part of a management body of the legal person is required.

2. Background

The CJEU proceedings were prompted by a fine imposed on Deutsche Wohnen SE by the Berlin data protection authority for a violation of data protection law. Deutsche WohnenSE challenged the fine in court and succeeded before the Berlin Regional Court. The Berlin data protection authority appealed against this with the Berlin Court of Appeal, which referred questions to the CJEU for a preliminary ruling on the interpretation of Art. 83 (4) to (6) GDPR.

3. What does this mean for companies?

In the event of GDPR violation, companies cannot argue that a fine against the company, first requires proof of a (specific) violation by a company manager according to Sections 130 and 30 of the Act on Regulatory Offences (Gesetz überOrdnungswidrigkeiten, OWiG). Rather, companies shall be liable in the event of a violation insofar as they are responsible for the processing of the data that is carried out by them or on their behalf. Companies are not only liable for violations committed by their legal representatives, managers or directors, but also for violations committed by any other person acting within the scope of their business activities and on behalf of the company. The CJEU has ruled that the requirements of Sections 130 and 30 OWiG should be considered to be substantive criteria for a fine and that those provisions contradict the higher-ranking provisions of Art. 83 GDPR.

However, the CJEU also emphasises that a fine under Art. 83 GDPR clearly requires that the company is at fault for the violation. This means that a fine cannot be imposed simply because the data protection authority identifies a violation. The authority also has to establish attributable intentional or negligent behaviour. According to the CJEU, the required culpability for fines under the GDPR does not have to fulfil extremely high requirements. Companies can be expected to provide appropriate monitoring and need to be able to demonstrate the compliance of processing activities with the GDPR. The court also emphasises that specific knowledge of the violation on the part of the management bodies is not necessarily required, but rather that companies must prove that they were not aware of the violation. The CJEU confirmed these requirements in another decision from the same day (C-683/21 - Lithuanian Covid app).

The CJEU's ruling highlights the need for companies to implement compliance measures in order to ensure that all employees behave in accordance with data protection regulations and to minimise the potential of administrative fines.

Dr. Jürgen Hartung und Patrick Schwarze

Back

5. EU Data Act adopted

The European Union finally adopted the Data Act on 27 November 2023.

After the European Commission submitted the proposal for a regulation on harmonised rules for fair access to and fair use of data ("Data Act") on 23 February 2022, the European Parliament and the European Council reached a provisional agreement on the content of the Data Act on 28 June 2023. The regulation was then approved by the European Parliament on 9 November 2023 and by the European Council on 27 November 2023.

What is the Data Act?

The Data Act - just like the AI Act and the Digital Services Act - is part of the European data strategy. The Data Act aims to standardise the rules for using and exchanging data. The main objective is to ensure fair access to and fair use of data.

Important aspects of the Data Act:

  1. Applicability: The Data Act is primarily applicable to (networked) products and connected services that generate data, such as networked household appliances, voice assistants, industrial equipment and cars connected to the Internet, as well as software for fitness watches. Like the GDPR, the Data Act applies to non-European companies if they operate in the EU, in particular if the products and services are placed on the market in the EU or if the data recipients are located in the EU.
  2. Provision of data: Users have the right to receive the data generated by the product or service and, if necessary, to pass it on to third parties. This means that a company must provide not only the actual user data (e.g. analysed data from the fitness watch), but also all raw data. The company must provide information on the type and manner of provision before the contract is concluded. The data controller is therefore also obliged to pass on the data concerned to other companies in return for an appropriate consideration. However, these obligations do not apply to small and micro-enterprises.
  3. Interoperability and data access: Data processing providers (especially cloud or edge services) are obliged to ensure a simple switch of service provider by means of interfaces and compliance with certain interoperability standards. The provider must support the transition to the new system for a period of 30 days after the end of the contract. After the Data Act comes into force, only a reduced switching or data transfer fee may be charged for this service. Three years after the Data Act comes into force, the support services will even have to be provided free of charge.
  4. Data access rights of public authorities: Public authorities have the right to request the necessary data from companies in exceptional circumstances, such as emergencies or disasters. The same applies if the public authority needs the data to fulfil a task in the public interest and cannot obtain this data elsewhere. The company must comply with a corresponding request for data access.
  5. Requirements for contractual agreements between providers and users: The Data Act sets out extensive requirements that must be observed when drafting contracts between providers and users: for example, liability cannot be completely excluded and the user can be restricted in the use of the data received.
  6. Trade secrets remain protected: Under the Data Act, there is expressly no obligation to disclose trade secrets. Rather, data owners are fundamentally allowed to take all necessary measures to protect their trade secrets before their disclosure. If this is not sufficient, it can also be contractually agreed that the user or data recipient is obliged to take further technical and organisational measures to protect the data.
  7. Penalties for violating the provisions of the Data Act: The Data Act provides for significant penalties in case of violations. Fines can amount to up to € 20,000,000 or up to 4 percent of annual global turnover. The amount of the fines is reminiscent of the GDPR.

When will the Data Act actually come into force?

The Data Act will be published in the Official Journal of the EU shortly. It will enter into force on the twentieth day after its publication. However, the Data Act will not apply immediately, but only 20 months after its entry into force. The exception to this is Art. 3 (1) of the Data Act: This states that there is a duty to design and manufacture or provide networked products and connected services in such a way that users can access their data easily, securely and free of charge as a rule. However, this obligation will only apply to products that are placed on the market 32 months after the Data Act comes into force.

Conclusion:

The Data Act is a significant step towards comprehensive and standardised regulation in the handling of data. Affected companies may have to adapt their products and services in order to fulfil the requirements of the Data Act. The Data Act must also be taken into account when drafting contracts. Furthermore, it is advisable to develop internal guidelines on how providers should deal with requests from users, public authorities and third parties. The EU Commission has also announced that it will issue model clauses for contracts between data owners and users. In this regard, we recommend monitoring further developments regarding the implementation of the Data Act.

Patrick Schwarze

Back

6. IT security update

Over the next few years, IT security requirements are to be comprehensively revised. The implementation of the "NIS-2 Directive" (EU) 2022/2555 with EU-wide minimum standards for cybersecurity affects at least 30,000 companies in Germany. The planned implementation of Directive (EU) 2022/2557 (CER Directive) contains specific obligations for the (physical) resilience of operators of critical entities. In addition, the obligations of the Digital Operational Resilience Act (DORA) are to be applied in the financial sector from the beginning of 2025.

Overview of NIS-2 implementation

On 27 September 2023, the Federal Ministry of the Interior and Home Affairs (Bundesministerium des Innern und für Heimat, BMI) published a Discussion Paper on the implementation of the NIS-2 Directive, whereby both the scope of application and the obligations of companies contained therein have not changed significantly compared to the first draft that became public (for details, see our article "Cybersecurity in German companies": New obligations through planned new regulations of the BSIG").

In October, the BMI already held an initial "workshop meeting" with representatives of industry (interest groups), at which they agreed on the need to further amend the Discussion Paper. A further updated draft bill is expected to be published in the foreseeable future.

Forthcoming KRITIS Umbrella Act

On 28 July 2023, the draft bill of the act on the implementation of the CER Directive, the so-called KRITIS Umbrella Act (KRITIS-Dachgesetz), was published. Its addressees are operators of critical entities and their obligations relate to the (physical) resilience of the entities. The legislator proceeds on the basis that, besides the existing operators of critical infrastructures, only a few additional companies will be covered by this regulation.

Thus, although classification as an operator of a critical entity within the meaning of the KRITIS Umbrella Act will also lead to the applicability of the NIS-2 obligations (Art. 28 (1) No. 4 Discussion Paper on NIS-2 Implementation), conversely, classification as an particularly important (essential) or important entity within the meaning of Art. 28 (1) and (2) of the NIS-2 Implementation Discussion Paper does not lead to the applicability of the KRITIS Umbrella Act. This therefore creates a so-called "all-hazards approach" to strengthen resilience in various areas.

The fact that key obligations of the KRITIS Umbrella Act (in particular reporting and verification obligations) are not due to come into force until 2026 means a little relief for operators of critical entities.

Breaking news: End of December 2023, an updated draft bill of the KRITIS Umbrella Act was published. We will provide an updated overview on our next newsletter.

Christian Saßenbach

Back

7. Draft German Digital Services Act: new regulations for digital intermediary services

Things are getting serious for providers of digital intermediary services: from 17 February 2024, Regulation (EU) 2022/2065, known as the "Digital Services Act" or "DSA", will no longer apply only to very large online providers, but to all providers of digital intermediary services and will be binding and directly applicable in all EU member states. The regulations of the DSA aim to promote a secure digital environment and establish a standardised legal framework for digital services across Europe. To this end, the DSA provides for extensive due diligence obligations for digital intermediary services such as online platforms and search engines.

Although the DSA as an EU regulation itself already contains comprehensive, clear and directly applicable regulations, it needs to be fleshed out by national legislators with regard to the enforcement of these regulations and the envisaged sanctions. At the beginning of August 2023, the Federal Ministry for Digital and Transport (Bundesministerium für Digitales und Verkehr, BMDV), which is the responsible body in Germany, presented a draft bill for a German Digital Services Act (Digitale-Dienste-Gesetz-Referentenentwurf, "DDG-RefE").

1. Regulatory content and status of the legislative process

On the one hand, the DDG-RefE focuses on the regulation of official responsibilities and procedures, but on the other hand, in conjunction with the DSA, it also contains regulations that will replace existing national regulations on digital services. For example, both the Telemedia Act (Telemediengesetz, "TMG") and the Network Enforcement Act (Netzwerkdurchsetzungsgesetz, "NetzDG") will expire. May we also point out that the national term "telemedia" from Section 1 (1) TMG will in future be replaced by the term "digital services" in accordance with the new European terminology under the DDG-RefE.

As the German Digital Services Act (Digitale-Dienste-Gesetz, “DDG”) is currently only at the stage of a draft bill in the form of the DDG-RefE, certain changes to the content of the regulations are anticipated during the further legislative process. This applies in particular to the division of responsibilities, which has been criticised as being too complex. It currently looks as if Germany will miss the implementation deadline of 17 February 2024 and the DDG will not be finally adopted before March at the earliest. Providers of digital intermediary services would be well advised keep a close eye on developments in the legislative process.

2. Key regulations

a. Federal Network Agency as central coordinator

According to the DDG-RefE, the Federal Network Agency (Bundesnetzagentur) will become the competent authority for the enforcement of the DSA (cf. Section 12 DDG-RefE). To this end, the Federal Network Agency will set up a Coordination Centre for Digital Services (Koordinierungsstelle für digitale Dienste, “KDD”), which will act as a central point of contact for Internet users and online customers in the event of violations of the DDG or the DSA. Special responsibilities are also being assigned to the Federal Centre for Child and Youth Media Protection (Bundeszentrale für Kinder- und Jugendmedienschutz, “BzKJ”) in Section 12 II DDG-RefE, the Federal Commissioner for Data Protection (Bundesbeauftragter für den Datenschutz, “BfDI”) in Section 12 III DDG-RefE and the Federal Criminal Police Office (Bundeskriminalamt, “BKA”) in Section 13 DDG-RefE.

During the consultation with the federal states and associations, the distribution of official responsibilities emerged as a major point of criticism of the draft bill. How the responsibilities will be regulated in the final version of the DDG remains to be seen.

b. Repeal of the TMG and NetzDG

Against the background of the full harmonisation intended by the European legislator with the adoption of the DSA, the TMG and the NetzDG will be repealed when the DDG comes into force. However, the interaction between the DSA and the DDG should regulate their subject matter to at least a comparable extent.

This can be seen, for example, from the fact that the legal notice obligation from Section 5 TMG can now be found in the same wording in Section 5 DDG-RefE and the liability privileges of Sections 8 et seq. TMG are now regulated in Art. 4 et seq. DSA, supplemented by the special national provisions on liability for anti-competitive conduct and the liability of WLAN operators in Sections 7, 8 DDG-RefE.

c. Sanctions 

Section 25 DDG-RefE, which contains comprehensive provisions on the various administrative offences and the resulting liability amounts, will be particularly relevant for providers of digital intermediary services. Paragraphs 1-4 describe possible violations of the DDG, the DSA and Regulation (EU) 2019/1150 ("P2B Regulation"), such as violations of notification obligations and requests for information, the inadequate establishment of a complaints management system or an advertisement that does not comply with the requirements of the DSA.

Paragraphs 5 and 6 list possible fines, which can amount to up to 6 percent of a company's global annual turnover in the preceding financial year. Please take particular note of the fact that the financial year preceding the official decision is always taken as a basis here and not the financial year in which the infringement was committed. This can be particularly relevant for young companies in constellations in which official decisions are made at a much later date and relate to infringements during a start-up phase.

3. Conclusion

The application of the DSA from 17 February 2024 and the imminent entry into force of the DDG will entail extensive new obligations and liability risks for providers of digital intermediary services. Companies that have not already done so should therefore familiarise themselves promptly with the requirements affecting them, in particular the extensive due diligence obligations arising from the DSA. The entry into force of the DSA will also require the updating of references to the TMG or NetzDG, which are being repealed at the same time (e.g. in the information given in the legal notice).

Tobias Kollakowski

Back

8. CJEU judgement on Schufa scoring

On 7 December 2023 (C-634/21), the European Court of Justice (“CJEU”) ruled on the admissibility of Schufa creating a probability value for the creditworthiness of natural persons ("score"). While the judgement will have a direct impact on Schufa and other credit agencies, the effects on the clients of credit agencies are likely to be less intense.

The CJEU decided in a case concerning a citizen's subject access request and request to erase data against Schufa and the subsequent legal dispute before the Wiesbaden Administrative Court because the responsible data protection authority in Hessen had failed to remedy the situation.

The GDPR contains requirements for the use of such a score in Art. 22 GDPR, i.e. the provision on decisions based solely on automated processing - including profiling - which produce legal effects for the data subject or which affect them otherwise in a similar significant manner. This provision applies in addition to the need for a legal basis under Art. 6 or Art. 9 GDPR and compliance with the basic principles under Art. 5 GDPR.

The CJEU has now ruled that Art. 22 GDPR does also apply to credit agencies that calculate a corresponding score from the data available to them, even if they do not use this score vis-à-vis the data subjects themselves, e.g. when concluding a contract. In order to avoid legal loopholes in European and national data protection laws, for example with regard to the rights of data subjects, the CJEU stated that the creation of a corresponding score by a credit agency already constitutes automated processing with a significant impact on the data subjects due to the subsequent use by customers of the credit agency, at least if such a score is essential for the decision of such customers. Thus, Art. 22 GDPR applies to the credit agency and not only to its customers.

However, the CJEU has not ruled that Schufa may no longer create such score values or that its customers would not be entitled to use them. There are various exceptions to the general prohibition of Art. 22 (1) GDPR of such automated decision-making in Art. 22 (2) GDPR. E.g. a justification is possible via national exceptions such as Section 31 of the Federal Data Protection Act (Bundesdatenschutzgesetz, BDSG). The CJEU did not declare this standard null and void, rather, it left this examination to the German courts. But it established certain guidelines for national provisions: These must include measures for data subjects, such as a person's right to intervene. Member states are not allowed to modify the justification criteria, e.g. to stipulate a balancing of interests. In this respect, it is likely that Section 31 BDSG will not fully hold up against this test, and that there will indeed be a need for action by the German courts and then by the German legislator.

However, for the customers of the credit agencies, i.e. companies that wish to carry out a credit check as part of a contract review with their customers, there will be no significant changes for the time being. They must measure automated decision-making against the standard set out in Art. 22 GDPR. Unlike the credit agencies, however, they can invoke the exception in Art. 22 (2) (a) GDPR if their automated decision is necessary for the conclusion or fulfilment of a contract and appropriate measures have been taken to safeguard the legitimate interests of the data subjects in accordance with Art. 22 (3) GDPR. This includes at least the right to obtain human intervention on the part of the controller in order to review the automated decision. Even if the CJEU's judgement is interpreted to mean that an exclusive decision based on a score that cannot be further explained and is not transparent is questionable, in practice, a credit check using automated processing by companies can be permissible. E.g. companies should not exclusively rely on such a score, but also take into account other factors. For example, specific "hard" negative features (such as unpaid receivables, insolvency), own specific knowledge or experience from the prior customer relationship, the basic assessment of risks of different acquisition channels or payment methods should be considered as well on top of such a score.

Thus, we currently do not expect that credit check by companies will be inadmissible, even if a score is used. However, it may be worth to re-evaluate the respective procedures.

Dr. Jürgen Hartung

Back

 

Legal Tech Tools - Digital applications for more efficient solutions

Discover our wide range of legal tech tools! Learn more ...

Back to list

Dr. Jürgen Hartung

Dr. Jürgen Hartung

PartnerAttorney

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 643
M +49 172 6925 754

Email

LinkedIn

Dr. Marc Hilber<br/>LL.M. (Illinois)

Dr. Marc Hilber
LL.M. (Illinois)

PartnerAttorney

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 612
M +49 172 3808 396

Email

LinkedIn

Michael Abels

Michael Abels

PartnerAttorney

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 600
M +49 172 2905 362

Email

Dr. Angela Busche<br/>LL.M. (CWSL)

Dr. Angela Busche
LL.M. (CWSL)

PartnerAttorney

Am Sandtorkai 74
20457 Hamburg
T +49 40 808105 152
M +49 173 4135932

Email

Marco Degginger

Marco Degginger

Junior PartnerAttorney

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 365
M +49 162 1313 994

Email

Tobias Kollakowski<br/>LL.M. (Köln/Paris 1)

Tobias Kollakowski
LL.M. (Köln/Paris 1)

Junior PartnerAttorneyLegal Tech Officer

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 423
M +49 173 8851 216

Email

LinkedIn

Christian Saßenbach<br/>LL.M. (Norwich), CIPP/E

Christian Saßenbach
LL.M. (Norwich), CIPP/E

AssociateAttorney

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 115
M +49 151 1765 2240

Email

Dr. Hanna Schmidt

Dr. Hanna Schmidt

Junior PartnerAttorney

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 613
M +49 172 1475 126

Email

Patrick Schwarze

Patrick Schwarze

Junior PartnerAttorney

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 406
M +49 1520 2642 548

Email

LinkedIn

Dr. Axel Grätz

Dr. Axel Grätz

AssociateAttorney

Konrad-Adenauer-Ufer 23
50668 Cologne
T +49 221 2091 604
M +49 170 929 593 6

Email

LinkedIn