E-commerce: Unterschied zwischen den Versionen

Aus daten-speicherung.de
Zur Navigation springen Zur Suche springen
Zeile 5: Zeile 5:
 
It is becoming more and more important that queries regarding personal data can be addressed directly to a company data protection officer where such a person has been appointed. Service providers should therefore provide the details of a company data protection officer, including his electronic mail address, which allow him to be contacted rapidly and communicated with in a direct and effective manner (article 5 to be amended).
 
It is becoming more and more important that queries regarding personal data can be addressed directly to a company data protection officer where such a person has been appointed. Service providers should therefore provide the details of a company data protection officer, including his electronic mail address, which allow him to be contacted rapidly and communicated with in a direct and effective manner (article 5 to be amended).
  
==Intermediary liability: Stop pro-active monitoring and protect freedom of speech==
+
==Intermediary liability: Protect freedom of speech by stopping pro-active monitoring and "prevention"==
  
The German Federal Supreme Court (Bundesgerichtshof) interprets the e-commerce directive not to be applicable to applications for injunctive relief.<ref>NJW 2004, 3102 (3103)</ref> The liability exemptions in the e-commerce directive were ruled not to apply to claims for providers to prevent illegal user activity.<ref>http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/germany_12nov2007_en.pdf</ref>
+
===Case study: Inacceptable situation in Germany===
  
This said, the most far-reaching doctrine of contributory "liability" is being applied by German courts, called "accessory liability" (Störerhaftung). According to this doctrine it is not only the wrongdoer himself (direct infringer) and participants (effective promoters or helpers) that can be subject to a claim for refraining from rights infringements, but also mere accessories, including providers of information society services. Responsibility for unlawful user action is extended to all persons who - without necessarily being wrongdoers or participants - deliberately and generally causally contribute to the infringement of a third party's right, provided they have the legal and effective means to detect and prevent the infringement. Once an intermediary obtains knowledge of an infringement, it is not only obliged to remove the unlawful content but also to take all technically feasible and reasonable precautions to prevent future infringements. In other words, subject to the requirement of reasonableness, service providers are obliged to examine all user content as soon as they obtain a notice of unlawful content. Culpable breach of this monitoring duty is punished by a disciplinary fine or even a prison sentence. This obligation is not limited to the detection of the unlawful content that was originally notified or to the original publisher of this content.<ref>DG Market, Study on the liability of Internet intermediaries, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/germany_12nov2007_en.pdf.</ref> In practice, this jurisprudence effectively forces providers to pro-actively monitor user-generated content, this being the only way they can avoid indictment.<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.</ref>
+
The German Federal Supreme Court (Bundesgerichtshof)<ref>BGH, NJW 2004, 3102 (3103)</ref> interprets the liability exemptions in the e-commerce directive not to be cover applications for injunctive relief, i.e. to claims for providers to prevent illegal user activity.<ref>http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/germany_12nov2007_en.pdf</ref>.
  
The German courts have consistently held that a provider that is notified of illegal user-generated content must not only remove that content but also take all "appropriate" ("zumutbar") steps to prevent infringements of this kind in the future, including content generated by other users. There is much legal uncertainty about what measures the courts consider "appropriate".<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.</ref> A German court that finds a provider not to have taken sufficient steps will grant injunctive relief by prohibiting the provider to allow users to re-publish the illegal content, without setting out what measures the court considers necessary to prevent user infringements.<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 66.</ref> The extent of the provider's pro-active obligations is determined only when the rights holder initiates a separate procedure for alleged violation of the injunction and applies for a disciplinary fine to be imposed. So providers are told which steps they are required to take only in the judgement that imposes a fine on them for not complying with duties they had not been notified of. This jurisprudence not only causes unacceptable uncertainty for providers.<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 67.</ref> It also has unacceptable repercussions on the freedom of speech on-line, because providers threatened by fines will - often using automatic and broad filters - block and prevent any content that they consider could in any way be illegal.
+
In consequence, the most far-reaching doctrine of contributory "liability" is being applied by German courts, called "accessory liability" (Störerhaftung). According to this doctrine it is not only the wrongdoer himself (direct infringer) and participants (effective promoters or helpers) that can be subject to a claim for refraining from rights infringements, but also mere accessories, including providers of information society services. Responsibility for unlawful user action is extended to all persons who - without necessarily being wrongdoers or participants - deliberately and generally causally contribute to the infringement of a third party's right, provided they have the legal and effective means to detect and prevent the infringement.<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.</ref>
  
German courts tend to go very far in what measures they impose on providers to prevent future infringements by their users: Technology must be used to automatically scan all user-generated content for content that potentially violates the injunction. Users that have generated illegal content in the past must be subjected to a manual examination of any future content they generate. To this end, all users must be identified and the anonymous use of services must be disabled. Using the identification data, all user actions must be logged. Content regarding issues that "provoke illegal content" must be examined in detail. All of these duties have been imposed not for the prevention of serious crime, but for the mere prevention of infringements of private titles including commercial rights.
+
According to this doctrine, once an intermediary obtains knowledge of an infringement, it is not only obliged to remove the unlawful content but also to take all technically feasible and reasonable precautions to prevent future infringements. In other words, subject to the requirement of reasonableness, service providers are obliged to examine all user content as soon as they obtain knowledge of any unlawful content. The monitoring obligation is not limited to the detection of the unlawful content that was originally notified or to the original publisher of this content.<ref>DG Market, Study on the liability of Internet intermediaries, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/germany_12nov2007_en.pdf.</ref>
  
These "preventive duties" imposed by the courts violate article 15 of the e-commerce directive, according to which Member States - including their courts - must not impose a general obligation on providers to monitor information which they transmit or store. Injunctions lead to a de facto obligation to monitor user-generated content<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.</ref>, and thus amount to a general monitoring obligation.  
+
In practice, this jurisprudence effectively forces providers to pro-actively monitor user-generated content, this being the only way they can avoid indictment.<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.</ref> Culpable breach of the monitoring duty is punished by a disciplinary fine or even a prison sentence.  
  
The German Federal Supreme Court reads article 14 (3) to justify broad "preventive duties": "This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to [...] prevent an infringement". Also recitals 47 and 48 read: "Member States are prevented from imposing a monitoring obligation on service providers only with respect to obligations of a general nature; this does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation. This Directive does not affect the possibility for Member States of requiring service providers, who host information provided by recipients of their service, to apply duties of care, which can reasonably be expected from them and which are specified by national law, in order to detect and prevent certain types of illegal activities."
+
A German court that finds a provider not to have taken sufficient steps will grant injunctive relief by prohibiting the provider to allow users to re-publish the illegal content, without setting out what measures the court considers necessary to prevent user infringements.<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 66.</ref> The extent of the provider's pro-active obligations is determined only when the rights holder initiates a separate procedure for alleged violation of the injunction and applies for a disciplinary fine or a prison sentence to be imposed. So providers are told which steps to take only in the judgement that imposes a fine on them. This jurisprudence causes unacceptable uncertainty for providers.<ref>DG Market, Study on liability of Internet service providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 67.</ref> It is impossible for intermediaries to anticipate which measures the courts would consider "reasonable".<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.</ref>
 +
 
 +
This doctrine also has unacceptable repercussions on the freedom of speech on-line, because providers threatened by fines will - often using automatic and broad filters - block and prevent any content that they consider a risk. This leads to the suppression of controversial but politically very valuable content, as intermediaries to not wish the battle on the legality of the content to be fought at their expense.
 +
 
 +
Furthermore, German courts tend to go very far in what measures they impose on providers to prevent future infringements by their users: It was held that there was a duty to deploy technology to automatically scan all user-generated content for content that potentially violates an injunction.<ref>BGH, NJW 2007, 2636 (2639 f.)</ref> Users that have generated illegal content in the past must be subjected to a manual examination of any future content they generate.<ref>BGH, NJW 2008, 758 (762) - eBay</ref> To this end, all users must be identified and the anonymous use of services must be disabled.<ref>OLG Hamburg, Urteil vom 02.07.2008 – 5 U 73/07 – Rapidshare.</ref> All user actions must be logged.<ref>OLG Hamburg, Urteil vom 02.07.2008 – 5 U 73/07 – Rapidshare.</ref> Content regarding issues that "provoke illegal reaction" must be examined manually.<ref>OLG Hamburg, Urteil vom 22.08.2006 – 7 U 50/06 – Heise-Forum.</ref> All of these duties have been imposed not for the prevention of serious crime, but for the mere prevention of infringements of private titles including commercial rights.
 +
 
 +
===The concept of private policing is failed===
 +
 
 +
These "preventive duties" imposed by the courts violate article 15 of the e-commerce directive, according to which Member States - including their courts - must not impose a general obligation on providers to monitor information which they transmit or store. Injunctions lead to a de facto obligation to monitor user-generated content<ref>DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.</ref>, and thus amount to a general monitoring obligation.
 +
 
 +
At any rate, the entire concept of "specific" obligations to "prevent" illegal user action is failed. In a democracy, everybody can trust in the integrity of their fellow citizens, even if it is known that this trust is sometimes abused. Fundamental freedoms constitute the foundation of justice and peace in the world.<ref>European Convention on Human Rights, preamble.</ref> Freedom is the purpose of all law. Its benefits for every person as well as for our society as a whole by far outweigh the harm done by its abuse. A prevention society aimed at eliminating, as far as possible, all potential risks of human behaviour and life is not compatible with European values and freedoms.
 +
 
 +
According to general principles of civil law, only those need to prevent harm that have created a hazard source. The exchange of information is at the roots of human nature. It is a fundamental right (article 11 of the charter of fundamental rights) and can therefore not be considered a "hazard source". The provision of telecommunications services does not create a greater risk of rights infringements than the provision of any product or service. Typical, socially adequate and therefore legal risks do not put their originator in a position of being responsible for intentional violations committed by other people.
 +
 
 +
Even prevention (or policing) technology that can reasonably be implemented or is industry standard must not be imposed on all service providers because of its devastating effects on freedom of speech. Filtering technology, for example, will by its nature suppress legal content that is merely similar to illegal content. This leads to the suppression of controversial but politically very valuable content, for example critical comments on companies and products or "fair use" of intellectual property. Policing is not the job of private companies.
 +
 
 +
===Recommendation===
 +
 
 +
The concept of "specific" obligations of intermediaries to "prevent" illegal user action that is rooted in article 14 (3) and recitals 47 and 48 of the e-commerce directive should thus be given up in its entirety. Intermediaries should only be required to remove illegal content upon notification. The deterring effect of criminal sanctions is sufficient to "prevent" illegal user action.
 +
 
 +
Recommendation:
 +
*Service providers should not be required to "prevent" infringements. To this end, article 14 (3) ECD should read: "''This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate <b><s>or prevent</s></b> an infringement, nor does it affect the possibility for Member States of establishing procedures governing the removal or disabling of access to information.''" Recitals 47 and 48 should be deleted.
 +
 
 +
==Intermediary liability: Judicial review is needed to protect freedom of speech==
 +
 
 +
The current liability limitations do not sufficiently protect free speech for another reason: A service provider that is notified of allegedly illegal content is not exempted from liability under the current directive even if is has reason to believe that the content may well be legal. In practise this situation leads to the removal of practically any notified content without proper assessment of its legality, resulting in major damage to free speech on-line. The provider is being put in the position of a judge which it cannot fill. If it removes content which is later considered legal by the courts, it can be faced with high damage claims by its customer/user. If the provider refuses to remove content which is later considered illegal by the courts, it can be faced with high damage claims by the rights holder.
 +
 
 +
A service provider that is notified of allegedly illegal content should therefore not be required to remove the content before its legality has been assessed by a judge in a preliminary procedure. Member States can design this procedure to be fast and effective. However its cost must not be borne by the provider as this would again have a chilling effect on free speech.
 +
 
 +
Recommendation:
 +
*Service providers should not be required to remove or disable access to information before it has been found illegal by a court of law. To this end, article 14 (1) (b) ECD should read: "''the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information <b>after it has been found illegal by a court of law</b>.''"
  
Article 14 (3) should no longer allow authorities to require service providers to prevent infringements. Policing is not the job of private companies. According to general principles of civil law, only those need to prevent harm that have created a hazard source. The exchange of information is at the roots of human nature, is a fundamental right (article 11 of the charter of fundamental rights) and can therefore not be considered a "hazard source". The provision of telecommunications services does not create a greater risk of rights infringements than the provision of any product or service. Typical, socially adequate and therefore legal risks do not put their originator in a position of being responsible for intentional violations committed by other people.
 
  
 
([http://www.daten-speicherung.de/data/Forderungen_Telemedienrecht_26-02-2009_publ.pdf source])
 
([http://www.daten-speicherung.de/data/Forderungen_Telemedienrecht_26-02-2009_publ.pdf source])

Version vom 15. Oktober 2010, 10:36 Uhr

Draft submission for public consultation on the future of electronic commerce in the internal market and the implementation of the Directive on electronic commerce (2000/31/EC) that is open until 5 November 2010:

Accessibility of company data protection officer

It is becoming more and more important that queries regarding personal data can be addressed directly to a company data protection officer where such a person has been appointed. Service providers should therefore provide the details of a company data protection officer, including his electronic mail address, which allow him to be contacted rapidly and communicated with in a direct and effective manner (article 5 to be amended).

Intermediary liability: Protect freedom of speech by stopping pro-active monitoring and "prevention"

Case study: Inacceptable situation in Germany

The German Federal Supreme Court (Bundesgerichtshof)[1] interprets the liability exemptions in the e-commerce directive not to be cover applications for injunctive relief, i.e. to claims for providers to prevent illegal user activity.[2].

In consequence, the most far-reaching doctrine of contributory "liability" is being applied by German courts, called "accessory liability" (Störerhaftung). According to this doctrine it is not only the wrongdoer himself (direct infringer) and participants (effective promoters or helpers) that can be subject to a claim for refraining from rights infringements, but also mere accessories, including providers of information society services. Responsibility for unlawful user action is extended to all persons who - without necessarily being wrongdoers or participants - deliberately and generally causally contribute to the infringement of a third party's right, provided they have the legal and effective means to detect and prevent the infringement.[3]

According to this doctrine, once an intermediary obtains knowledge of an infringement, it is not only obliged to remove the unlawful content but also to take all technically feasible and reasonable precautions to prevent future infringements. In other words, subject to the requirement of reasonableness, service providers are obliged to examine all user content as soon as they obtain knowledge of any unlawful content. The monitoring obligation is not limited to the detection of the unlawful content that was originally notified or to the original publisher of this content.[4]

In practice, this jurisprudence effectively forces providers to pro-actively monitor user-generated content, this being the only way they can avoid indictment.[5] Culpable breach of the monitoring duty is punished by a disciplinary fine or even a prison sentence.

A German court that finds a provider not to have taken sufficient steps will grant injunctive relief by prohibiting the provider to allow users to re-publish the illegal content, without setting out what measures the court considers necessary to prevent user infringements.[6] The extent of the provider's pro-active obligations is determined only when the rights holder initiates a separate procedure for alleged violation of the injunction and applies for a disciplinary fine or a prison sentence to be imposed. So providers are told which steps to take only in the judgement that imposes a fine on them. This jurisprudence causes unacceptable uncertainty for providers.[7] It is impossible for intermediaries to anticipate which measures the courts would consider "reasonable".[8]

This doctrine also has unacceptable repercussions on the freedom of speech on-line, because providers threatened by fines will - often using automatic and broad filters - block and prevent any content that they consider a risk. This leads to the suppression of controversial but politically very valuable content, as intermediaries to not wish the battle on the legality of the content to be fought at their expense.

Furthermore, German courts tend to go very far in what measures they impose on providers to prevent future infringements by their users: It was held that there was a duty to deploy technology to automatically scan all user-generated content for content that potentially violates an injunction.[9] Users that have generated illegal content in the past must be subjected to a manual examination of any future content they generate.[10] To this end, all users must be identified and the anonymous use of services must be disabled.[11] All user actions must be logged.[12] Content regarding issues that "provoke illegal reaction" must be examined manually.[13] All of these duties have been imposed not for the prevention of serious crime, but for the mere prevention of infringements of private titles including commercial rights.

The concept of private policing is failed

These "preventive duties" imposed by the courts violate article 15 of the e-commerce directive, according to which Member States - including their courts - must not impose a general obligation on providers to monitor information which they transmit or store. Injunctions lead to a de facto obligation to monitor user-generated content[14], and thus amount to a general monitoring obligation.

At any rate, the entire concept of "specific" obligations to "prevent" illegal user action is failed. In a democracy, everybody can trust in the integrity of their fellow citizens, even if it is known that this trust is sometimes abused. Fundamental freedoms constitute the foundation of justice and peace in the world.[15] Freedom is the purpose of all law. Its benefits for every person as well as for our society as a whole by far outweigh the harm done by its abuse. A prevention society aimed at eliminating, as far as possible, all potential risks of human behaviour and life is not compatible with European values and freedoms.

According to general principles of civil law, only those need to prevent harm that have created a hazard source. The exchange of information is at the roots of human nature. It is a fundamental right (article 11 of the charter of fundamental rights) and can therefore not be considered a "hazard source". The provision of telecommunications services does not create a greater risk of rights infringements than the provision of any product or service. Typical, socially adequate and therefore legal risks do not put their originator in a position of being responsible for intentional violations committed by other people.

Even prevention (or policing) technology that can reasonably be implemented or is industry standard must not be imposed on all service providers because of its devastating effects on freedom of speech. Filtering technology, for example, will by its nature suppress legal content that is merely similar to illegal content. This leads to the suppression of controversial but politically very valuable content, for example critical comments on companies and products or "fair use" of intellectual property. Policing is not the job of private companies.

Recommendation

The concept of "specific" obligations of intermediaries to "prevent" illegal user action that is rooted in article 14 (3) and recitals 47 and 48 of the e-commerce directive should thus be given up in its entirety. Intermediaries should only be required to remove illegal content upon notification. The deterring effect of criminal sanctions is sufficient to "prevent" illegal user action.

Recommendation:

  • Service providers should not be required to "prevent" infringements. To this end, article 14 (3) ECD should read: "This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement, nor does it affect the possibility for Member States of establishing procedures governing the removal or disabling of access to information." Recitals 47 and 48 should be deleted.

Intermediary liability: Judicial review is needed to protect freedom of speech

The current liability limitations do not sufficiently protect free speech for another reason: A service provider that is notified of allegedly illegal content is not exempted from liability under the current directive even if is has reason to believe that the content may well be legal. In practise this situation leads to the removal of practically any notified content without proper assessment of its legality, resulting in major damage to free speech on-line. The provider is being put in the position of a judge which it cannot fill. If it removes content which is later considered legal by the courts, it can be faced with high damage claims by its customer/user. If the provider refuses to remove content which is later considered illegal by the courts, it can be faced with high damage claims by the rights holder.

A service provider that is notified of allegedly illegal content should therefore not be required to remove the content before its legality has been assessed by a judge in a preliminary procedure. Member States can design this procedure to be fast and effective. However its cost must not be borne by the provider as this would again have a chilling effect on free speech.

Recommendation:

  • Service providers should not be required to remove or disable access to information before it has been found illegal by a court of law. To this end, article 14 (1) (b) ECD should read: "the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information after it has been found illegal by a court of law."


(source)

  1. BGH, NJW 2004, 3102 (3103)
  2. http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/germany_12nov2007_en.pdf
  3. DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.
  4. DG Market, Study on the liability of Internet intermediaries, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/germany_12nov2007_en.pdf.
  5. DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.
  6. DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 66.
  7. DG Market, Study on liability of Internet service providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 67.
  8. DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.
  9. BGH, NJW 2007, 2636 (2639 f.)
  10. BGH, NJW 2008, 758 (762) - eBay
  11. OLG Hamburg, Urteil vom 02.07.2008 – 5 U 73/07 – Rapidshare.
  12. OLG Hamburg, Urteil vom 02.07.2008 – 5 U 73/07 – Rapidshare.
  13. OLG Hamburg, Urteil vom 22.08.2006 – 7 U 50/06 – Heise-Forum.
  14. DG Market, Study on liability of Internet providers, http://ec.europa.eu/internal_market/e-commerce/docs/study/liability/final_report_en.pdf, p. 51.
  15. European Convention on Human Rights, preamble.