You are currently browsing EDRi's old website. Our new website is available at https://edri.org

If you wish to help EDRI promote digital rights, please consider making a private donation.


Flattr this

logo

EDRi booklets

Data protection series - issue sheets

10 October, 2013
» 

On 21 October the European Parliament Committee on Civil Liberties, Justice and Home Affairs (LIBE) will vote a hugely important dossier: The General Data Protection Regulation.

This very long legislative document is intended to ensure that our rights to privacy and data protection can be effectively asserted in our everyday lives. One of the main purposes of the Regulation is to give citizens greater control over their personal information - maintaining the principles that were developed in the 1995 Data Protection Directive. Recent revelations have shown just how important this is.

Since more than 4,000 amendments have been proposed in the European Parliament, we are now releasing a series of one-pager documents to explain the most important points of the Regulation. These issue sheets explain which amendments are good for data protection and which amendments might do a lot of harm.

Watch this space, we'll be publishing new documents every two working days! We will, of course, also share them with Members of the Committee.

About the Issue Sheets - an introduction by Prof. Douwe Korff
01 - Definitions
02 - Consent
03 - Legitimate Interest
04 - Data minimisation
05 - Right to be forgotten
06 - Data portability
07 - Profiling
08 - Export to third countries
09 - Data protection by design & by default
10 - Sanctions
11 - Data breach notification

About these Issue Sheets - an introduction by Prof. Douwe Korff

Download pdf

The Draft EU Data Protection Regulation that will replace the main EC Data Protection Directive (Directive 95/46/EC) is in the process of being adopted. The Regulation will have a major impact on the digital environment for citizens, businesses and public bodies. EDRi and its 35 member organisations in 21 European countries are concerned that any weakening of the European data protection rules and principles will undermine the rights and freedoms of European citizens, both within the EU and internationally, due to the EU’s leading role on privacy issues. The ongoing PRISM/Tempora/Etc. scandal shows how crucial high-level data protection rules are for individual citizens, non-governmental organisations and companies.

The Issue Sheets seek to provide Members of the European Parliament in particular with simple clarifications of the main issues; they are kept to one page on each topic.

There is one issue we should stress in this cover note: the wide “national security” exemptions in the main EU treaties. It should be stressed that these are not absolute, and do not grant Member States total exemption from scrutiny in this regard. The EU Charter on Fundamental Rights, which explicitly demands full protection of personal data, cannot be simply ignored.

Ultimately, it is for the European Court of Justice to determine the scope of the exemption. It is already clear that the activities of the United States’ NSA are manifestly not limited to national security as defined in international law, and it would seem that the same applies to the activities of some EU agencies, such as the UK’s GCHQ. Activities by Member States’ “national security agencies” that are not strictly limited to national security as defined in international law are not covered by the EU treaties’ exemptions, and this should be made clear in the Regulation.

Secondly, we urge the European Parliament to stress that any disclosure of personal data that are within the scope of EU data protection law, such as airline passenger data or data on individuals’ financial transactions, to any national security agency is and will remain subject to the EU rules on the processing of personal data: disclosure is expressly mentioned as a form of processing in both the current Data Protection Directive and in the Draft Data Protection Regulation, and is subject to these instruments. Even if the processing of such data by national security agencies after such a disclosure may be outside EU law (provided it is indeed for strictly-defined national security purposes), the disclosure itself is not: it is and must remain subject to the fundamental principles of European data protection law, including purpose-specification and -limitation, transparency, and transborder data flows. Moreover, the activities of EU Member States’s national security agencies’ activities must still always conform, at the very least, to the minimum European standards on State surveillance, adduced by the European Court of Human Rights (as set out in an Annex to a recent EDRi/FREE submission (pdf) on the matter.

01 - Definitions

Download pdf

What is needed?

The increasing possibilities to extract data from large datasets and to merge different types of data makes it easier to identify and analyse individuals based on seemingly isolated pieces of data. The definition of “personal data” must take these possibilities into account. An incomplete definition will undermine the whole legislative proposal. If legislation on personal data does not cover all personal data, it will be destined to fail.

Especially in an online environment, data about users is often not directly identifiable. Online tracking companies do not need or want to know the name of an individual as they care about what a person is are and not who the person is – users are being singled out as worthy of being sold a particular product or accepted for a particular service at a particular price... or not. This happens regardless of how the data is stored: creating a 'pseudonym' will not prevent tracking and analyzing of personal data or taking decisions based on these data.

In order to ensure thorough privacy protection, all personal data including data which is not directly identifiable (like a set of mobile phone numbers) must be given equal protection. So-called 'pseudonymous data' should not fall under a separate regime. The mere fact that data (such as your mobile phone number) does not directly identify you should not mean that it is not worthy of the same protection as your name or your address.

Negative amendments

These fall into two categories. Firstly, there are amendments which limit the scope of what constitutes 'personal data'. Currently, all data linked to an individual are considered personal data. These amendments restrict the scope of personal data by establishing that data is only personal data for the organisation that processes the data (the “controller”) or by a third party “working together with the controller”. Apart from these parties, data relating to a natural person will not be considered “personal data” and will not be protected. Amendments 715 and 716 (ALDE), 717 (EPP) and 720 (independent) fall into this category.

A second set of amendments seek to give less protection to data which are “pseudonymous”, which means that a directly identifiable piece of data is replaced by a pseudonym. This lowering of protection includes all types of data that are 'pseudonimised', including data generated by profiling individuals' personality in online social media, for example. Examples of such amendments are:

  • ALDE: 726, 728, 729, 732, 851, 887, 897, 904, 1542, 1568,
  • EPP: 730, 898, 921, 922, 1543, 1585, 1630.

Positive amendments

Positive amendments seek to clarify that the “singling out” of individuals produces personal data, while maintaining the main direction of the Commission's original proposal. Such amendments include:

  • ALDE: 714
  • S&D: 719
  • EPP: 721

Law enforcement access to data

The PRISM scandal shows us that pseudonymous data are at considerable risk of access and identification from anyone that has the ability to access multiple databases. As pseudonymised data are often used for the creation of detailed personality profiles by online advertising companies, the potential impact of re-identification of data is enormous. The fact that some of the data are only indirectly identifiable or pseudonymised does not in any way reduce the intrusion into each individual's privacy and freedom of communication.

02 - Consent

Download pdf

What is needed?

Consent is an important legal basis for the use of personal data. As it is one of the six legal bases for processing data, it is important to define "consent" properly.

Consent should be explicit, specific and informed in all circumstances. In practice, this means that consent should always be strictly linked to the processing that the user was informed about and not include other forms of use of personal data. A user must receive sufficient information to be able to understand the consequences before he or she can give their consent to the processing of their data.

In practice, data controllers should not be able to use "pre-ticked boxes" to gain users' consent for the processing of their personal data, nor infer their consent from other actions such as acceptance of general terms and conditions.

Negative amendments

A set of amendments tries to undermine the concept of consent by allowing for the possibility of implicit consent, instead of offering a real choice to data subjects.

  • Amendments 757 (ALDE), 758 (ALDE), 762 (ALDE)
  • 760 (S&D),
  • 764 (ECR) and
  • 765 (EPP) follow this approach.

Implicit consent offers less protection, as it can be assumed or included in general terms and conditions. Users will lack a genuine choice about how their data will be processed.

Other amendments introduce a new and unclear concept of "broad consent". This new notion could undermine legal certainty for controllers and would create a loophole, where data subjects agree to something they do not fully understand. Amendments

  • 3066 (EPP),
  • 3067 (ALDE), 3069 (ALDE), 3076 (ALDE), 3079 (ALDE)
  • 3068 (S&D) and 3084 (S&D) fall into this category.

Positive amendments

Amendments providing a clear connection between the consent and the purpose are very welcomed and provide more legal certainty. These include:

  • 89 (Greens),
  • 763 (GUE-NGL) and
  • 1000 (EPP).

Clearly specifying that the consent should be both explicit and informed would also be a very good addition. See for example amendment 854 (S&D).

Consent should not only be limited to specific purposes, but also be automatically declared invalid when given in a general and abstract way to unspecified and unpredictable forms of data processing. This is proposed in

  • Amendment 1000 (EPP).

Law enforcement access to data

Recent revelations, such as in relation to the PRISM programme, show us that the existence of any database generates a risk of abuse of that database. It is therefore crucial for the decision about whether or not to undergo such a risk to lie in the hands of each individual. Consent, in combination with data minimization is essential to allow data subjects to control the collection and use of their personal data.

03 - Legitimate Interest

Download pdf

What is needed?

It seems reasonably clear that, if businesses or other organisations wish to use personal data, any such use should be based on consent, necessity or a legal obligation. However, there may be cases data processing is appropriate, but does not correspond to these criteria. In these cases, the 'legitimate interest' of the organisation that is processing the data may serve as a basis for data processing. However, such cases should be an exception rather than a rule. We therefore need such exceptional cases to be clearly defined and limited to circumstances that are predictable, well-regulated and sufficiently narrow. By doing so, “legitimate interest” will not become a “loophole” that undermines the credibility of the legislation and the trust of citizens. In particular, the legislation needs to prohibit data from being used for purposes that are not related to the original purpose of collection.

Negative amendments

The negative amendments on this issue seek to broaden further the scope of the Commission's proposal on this subject by explicitly extending the scope to include third parties (i.e. allowing third parties, possibly companies that a person has never been in contact with, to process data if such processing is in their interest). Furthermore, some amendments allow the processing of the data for purposes that are that not compatible with the original purpose for which the data were collected. If the Regulation were to permit companies that a person had never heard of before to process data for reasons that the person was not aware of, it would eliminate most of the value of the regulation for the citizen. Amendment 890 from a group of EPP MEPs even suggests that personal data can be freely data mined and further processed on the sole basis that the data are online – regardless of whether the personal data are legally online or not.

  • ALDE: 873, 880, 887, 888, 895, 896, 897, 899, 904, 943, 948, 949.
  • EPP: 882, 883, 890, 898, 900, 901, 945.
  • ECR: 891, 892, 893, 894.
  • Independent: 947

Positive amendments

Some amendments aim to (with varying degrees of success) clarify the scope of the “legitimate interest” exception and to minimise or eliminate the use of data for purposes that are incompatible with the original purpose of the data collection.

  • S&D: 872, 875, 885, 940, 941,
  • ALDE: 879, 939,
  • GUE/NGL: 881, 942,
  • Greens: 100

Law enforcement access to data

It is crucial that the “legitimate interest” exception not be used as a justification for permitting foreign law enforcement authorities to circumvent democratically agreed data access procedures. There is a clear legal basis for law enforcement access to data in Article 6.1.c of the Regulation. Consequently, there is no justification for the use of the “legitimate interest” exception for providing access to personal data for this purpose. It should be clear that this legal ground cannot be used to comply with access requests, except where those meet the requirements laid down in article 6.3.

04 - Data minimisation

Download pdf

What is needed?

The capacity of computer systems to store and process personal information has been constantly increasing in recent years. As the mass of personal information increases, so do resulting threats to privacy and security.

Data minimisation is an essential principle of data protection. It establishes that data collected and processed should not be retained or further used unless this is necessary for purposes that were clearly stated in advance. Before starting to process personal data, controllers should ask themselves: "Do we need this? What for? And for how long?" The existing 95 Directive already contains the principle of "not excessive data collection", but it is not very well observed. This principle is being clarified and strengthened by Article 5 (c) of the proposed General Data Protection Regulation.

Negative amendments

Amendments 824, 825, 826 and 827 seek to delete the notion that collection of data should be limited to the minimum necessary and replace it by the wording "not excessive", which is more vague and which will not provide a much-needed improvement of the current legislation. The vague concept of “not excessive” is hard to enforce and opens the door for unlimited retention periods, unexpected re-use and other forms of abuse.

Positive amendments

Amendments 752 and 753 introduce new provisions to define « data protection by default and by design » and to clarify that these principles should also include data minimisation.

When it comes to personal and intimate information held by companies about our private lives, it is essential that these data are kept to a minimum; otherwise we risk more abuse and increase the loss of control over how our personal data are used, shared and sold.

Law enforcement access to data

The European data protection regime is applicable to all companies providing services in the EU. Data minimisation can therefore be a good step towards fewer data being stored, more control over what data are stored and a corresponding reduction of the risk of surveillance from (foreign) governments and more control over one's personal data.

05 - Right to be forgotten

Download pdf

What is needed?

Article 17 of the draft Regulation on the ‘Right to be forgotten and to erasure’ means that data controllers, such as social networks, will have to comply with users' requests to delete everything they have published about themselves online. This right already exists in the 1995 Directive and is very important for holding controllers accountable for correctly managing the data they process. It empowers data subjects to have control over their own data.

It is important to note that this proposal does not create open-ended rights to have newspaper articles or blogs deleted or to overturn legal obligations on companies to store certain data. There is a specific exception in Article 80 on freedom of speech, and this article can be strengthened and clarified to minimise any risk of misunderstanding. For clarity, the wording “erasure” should be used throughout, as “the right to be forgotten” has led to a great deal of confusion.

Negative amendments

Many amendments aim at restricting the right to erasure of the data subject and would undermine legal certainty. For example, a right to simply “request” deletion makes no legal sense. Moreover, the amendment to only delete data where “appropriate” only adds confusion. Some amendments make reference to an undefined “retention period” adding further confusion. The following is a small selection of the negative amendments tabled:

  • ALDE: 1386, 1388, 1390, 1391,1393, 1395, 1396, 1397, 1399, 1400, 1401, 1405
  • EPP: 1392
  • ECR: 1389
  • S&D: 1398, 1402, 1418, 1419

Suggested exemptions for anonymous and pseudonymous data - such as amendment 1420 by the EPP - do not make sense; this could increase the liability of intermediaries and wrongly incentivise them to monitor and delete information for which they are not responsible. Furthermore, the drafters appear not to understand that anonymisation, by definition, cannot be undone.

Positive amendments

Some amendments have been tabled in order to clarify the title and change it into “erasure”.

  • These amendments are: ALDE: 1380, 1383; EPP: 1381; S&D: 1382

Some amendments have been tabled in order to delete Article 17, paragraph 2 since this provision, as proposed by the Commission, is unclear and could increase the liability of intermediaries and wrongly incentivise them to monitor, restrict or delete information over which they have no control.

  • These amendments are: ECR: 1411 - EPP: 1412 - ALDE: 1413, 1414

Law enforcement access to data

Recent revelations have shown that the existence of databases that store large amounts of data create risks for individuals' privacy. It is clearly appropriate that individuals should always have the right to require their data to be deleted.

06 - Data portability

Download pdf

What is needed?

Data portability (Article 18) is the right to move one's data from one service to another. It is a key right to ensure effective control over personal data.

This right will make it easier for users to switch services, effectively preventing “lock-in” effects which are harmful for consumers and competition. For instance, if a user is dissatisfied with the service offered by a social network, they would be free to move their data to a different platform, instead of being forced to delete their profile and start all over again.

Data portability would help to stimulate competition by making market entry easier for new companies, as consumers will be more willing to try out these services if this is made easier than it is at the moment. It would also create opportunities for innovative new services, such as a service that would analyse a user's electricity usage to work out if another company would be cheaper or to determine how to consume energy more efficiently.

The formats in which data are provided should be interoperable. Otherwise the right to data portability would make little sense because data could not easily be transferred from service to service.

Finally, it should be clarified that data controllers should not store data that are no longer needed only in order to be able to comply with a possible future request to move the data.

Negative amendments

Some amendments propose deleting the article on data portability entirely. This is acceptable only under the condition that this right is transferred to the article on the right to access (Article 15). Amendments that go in this direction include 1495 (S&D) and 1496 (ALDE)

A set of amendments propose vague exceptions that would allow controllers to circumvent this obligation. These include amendments 1497 (ALDE), 1498 (EPP) and 1505 (ALDE)

Requiring the data subject to pay a fee to be able to exercise data portability would create obstacles to exercise this right, thereby undermining its use in practice. Amendment 1500 (EPP) suggests this restriction.

Data portability should not be confused with interference with intellectual property rights or trade secrets such as in amendments 1504 (ECR) and 1512 (EPP).

The Regulation should not omit the public sector from this obligation as proposed by Amendment 1522 (EPP).

Positive amendments

Adding an interoperability criterion and expanding data portability beyond procedures based on contract or consent would allow data portability to be fully applicable. Amendments 1503 and 1511 (ALDE) are positive in this regard.

The Rapporteur also proposed in amendment 143 to clarify that data portability does not interfere with the obligation of deletion.

Law enforcement access to data

The companies involved in the PRISM scandal – all large US corporations – have been collecting and transferring the data of EU citizens to US intelligence services. If European citizens had the possibility to port their data to more secure services, they would have much more control over their personal data. A strong right to data portability would ensure that EU citizens are not locked into foreign services and are free to use local, privacy-preserving platforms of their choosing.

07 - Profiling

Download pdf

What is needed?

Profiling is a process whereby assumptions are made about individuals based on automated processing of data which has been collected about them. This type of profiling is most commonly used for business purposes (for targeted advertising or credit ratings, in particular) and for law enforcement purposes. Due to the ever-increasing scale of electronic data collection and the increasing complexity of profiling calculations, the types of data being generated are becoming more and more privacy-intrusive. Profiling tends to reinforce societal stereotypes and has a built-in acceptance of errors – the profile will guess who is a potential terrorist or a bad insurance risk. The fact that it might be wrong 5% of 10% of the time is not important from the perspective of the business, but of potentially huge importance to the individual citizen. Recent research from the University of Cambridge(1) shows that surprisingly extensive, sensitive and relatively accurate information can be obtained about an individual based on small amounts of data.

As a result, it is important that profiling be prohibited both online and offline unless certain strict conditions are met, including the consent of the individual. Strong safeguards should be put in place, including the right to be provided with meaningful information about the logic behind the profiling.

Negative amendments

Negative amendments seek to change profiling from an opt-in provision (where positive consent is needed) to an opt-out rule (where individuals can be subjected to profiling but have the right to object or request not to be profiled). In effect, such amendments would allow citizens to be tracked without their knowledge or consent.

  • ALDE: 1545, 1547, 1555, 1556, 1557, 1559, 1560, 1568, 1572
  • EPP: 1549
  • ECR: 1553
  • S&D: 1551
  • Independent: 1554

Positive amendments

Positive amendments seek to strengthen and clarify the rights of the individual.

  • EPP: 1546, 1548,
  • ALDE: 1550, 1561,
  • S&D: 1552, 1562, 1563

Law enforcement access to data

We have seen from the stockpiling and use of airline passenger name records (PNR), financial data (TFTP) and communications data (data retention) that big databases will inevitably be re-used – often illegally(2) – for “law enforcement” purposes. The types of data that can be generated by profiling includes the most sensitive personal information (sexual orientation, political orientation, family relationships, health-related information, etc). The fact that these data will be stockpiled over months and years means that profiling companies will have a more profound insight into individuals' lives, trends and location than the individuals will have about themselves, particularly with regard to older information. This is a new and growing threat to personal privacy and security that must be treated with utmost seriousness.

It is absolutely crucial for both privacy and that the collection and to avoid a profound chilling effect for online communication that profiling can only be done with consent and with a right of erasure.

Footnotes: (1) http://www.cam.ac.uk/research/news/digital-records-could-expose-intima... (2) See https://www.aclu.org/national-security/european-officials-declare-us-f..., for example

08 - Export to third countries

Download pdf

What is needed?

A significant number of countries around the world, including the USA, have weaker data protection legislation than the European Union or no comprehensive legislation at all.(1) In order to re-establish trust, European citizens need to be sure that if their data are being transferred to a third country, his/her fundamental right to data protection and the level of protection provided by EU rules will be respected.

Weak controls on export of data means a weakening of protection of data and an incentive to process data abroad to circumvent EU rules.

Export of data should only be possible when verifiable safeguards are in place that data will be processed in line with the minimum EU standards or better.

Negative amendments

A large number of amendments are trying to weaken the protection offered.

Some amendments propose weakening the new EU rules by automatically recognising the applicability of past adequacy decisions even though the rules would have changed (e.g. Amendment 2384 (ALDE)).

Some amendments propose weakening the standards when it comes to processing for historical, statistical and scientific research (for example amendments 3075 (EPP), 3077 (ALDE) and 3094 (ALDE)).

Some amendments seek to recognise an adequate protection when the controller has “adduced appropriate safeguards”, and give data controllers the right to unilaterally overrule a decision of non-adequacy by the European Commission (such as amendment 2145 (ALDE) and 2148 (ALDE).

Positive amendments

The reintroduction of the article guaranteeing safeguards when it comes to disclosure to third countries by virtue of extra-territorial law is very important. (Amendment 256 (Greens))

The review of current adequacy decisions is also a good addition, as proposed by amendments 2412 (EPP) and 250 (Greens). Amendment 259 from the Greens seeks to reintroduce the “Article 42” provision that would explicitly prohibit the transfer of personal data to third country law enforcement authorities outside of agreed legal frameworks.

The prohibition of transfer to third countries when the laws allow processing that would be unlawful under the Regulation, as proposed in amendments 2385 (GUE/NGL) and 2386 (S&D) is also welcome.

Law enforcement access to data

The transfer of data of the companies involved in the PRISM scandal to the US is allowed under the EU-US Safe Harbour Agreement. The problem is that the Safe Harbour does not offer enough protection for EU citizens as US data protection is lower than the European standards.

The Safe Harbour agreement is so unclear that the European Commission is certain that only Recital 90 of the draft Regulation was needed in order to clarify the situation. Meanwhile, the Irish data protection authority is clear that companies are completely within their rights to transfer European data to the US authorities for apparently any purpose, without information being provided to the citizen and regardless of the presence or absence of any safeguards in the United States.(2) This approach profoundly and fatally undermines the fundamental right of European citizens to protection of their personal data.

Footnotes: (1) See https://www.privacyinternational.org/global-data-protection-map (2) http://www.europe-v-facebook.org/Response_23_7_2013.pdf

09 - Data protection by design & by default

Download pdf

What is needed?

Data protection by design means that, when designing products and services, data protection requirements are taken into account from the outset. It ensures that companies and public bodies take a positive approach to protecting privacy, throughout the entire life cycle of design and implementation of technologies and services that require the processing of personal data.

The provisions must be further clarified, in order to indicate that data protection by design and default relate to both (a) technical measures relating to the design and architecture of the product or service and (b) organisational measures, which relate to operational policies of the controller.

Negative amendments

Some amendments introduce the so-called “context and risk-based approach” to the concept of privacy by design and by default. This is dangerous as it would only lead to legal uncertainty for companies and citizens and would, consequently, result in higher legal costs for companies.

Other amendments describe these principles as a burden for companies, despite the fact that this approach will help to avoid situations in which data protection requirements are an afterthought to the development process, which can result in both higher development costs for companies and lower protection for the data subject. Data protection by design can be more accurately seen as an investment that greatly reduces the privacy risks for both companies and citizens (e.g. costly data breaches).

Moreover, some of the suggested additions are vague and undermine legal clarity and opt for a more self-regulatory approach, which is not sufficient to ensure the implementation of strategies that protect privacy by design and by default.

Examples of such amendments include:

  • ALDE: 1710, 1715, 1721, 1726, 1729, 1730
  • EPP: 1711, 1720, 1723
  • N/A: 1717
  • Greens/EFA: 1725
  • S&D: 1728, 1731

Positive amendments

Some amendments add clarity and specificity to the Commission proposal. They work to strengthen privacy by design and by default which ultimately strengthen the rights of the data subject.

These amendments are:

  • S&D: 1714, 1722, 1727
  • Greens/EFA: 1713

Law enforcement access to data

The Prism scandal showed that several Western governments and large corporations have cooperated to implement mass surveillance, resulting in a loss of public trust. Public awareness regarding privacy enhancing technologies, such as encryption, has grown. Since the NSA leaks, companies that implement privacy by design and by default, such as search engines Ixquick and DuckDuckGo, have seen a steep increase in traffic. It is clear, therefore, the comprehensive adoption of data protection by design and by default can serve to restore citizens' trust in systems, governments and companies, while also providing the latter with a distinct competitive advantage.

10 - Sanctions

Download pdf

What is needed?

It is crucial to have sanctions that are effective, proportionate and dissuasive. Good legislation is not effective without good enforcement rules.

The level of sanctions is very low in most European countries, and therefore even large companies, whose business is the collection and processing of data, have little incentive to respect the legislation. The European Union needs an effective, uniform and predictable level of enforcement.

In an era of "big data", with all the risks that this implies for the fundamental rights of citizens, regulators and courts must have the power, when necessary, to impose appropriate levels of penalties. The establishment of comprehensive and streamlined remedies is an essential element of the Regulation.

Negative amendments

A set of amendments proposes eliminating any reference to specific levels of sanctions or eliminate entirely the proposal to fine serious infringers a percentage of turnover. This would be a considerable step back for the effectiveness of the Regulation. Amendments that go in this direction are 2886 (EPP), 2887 (ALDE), 2890 (ALDE), 2891 (ALDE).

Positive amendments

Some amendments propose an increase of the percentage of turnover that a serious infringer could be fined. Companies will be more willing to respect the rules if the punishments that could be imposed are commensurate to their ability to pay. It should not be forgotten that the maximum sanction that would only be imposed in cases where the infringement of the law is very extremely serious. Amendments that go in this direction include2 905 (GUE-NGL), 2921 (S&D), 2922 (S&D), 2925 (GUE-NGL).

11 - Data breach notification

Download pdf

What is needed?

There are no 100% secure systems. This was demonstrated in December 2012 by the national railway operator in Belgium, SNCB/NMBS Europe, which leaked personal data of 1.4 million customers. In this database were personal data, including 5,682 email addresses from @ec.europa.eu (European Commission) and 1,668 European Parliament addresses. To date, the company has not notified any of the victims of the breach.

This incident shows quite clearly that the Commission proposals to introduce mandatory notifications to data protection authorities and to the victims of the breach should be supported. Existing laws in the US show that mandatory breach notifications are an effective tool to force companies and other organisations to quickly and comprehensively address breaches, as well as acting as an incentive for better security practices.

Moreover, supervisory authorities should maintain a public register of breaches. The safeguards against excessive notifications to victims and against excessive demands (companies are only required to act “where feasible”) being made of the data controller are clear and reasonable. Any weakening appears entirely unnecessary.

Negative amendments

Amendments have been tabled to weaken the obligation for companies and autorities to notify security breaches only when the breach is « likely to adversely affect » the privacy of citizens or if it is constitutes a « serious risk ». However, different controllers will define ‘adversely’ or 'serious' in different ways. In addition, it is naturally in a company's interest to underplay the impact of a breach on their customers for reputational and other reasons. These amendments are:

  • EPP : 1947, 1950, 1951, 1953, 1961, 2003
  • ALDE : 1952, 1955, 1956, 1959, 1998, 2000
  • ECR : 1997

Other amendments limit the breach notification to a narrow set of circumstances and suggest that processors should decide whether the breach needs to be notified or not to the data protection authority or the victim of the breach. Some amendments even go as far to delete the obligation to keep records of breaches. Amendments:

  • EPP : 1957
  • ALDE : 1964, 1967, 1968, 1975

Positive amendments

The introduction of a public register is a positive step since this can help to educate the public about IT security and provide added insight into trends regarding breaches. It is also beneficial, for legal certainty, to introduce the obligation to request the opinion of the European Data Protection Board before adopting implementing acts.

Amendments:

  • S&D : 1989, 1994, 1996, 2009, 2013
  • GUE/NGL : 2004
  • Greens/EFA : 198, 202
 

Syndicate:

Syndicate contentCreative Commons License

With financial support from the EU's Fundamental Rights and Citizenship Programme.
eu logo