Ethics and PIA’s (Protection Impact Assessment) - Bufet Casanovas
17637
post-template-default,single,single-post,postid-17637,single-format-standard,bridge-core-2.3.2,ajax_fade,page_not_loaded,,side_area_uncovered_from_content,qode-child-theme-ver-1.0.0,qode-theme-ver-21.8,qode-theme-bridge,disabled_footer_top,wpb-js-composer js-comp-ver-7.9,vc_responsive
 

Ethics and PIA’s (Protection Impact Assessment)

Ethics and PIA’s (Protection Impact Assessment)

The recent report on ethics in organizations “AI and the ethical conundrum: How organizations can build ethically robust AI systems and gain trust”, suggests, among other results, that organizations that take ethics into account have more advantage points in the most widely used index to measure customer loyalty. This direct relationship between ethics and increased sales makes it clear that clients have much more influence in organizations and it is something from which we can collectively benefit.

That is why more and more organizations are looking for the appropriate ethical evaluation for the use of data.

This means that many organizations take into account the ethical justification for how personal data is processed and therefore the appropriateness of using the data is questioned.

An impact assessment, known by its Anglo-Saxon term PIA (Protection Impact Assessment), is an excellent instrument to implement this assessment, since it allows organizations to balance the various interests among themselves, and if necessary or possible, to mitigate measures.

This balance is done within legal frameworks, but when conducting a PIA, this balance will, in many cases raise ethical issues (e.g. the question of whether an infringement is proportional, is addressed from the legal framework, but it is in itself an ethical question – what is proportional? How is proportionality measured? -). Consequently, the choice to conduct a PIA already includes an ethical consideration.

Therefore, conducting a PIA is not a purely legal exercise, but can be an excellent way to reflect on the ethical interests involved. For this reason, those responsible for organizations must be encouraged to be aware of the risks that the data project entails for the rights and freedoms of natural persons and assume the responsibility of careful treatment. In this way, organizations can focus on the citizen and “data subject” by increasing privacy awareness in their organization.

Since PIAs do not have a fixed format, organizations can choose to configure a PIA in a way that is consistent with the ethics of the organization. As examples, the following adaptations can be adopted:

  • Formulate “check” questions that conform to the core values ​​of the organization (e.g. up to what point thes risks associated with processing affect the core values ​​of the organization), what can be done to act on the basis of the values organization?).
  • Create an ethics committee of the organization that can participate in ethical evaluations.
  • Make sure that the PIAs process includes asking the ethics committee for advice in certain cases (for example including that ethics will always evaluate the selection criteria in the case of profiles).
  • Ensure the accuracy in distinguishing and formulating the need, the means (the proposed processing) and the objective to be achieved with it, in order to better identify the interests and consequently, to balance proportionality.
  • Maintain a regular dialogue with those involved in the PIAs on any ethical issue related to data processing. Any risks arising from this dialogue can be included in the PIA for the responsible person to take into account in the final evaluation.
  • Consult the data subjects, for example, through customer surveys or by consulting interest groups (possibility also included in article 35.9 of the RGPD), so that risks are not only addressed from a technical point of view, but also the perceived risk or sense of security is also well represented.

Regularly reconsider the PIAs carried out. Society, context, technology, and therefore risks, are constantly changing.

The PIAs are therefore a framework for data ethics, although it will not be the only one to reach a responsible treatment of the data.

*Glossary:
AI: Artificial Intelligence.
PIA: Protection Impact Assessment.

Tags:
,