#PrivacyCamp23: Event Summary

In January 2023, EDRi gathered policymakers, activists, human rights defenders, climate and social justice advocates and academics in Brussels to discuss the criticality of our digital worlds.

We welcomed 200+ participants in person and enjoyed an online audience of 600+ people engaging with the event livestream videos. If you missed the event or want a reminder of what happened in a session, find the session summaries and video recordings below.

In 2023, we came together for the 11th edition of Privacy Camp, which is jointly organised by European Digital Rights (EDRi), the Research Group on Law, Science, Technology & Society (LSTS) at Vrije Universiteit Brussel (VUB), the Institute for European Studies at Université Saint-Louis – Bruxelles (IEE at USL-B), and Privacy Salon.

What is the topic that brought us together?

Invoking critical times often sounds like a rhetorical trick. And yet, in 2022, we witnessed the beginning of both an energy and security crisis, caused by the Russian invasion of Ukraine. In the meantime, the world is still dealing with a major health crisis, while increasingly acknowledging the urgency of the climate and environmental crisis. In fact, crises are situations where the relations in which we are entangled change, so that understanding and making an impact on how these relations change, and in favour of whom, becomes crucial.

In this context, we asked ourselves the following questions: How do digital technologies feed into and foster the multiple crises we inhabit? What do we need to consider when approaching the digital as a critical resource that we should nurture, so as to promote and protect rights and freedoms?

Check out the summaries below to learn about the role of digital technologies in ongoing world crisis and what are the current efforts to build a people-centred, democratic society!

Contents


Each year, Privacy Camp is made possible thanks to the generous support of our partners and individual donors like you. Your donations help us to maintain the conference free of charge for all digital rights advocates, activists, academics and all other participants from Europe and beyond.


Reimagining platform ecosystems

Session description | Session recording

In the context of the upcoming implementation of the Digital Services Act (DSA) and Digital Markets Act (DMA), the panel discussed their vision of a sustainable and people-orientated platform ecosystem. Ian Brown pointed out to the need for platform business innovation about interoperability. Chantal Joris expressed the vision of having more open space for healthy debate at the expense of the current monopoly of a few companies deciding on what information users can access. Jon von Tetzchner prioritised the urgency of banning current data collection and profiling practices companies are using. And Vittorio Berlota concluded that the future challenge we are set against is finding a middle ground between not over-regulating the internet and preventing companies from “conquering the world, and conquering knowledge”.

The panel spent some time to talk about interoperability, noting that it can advance the goals of the General Data Protection Regulation (GDPR), i. e. improving data protection by empowering users to choose more privacy friendly services which in turn should create market pressure on all market participants. Concerns were expressed about the current regulatory approach giving more control to companies that rather restrict users’ speech and about companies’ ability to circumvent interoperability obligations.

The audience raised the question about law enforcement bodies and how they relate to interoperability. The speakers highlighted how a limited number of platforms would equally be beneficial for law enforcement since it would ultimately facilitate control – whereas decentralisation and the existence of a larger amount of players would make government control more difficult.

The rise of border tech, and civil society’s attempt to resist it, through the AI Act’s eyes

Session description | Session recording

The session focused on the currently negotiated Artificial Intelligence Act (AI Act) and how the legislation is failing to protect people on the move despite civil society’s recommendations. Simona De Heer highlighted the lack of clarity and transparency in the European Union’s and Member States’ development and uses of technology in migration. This technology is used to profile, analyse and track people on the move all the time. Prof Niovi Vavoula explained that these AI systems are inherently biased, creating a power imbalance and divide between foreigners and non-foreigners. While there is no ban on tech uses in migration, discriminatory practices are on the rise like emotion recognition and behaviour analysis, predictive analytics, risk profiling and remote biometric identification in migration.

Alyna Smith shed light on the impact such uses of AI-powered technologies have on undocumented people as ID checks have become part of a strategy to increase deportations. An underlying element of these practices is racial profiling and criminal suspicion against migrants. Evidence shows that 80% of border guards say that ethnicity was a useful indicator. Hope Barker highlighted that the issue is not the tech itself but the uses of tech for harmful ends. For example, reports show that Frontex officers have taken images and videos of people on the move without consent. Or cases of drones collecting a vast amount of data without people’s knowledge of how the information will be used in countries like Croatia, Greece, North Macedonia, and Romania.

Workshop: Policing the crisis, policing as crisis: the problem(s) with Europol

Session description

The workshop engaged the audience with a series of questions about the nature and powers of Europol. Chris Jones pointed out that the last reform gave Europol many tasks that they were already executing, including processing personal data. The agency is becoming a hub to receive data from non-EU countries and may be involved in “data laundering” (using data extracted from illegal actions such as torture). Laure Baudrihaye-Gérard added that the lack of scrutiny around how Europol is processing data and unclear legal understanding if the agency can be sanctioned for misbehvaiours puts thousands of people under mass surveillance without the right to fight back.

Fanny Coudert underlined that the new reform will make it very difficult for the European Data Protection Supervisor to limit the data processing by Europol, especially in the case of large databases. Saskia Bricmont revealed that there was no strong opposition in the European Parliament or the Council during the Europol reform negotiations. Civil society’s actions had an impact on amending and ensuring some scrutiny during the reform, which, however, was rejected in the end.

Sabrina Sanchez spoke about the attempt to make “prostitution” a European crime in the VAW Directive. Since the Europol system is very opaque sex workers don’t know if they are in database where organising as sex workers is criminalised. Romain Lanneau finished with a Dutch case in which the data of a leftist activist was used by Europol. The case is the tip of the iceberg given the lack of scrutiny of Europol. Romain invited the audience to request data from Europol. See more here.

Contesting AI & data practices. Practical approaches to preserving public values in the datafied society

Session description | Session recording

The session investigated the practical implementation of data ethics from academic, business and legal perspectives. Iris Muis kicked off the conversation by presenting the tools (Fundamental rights and algorithms assessment and Data Ethics Decision Aid) Utrecht University developed and how they were used in the government sector. Joost Gerritsen made the point that implementing data practices that preserve public values should be profitable. He stressed that it’s important to acknowledge that GAFAM is not representative of the whole of Europe as there are other companies relying on artificial intelligence.

Willy Tadema took us through the recent historical development of AI and how governments relate to using the technology. Willy noted that nowadays governments are reluctant to use AI without sound reasoning. The discussion raised the question if we shouldn’t focus on the need of building an algorithmic system in the first place and then look at data ethics. The panel then followed to discuss the issue of putting the responsibility and “burden of moral decision” on the tech team. Willy pointed out that we should have all key stakeholders in the room from the very beginning, including those affected by the algorithm.

Critical as existential: The EU’s CSA Regulation and the future of the internet

Session description | Session recording

Ella Jakubowska started the panel by introducing the debate surrounding the CSA proposal which has been subject to criticism from a broad coalition of experts ranging from legal experts to computer scientists. The debate was kicked off with the participants articulating their understanding of ‘critical‘ and ‘digital‘. It would imply understanding the opportunities and limits posed by technology (Patric Breyer), they would reflect key principles of the CCC, i.e. computers can and should be used to do good, but at the same time mistrust in authority and decentralised systems were key (Elina Eickstädt), they implied that digital policy is also social policy and that we should thus be aware of existing power structures and their effects on vulnerable groups (Corinna Vetter).

Elina Eickstädt stressed that there is the belief among lawmakers that it was possible to undermine encrypted communication in a secure way would be an illusion. Encryption would be binary: either you have it, or you don’t. Screening content before it is encrypted via client-side scanning would break with the principle of end-to-end encryption, and that users are in total control of who can read their messages.

Corinna Vetter highlighted the social and labour policy issues of the CSA. Since no algorithm exists yet that could reliably detect CSA content, it would have to be done by people who would get access to private communications. Furthermore, we would already know from current content moderation practices that it is done by workers who suffer from very poor working conditions and the psychological impact of the content they have to review. Patrick Breyer presented how the CSA proposal would create a new and unprecedented form of mass surveillance.

Workshop: Police partout, justice nulle part / Digital police everywhere, justice nowhere

Session description

The workshop started by focusing on trends of tech surveillance and harms to racialised communities. Dr Patrick Williams highlighted that currently the UK is facing problematic situations and crises in the police institution and what we see in this context is that institutions trying to distance themselves from the issue. Chris Jones spoke about the 2015-2016 period of people on the move coming to the European Union and the institutions’ resort to control and tracking of people as a reaction. He underlined that the problem is not the technology but power and pointed out that the materialisation of this approach on an international scale is global policing strategies.

Itxaso Domínguez de Olazábal outlined the role of Israel in the development of the techno-solutionist approach as they have built a leading surveillance industry, testing new technologies on Palestinian people. Itxaso explained that this system of extraction and testing underlines capitalist racism. Laurence Meyer explained that police and prison are not effective as there are designed to enforce order, not reduce crime or increase safety. What we have seen is that digital policing tools are both discriminatory and criminalising, impacting people who exist outside of the hegemonic structures.

The discussion concluded with a strong call to action tofind cracks in the policing and criminal system and create our resistance there, to not be paralysed in the face of these seemingly perfect and infallible technological structures. We can work towards non-reformist reforms, reducing the power of the police, going beyond technology and posing a political question about abolition.

In the eye of the storm: How sex workers navigate and adapt to real – and mythical – crises

Session description| Session recording

Kali Sudhra started the conversation by outlining the context that the COVID-19 pandemic created for sex workers. Sex workers were faced with less space in their community, increased police encounters and more people moving online to do their work. However, the online environment brought more risks, including financial discrimination by platforms, as sex workers have to abide by the terms of service which are often discriminatory, require private info disclosure (e.g. PayPal). Sex workers also experience online censorship, a consequence of racist algorithms, meaning many cannot advertise services, pushing sex workers to the margins.

Yigit Aydinalp spoke about the role of private actors as enablers of harmful legal frameworks. In the Digital Services Act, the Greens introduced an amendment on non-consensual imagery, which means that hosts of content would have to collect users’ phone numbers. This violates the data minimisation principle, especially when working with marginalised communities. Sex workers were not consulted on this. In the Child Sexual Abuse Regulation proposal, we also see the over-reliance on tech solutions in response to another crisis, resulting in more surveillance to marginalised communities.

Saving GDPR enforcement thanks to procedural harmonisation: Great, but how exactly?

Session description| Session recording

Lisette Mustert focused on the cooperation between the European Data Protection Supervisor (EDPS) with national authorities to highlight how cross-border handling of data occurs and express the need for a new set of rules that will clarify the existing uncertainties. To reach a consensual outcome, lead authorities need to assist each other. The cooperation process is complex and slow and it may deprive parties of their procedural rights, which does not lead to protection of digital rights or admin rights or defence rights based on how the GDPR system is designed.

Gwendal Le Grand explained that not all authorities are always in agreement in terms of the interpretation of the law and decisions. Then we enforce dispute enforce mechanism. To that, Romain Robert gave the example of the Meta complain EDRi member noyb submitted and the many procedural issues along the way. Maria Magierska highlighted the capacity and resources limitations of Data Protection Authorities (DPAs), which should be seen as a structural problem and not an individual problem.

In their wishes of what could solve the harmonisation issue of the GDPR, the speakers mentioned the implementation of the principal of good governance, implementation of the full law, DPAs to work as fast as the EDP, make the regulation as clear as possible.

Workshop: The climate crisis is a key digital rights issue

Session description

Jan Tobias introduced the discussion outlining the question of the link between climate crisis and digital infrastructure. Harriet Kingaby focused on disinformation economy and climate disinformation, pointing out the harms advertising tools create for society. Narmine Abou Bakari presented some empirical evidence of how tech companies negatively impact the environment. For example, in 2020 tech companies consumed 9 percent of our global electricity and are right now far away from the net-zero metric. Another comparison showed that cryptocurrency mining energy consumption is equal to whole Argentina country energy consumption.

Narmine also emphasised that we need repairable devices, urging for advancing people’s right to sustain devices, choose software, transparent communication from companies and give users more control over data and software.

The speakers also discussed the questions of whose justice we are prioritising when we speak about climate crisis from digital perspective and how we can engage in these spaces.

Solidarity not solutionism: digital infrastructure for the planet

Session description | Session recording

The panel started with an overview of the historical relation between technology and the climate. Some examples link to surveillance of movements and land defenders; access to information and spreading disinformation; and extractive nature of corporate technology practices. Paz Pena focused on geopolitical ethics and the exploitation of nature by digitalisation and globalisation; and who are the communities as well as non-humans paying the price for the “solutions” to the climate crisis that have been put forward by companies.

Following that the conversation continued by revealing some of the false solutions given to climate crisis and their links to digital rights. Becky Kazansky spoke about carbon offsetting and how institutions, companies and individuals use it to compensate for their carbon footprint. Ecology is not a balance sheet, so it’s important that we treat climate pledges by tech and other companies in the same way as other digital rights issues. Lili Fuhr added that the way we define the problem dictates the solution – and what we are seeing is that tech is being brought to ‘fix’ climate and made to ‘fit’ the problem. The discussion concluded with several critical points suggesting that even though Big Tech contribute to the climate crisis, what we need to fight against is the hegemonic logic of technocapitalism as a ‘solution’ to ecological crisis. A social justice problem cannot be distilled to a feel-good practice for European consumers.

The EU can do better: How to stop mass retention of all citizen’s communication data?

Session description | Session recording

Data retention is a topic that comes and goes. Plixavra Vogiatzoglou spoke about the judicial legislation of mass data retention and some challenges that arise from it. How can we distinguish if the interference is serious or not. Data retention on itself regardless of any harms or sensitive information collection constitutes interference and conclusion is even more justified when private and sensitive information is included. Interference should be assessed as serious by the fact that this vast amount of data being collected amplifies quote significantly the power information asymmetries between the citizens and the government. Second challenge refers to the differentiation between public and national security, which seems to be made on the basis of immediate and foreseen threats. And third challenge that was presented referred to making a balance between the entering of new technology and avoiding the facilitation of mass surveillance.

In Belgium, there are currently three data retention laws, subjecting people to mass surveillance. In Germany, there is space to establish an alternative to the mass data retention approach. Furthermore, the panel discussed how the European Court of Justice reacts to the pressure coming from the Member States and what actions could be taken to defend fundamental digital rights.

Workshop EDPS Civil Society Summit: In spyware we trust. New tools, new problems?

Session description

Wojciech Wiewiórowski kicked off the discussion with a reflection on the way member states behave when matters refer to national security, recognising that many states rely on tools like spyware. He highlighted that one of the major issues is that national states security exemption is not harmonised and the different duties that Data Protection Authorities need to perform nationally. Hence, the oversight of the European Data Protection Supervisor is very important.

Rebecca White pointed out that civil society is burdened with the task of proving that untargeted mass surveillance is not the way to ensure security. Eliza Triantafillou spoke about Predator spyware investigations and the challenge of persuading the public that it is harmful to surveil journalists. For example, recent news revealed that a journalist was put under surveillance by the Dutch secret services for the last 35 years and now none of the protected sources wants to work with this journalist. Bastien Le Querrec added that other than Pegasus, there are many other uses of spyware that people are not familiar with.

The following part of the workshop took a fishbowl structure and allowed for intervention from the audience. The main issues that were raised were around national security, the efficacy of a ban on spyware, and commercial use of spyware.


Thank you to this year’s Privacy Camp sponsors for their support! Reach out to us if you want to donate to #PrivacyCamp24.


Author
Viktoria Tomova
Communications and Media Officer
Twitter: @tomova_viktoria

#PrivacyCamp23: Critical. Digital. Crisis. | Call for Panels (Deadline extended)

The 11th edition of Privacy Camp invites to explore the criticality of our digital worlds. 

Invoking critical times often sounds like a rhetorical trick. And yet, this year, we have witnessed the beginning of both an energy and security crisis, caused by the Russian invasion of Ukraine. In the meantime, the world is still dealing with a major health crisis, while increasingly acknowledging the urgency of the climate crisis. In fact, crises are situations where the relations in which we are entangled change, so that understanding and making an impact on how these relations change, and in favor of whom, becomes crucial.

Hence, these are some of the questions we want to ask. How do digital technologies feed into and foster the multiple crises we inhabit? What do we need to consider when approaching the digital as a critical resource that we should nurture, so to promote and protect rights and freedoms?

Our call for panels aims at fostering a conversation in which the criticality of digital technologies can be read in two ways (following the Oxford English Dictionary). On the one hand, the  critical as a situation having the potential to become disastrous. On the other hand, the critical as having a decisive or crucial importance in the success, failure, or existence of something. For instance, the critical nature of digital infrastructures rests on the importance of these infrastructures in all aspects of our lives: from public health to education, labour and services, from politics to intimate relations. At the same time, digital infrastructures and technologies become critically important in times of crises. 

While not all critical infrastructure is digital, much of the digital infrastructure is becoming critical.  For example, digital technologies can contribute to reinforcing geo-political tensions, because of extractivist approaches to rare raw materials mining and the reliance on external powers for critical infrastructure.

Our ability to deal with multiple crises as societies is also quite dependent on the way the digital public sphere functions and how EU regulation is enforced, including the General Data Protection Regulation (GDPR) and the Digital Services Act. Ongoing debates about what kind of security European societies and institutions want to pursue contribute to shifting European and national authorities’ stance on what kinds of technologies law enforcement and migration control should rely on, and how these technologies should be governed. This is becoming particularly tangible in the approach to the regulation of border control technologies in the AI Act, increased powers and broader scope of the recent Europol’s mandate, as well as to the regulation of Child Sexual Abuse Material (CSAM) in the CSA Regulation.

If we are concerned with the future of rights, democracies and the planet, we are to remain critical of how both rights and digital infrastructures are organised and controlled, what role should the private sector have, or how all things digital impact our environment in terms of energy and ecology. This requires a broader conversation involving perspectives and approaches focusing on various rights, be they individual and collective, traditional or new.

In 2023, the Privacy Camp invites you to participate in, and foster, a discussion about the critical state(s) of our a world in which the digital is, itself, critical. What does it mean to regulate digital technologies and infrastructures in times of crises? Specifically, we invite submissions that answer the following questions:

  • How do digital rights look like during times of crises?
  • What is the long-term impact of border control digital technologies implemented during times of crises?
  • How does the crisis logic boosts extractivist approaches with regard to data, natural resources and social justice?
  • What is the link between crisis politics and the rise of securitisation narratives in EU digital policy-making?
  • How do we avoid a techno-solutionist approach to solving crises?
  • How do we sustain legal standards and the rule of law principles, (Is the GDPR in the middle of an (enforcement) crisis, and how to save it?

Submission guidelines:

  • Indicate a clear objective for your session, i.e. what would be a good outcome for you?
  • Include  a list of a maximum of 4 speakers that could participate in your panel.  Ensure you cover academia, civil society and decision–makers’  perspectives. Let us know which speaker(s) has/have already confirmed  participation, at least in principle.
  • Make it as interactive as possible and encourage audience participation.
  • Support diversity of voices among panelists and strive for multiple perspectives.
  • Note that the average panel length is 50 minutes.

To submit a proposal, please fill in this form by 20 November 2022, 23:59 CEST. (Deadline extended)

We will review submissions and will notify you about the outcome of the selection procedure before 1 December 2022. Please note that we might suggest combining panel proposals if they are similar or complement each other.

About Privacy Camp

Privacy Camp is jointly organised by European Digital Rights (EDRi), the Research Group on Law, Science, Technology & Society (LSTS) at Vrije Universiteit Brussel (VUB), the Institute for European Studies at Université Saint-Louis – Bruxelles (IEE at USL-B), and Privacy Salon.

In 2023, Privacy Camp’s Content Committee are: Andreea Belu (EDRi), Gloria González Fuster (LSTS, VUB) and Rocco Bellanova (LSTS, VUB)

Privacy Camp 2023 will take place on 25 January 2023 in a hybrid format (in Brussels with online broadcast). Participation is free and registrations will open in December 2022.

For inquiries about the programme and/or to support the event organisation as a volunteer, please contact Andreea Belu at andreea.belu(at)edri(dot)org.

#PrivacyCamp22: Call for panels! (New deadline)

** UPDATE: The deadline is extended to 14 November 2021, 23:59 CEST.

Privacy Camp turns 10. It is time to celebrate. But Privacy Camp 2022 is also the occasion to reflect on a decade of digital activism, and to think together about the best ways to advance human rights in the digital age.

The 10th edition of Privacy Camp invites for a forward-looking retrospective on the last decade of digital rights. This edition aims at building on the lessons of the past and at collectively articulating strategic ways forward for the advancement of human rights in the digital society.

Emerging intersections within the realms of regulating digitalisation as well as within other broader social justice movements point that – while some issues remain timeless – the power struggles ahead might happen on new terrain(s).

How can we adapt to these new terrains, while drawing on a decade’s worth of lessons? How can we organise with broader groups of people and other communities? What are the points of reflection we must focus on, to address the wider impact of the digital rights’ fight?

Concretely, we want to explore ideas for (1) putting rights at the centre of digital policies, and (2) bringing marginalised perspectives to the core of digital rights discussions. In this spirit, we call for solution-oriented panel proposals around the following themes:

1. Putting rights at the centre of digital policies

Too often, rights are an after-thought of digital policies. In the past decade we have seen again and again decision-makers decide first, and think about the impact on digital rights later. How can this be changed, to have future policy decisions getting rights right from the start, notably in relation to automated decision-making, the platform economy, data protection and privacy of communications, and the surveillance infrastructure?

Notably, we invite proposals tackling questions such as the below:

  • What can we learn from national and EU debates around digital rights, that will be relevant for current and upcoming challenges?
  • At EU level, has there been an evolution in terms of better integration of fundamental rights concerns into policy-making and socio-technical design?
  • Has the changing role of EU institutions in relation to fundamental rights affected their approach to digital policy? Does it depend on the EU institution?
  • Halfway through its term, how is the European Commission standing in terms of digital rights and policies?
  • How do debates about EU digital policies intersect with the power of Big Tech and national states?
  • How to make sure that rights remain a central priority when legal instruments have been adopted and what is needed is to guarantee their effective enforcement (e.g. GDPR enforcement)?

2. Bringing marginalised perspectives at the core of digital rights discussions

The digital rights agenda was never neutral. It has been shaped over the years by a predominantly reactive approach to digital policy debates. Importantly, it also has its own dynamics dependent on a rather specific set of priorities. This means that some perspectives on digital rights, notably those coming from the point of view of marginalised people and communities, have been themselves marginalised. What are the voices and issues that have been left out, heard less, or simply not amplified enough?

Notably, we invite proposals tackling questions such as the below:

  • How have digital rights strategies and approaches suffered from a limited perspective in the past?
  • How can the digital rights community better centre the voices of people disproportionately affected by exploitative digitalisation, such as women, LGBTQI+ communities, racialised communities and people from the global south, people with disabilities, working-class people?
  • What are the lessons learnt from creating broader coalitions with other actors such as workers’ unions, groups advocating for women rights, LGBTQI+ rights, anti-racism movements, or migrants’ rights defenders?
  • What can we learn from how marginalised groups have been affected by digitalisation, and what effects have legal frameworks had to counter this disproportionate impact?
  • How can we make sure that when we put rights at the centre of digital policies the concerns of marginalised people are given the necessary space?
  • How might the digital rights field incorporate transformative justice and decolonial perspectives into its work?

Deadline for panel proposal submissions: 7 November 2021

Background

The past decade brought the increased digitalisation of all aspects of our life. This process has led to a growing production of data in digital formats, be they personal or non-personal data, data related to content or metadata, and often sensitive data because of their nature or because of how they are processed.

In this context, corporate and government entities have gained unprecedented power. Internet services and digital technologies have developed in often inadequate and insufficient regulatory frameworks. As a result, many have been excluded from the benefits of the digitalisation process.

In the realm of the internet, but also beyond it, our societies have seen the rise and normalisation of government mass surveillance and surveillance capitalism, with Big Tech power grabbing from all areas of public life including public services. Connected to this trend, public and political debates have often been centred around securitisation arguments, and policy-making focused on counter-terrorism measures, and border and migration surveillance.

Against this tide, civil society, along with academia and some policymakers have worked together to curtail the harms of data exploitation and promote regulatory frameworks that put human rights at their centre.

In the past 10 years, Privacy Camp has become a forum that facilitates discussion and debates, and that offers occasion to coordinate and strategize better. It has foregrounded issues concerning EU data protection law, online content regulation and platforms’ power, the confidentiality of communications and the regulation of emerging technologies such as Artificial Intelligence (AI), among others.

In 2020 and 2021, the public debate has been dominated by the impact of harmful online content, the rise of biometrics mass surveillance and, once again, the fake dichotomy between rights and security. Furthermore, the COVID-19 pandemic highlighted our society’s dependence on digital technologies and on the actors that control them, as well as the role of individuals in facilitating or preventing access to data. It thus pushed legislators to focus on the need to further regulate digitalisation, and even re-ignited their aspirations to achieve a so-called ‘digital sovereignty’, of unclear contours.

The digital rights field composition, organisational practices and methods, however, have often left the people most affected by harmful uses of technologies outside of policy, advocacy or litigation work. This has resulted in siloed approaches to human rights in the digital age, or to overlooking the impact of digital infrastructure on marginalised groups and the planet itself.

With this edition of the Privacy Camp, we want to move beyond empty calls to put ‘the (undefined) human at the centre’ into a genuine taking into account of digital rights.

Submission guidelines:

  • Indicate a clear objective for your session, i.e. what would be a good outcome for you?
  • Include a list of a maximum of 4 speakers that could participate in your panel. Ensure you cover academia, civil society and decision–makers’ perspectives. Let us know which speaker(s) has/have already confirmed participation, at least in principle.
  • Make it as interactive as possible and encourage audience participation.
  • Support diversity of voices among panelists and strive for multiple perspectives.
  • Note that the average panel length is 50 minutes.

To submit a proposal, fill in this form by 14 November 2021, 23:59 CEST.

After the deadline, we will review your submissions and will notify you about the outcome of the selection procedure before 29 November. Please note that we might suggest merging panel proposals if they are similar or complement each other.

About Privacy Camp

Privacy Camp is jointly organised by European Digital Rights (EDRi), Research Group on Law, Science, Technology & Society (LSTS) at Vrije Universiteit Brussel (VUB), the Institute for European Studies at Université Saint-Louis – Bruxelles (IEE at USL-B), and Privacy Salon.

In 2022, Privacy Camp’s Content Committee are: Andreea Belu (EDRi), Gloria González Fuster (LSTS, VUB) and Rocco Bellanova (IEE, USL-B)

Privacy Camp 2022 will take place on 25 January 2022 online.

Participation is free and registrations will open in December 2021.

For inquiries, please contact Andreea Belu at andreea.belu(at)edri(dot)org.

#PrivacyCamp21: Event Summary

The theme of the 9th edition of Privacy Camp was “Digital rights for change: Reclaiming infrastructures, repairing the future” and included thirteen sessions on a variety of topics. The event was attended by 250 people. If you missed the event or want a reminder of what happened in the session, find the session summaries here.

Platforms, Politics, Participation: Save the Date and Call for Panel Proposals

Join us for the 7th annual Privacy Camp!

Privacy Camp will take place on 29 January 2019 in Brussels, Belgium, just before the start of the CPDP conference. Privacy Camp brings together civil society, policy-makers and academia to discuss existing and looming problems for human rights in the digital environment.

Take me to the call for panel submissions.
Take me to the call for user story submissions.

Platforms, Politics, Participation

Privacy Camp 2019 will focus on digital platforms, their societal impact and political significance. Due to the rise of a few powerful companies such as Uber, Facebook, Amazon or Google, the term “platform” has moved beyond its initial computational meaning of technological architecture and has come to be understood as a socio-cultural phenomenon. Platforms are said to facilitate and shape human interactions, thus becoming important economic and political actors. While the companies offering platform services are increasingly the target of regulative action, they are also considered as allies of national and supranational institutions in enforcing policies voluntarily and gauging political interest and support. Digital platforms employ business models that rely on the collection of large amounts of data and the use of advanced algorithms, which raise concerns about their surveillance potential and their impact on political events. Increasingly rooted in the daily life of many individuals, platforms monetise social interactions and turn to questionable labor practices. Many sectors and social practices are being “platformised”, from public health to security, from news to entertainment services. Lately, some scholars have conceptualised this phenomenon as “platform capitalism” or “platform society”.

Privacy Camp 2019 will unpack the implications of “platformisation” for the socio-political fabric, human rights and policy making. In particular, how does the platform logic shape our experiences and the world we live in? How do institutional actors attempt to regulate platforms? In what ways do the affordances and constraints of platforms shape how people share and make use of their data?

Participate!

We welcome panel proposals relating to the broad theme of platforms. Besides classic panel proposals we are also seeking short contributions for our workshop “Situating Platforms: User Narratives”.

1. Panel proposals

We are particularly interested in panel proposals on the following topics: platform economy and labour; algorithmic bias; democratic participation and social networks.

Submission guidelines:

  • Indicate a clear objective for your session, i.e. what would be a good outcome for you?
  • Indicate other speakers that could participate in your panel (and let us know which speaker has already confirmed, at least in principle, to participate).
  • Make it as participative as possible, think about how to include the audience and diverse actors. Note that the average panel length is 75 minutes.
  • Send us a description of no more than 400 words.

2. “Situating Platforms: User Narratives” submissions

In an effort to discuss situated contexts with regard to platforms, we will have a session on lived practices and user narratives. Individuals, civil society groups or community associations are welcome to contribute in the format of a short talk or show & tell demonstration. Details and the online submission form are here:

Submission form

Deadline

The deadline for all submissions is 18 November. After the deadline, we will review your submission and let you know by the end of November whether your proposal can be included in the programme. It is possible that we suggest merging panel proposals if they are very similar.

Please send your proposal via email to privacycamp(at)edri.org!

If you have questions, please contact Kirsten at kirsten.fiedler(at)edri(dot)org or Imge at imge.ozcan(at)vub(dot)be.

About Privacy Camp

Privacy Camp is jointly organised by European Digital Rights (EDRi), the Institute for European Studies of the Université Saint-Louis – Bruxelles (USL-B), the Law, Science, Technology & Society research group of the Vrije Universiteit Brussel (LSTS-VUB), and Privacy Salon.

Participation is free. Registrations will open in early December.

Press release: 6th annual Privacy Camp takes place on 23 January 2018

Tomorrow, on 23 January 2018, Privacy Camp brings together civil society, policy-makers and academia to discuss problems for human rights in the digital environment. In the face of what some have noted as a “shrinking civic space” for collective action, the event provides a platform for experts from across these domains to discuss and develop shared principles to address key challenges for digital rights and freedoms.

Themed “Speech, settings and [in]security by design”, the one-day conference at the Saint-Louis University in Brussels features panel discussions and privacy workshops led by experts in the fields of privacy, surveillance and human rights advocacy. The nonprofit, nonpartisan event draws privacy activists, civil society representatives, public servants and academia of all ages and backgrounds who are interested in improving privacy and security in communications and work towards the respect of human rights in the digital environment.

This year, Privacy Camp also features the “Civil Society Summit” of the European Data Protection Supervisor (EDPS).

Among others, speakers of the Privacy Camp 2018 are Giovanni Buttarelli, Wojciech Wiewiorowski, Fanny Hidvegi, Glyn Moody, Katarzyna Szymielewicz, Juraj Sajfert, Marc Rotenberg. The full programme can be accessed here.

Post-camp Party, Tuesday 23/1 from 7pm onwards

All good things must come to an end. But not all things end quite as dramatically and with as much suspense as this year’s PrivacyCamp!

Join our post-camp party at Smouss Bar! There will be complimentary snacks, free drinks (for those with a conference badge, so remember to register to get yours) and, most importantly, our legendary “Big Fat PrivacyCamp Quiz of 2018”.

Here are the directions for getting from the conference to the after-party (by foot, 18 mins):

From 19:00 onwards
at Smouss Café
112 Rue du Marché au Charbon
1000 Bruxelles