Ethics in VIRT-EU is understood as values in action, as the ethics of material choices, as the ethics of navigating complex decision spaces.

What If Everyone In the World?

Work out the implications of your ethical challenges, assessing the roots of the challenge and ideating options to address the challenge.

Use when trying to understand ethical challenges and difficult decisions.
Takes 45 minutes to an hour.

Responsive image
Mapping options

Facilitate a discussion and evaluation of different ideas you came up with when working through the ethical challenges and/or WIEITW.

Use when assessing solutions.
Takes around 30 minutes

Responsive image
Reflect on your values

Checking in with your "North Stars" - the values that you had identified at the beginning of your ethical stack - and being realistic about how well you can align with them.

Use as a periodic check-in during the process of product development.
Takes around 20 minutes.

Responsive image

Gather your team to articulate your ethical values as individuals and as a group in relation to your product development.

Use at the beginning of product development.
Takes around 30 minutes.

All Ethical Issues

We have created an open spreadsheet for you to refer to all of the value challenges and value clashes here.

Technical Privacy Questionnaire

Go to VIRT-EU's service package’s PESIA for the full-blown privacy, ethical and social impact self-assessment questionnaire (PESIA), developed by Politecnico di Torino and the Open Rights Group.

Value Clashes

Take a look at scenarios where ethical values may clash with one another. As ethics comes up when things go down, look now (before things go down!) at some difficulties you might face if you care about any of the values below. These clashes are the basis for workshops and experiences we have built.

Wellbeing & Data

The effect of this language is to present data collection and self-surveillance as a route to mental and physical health, reduced stress, greater self-control, higher levels of motivation, and an increased ability to switch off. There is an implicit assumption that you will not be “the best version of yourself” without this quantification-supported striving that buying and using your products will enable.

Autonomy & Agency

A company is developing an IoT radio and cares deeply about openness as in, open to all to contribute, and inclusion as in, all kinds of people can participate. Some people with extreme viewpoints participate as loudly as possible. Can the company be “inclusive” if some users feel extremely uncomfortable and leave the platform? Should they stay “open and inclusive” as previously defined or do they need to redefine?

Responsibility & Sustainability | Inclusion

There’s a new connected lightbulb that allows users to control the brightness with voice commands. The company cares deeply about sustainability as in, having as little electronic waste as possible, responsibility as in, being responsible about how they provide value to the user, process and use data. They are releasing a software update with a new machine learning algorithm that will make it easier to process the voice commands. The update can only run on a new microcontroller with local machine learning. The company is therefore releasing a complete new package of hardware and software. Is there a way to stay true to being “sustainable” while making sure to be “responsible” in terms of how they implement their machine learning (locally rather than on the cloud)?

Sustainability & Security

If the security of an element of your device is compromised, then how many more elements of your system would be compromised? Sustainability is linked to security in that if you invest in the security of your system, you will have a lower likelihood of needing to recall the hardware. This may mean that you will have to make some hard choices about "green" materials if those "green" materials are less secure. For example: What you save in battery life and complexity comes at the price of easy discoverability and exploitability. Whilst BLE does have support for security, it is rarely implemented. When it is implemented it’s often done poorly. The app for this device potentially collects a lot of personal and intimate data -- the app accesses your phone or tablet's microphone, camera, and location tracking. That means a lot of potentially intimate information could be hacked or shared without consent if something went wrong.

Inclusion & Privacy

If your device will only be beneficial if users give up sensitive data, consider that you are requiring them to open up more of their private lives than they are typically comfortable with.

Security & Wearable || Monitoring || Health || Remote Control

Is it better to track your loved ones even though it may mean they are a) targeted or b) exposed in terms of their security?

Openness & Responsibility || Accountability

If your product works on/ is based on open source software, consider what kind of mechanisms you can put in place to ensure responsibility and accountability for when things might not go as planned? What happens when a contributor's code generates a lot of bugs for your product that it becomes [at least] temporarily unusable? What happens if a problem occurs in your product and you cannot pinpoint how and why it occured? How would you try to solve the problem?