Some Thoughts on Bias

A Little story of bias

A father was driving his two children to watch a football match when they were involved in a terrible accident. The driver was killed immediately, as was one of the boys. The youngest child was sitting in the back on his car seat, survived the accident but was seriously injured.

The young child was taken to hospital where he was rushed into an operating theatre where they hoped they could save his life.

The doctor entered the room and looked at the patient, froze and said “I cannot operate on this boy, he is my son!’

Bias within Data

If you asked the question of how the boy could be the doctor’s son you are falling in a trap of bias. The doctor in the story is the child’s mother (obviously), but that may not be the first solution that comes to mind. In many societies we are brought up to see doctors as male, and nurses as female. This has really big implications if we are using computers to search for information though, as a search machine that uses content generated by humans will reproduce the bias that unintendedly sits within the content.

The source of the bias could be from how the system works. For example, if a company offers a face recognition service and uses photos posted on the internet (for example categorized in some way by GOOGLE), there will be a lot more white males than girls of Asian background. The results will be more accurate for the category with the largest presence in the database.

If a banking system takes the case of a couple who declare an income together, it will presume that the man’s income is higher then that of the woman’s and treat the individuals accordingly, because from experience the data shows that men’s income is higher then that of women and this generalization will become part of the structure.

The problem with language is also easy to see. If the example above of the doctor problem can be in some way ‘seen’ in the vast amount of text analyzed and used for an algorithm, then proposals and offers will differ according to gender.

Let’s take how we describe ourselves for a moment. A male manager will use a set of descriptive terms to describe himself that will differ from those used by a woman, he might be assertive, but she is more likely to be understanding and supportive. A system that unwittingly uses a dataset based upon (or even referring to) language used in job adverts and profiles of successful candidates will replicate a gender bias, because more proposals will be sent to people who use the language that reflects the current make-up of the employment situation.

In short: More men will be using the language that the system picks up on, because more men (than women) in powerful positions use that type of language. The bias will be recreated and reinforced.

In 2018 the State of New York proposed a law related to accountability within algorithms, Take a look at this short description, and the European Commission released a white paper on Artificial Intelligence – A European approach to excellence and trust in 2020. It might be more important an argument than it first appears.

There is lots of literature about this problem if you are interested, a quick online search will offer you plenty of food for thought.

Problems Counting People?

Implementing COVID Regulations

Here in the Netherlands the University system just reopened after a short lockdown (again). There are still restrictions on how many people are allowed into rooms however, a maximum of 75 in any single space. This ruling was introduced last year, and led to some developments that might be of interest to technology fans (and privacy fans I should add).

Counting people manually as they enter and leave a room is a time consuming and expensive approach, so two universities had the idea of using cameras and artificial intelligence to check how many people are in buidings and individual spaces.

Utrecht University ran a trial, while Leiden placed 371 cameras on the walls above the doors to each space.

The Leiden approach however caused a bit of a stink. The cameras were all placed and set up while the students were locked out of the building, the ideal time we might say, to be having people up ladders in front of doors. But such an approach can also be seen as trying to do something without too many people noticing.

And that is how some of the students saw the arrival, and a couple started to investigate for an article in the weekly University student magazine.

Counting entries and exits, well nobody could be against that! The University has to do it by law. So discussion grew around the methods and the cameras and the data.

The university had bought 371 cameras from the Swiss manufacturer Xovis, 600 euro a piece. So the question is what can (and do) they register?

According to company spec, the system is capable of:

Counting students

Following their individual routes

Calculating an individual’s height

Estimating age

Suggesting mood (is an individual happy or angry)

Determining who is a staff member

Counting numbers in groups

Detecting incorrect facemask use.

Now these types of cameras are already in use in airports and shopping centres, to minimize waits (among other things) and to try and calibrate advertising and work out the actual moment that someone choses to buy something. So such data does offer broad analysis possibility.

The slogan used by the manufactures maybe lets the cat out of the bag a bit: ‘Way more than people counting.’

The cameras can of course be set to different levels of data collection and privacy, from level 3 fully anonymous (just numbers of people), to 0, which is a livefeed of the images.

Some Questions

Now I am no expert, but one problem seems to me to be that the system records lots of data, that at some point someone filters before providing their dataset to the customer. Who, when, under which circumstances, who manages security of access, there are a lot of issues here. But they are not all negative. Such a system may be of use in a terrorist incident for example, or other sorts of emergency. You could see why something more expansive might be chosen over a system that just counts movement. But there is a moral as well as practical dilemma in choosing such an overkill solution to a simple problem.

The report the student investigators published in the weekly university magazine showed lots of security issues, and there were protests from the students who wanted the system taken down. Both Utrecht and Leiden have now stopped using the cameras.

But that is not a good result from a responsible innovation perspective. Lots of money was wasted, many people got upset, two sides of an argument were constructed that are at loggerheads with each other.

A change in public participation techniques might have avoided all of this. A lesson to be learned I feel. Informing without debate doesn’t work.

You can read the student report here and a local newspaper report here. All in Dutch though, so you might have to use some translation software.

OECD Conference on Technology in and for Society

In this post I would like to offer some take-aways and personal thoughts on the recent OECD Conference on Technology in and for Society, held on the 6th and 7th of December 2021.

Innovating Well for Inclusive Transitions

The conference rationale was Innovating Well for Inclusive Transitions, based upon the arguments that the world faces unprecedented challenges in health, food, climate change and biodiversity, solutions for which will require system transition or transformation. The technologies involved may bring fear of negative consequences and problems with public acceptance, as well as raise real issues of social justice (primarily of equal access, thinking today about covid vaccination inequalities as an obvious starting point).

Good governance and ethics will therefore be necessary to harness technology for the common good.

Towards a framework for the responsible development of emerging technologies

The following is taken from the rationale page of the conference website:

The conference will explore values, design principles, and mechanisms that operate upstream and at different stages of the innovation value chain. Certain policy design principles are increasingly gaining traction in responsible innovation policies, and provide an organising structure for the panels in the conference:  

Inclusivity, diversity and stakeholder engagement

Stakeholder and broader public engagement can be means to align science and technology with societal values, goals and needs. This includes the involvement of stakeholders, citizens, and actors typically excluded from the innovation process (e.g. small firms, remote regions, certain social groups, e.g. minorities etc.). The private sector too has a critical role to play in governance. 

Goal orientation

Policy can play a role in better aligning research, commercialisation and societal needs. This implies investing in public and private sector research and development (R&D) and promoting “mission-oriented” technological transformations that better connect innovation impacts to public policy needs. At the same time, such innovation and industrial policies need to be transparent, open and well-designed so they foster deliberation, produce value for money, and do not distort competition.

Anticipatory governance

From an innovation perspective, governance approaches that engage at a late stage of the innovation process can be inflexible, inadequate and even stifling. More anticipatory kinds of governance — like new technology assessment methods, foresight strategies and ethics-by-design – can enhance the capacity to govern well.

The conference included round-table and panel events alongside institutional presentations, introductions and scene setting as well as wrap-ups. Video of each event is available via the conference website, supported by an introduction paragraph and series of questions.

One of the roundtables I attended may be of particular interest to Technology Bloggers readers as it was all about carbon neutrality:

Realising Net Carbon Neutrality: The Role of Carbon Management Technologies

Description

Reaching net carbon neutrality is one of the central global challenges we face, and technological development will play a key role. A carbon transition will necessitate policies that promote sustainable management of the carbon stored in biomass, but not exclusively so: technology is increasingly making it possible to recycle industrial sources of carbon, thus making them renewable. The idea of “carbon management” may capture the different facets of the answer: reduce the demand for carbon; reuse and recycle the carbon in the bio- and technosphere; and remove carbon from the atmosphere. But a reliance on technologies for carbon capture and usage (CCU) and carbon capture and storage (CCS) may present barriers for other more radical transformations.

● What knowledge is necessary to better guide national and international policy communities as they manage emerging technology portfolios for carbon management?

● What can more holistic approaches to carbon management offer for developing technology pathways to net carbon neutrality?

● What policies could ensure that one technology is not a barrier for implementation of another?

I took a lot of notes, including the following points:

What kind of technology and knowledge is necessary when steering the development of emerging technology?

There are both opportunities and challenges for finding the right mix between technology and policy

Carbon capture alone will not be viable, we have to reduce emissions

The energy transition will have to be dramatic but there is no international agreement on the phasing out of carbon fuels

There is an immediate need for investment, social acceptance and political will

Use technology that is available today rather than using language about innovation

Policy-makers have to see a whole picture, just cutting carbon from some of the big emitters will not be enough

Real structural change is necessary

The old economic sectors and the poor should not be those who pay

Success requires not only information, but communication

The truth about both economic and social costs should be available

Why not watch the video here? It’s just over an hour long.