Transparent algorithms won't solve the 'AI bias problem', you need to understand how data is used

What we need is to send the Algorithms to the data, make them open, decentralize and democratize.

We have begun collecting all human behaviour and putting the machine in between human beings and every transaction in life. In doing that, we are increasingly operating our society on a new basis. The omnipresent smartphones and digital businesses around us have caused a permanent shift in our societies. We are now operating with a new form of social science that we don't understand yet, but which we are already using in much the same way that physics was used to change European society in the 17th and 18th centuries. The time when those societies produced an industrial revolution that people didn't really understand yet, but they had a new science and they used it.

We are now using the phrase Artificial Intelligence and the phrase Machine Learning to mean "pattern matching" and "pattern finding" in immense quantities of human behaviour data. We are using this to control the market by changing advertising into a system of anticipating wants and intervening in the near term to amplify some wants and reduce others.

Recall all those ads that target you after you have discussed something on Facebook, those. This is having the consequence that our understanding of democratic politics is no longer accurate and things are changing in an out of control way. People who were told that they were taking back control by having these wonderful devices in their hands are beginning to feel that they are not in control but that they are being controlled.

Representational Image

Representational Image

Everybody wants to understand, what this new technological existence means. Either from an exploitative point of view: how do we use this social science for our profit?

Or from a rights-based point of view: how do we protect people from this new social science?

Or from a political point of view: how do we perfect social control either in a large scale form like the Chinese Communist Party or a small scale form like Google and Facebook which have managed to revolutionise the advertising system to their advantage in a very short time?

Everyone is busy trying to wrap their heads around these changes.

We as human beings are always trying to understand the sources of power in the physical environment, in order to understand the shape of power in the social environment, in order to understand what law should do or how various entities should operate. In that process, we are discovering that inequality of knowledge is the source of power in the market, in the government and in politics. We are generating a new form of energy not based on combustion but based on knowledge, information about people. What do we know about people? We are really talking about the vast accumulation of unequal knowledge. This knowledge is power.

Currently, this knowledge is concentrated in the hands of a few large players, therefore concentration is our problem, not obscurity.

Algorithmic transparency is a phrase that we now use to express the desire for reduction of inequality between the human being and the machine. If only we knew what it, the machine, was doing we would be more free of bad consequences than we are now. For those of us who came from free software movement, who have spent a generation feeling that political liberty hinged on technological freedom that people had control over their devices, this seems like a totally natural conclusion.

Unfortunately, it’s completely false.

Representational image. Pixabay.

Representational image. Pixabay.

In order to understand why it's completely false, we need to understand what we mean by algorithmic transparency. Apparently what we mean is show us the algorithm and we will know something. But at the level at which we are talking about the systems that determine who should go to jail for how long, who should get what parole, how to predict who is driving drunk so you can stop them or how to figure out which Hindu/Muslims/Uighur have religious opinions, knowing the rules of the analysis about how a particular ad is targeted at us or a video is shown to us will not restore human control or freedom.

This is because the algorithm isn't what really matters. The algorithm is usually very simple. What we all should be demanding is, "Show us the data and then we will know what is really going on".

Nobody really talks about this in these terms because capitalism and despotism and all other forms of power in the human race are being restructured in terms of who controls the data. Those in whose hands all this power is concentrated want to say, we could tell you about the software but we can't tell you about the data because its new the oil and we don't give it away.

This is why self-explainable AI, notice of AI, an opportunity to be heard when there is any use of AI mixed with humans or not is what we need. What we need is to send the algorithms to the data, make them open, decentralise and democratise, to hold the decision-makers accountable and start doodling with new business models while looking at due process that is larger than the one that GDPR offers.

Or else, be prepared to be lost in another seismic shift even before we have emerged from the whirlwind of fancy words.

The author is a technology lawyer and managing partner at Mishi Choudhary & Associates and founder of SFLC.in

Tech2 is now on WhatsApp. For all the buzz on the latest tech and science, sign up for our WhatsApp services. Just go to Tech2.com/Whatsapp and hit the Subscribe button.





Top Stories


also see

science