Sans Tabou is a 6-episode debate series created by Usbek & Rica and Microsoft France. An honest dialogue, guaranteed without bullshit , social washing or quotes from General de Gaulle, to seriously reflect on the major challenges of our societies and bring about a more desirable future.
This dialogue always takes place around Carlo Purassanta, the president of Microsoft France, and a guest to give him the change. In this second episode, the author of science fiction and fantasy Catherine Dufour , comes to discuss the possible drifts of technology, and the measures to be taken now to guarantee a more “free” future.
Between the predictive capacities of certain calculations, facial recognition, artificial intelligence, spyware robots and the development of surveillance tools, it is time to ask the question ...
To review the entire debate:
Either we think about it a posteriori by forbidding ourselves certain things, or we think about it upstream by trying to make ethical digital the three laws of robotics of Asimov : a robot must not harm a human being, it must obey humans unless it violates the first law, and he must think about his own preservation, unless it violates the first and second laws. We have always tried to put a human ethic in the machines we have produced. It is our great fear of the Frankenstein monster that we are building and that escapes us.
But too often digital is made the scapegoat for all evils. The SCOR software that all banks use to grant or deny credit doesn't like non-white people, women (especially divorced), young people, the poor… It only acts in terms of statistics. He is therefore racist, classist, ageist, without being an artificial intelligence. Often, digital technology sends us a mirror image that we deeply dislike. It's not his fault, it's ours! Rather, we need to ask questions.
It is possible and necessary to reconcile digital and ethical. Artificial intelligence accelerates and strengthens the power of digital technology, but it is a factor of production and a vector of economic value creation. It makes it possible to create or optimize products and services, to increase the well-being of populations around the world. Technology is also essential for fundamental research to solve the greatest challenges of the Earth: health, hunger, education. It really helps to create a better world.
But there can be excesses because technology is like the women and men who create and manage it. It is therefore essential to do more education on the power of digital technology, its construction and use. We are not doing enough to teach children and the general public how to use it wisely. Businesses can think ethically, and push themselves to do better, but the debate needs to come from the whole of society so that the rules of the game are set at the same speed as the evolution of technology.
Should we think that we are not ready to have such powerful resources and that we should perhaps give it up?
If you put humanity in a room with a red button marked "especially don't press", it won't work! I don't believe at all that we can refrain from continuing to seek, find, and develop new technologies. We must manage to channel, to domesticate our power of creation.
Companies can adopt principles, list things that they will not do, but it is not easy. At Microsoft, we try to set rules: just because a business model is possible does not mean that it is desirable. So there are things we don't do. We have also worked on the creation of " Partnership on AI ", a collaborative initiative of the major producers of technologies and artificial intelligence to think about common rules because if the company, the government and the company define these principles, the technology can create algorithms to self-regulate.
NGOs working in these areas need technology. Digital technology is not only for privileged countries, it is a tool for bringing well-being to all countries on the planet.
These technologies, especially artificial intelligence, will they not reinforce inequalities and destroy a lot of jobs?
All digital technology promotes relocation and social dumping. It is the announced destruction of a lot of trades, a revolution from below. Before it was the professions of "blue collar", now it is the turn of "white collar": accounting, medicine, legal ...
It's funny to see how we bring AI into very intellectual professions like surgery. A surgeon will read 10,000 or 100,000 x-rays throughout his career to find a fracture or a tumor. Artificial intelligence reads 5 million in less than a second. For the moment, it is still the surgeons who operate, but they do not like to be told what to do. So we injected the AI with a dose of simulated humility so that the surgeons would accept its help. It's strange to see that we have to inject a simulacrum of humanity into the digital world to make it happen. But in the future, surgeons stand a good chance of being replaced by AIs capable of diagnosing and operating. At that point, even more need to simulate humanity.
Previous industrial revolutions have shown that each revolutionary factor of production creates more value and jobs than it destroys. The basic rule is that artificial intelligence should not replace human work but make it more relevant, more precise. You could say that an agricultural machine is bad because manual labor keeps it in shape, but you have to look at the big picture. Are there fewer farmers than 200 years ago? Yes. Do farmers have more value-added work and more responsibility? Yes, because it is necessary to feed the 9 billion human beings in 2050.
We must always ask ourselves the question of the result. The profession of a surgeon will change. But don't we ultimately want the operation to be more successful? The important thing is to make the correct diagnosis everywhere in the world, not just in the three countries which have the best surgeons. Well-used artificial intelligence and digital are a factor in distributing this value in a more equitable way for 9 billion people.
There are currently 1 billion people starving and half of humanity does not have access to clean water or personal toilets. A lot of people have never seen the tail of a processor. Digital concerns are luxury concerns.
These are not luxury concerns! The goal is precisely to solve the problem of hunger and access to drinking water. NGOs working in these areas need technology. Digital technology is not only for privileged countries, it is a tool for bringing well-being to all countries on the planet.
Can there be a digital ethic without public regulation, asks François, who is attending this debate?
Self-regulation is possible but not sufficient. Private companies will not go fast enough or regulation will not be homogeneous and consistent across all business models . The private / public debate is necessary and the regulation of States is very important because it must make us think about the common good. The problem is that the technological opportunity goes too fast for regulation, which takes time to analyze and solve a problem. We cannot let ten years go by without regulation withpotentially unethical. We must reduce this time between innovation and regulation. And during this window, we must give citizens an additional critical force so that they are able to understand the models and choose to adhere to them or not.
How can we force web companies to self-regulate, like automobile, alcohol and tobacco companies, Bertrand asks us?
Regulation comes from States, it is not the job of private entities. The question is to enforce it. Anicet Le Pors [politician and senior French official, editor's note] had a simple answer: " Those who do not respect the laws, they must hit the cash ". It's basic but that's all we found to be effective.
But saying "we need regulations" is not enough, they must be well done and well thought out. I strongly believe in collaboration and reflection between companies, public or private entities, think tanks, actors like the CNIL ... It must be a collective reflection, initiated by a common need with behind coercive means which fortunately fall under state powers.
"To achieve digital decline, we already need a slowdown in the growth of our digital dependence ..."
Another question from Bertrand: how to educate Internet users about digital deconsumption so that their traces are reduced?
It is very difficult, especially during a pandemic when everyone is in front of their screen. I think it's a good time to ask yourself if we can live otherwise? Didn't we go too far? There is a lot of research on homo-numericus and our techno-cocoon. But to achieve digital decline, we already need a slowdown in the growth of our digital dependence ...
The power of technology must be used to solve global problems. But if it creates dependencies or dangers, it must be regulated. The basis must be individual responsibility and an understanding of the use of technology. It is a question of education. I fundamentally believe that we must empower citizens, from an early age, to understand technology and develop their critical thinking. It is unlikely that in 2021, in the midst of the industrial revolution, digital technology will hardly be part of education!
The digital hegemony of GAFA is worsening because of the health crisis. The economy in Europe contracted 7.8% but Amazon's stock market value rose 72%. Is it cyclical or is it the death of the old economy?
It is historic and catastrophic! For the first time, societies are so rich that they can subjugate state powers. Power that should be democratic is thus handed over to private interests. This is the very definition of dictatorship. In addition, this money is earned thanks to our personal data, this gold that we offer to companies that have made mountains of profits and which are not at all ready to return them in the form of common interests: hospitals , roads, schools, nurseries ... All this money there, which belongs to you and to me, should be paid for the public interest and ends up in private interests.
This is a fundamental demarcation: there are factors of production, like the cloud, which improve the work of a company, and businesses based on the monetization of access to data. These are two fundamentally different ways of using technology.
On August 11, 2020, the London Court of Appeal ruled that the use of facial recognition by Welsh police was illegal. Is this a major victory for our public freedoms or a momentary slowdown of the surveillance society?
The border between private and public, the need for security and the need to protect people from too intrusive technology is a fundamental subject of which facial recognition is a part. Microsoft has asked for regulation because it is necessary. In the meantime, we have clear principles: this technology must be auditable by a third party to understand how it operates. In the case of a personal application, it must be transparent and work with mandatory prior consent. If it is used in a public place, the public should be alerted to its use. But here too, the time between innovation and regulation is too long.
It is the end of individual freedom. Fortunately, humanity often manages to secrete a number of antibodies. The mask helps to avoid facial recognition. There are also a number of makeups that confuse facial recognition. But I trust the technology to get over those antibodies. It will be unbearable to live in this panoptic society where you are seen all the time and without you being able to see the person who is looking at you.
Science fiction always eventually… so read science fiction because it is an attempt to anticipate, to put into words what awaits us tomorrow or the day after. But science fiction itself struggles to go as fast as science today. So educate yourself and don't hesitate to bitch, protest and get involved when you feel your digital, physical and mental integrity is threatened. Scientific and technical progress is so rapid that it is difficult to anticipate the danger, but we must not let it go: the future is now, so above all, let's not let go!
I am optimistic, I think technology can create value. But we cannot justify all business models with the technology-value equation. You have to spend time acquiring a critical sense of the technology, looking at the companies that provide it, understanding their business model , the mentality with which they produce these products and services. Adults must do this, but children must also be allowed to begin to acquire this essential critical skill.