top of page

Does Artificial Intelligence and Philosophy Have Anything in Common?


Photo by Markus Winkler on Pexels.com

We all have heard about the “new normal”.

New normal, has many interpretations and it is widely spoken everywhere. It seems to be like the new mantra. But do we really know what does it mean for us? Beyond the question of health…

Is it now the “new normal” that we will be more socially (physically) apart than before?

Is it the “new normal” to forget office life and embrace the new home office modality?

Is it the “new normal” to attend online classes, and forget the good old school days of meeting your friends and waiting at the break time to conspire against the world (or just against the teachers) together?

Is it the “new normal” to transform our consumption habit from “Going out shopping” to “Stay home shopping”? – I still remember the legendary phrase of the infamous Carrie Bradschaw at Sex and the City: “Shopping is my cardio!” well, not anymore, dear. Nowadays you don’t need to even stand up from your bed. Just two finger movements with an app and you get your clothes delivered at home.

If you are in the business world, you might agree with me on this: There is no single day that goes by without hearing the word “digitalization”, or even worse: “digital transformation”.

It’s the topic of the century… there is no single strategy meeting, or corporate strategy goal that does not include these buoyant words.

Karl Schwab, Executive Chairman of the WEF (World Economic Forum) in his recent book called “COVID-19: The Great Reset”- seems to be very enthusiastic about the new digital era; in fact, he predicts that “consumers may prefer automated services to face to face interactions” and adds that “recession induced by the pandemic will trigger a sharp increase in labour-substitution, meaning that physical labour will be replaced by robots and intelligent machines, which will provoke lasting changes in the labour market”.

Before the pandemic, we all understood that the digital era was here to stay; that artificial intelligence is advancing at gigantic spaces in many industries and will indeed represent a big shift in our current way of living.

What is missing for me is the central question, the elefant in the room: Are we ready for a transformation like this? Are we ready to introduce in an unprecedent rapid and high scale, the so called Internet of Things? How far are we in our local and global policies, in defining what is the impact of AI in our human affairs? What will change in our human rights? Do we really want this or is this change being imposed on us?

Is there a human right protection law against being dismissed from your job in exchange of a robot? Do we need one? Is there a policy which regulates the human made data? To which extent we can claim property rights of our own information, and where is the limit, when an algorithm picks it up and mixes it with other millions of data to form a new data pattern? Who owns the data?

If I get killed by my own automated driving car, can my family take charges against the self-driving car? Against the company who sold it?

And what does philosophy has to do with all of this?

Apparently, most people perceive AI and the “high tech” to be this technical, science based, geek misterious subject whereas philosophy sounds more like a subjective, poetical, people of words kind of subject. But if we look a bit deeply, one cannot exist without the other.

I think indeed philosophy lies at the heart of artificial intelligence. As we know from common definition, philosophy is the fundamental study of the human nature, more specifically in the nature of knowledge and existence.

AI is here to stay, so what will it mean to our existence? If the life we used to know will be fully transformed by a new digital world, how will this affect our consciousness, our epistemology?

If a robot will in the future perform the classic human affairs, not only the boring assembly job which is anyway fully automatized nowadays in most industries, but actually we can see not so far from now robot nurses, doctor robots (which exist already in some hospitals in China), self-driving cars, intelligent home devices that will talk to you and do the job for you, robots managing global supply chains, and even the arts cannot not escape; algorithms learn so fast that now are able to compose beautiful songs based on millions of data patterns to create new bars of music.

So where do we fit in all of this?

Isaac Asimov, the famous writer of “I, robot” described three fundamental rules that became the heart of the artificial intelligence era of today:

1.A robot may not injure a human being, or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

The third rule bothers me a little. How can we program a robot that can protect itself without causing any harm at all? How will a robot develop an intuitive thinking like we got after millions of years of evolution, to face such a moral dilemma of deciding what is right or wrong?

Even we, after millions of years of evolution are far from doing this right, as Isamov rightly says: “The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom.”

I believe philosophy plays a more crucial role than ever before. We are in unprecedented times, and we are urged to think further and consciously about where are we collectively going.

These are not times for being egoistical and worrying just about your own good. These are times where we need to redefine our human philosophy. If a digital era is coming and changing our world, which by the way was a change triggered by us, what does it mean now to be human? What are the new roles in our society? Where do we stand, locally and globally?

How can we regulate this? Where are the values and morals that have defined us for thousands of years? The importance of family as a nuclear entity, the importance of community and the fundamental human need of transcend, to have a mission in life, to have a purpose?

It is the obvious which is so difficult to see most of the time. People say ‘It’s as plain as the nose on your face.’ But how much of the nose on your face can you see, unless someone holds a mirror up to you?” Isaac Asimov, I, Robot

In the end, we are the only responsible ones to define our own identity, aren’t we?

3 views0 comments
bottom of page