Fear of AI is pretty irrational in my opinion. You ask what becomes of humans when AIs can out-think us? I don't think it will ever get to that point because I see humans uploading their consciousness into a digital world before we create sapient machines. First-gen mind-machine interfaces are already being prototyped and once that is achieved it's not that big of a leap from mind-machine interfaces to uploading your consciousness.
I also think people fear advancing technology because people just simply fear change. Being able to upload our consciousness into a digital world would fundamentally change what it means to be human and I think people don't want to give up their bodies yet. Of course I don't understand why anyone wouldn't want to give up a body that ages and dies for a life of immortality. I think a big part of that is that a vast majority of people still think that our physical form defines who we are; instead of seeing our bodies for what they really are: a mere vessel that carries our true self around. What all this advancing technology will eventually allow us to do is upgrade our hardware so to speak.
Assuming that all technology is good and necessary and that older ways are inferior and stupid is madness, in my opinion. Ultimately, after survival, basically everything is done with an aim for happiness (as long as it doesn't harm anyone) or the avoidance of unhappiness. I play video games because I enjoy them, I go to college so I can have a better chance of getting my "dream" job because that'd make me happy. A very charitable person might volunteer most of their spare time because they get a kick out of helping people. Survival of self and species as well as happiness are the goals, but it's easy to forget.
I'm reminded of the parable of the fisherman and the CEO. The fisherman lives near the beach, works a few hours a day, and spends the rest of his time relaxing. A wealthy CEO on vacation meets him and learns about the fisherman's life. The CEO tells the fisherman that he should work sunup to sundown, use the extra money to buy a bigger boat and more nets, and keep investing until he has millions. The fisherman asks why. So you can live near the beach, work a few hours a day, and spend the rest of your time relaxing, the CEO says. He's completely missing the point.
And could a machine really ever "enjoy" anything? Moreover, if we're uploading our consciousnesses, might we be committing suicide and creating a precise digital copy of ourselves? It could be no more "us" than a twin, and we'd have no way of knowing.
To me, "life," or rather, existence, as a machine or program or packet of electronic data is completely devoid of meaning. What's the point? Why do anything? You don't need to eat or sleep, your "loved ones" are little more than glorified computer files, and you're missing out on a lot of physical sensations. Our brains have evolved to be parts of people and to grow up in a society of people, not machines. I, for one, am quite happy to be a human. I embrace it, and I like my body. It's an integral part of me. It is me. I love going for horseback rides and float trips, feeling the wind and sun, feeling the recoil and shock of a gunblast, and so on. I like that I'm fundamentally an animal which can communicate with and relate to other animals. These things are "practical" for me because they help me accomplish my goal of happiness.
Few things bother me more than the suggestion that all change is progress and improvement. Technology won't make us any happier. I'd argue that people had all the things they needed to be truly happy thousands, if not hundreds of thousands of years ago. You might counter that they also led shorter, unhealthier, more painful lives. I'd say that's a valid point, but I'd counter that if, hypothetically, there were no disease, needless fighting, or oppression then, life would have been pretty good. That's because humans, like other animals, can be pretty happy if they have certain sets of requirements met. Cats are happy with fresh water, fresh meat, and something to chase. Horses are happy with plenty of fodder, a place to run, and other horses to live with. People are happy if they get the basics, enough exercise, feel like valuable and accepted members of a community, love and are loved, have avenues of self-expression, and have something to wonder at and something to live for.
These broad categories have never changed. They may mean different things to different people, but they never go obsolete. People don't suddenly lose the need for social acceptance just because some new technology comes out, nor were all people everywhere miserable until the TV was invented. If that were the case, then we'd get happier with each passing year and each purchase of some new technology, but we don't, not necessarily. People are still nasty to each other, they can still feel lonely and without purpose, they still commit suicide, riot, complain, and suffer from depression. There are alternative means of being happy. The Amish seem no less happy than the rest of us, despite their simpler lifestyle. They don't need much technology to be happy. Strong social bonds and a sense of purpose and belonging are good enough for them. A lot of us lack one or both, and those are holes that no inventions can fill.
It seems to me that when our society changes, it tends to throw the baby out with the bathwater. Everything, good or bad, is replaced. I'd rather we focus more on fixing problems and deleting the negatives and less on adding new abilities. Life with immersive virtual reality and omnipresent Wi-Fi might be nice, but life without infectious or genetic diseases would be better. And the more society changes, the more people get left behind. The elderly used to be valuable and respected members of society with plenty of good advice, but now society changes so rapidly that any advice they might have is probably totally outdated. They're seen as out of touch relics with absurd, quaint views, and it doesn't help that we shove them into retirement homes to go away and die rather than keeping them around to help raise the grandchildren. The younger generations mock them, seemingly unaware that they too will probably be ridiculed, ignored, and abandoned as everything they know and love disappears.
Anyway, I worry that people are powerfully attracted to new technologies without carefully considering whether they're ultimately beneficial in achieving the goals of survival and happiness. If "you" "lived" as a machine, would you really enjoy it? Could you enjoy it? Could you feel anything? Why would you "live?" What's the point? I live because it's the default choice, I see little need to change it, and even if I did, I'd have a hard time doing so. "Transcending," on the other hand, is a conscious decision, unlike being born. Why make that decision? I see little need for it. Probably it'd just end up being the rich who transcend and live forever while everyone else continues to live and die in a world where people have been made obsolete. Making people obsolete in everything completely defeats the purpose of progress, but people will call it that anyway and mock anyone who prefers human labor and human life. Eventually, most people become machines and those who don't get killed off eventually since they can't compete. The human race goes extinct, having collectively committed suicide in an ill-conceived plan to turn into machines. Only machines and post-humans remain, existing, for no reason.
It absolutely horrifies me. I lose sleep fearing that the whole human race will short-sightedly commit suicide in a desperate, blind, unreasoning quest for an artificial existence of artificial pleasure. Transhumanism/posthumanism/transcendence is one of my biggest fears by far, and I worry about it all the time. I mean it. It sneers at and regards as inferior thousands of generations of humans, it sees them as unclean and primitive, and tries to eliminate and replace them with "superior" models, without ever really stopping to think about what standards it's using, and to what end it's aiming. People are already proudly predicting that there will be no need for human contact anymore, as though that's in any way a good thing, and they'll just lead "lives" of "endless pleasure." Think on this: If we could, should the whole human race just hook itself up to some pods that put us in a dreamlike state and pump us full of drugs that artificially stimulate the parts of our brain controlling happiness and automatically make us "happy" no matter what? Is that really living? Is it really happiness? What's the point? How is transcendence any different? How's it any different from the human race doing nothing but masturbate endlessly?
To be fair, the idea of immortality is intriguing. But what would you do with few dozen billion people who won't die? I'd like it best if there were some limited number of people who would get reincarnated. The population of the world would be big enough to be interesting and full of diversity, but small enough not to overtax resources or destroy the environment. We'd get to enjoy human life, we'd get to be kids again, and death wouldn't be a permanent end to everything. Too bad this is an impossible dream.
But why bother. People will just call me a stupid primitive old-fashioned sentimental Philistine who's standing in the way of the Unstoppable March of Progress. Everything I love--nature, living with nature, some forms of traditions, people being valuable and accepted members of society-- is being destroyed. There's no future. But what do you care, I'm just some stupid ape.