civvver
Deity
- Joined
- Apr 24, 2007
- Messages
- 5,855
https://www.ft.com/content/50bb4830-6a4c-11e6-ae5b-a7cc5dd5a28c
I was listening to NPR Tuesday and not sure who the guest was, but they were talking about what's now called the surveillance economy, where companies like facebook entice you into staying on as long as possible to gather and sell more data about you. There was a caller who asked about free will and how this relates and it was a good discussion.
I googled it and found this article. Not by the guest but a good read.
Do you think we are headed for some dystopian future where big data predicts all the best outcomes and our will is taken away?
Think about it right now, all the decisions that big data is already influencing. We have dating sites like match and eharmony that match you based on data. You still get final choice but the influence there is strong. We have medical decisions greatly influenced by data and genetic mapping is going to add even more weight to this. Oh you are 90% predicted to have a type cancer based on this gene, so do this preventative treatment now. It's coming.
Then the stuff people consider harmless, like book and show recommendations, and, at least until the Russian hacker facebook stuff, news feeds.
If you've ever seen the movie minority report you know how scary this could become. In that movie, instead of using big data to predict behaviors the government has found three telepathic siblings who have the gift of precognition. They can see the future. So they dream and see murders in the future and the police go arrest those people for crimes they haven't committed yet, but will in the future.
I could absolutely see a government doing this with big data. Oh based on your biology and genetics and upbringing and personality tests, you have a 98% chance of becoming a murderer, so we're going to lock you away. What's the sacrifice of a 2% chance of error vs the safety of the public? And many people will give up free will for that sense of security. And then whoever controls these algorithms will be in complete control of everything else.
In a sense we already do this, or want to do this on a much smaller degree. We have background checks for buying guns and we want to expand these even more and include mental health checks. We decide, based on statistical evidence, that certain people are more likely to commit crimes with guns and thus we take away their freedom to choose to own a gun. Most people are fine with this, as am I, because they either don't see the necessity in owning a gun, or they weigh public safety much higher that an individual's right to own a gun, or they simply don't even see that as a right. It seems easy to agree with in this context, but that scope of rights could be taken so much further.
So what do you guys think? Is this along the lines of worrying about an AI driven robot take over? Cus I do not believe that will ever happen. But I could very much see a society in which big data drives everything and choice and freedom are completely lost.
Or would that actually be a utopia? If all of your heart's desires are predict and met ahead of time, you're given everything you want for optimum happiness before or at the moment you know you want it, would that be amazing? The advances in genetics alone could save tons of lives. And maybe we don't have to lock away the future criminals in a traditional jail, maybe they just have an AI surveillance put on them so they can't do anything naughty. Or maybe they do go to jail but jail is awesome, like a spa vacation. Cus if everything is optimized we could afford it. I think the big data proponents envision this type of future without considering how just a few people would control everything.
I was listening to NPR Tuesday and not sure who the guest was, but they were talking about what's now called the surveillance economy, where companies like facebook entice you into staying on as long as possible to gather and sell more data about you. There was a caller who asked about free will and how this relates and it was a good discussion.
I googled it and found this article. Not by the guest but a good read.
Do you think we are headed for some dystopian future where big data predicts all the best outcomes and our will is taken away?
Think about it right now, all the decisions that big data is already influencing. We have dating sites like match and eharmony that match you based on data. You still get final choice but the influence there is strong. We have medical decisions greatly influenced by data and genetic mapping is going to add even more weight to this. Oh you are 90% predicted to have a type cancer based on this gene, so do this preventative treatment now. It's coming.
Then the stuff people consider harmless, like book and show recommendations, and, at least until the Russian hacker facebook stuff, news feeds.
If you've ever seen the movie minority report you know how scary this could become. In that movie, instead of using big data to predict behaviors the government has found three telepathic siblings who have the gift of precognition. They can see the future. So they dream and see murders in the future and the police go arrest those people for crimes they haven't committed yet, but will in the future.
I could absolutely see a government doing this with big data. Oh based on your biology and genetics and upbringing and personality tests, you have a 98% chance of becoming a murderer, so we're going to lock you away. What's the sacrifice of a 2% chance of error vs the safety of the public? And many people will give up free will for that sense of security. And then whoever controls these algorithms will be in complete control of everything else.
In a sense we already do this, or want to do this on a much smaller degree. We have background checks for buying guns and we want to expand these even more and include mental health checks. We decide, based on statistical evidence, that certain people are more likely to commit crimes with guns and thus we take away their freedom to choose to own a gun. Most people are fine with this, as am I, because they either don't see the necessity in owning a gun, or they weigh public safety much higher that an individual's right to own a gun, or they simply don't even see that as a right. It seems easy to agree with in this context, but that scope of rights could be taken so much further.
So what do you guys think? Is this along the lines of worrying about an AI driven robot take over? Cus I do not believe that will ever happen. But I could very much see a society in which big data drives everything and choice and freedom are completely lost.
Or would that actually be a utopia? If all of your heart's desires are predict and met ahead of time, you're given everything you want for optimum happiness before or at the moment you know you want it, would that be amazing? The advances in genetics alone could save tons of lives. And maybe we don't have to lock away the future criminals in a traditional jail, maybe they just have an AI surveillance put on them so they can't do anything naughty. Or maybe they do go to jail but jail is awesome, like a spa vacation. Cus if everything is optimized we could afford it. I think the big data proponents envision this type of future without considering how just a few people would control everything.