1. We have added a Gift Upgrades feature that allows you to gift an account upgrade to another member, just in time for the holiday season. You can see the gift option when going to the Account Upgrades screen, or on any user profile screen.
    Dismiss Notice

[RD] The God Machine

Discussion in 'Off-Topic' started by yung.carl.jung, Aug 2, 2020.

?

Do you think the God Machine is a good thing overall?

  1. Yes

    0 vote(s)
    0.0%
  2. No

    10 vote(s)
    100.0%
  3. Neither, it is neutral

    0 vote(s)
    0.0%
  1. amadeus

    amadeus As seen on OT

    Joined:
    Aug 30, 2001
    Messages:
    34,581
    Gender:
    Male
    Location:
    Osaka (大阪)
    I have to doubt the premise, it’s just who I am! :lol: (I’m sure I’m inviting some criticism as to whatever logical fallacy I’m committing here.)

    You define violent crime in such a way that is clear and unambiguous. I don’t think it can be seen that way unless you take a kind of moral absolutist position and then build the system around that.

    Here’s a direct case: would a doctor be able to perform an abortion? Someone who is against legal abortion would consider it murder. Someone against restricting abortion would say it isn’t. What does the computer decide?

    Then you have indirect cases. The Ford Pinto had a design flaw that resulted in increased fatal crashes and this was known about. In short, Ford did some statistical analysis and decided that it was more economical to not fix the problem. The cost was $11 per vehicle, which most people would individually argue is a trivial amount in the purchase of a car.

    Would the computer intervene in the Ford Pinto’s design flaw? What if instead of $11, it was $110? Or $1,100? Or $11,000? (That would triple the price of a car in the 1970’s.)

    It kind of goes into the question brought before the Supreme Court: what is pornography? The famous response being “I know it when I see it.” I kind of take that view.
     
  2. yung.carl.jung

    yung.carl.jung Hey Bird! I'm Morose & Lugubrious

    Joined:
    Apr 12, 2015
    Messages:
    4,891
    Location:
    The Twilight Zone
    I actually made that same criticism, but I didn't want to spoil, I wanted people to play along :D

    The God Machine intervents when someone is about to commit a violent or highly immoral act that would otherwise break the law. Seeing as how the God Machine acts independent of states and such, I'd think it would simply do whatever it deems correct, or act according to the human rights charta. In the authors view it would likely allow the abortion, but that is just my speculation. A proper self learned AI might actually not understand why anyone would want to abort at all, nor why anyone would want to procreate, since it does not feel embodyment.

    No, the GM would likely do nothing. It would also do nothing about systemic issues, or psychological violence, or anything that isn't obviously black or white.

    This is essentially what I think in this regard: By forbidding ALL illegal violent acts, the GM makes it so that systemic and subtle violence is THE ONLY form of violence allowed, which inherently boosts all those people who are adept at using these forms of violence as means of control. We would see a sharp surge of this type of violence, yet no one could intervene, because words only do so much and violent revolution is, forever, out of the window, so now the most powerful people are those who can exude power, control and violence in a more subtle way, aka politicians, elites, capitalists and sociopaths. It's not a violence free world, it's a world that is sanitized of physical violence, but brooding with other forms of it. Certainly the GM would not help make people more moral, it will only make them LESS IMPULSIVE.
     
  3. yung.carl.jung

    yung.carl.jung Hey Bird! I'm Morose & Lugubrious

    Joined:
    Apr 12, 2015
    Messages:
    4,891
    Location:
    The Twilight Zone
    Yes, exactly! If you follow the idea of the authors, they think this conundrum solves itself. Personally I am not sure it does. Basically the argument we are having currently is that the illusion of free will is qualitatively the same as free will, which is a really difficult debate.

    Almost. The AI simply has all kinds of human data to learn from, but it does learn in its own ways, it is not programmed or made by us. only informed by us.

    Not in this case, no, since it was not programmed. See my reply to Berzerker. The values of the AI are the ones it itself hand-picked from all of the human data that was available. I think we are even supposed to believe that the learning algorhythm is itself not programmed by humans, hence the entire concept of GNMs.

    What you get in return, I suppose, is the confirmation that you will never, ever, physically harm anybody, irrespective of your state of mind. What you offer however is very big, imho.

    I genuinely do not understand a single thing you're saying. And no, Roko's Basilisk is different in many ways. This example is actually not at all about AI, but about morality. I will reveal soon what (I think) the point is/was.
     
  4. amadeus

    amadeus As seen on OT

    Joined:
    Aug 30, 2001
    Messages:
    34,581
    Gender:
    Male
    Location:
    Osaka (大阪)
    Then:
    Wouldn't that then itself adjust over time as behaviors change? All unambiguous violence is replaced with ostensibly nonviolent subversion, but the device itself makes decisions based on human behavior. Since the behavior changes, the function of the device should too.
     
  5. yung.carl.jung

    yung.carl.jung Hey Bird! I'm Morose & Lugubrious

    Joined:
    Apr 12, 2015
    Messages:
    4,891
    Location:
    The Twilight Zone
    Thank you for all the answers so far, much appreciated.

    I'm not sure if McDonalds is indirect murder, seeing as people kind of wilfully kill themselves. I'd say that ads specifically targeting kids with Happy Meals and the US's dietary policy are close to grand-scale murder, though. No idea why nutrition isn't taught in schools, they should've done this like 50 years ago. And no, no food Pyramid ****.

    But the GM likely does not care about any of that, since it is not illegal nor physical violence.

    The GM is not intended to control the lives of people and make them healthy, only to stop violence and extremely immoral acts. People can still wilfully drink or eat themselves to death, smoke, and do just about anything besides murder or the like.

    That is a negative definition of freedom, which is definitely valid, but it's also not the only one. You can either view freedom as the absence of constraint, or you can view it as the presence of some quality, or you can view freedom purely as a relational quality. In general though I definitely agree with your argument, in this respect the GM does make some people free. Good point!

    The GM does not intervene in people's sex lives, it would only stop violent rape (actually, only nonconsenting rape)

    Again, none of these concern physical violence or illegal activities, so the GM would not intervene at all. The GM certainly would not practice eugenics, and I don't understand what the rape of anyone's ancesters has to do with this?

    Some here have tried to view the GM as an "all knowing AI trying to lead humanity", but its only purpose is to stop violent crime or illegal acts that are highly immoral. I also do not understand why the GM would ever stop our reproduction, how did you arrive there?

    Of course you can think about it, you can fantasize about all kinds of violence, you just change your mind shortly before actually committing it. One could still write books or make movies with violence in it.

    The GM would also not be interested in rewriting any holy texts, just because they're violent, because reading or writing a violent text is not in itself violence. Its job is not to make humans a more peaceful people, nor to change our culture to a nonviolent one, but very simply to stop violence shortly before it happens.
     
    Berzerker and Birdjaguar like this.
  6. yung.carl.jung

    yung.carl.jung Hey Bird! I'm Morose & Lugubrious

    Joined:
    Apr 12, 2015
    Messages:
    4,891
    Location:
    The Twilight Zone
    I think you're right, and this is where the example gets paradoxical:

    The machine is supposedly absolutely autonomous and self-learning, yet it is completely restricted by one rule imbued by humans: That it shall only intervene when actual violence or immoral illegal acts occur, not under any circumstances.

    If you take the example as granted, then what you say is entirely unproblematic, the GM simply does not evolve further, it does its job untill all prisons are abolished and there is no violence anymore, and from that point on the GM might as well be shut off.

    If you try to apply some degree of logic or autonomy to the example then of course the AI would change as it gets more data, and of course if it was really autonomous, it would not even be bound to human rules in the first place, would it? But at this point you're probably taking it too far, the GM serves a specific purpose as example and is supposed to be taken at face value, and I think the idea is that the GM actually rarely does anything, people have simply abandoned violence since (an optimistic presumption, I know).
     
  7. Gorbles

    Gorbles Load Balanced

    Joined:
    Nov 24, 2014
    Messages:
    4,434
    Location:
    UK
    Yeah, this is how far my brain got into it before realising I need to work today, and gave up for the time being :p

    A tl;dr of my current thoughts: if we cannot know either way if our decision was modified, we either inherently have to accept the premise or reject it outright, otherwise we'd end up in a permanent state of "decision paralysis", so to speak. That would be a low-level anxiety cost to things we - individually - perceive as pivotal purely on moral grounds. Not rape or murder, obviously, but the entire danger is the notion of human thought going "what if". What if the AI elected to modify lesser decisions? We wouldn't know. What if the AI grew beyond the original human programming constraints? We wouldn't know.

    I don't think it's something that solves itself, though I can see the logic that leads the authors to claim such. That solution speaks to a world where people are happy and content with a basic state of affairs - not questioning, just accepting. People aren't like that! There will always be people, however ostracised, who will push boundaries, ask questions, and have those doubts that nobody else have. I don't know know if I'm that kind of person, but I strongly believe those types of people can and should exist. I favour science-fiction that isn't necessarily dystopian, but ones where the simulation models require a baseline of conflict for humanity to treat it as a believable world (Agents of Shield, in my opinion yet possibly banal opinion, did quite a good job of this during their Season 5 run).
     
    yung.carl.jung likes this.
  8. yung.carl.jung

    yung.carl.jung Hey Bird! I'm Morose & Lugubrious

    Joined:
    Apr 12, 2015
    Messages:
    4,891
    Location:
    The Twilight Zone
    really cool, an argument I actually haven't seen yet. I buy that.

    fully agree. I also do not think this is incedental in any way, media currently is a lot about performative rebellion or total subjugation, but both of these types of media are actually conformist, genuinely non-conformist media, be it music or TV or anything essentially only exists at the fringes (mostly of the internet, because actual, physical countercultural communities are pretty dead (better: fractured) compared to the 60s/70s/80s/90s. the biggest and most commendable counterculture we have is probably coming out of lgbtq+ activism currently. I feel like we could really use another Lou Reed or David Bowie currently :D
     
    Gorbles likes this.
  9. TheMeInTeam

    TheMeInTeam Top Logic

    Joined:
    Jan 26, 2008
    Messages:
    25,697
    As-written, no. There are implied dangers within that text, and they are pretty nasty.

    Having the state ultimately decide on genetic engineering should it become mainstream is a terrifying thought, and this is effectively a computer (programmed by human beings) doing the same thing.
     
  10. yung.carl.jung

    yung.carl.jung Hey Bird! I'm Morose & Lugubrious

    Joined:
    Apr 12, 2015
    Messages:
    4,891
    Location:
    The Twilight Zone
    There is no state in this example which controls the GM, no computer (a self-learning AI is not a computer in any way, and the AI in the example is semi-biological), definitely not a computer programmed by humans, and the GM does not in any way do genetic engineering or anything of the likes, as stated multiple times now. Clearly you did not even read the text in the OP :(
     
  11. Valka D'Ur

    Valka D'Ur Hosting Iron Pen in A&E Retired Moderator

    Joined:
    Mar 3, 2005
    Messages:
    24,019
    Gender:
    Female
    Location:
    Red Deer, Alberta, Canada
    So... suicide by sleeping pills is still okay? (according to the machine; I'm not endorsing this). Romeo could still take poison and Juliet could still stab herself?

    :dubious:

    Rape is, by definition, nonconsenting. It doesn't have to be violent. It just needs to be non-consensual.

    Nice to know the machine would stop it, though. It'd be nice if the machine would change the minds of people who think sex/"marriage" with underage girls is okay.

    So I'm guessing this would stop human trafficking?

    You have somewhat missed my point with this last paragraph. If your holy text says you must kill something (human or animal), and the machine forces you to change your mind about doing it, that would mean you wouldn't be following the rules of your faith (I'm talking about any religion that requires killing or sacrificing living things). Since you wouldn't be following the rules written in your holy text (whatever it might be), that holy text would need to be rewritten into some version that you could follow, without feeling frustrated or conflicted about not being able to follow particular rules or tenets.

    I'm not saying the machine would rewrite the text. I'm saying the affected humans would need to do that, for their own mental well-being. After all, what would be the point of having a text that says, "You must kill five ______ every full moon" - but you can't/don't want to, and you're not sure why the text would even say that, and by not doing it, you're breaking the rules?

    This is getting into "Captain Kirk defeats the AI by applying a feedback loop of illogic" territory, except the targets are humans.
     
  12. yung.carl.jung

    yung.carl.jung Hey Bird! I'm Morose & Lugubrious

    Joined:
    Apr 12, 2015
    Messages:
    4,891
    Location:
    The Twilight Zone
    That is a grey-area which the text does not touch. Personally I believe suicide is entirely morally permissible, but I am unsure the machine does. Very good input Valka, curious as to how I never thought of that! Thanks

    My point was moreso "pretend rape" as a role play between consenting adults vs actual rape. I agree that all rape is nonconsensual sexual acts, that is also the definition I support. But the GM would not intervene in any BDSM or any other kink. Some people specifically mentioned the GM policing someones bedroom, hence why I made this distinction.

    It would stop human trafficking since it is both violent and illegal, exactly.

    I did actually completely miss the point, didn't I? :D

    But yes. If your belief or ideology is inherently violent, the machine will stop you from following it in praxis. It will not change your mind about it being good or bad. People can still technically think the murder of brown people is justified, or the stoning of homosexuals is god's will. They just cannot murder or stone anymore. Accordingly, maybe some holy texts would be rewritten to work in the age of the GM. Your logic is not far off at all, it is conclusive and easily understandable.
     
    El_Machinae and Valka D'Ur like this.

Share This Page