Why assume what Christianity teaches?
I don't make any assumptions here, I am going by what I was taught by Catholic nuns and others.
If as you put it, "The Bible is the Word of God". Is that because Christianity said so, or because God said so?
I was taught that The Bible is God's communique to us. Without God's will for us to hear his word, the Bible wouldn't exist.
If hell is the demand, then we still have the choice to pay that demand or accept God's love.
Yes, God expects us to weigh this and to make a decision. If God didn't have the expectation of us to make this decision, he wouldn't have presented it to us.
I mean, there are a lot more expectations than that. Jesus himself said: "You therefore must be perfect, as your heavenly Father is perfect."
He also said things like "Love the Lord your God with all your heart and with all your soul and with all your mind and with all your strength." and "As obedient children, do not conform to the evil desires you had when you lived in ignorance. But just as he who called you is holy, so be holy in all you do."
These are all expectations, I don't know how you can say that they are not.
This just isn't true. Even if we accept that Jesus is divine, there's vastly more evidence that the Bible is not the Word of God than evidence that it is. If we believe in the Adversary, then the I can definitively state that the Bible is probably the worst case of libel against God in the world.
We can snip out the teachings of Jesus if we want. He teaches a rather different story than what we see in the rest of the Bible.
I mean, of course the Bible is not
actually the word of God. But that is what Christianity teaches, that God has guided the creation of this book using the holy spirit, and has allowed it, via his many all-powerful means, to be curated in a specific way and to survive up until this point.