Work is really good here, so is health and wellness. Everyone's experience is different.
I get a lot of time off (6 weeks, 10 holidays, 7 unscheduled days), plus other benefits. Pretty much all preventive-care medicine (including dental) is free of charge, and I can easily see any of my doctors at pretty much any time. When I need to see a specialist of some kind, I can see one pretty much always the same week. I am paid well on salary, and feel quite secure with my employer, who looks out not just for my at-work wellbeing but also so much personally. I work maybe like 25-30 hours a week on average, and I get to work from home.
I was miserable living in Canada, and my family and friends still there complain about things all the time. My sister works a minimum wage job while raising three children. She and her husband (military) live in a rented townhouse that's falling apart (for example, they can't even open the garage) Most of my cousins are trapped in retail hell, and gripe constantly how they can never get in to see a doctor. Still, they're all convinced I made a huge mistake moving to the US and believe without a doubt that life here sucks. I just don't understand it.
I don't get either why foreigners like to rip on the US so much. It's like they seem to believe their countries don't have problems too, or maybe it's some kind of thing to try to make themselves feel better?
One of my best decisions in my life was to get out of Canada and come here. I'd never go back, and I'd never desire to live anywhere else.