I have a 4K TV, though I never had a TV prior to that, just a 24" 1920x1200 monitor (which I still use on my computer). I use the TV to watch Blu-Rays, DVDs, a bit of streaming video, and a handful of times per year actual over-the-air TV. It's a 48" Samsung, and I typically sit 7-8 feet away on a sofa. I've also played Rocket League on it a couple times, but I've never been handy with a controller so 99% of the time I play it on my 24" Dell monitor with a mouse+keyboard.
IMO the 4K aspect isn't a big deal; I got it in 4K because by autumn 2016, most of the better TVs were 4K anyway. The two big reasons being the viewing distance that Ryika touched on, and that there isn't much 4K content, and most of it's more expensive than 1080p content. I think the only video content I've watched in 4K is the film Interstellar, via Amazon Prime. From my couch, I couldn't tell an increase in detail versus watching Spectre on Blu-Ray or Game of Thrones in 1080p via HBO Now. It may have been better, but the difference was marginal. Whereas, if I watch a DVD, I do notice the difference in quality - for 5 or 10 minutes at least, and by then I'm into the film and adjusted to it enough that I don't think about it anymore.
For live sports, the signal is often still in 720p. When watching American football, there are times I wish there was a bit more detail or clarity. But that happens whether I'm watching on my 4K TV, a friend's 4K TV, or my parents' 1080i Sony Bravia from 2007. The signal is the limiting factor in all three cases.
The other difference I notice when comparing my 4K Samsung to my parents' 1080i Sony is that their Sony has much better sound quality. This is due to its increased thickness and integrated, visible-to-humans front-facing speakers. I wound up buying a soundbar for my Samsung, which made the difference between having to put closed captions on most of the time due to muddled audio, and having sufficiently clear audio to understand almost everything, not to mention improved sound quality in general.