You know I have no idea why flims still shoot such low frame rates. Many times when watching action movies I can't follow everything cus of the motion blur or camera shake. Higher frame rates make the action easier to follow. All the hobbit flims were shot and shown at 48 fps. When you first see them it's weird. Everything looks super real and crisp instead of having that soft movie like look. But after a few minutes you adjust and like I said, it's better for action scenes by a lot.
But if you want to know why, that's probably it, games are going for life like realism with fantasy settings, while films are still trying to put you in a specific place and might not want to see as real. Also the size of the screens, and cost. If you want to get 60 or 120 fps on your game you buy expensive hardware but for movies the theaters and production companies didn't want to spend more for higher frames.
I do not get the obsession with people trying to exceed 60hz though. Maybe I have to see 120 side by side in real life with 60 and compare, but 60 with vsync is super smooth as is. Even stuff like msaa (multi sample anti asliasing) vs fxaa (fast approximate anti aliasing) is extremely hard to tell the difference when you are playing a game at real life speed. I can usually only notice it in side by side screenshots, and the msaa requires like four times the processing compared to fxaa.