The Very-Many-Questions-Not-Worth-Their-Own-Thread Thread ΛΕ

Status
Not open for further replies.
^I was just going to ask whether they were the same people who pepper you political opinions and ‘facts’.
I didn't much like the LotR films with the storyline butchered by Peter Jackson, but the visuals were simply gorgeous.

It's not the story line changes they complained about. It's the overuse of CGI (saying that over 90% of the Lord of the Rings is CGI), the length and that everything that everyone complained about in The Hobbit films exists in The Lord of the Rings films but for some reason no one complained about it at the time. The Lord of the Rings films would've been better as a single, hour and a half film. The problem, along with many modern films, is that the director has too much control and that the producer should have absolute control over the making of the film and that films should be made to be as compressed and short as possible. They do have some exceptions, being Stanley Kubrick, Denis Villeneuve and Quentin Tarantino.
 
Did I miss the bit where this mysterious "they" was ever identified?
 
Okay, I looked this now up, because I was curious.
https://en.wikipedia.org/wiki/Flicker_fusion_threshold#Display_frame_rate
tl;dr: How many images per second you can see (that is the often claimed 24) does not relate to if you can see a flickering within the movie.
Longer version: It seems most people can identify 10-12 pictures/second as single images (so like a slide show), and with 24 it sort of becomes continous. Above that, it depends on your specific eye vision on how much flickering between the images you can perceive. Apparently that is for most people 50-60. Films seem to be shown with every image being repated 3 times after each other, to get to actual 72 images/second, to get around that.

I also thought that really high frame rates don't make sense, but apparently our current standard does not make sense itself. Seems also that a higher frame rate would actually make movies better.

So their initial claim about the 24 images is technically correct, but this doesn't take into account that you can see more than you consiously can process.

Now it also makes sense why sitting on an old CRT with 40HZ gives you figuratively eye cancer (try that, it's horrible; 80HZ was okay for me).
 
The first films were done at 12, IIRC, being the bare minimum required to create a sense of movement (see for example ye olde pre-2016 10-15fps animated gif). Cinematography being expensive at the time, using only 12 frames per second saves a lot of money on the reel negatives or whatever ancient 20th century tech it was. Later it was upgraded to 24 because 12 was deemed not good enough. It has stayed at 24 out of tradition or possibly just keeping older-but-still-good filming equipment around after we moved to digital.

It's also done because it's a lot easier to hide imperfections in the scene with 24fps motion blur. Don't have to fix your screw-ups if the viewers can't find them!
 
One could have expected the LotR films to be long and indulgent, because that's basically the books in question, but not The Hobbit, as that's a slim 300-page novel for children, rather than a sprawling 1100-page trilogy.
 
Attempts have been made to show films in higher framerate (notably The Hobbit) and the reaction was largely negative - motion sickness was a common complaint.

Traditionally, technical and economic limitations were the main factor - projectors can only catch, display, and advance so quickly; and shooting at a higher framerate means more filmstock which means higher costs to produce film, more storage demands for theaters, and more frequent reel changes. With the advent of digital film these limitations were largely removed. Film generally doesn’t shoot at 24 fps, 30 is the industry standard today as far as I know.

Now it’s more about standards perpetuating themselves - directors and cinematographers are trained to think in terms of 30 fps, editors learn to edit on 30, and filmgoers are used to seeing film exhibited at 30. Changing these standards would be difficult on production and would be met with negative reactions from general audiences.

CGI is also something to think about in this day and age. More frames means more work for animators and editors, which would balloon production budgets.
 
Okay, I looked this now up, because I was curious.
https://en.wikipedia.org/wiki/Flicker_fusion_threshold#Display_frame_rate
tl;dr: How many images per second you can see (that is the often claimed 24) does not relate to if you can see a flickering within the movie.
Longer version: It seems most people can identify 10-12 pictures/second as single images (so like a slide show), and with 24 it sort of becomes continous. Above that, it depends on your specific eye vision on how much flickering between the images you can perceive. Apparently that is for most people 50-60. Films seem to be shown with every image being repated 3 times after each other, to get to actual 72 images/second, to get around that.

I also thought that really high frame rates don't make sense, but apparently our current standard does not make sense itself. Seems also that a higher frame rate would actually make movies better.

So their initial claim about the 24 images is technically correct, but this doesn't take into account that you can see more than you consiously can process.

Now it also makes sense why sitting on an old CRT with 40HZ gives you figuratively eye cancer (try that, it's horrible; 80HZ was okay for me).

I think I've mentioned to them before that animation tends to use 12 frames per second because it makes it easier to draw each frame if there are less frames. They do keep saying that any frame rate above 30 has no benefits at all and that anyone who claims to see any benefits or have better reactions at a higher frame rate is delusional since it's physically impossible to see beyond 24 frames per second.

The first films were done at 12, IIRC, being the bare minimum required to create a sense of movement (see for example ye olde pre-2016 10-15fps animated gif). Cinematography being expensive at the time, using only 12 frames per second saves a lot of money on the reel negatives or whatever ancient 20th century tech it was. Later it was upgraded to 24 because 12 was deemed not good enough. It has stayed at 24 out of tradition or possibly just keeping older-but-still-good filming equipment around after we moved to digital.

It's also done because it's a lot easier to hide imperfections in the scene with 24fps motion blur. Don't have to fix your screw-ups if the viewers can't find them!

I was told that early films were made at a higher frame rate than 24 frames per second before the introduction of sound.

Attempts have been made to show films in higher framerate (notably The Hobbit) and the reaction was largely negative - motion sickness was a common complaint.

Traditionally, technical and economic limitations were the main factor - projectors can only catch, display, and advance so quickly; and shooting at a higher framerate means more filmstock which means higher costs to produce film, more storage demands for theaters, and more frequent reel changes. With the advent of digital film these limitations were largely removed. Film generally doesn’t shoot at 24 fps, 30 is the industry standard today as far as I know.

Now it’s more about standards perpetuating themselves - directors and cinematographers are trained to think in terms of 30 fps, editors learn to edit on 30, and filmgoers are used to seeing film exhibited at 30. Changing these standards would be difficult on production and would be met with negative reactions from general audiences.

CGI is also something to think about in this day and age. More frames means more work for animators and editors, which would balloon production budgets.

I'm told that the frame rate is still at 24 frames per second. There's also that they keep saying that because films are at 24 frames per second, games should be at 24 frames per second since that's good enough and anyone who complains is clearly an ungrateful idiot that is ruining the games industry.

Another thing they say is that CGI shouldn't cost anything because it's all made on computers instead of being made for real.
 
They do keep saying that any frame rate above 30 has no benefits at all and that anyone who claims to see any benefits or have better reactions at a higher frame rate is delusional since it's physically impossible to see beyond 24 frames per second.

"They" are wrong then. Whoever they are or whyever they're talking to you about this.
 
I think I've mentioned to them before that animation tends to use 12 frames per second because it makes it easier to draw each frame if there are less frames. They do keep saying that any frame rate above 30 has no benefits at all and that anyone who claims to see any benefits or have better reactions at a higher frame rate is delusional since it's physically impossible to see beyond 24 frames per second.



I was told that early films were made at a higher frame rate than 24 frames per second before the introduction of sound.



I'm told that the frame rate is still at 24 frames per second. There's also that they keep saying that because films are at 24 frames per second, games should be at 24 frames per second since that's good enough and anyone who complains is clearly an ungrateful idiot that is ruining the games industry.

Another thing they say is that CGI shouldn't cost anything because it's all made on computers instead of being made for real.



What this all boils down to is that these people are not right about anything. But their personal world view and self image is of themselves as people who do. It's about "I'm better than you, therefor I'm right about everything." They aren't right, and they aren't better than you. They are bullying obnoxious overpriviledged elitist jackasses. And you would really be better off if you treated everything they say as wrong.
 
What this all boils down to is that these people are not right about anything. But their personal world view and self image is of themselves as people who do. It's about "I'm better than you, therefor I'm right about everything." They aren't right, and they aren't better than you. They are bullying obnoxious overpriviledged elitist jackasses. And you would really be better off if you treated everything they say as wrong.

They are better than me and they've proved it constantly over the past 19 years.
 
Last edited:
^I was just going to ask whether they were the same people who pepper you political opinions and ‘facts’.
I didn't much like the LotR films with the storyline butchered by Peter Jackson, but the visuals were simply gorgeous.

Mostly unrelated but this post reminded me that there'll be an exhibition of the Fellowship of the Ring with a live orchestra playing the soundtrack in São Paulo on 6th July, as well as in other cities around the world on different dates
 
They are better than me and they've proved it constantly over the past 22 years.

Are "they" not just you though? Seems to be a rhetorical device for posting your own opinions without attracting any personal reprobation.
 
No. They've proven that all my opinions are always wrong.


And yet everything you've posted here about what they tell you is wrong. And you are smart enough to understand that something in what they are saying is wrong, so you check with another source.
 
Status
Not open for further replies.
Back
Top Bottom