MovieChat Forums > Brainstorm (1983) Discussion > Mistake in the trivia section (who would...

Mistake in the trivia section (who would've thought, eh?)


Why do they let people, who don't know anything about film, movie industry standards, or how things are displayed, write these 'trivia' entries in IMDb? And why is it so difficult to correct them, that we're practically DOOMED to have a very, very FAULTY trivia system throughout IMDb? Even Darth Vader was once spelled 'Darto Voder' (or was it 'Vador', can't remember)!

This mistake also reveals mostly the stupidity and ignorance of the entry writer:

"Douglas Trumbull originally wanted to film this movie in "Showscan", a 60-frame-per-second widescreen process he'd developed, but the costs of retrofitting theaters to show it proved prohibitive. If the "Showscan" version had been made, each non-"Brainstorm" frame would have been printed twice to create a 30-frame-per-second "normal" film rate to compliment the cropped, non-widescreen shots. The intent was to create an experience similar to what the onscreen characters were "viewing.""

First of all, movies are not shown 'normally' in '30 fps'.

They're shown in 24 fps in theatres. This is such basic knowledge, pretty much EVERYONE should know it, who is the least bit interested in movies, let alone someone, who is making a 'trivia' entry about this kind of details.

Second of all, just printing it twice, would of course not create 30 fps - I mean, technically, it would, but a film that has been filmed 24 frames per second, will play too fast and look unnatural, if played in 30 fps.

Here's what they REALLY planned:

"There would have been 70mm 60fps prints, with the 35mm stuff skip-printed to 60 from 24 fps. Also, there would have been 65mm material drop-framed to 24fps for a normal 35mm release."

Why is this kind of simple stuff too difficult to get right in a movie's official IMDb 'trivia' section? It ceases to be TRIVIA, if it's a LIE, you know.

In any case, I wish the fps-rates had been designed to be higher in the beginning. There used to be 50/60 fps interlaced television camera systems. But modern video cameras operate at 25/30 fps max. nowadays. Of course there are some cameras that can do a higher fps-rate, but it's still pretty common to be limited by this artificial 25/24/30 fps system.

A human eye can detect much faster movement than 30 fps, and very fast movement looks awful, when slowed down or watched as still frames - it becomes basically just a blur, and when that blur gets compressed, and you watch it in a lower resolution.. well, the result is not pretty. But for some reason, masses do not demand higher fps-rates for -everything- (thank goodness at least computer games can utilize much higher framerates than televisions and video cameras usually allow).

Still, even having high resolution and curved televisions AND higher fps, would not come close to showing things as the human eye normally sees them.

A smiley-face mark for anyone, who can at this point immediately realize, what I am talking about.

We would of course need a proper, at least passable HDR system. Why are we still limited by what paper or film can do (in most cases, even more limited than that, because you can put a really bright light against good-quality film, and it will show a higher dynamic range than a computer monitor is capable of displaying, AND it will show a much brighter picture)?

Lights don't look like lights, if you watch a movie or even play a computer game sometimes, when you use a modern, very dim (by comparison to some old 1980s small CRT monitors and television sets) monitor to do it. And even if there could be a very bright light that actually looks like light (you always have to use white, because it's the brightest color, to display 'light', but it still won't look like a bright light, except by -comparison- to the darkness of the surroundings - it's impossible to depict a bright daylight scene, and have a light look like a light - and yet, we see that every day in real life, at least during the summer - car lights, service vehicle flashing lights, etc. still look like bright lights, even if it's the middle of the summer day)..

.. the whole scene would still only have 256 different brightnesses. That's an incredibly small amount of brightnesses.

The 24-bit image system that we use most of the time, can only display 256 different brightnesses, because it only has 16 777 216 unique colors (RGB = Red, Green, Blue = Red (256) * Green (256) * Blue (256) = 16 777 216 colors).

Each color has only 256 different 'shades', so even if you 'mix' the colors in any way you want, there's never more than 256 shades (you can realize this if you turn any 24-bit image to a grayscale image, it becomes 8-bit picture, and you lose no 'visual data', the color only changes to shades of grey).

We live in 2016, soon it's 2017, and most people don't have HDR monitors, graphics cards, cameras. Most people don't even have passably bright monitors or television sets. Not even as bright as the old CRT television sets often were.

And the masses are too ignorant of technology, too stupid and easily pleased by superficial crap and marketing hype, too entertainment-focused to realize what they're missing, and to DEMAND higher fps and proper HDR!

Heck, even 16-bit HDR would be enough. But we're now using 8-bit (2^8 = 256) dynamic range, which is INCREDIBLY limited and limiting.

I could write a lot more about all this, but I think I've ranted enough for now.

It's just sad how stupid and ignorant people are on this planet .. they hijack the geek culture without understanding it, and then are mesmerized by marketing and think they have some real high-tech, while actually suffering from dimness and low fps, without ever REALIZING it ... this would never happen in a world, where people actually understood technology and saw the truth about things.

I think one way to keep people ignorant of how dim their monitors is, is the forced, constant use of the white color everywhere (255,255,255). It's the brightest color of any monitor, so of course it's going to APPEAR to be bright. But try displaying a red or deep blue color, and try to make it look like a bright light without using white, and you'll see that it's not actually that bright at all.

But because monitor pixels are consist of light, white color really looks bright. And people still use paper conventions - when they create pictures, they make a WHITE background (which creates an enormous contrast, because nothing is as bright, so even the brightest object depicted by a monitor, is going to look dim and is going to be harder to look at, when it's surrounded by white - it's like putting bright light bulbs to disturb the viewer, who is trying to look at a painting. NO ONE in their right mind would do that, and yet, that's absolutely the norm, when you look at pictures with a search engine - most of your results (especially if you are looking at some object that you can buy) are going to have a disturbing, WHITE background).

Yeah, these display matters on this planet are so messed up, I could rant for hours and hours and not even scratch the surface all that much.

(I do realize that it seems contradictory to on the other hand WANT more brightness, and right after that, to start ranting how white is too bright for background, but anyone with enough intelligence can realize how there's really no contradiction - white is good on PAPER only, because paper is not bright light, so anything darker than white, is going to form a nice picture that's easy to look at. However, monitor pixels ARE light, so the background for any image that you want the viewer to see, should always be black, or at least very dark, otherwise the picture is competing against the white and fighting a losing battle)

I just want lights to look like lights, and for images to be able to be viewed properly (without having to edit them first). Is that really too much to ask? To be able to look at things on a display, and SEE them properly, and also see them in as realistic light (no pun intended) as possible?

On this planet, that IS too much to ask!

Here's something to read about HDR:

http://www.hardwaresecrets.com/brightside-high-dynamic-range-display-technology/

http://www.trustedreviews.com/opinions/hdr-tv-high-dynamic-television-explained

By the way, it's kind of odd, how they DID make relatively bright 'modern' monitors for a short period of time (around 500 cd/m²), but nowadays, you'd be very lucky to be able to find one.

reply

Blah,blahblahblahblahblahblah......

reply

thanks for the great post.

I'd ask this however about higher FPS: I believe the new Hobbit trilogy was shot at 60 fps and to me it looks very fake. Every hair is perfectly sharp and clear even in the midst of battle scenes. I think (but hard to know for sure) that's why newer CG movies and effects look so fake to me. I think I prefer some motion blur and I don't really judge movies by how they look slowed down frame by frame.

I get the impression you know a lot about this subject and would like to hear your thoughts on something like the high FPS in newer movies.

reply