MovieChat Forums > WarGames: The Dead Code (2008) Discussion > How dumb does Hollywood think we are..? ...

How dumb does Hollywood think we are..? Really!!?


I just saw the movie... Man... WTF was that..?
Anyone who's even the slightest well-versed about computer, and especially programmers, know that more than half of that graphical sh!t is'nt plausible / necesseary.

And what's with the realtime camera-tracking..? I mean the amount of CPU / RAM that it would take ANY supercomputer to recognice, even a face (And match it to a person (As paranoid as R.I.P.L.Y seems to be)) would be so incredibly massive it's just soo dumb it's impossible! Not to mention, after "recognicing" the face, 'she' (R.I.P.L.Y) masters lip-read.

Also, the time it would take, "if" she could recognice faces, to sort out all the people walking / driving by the cameras - It's just so damn lame! We're talking multi-core, multi- R.I.S.C- processors here with terrabytes of ram, all operating with insane bus- speeds. Then the hundreds, upon hundreds of cameras there must be 'out there', since every other one of those had a three digit number, and realtime- scanning them. Ok for a pre-designated area and non-populated areas (For the sake of recognition) it may work, but not with accuracy.
Today a computer may recognice a face, but that's from very near, so screw the old analogue crappy-ass-cameras with poor resolution (Which btw, the image looked very sharp in the movie! A flaw!).

Controlling trafic lights - Iiiiiii don't think so. They should be on a separate grid. Please feel free to correct me on this, I have no idea how it really works, but it's highly unlikely though...

Oh, there's so damn much that went so wrong with this movie.
I which I could have been one consultant and most of it wouldn't have happened!

Naa, off to bed here.. C ya...

"-For crying out loud..." *Jack O'Neill - Stargate SG1*

reply

By that logic, there shouldn't be any movie that has any flaws or fictional concepts. Should movie makers wait another hundred years to when we could this just to make it believable. It's a story, so just go with it.

Welcome to the war

reply

It's ok with a little bit of fantasies, but personally I'm fed up with Hollywood taking us, who happens to know something about this area, for complete and utterly idiots.

Every fcking time someone, say a superhacker is about to do something insanely huge, we (Audience) sees graphics. That's the first flaw right there. A hacker doesn't care $ 2,50 and a cheese sandwich about graphic, it's the code(s) that is 'Da sh!t' Also, I don't think any real hacker (Or cracker if the intent is to destroy! Very commonly missed, especially when it comes to movies!!!) wouldn't even use a GUI (That means Graphical User Interface, like Windows where you have everything esentially ready, like menues and such, all set to go for launching programs etc.), but the coldness of some kewl Linux-distrubution's black background, viewing nothing but a flashing DOS-like prompt.

This thing can actually be seen quite often in some movies, but then they shift to some rediculous graphics.

Also, one of these classical 'errors' or flaws they do, very often;
When they're about to do something, someone says something, or as a result of some event (Naturally!), they begin typing the crap out of that keyboard, and we (audience) get to see a mousepointer moving towards an ok-button.

Hollywood should stop treating us as morons, that's not to much to ask for I think, do you..?
I actually feel somewhat insulted when they do these errors, simply by beeing lazy, underbudgeted (Which is sad, and very unfortunate, but unhelpable I guess), ignorant or just blissfully unaware of the facts about the tech-stuff behind it.

Naa, time 4 bed here.. C ya

"-For crying out loud..." *Jack O'Neill - Stargate SG1*

reply

And what's with the realtime camera-tracking..? I mean the amount of CPU / RAM that it would take ANY supercomputer to recognice, even a face (And match it to a person (As paranoid as R.I.P.L.Y seems to be)) would be so incredibly massive it's just soo dumb it's impossible! Not to mention, after "recognicing" the face, 'she' (R.I.P.L.Y) masters lip-read.

Jesus, get out of your rock, even my 6 year old pc can do that, it takes no supercomputer to match a face if you know which camera to look.

Also, the time it would take, "if" she could recognize faces, to sort out all the people walking / driving by the cameras - It's just so damn lame! We're talking multi-core, multi- R.I.S.C- processors here with terrabytes of ram, all operating with insane bus- speeds. Then the hundreds, upon hundreds of cameras there must be 'out there', since every other one of those had a three digit number, and realtime- scanning them. Ok for a pre-designated area and non-populated areas (For the sake of recognition) it may work, but not with accuracy.

First of all, RIPLY IS a super computer, and they do have supercomputers for real in Washington, thought, they are operatable by human factor instead of automatisation. Also it does not need to track hundred of cameras, it tracked where they were going and therefore they only tracked their location. When they moved to tracking Washington to harm that chief, they moved satellite even. If you have a 100mhz box it doesnt mean there arent any supercomputers there.

Today a computer may recognice a face, but that's from very near, so screw the old analogue crappy-ass-cameras with poor resolution (Which btw, the image looked very sharp in the movie! A flaw!).

Omg, you must be high on drugs or something. we even have a finger-mouses, when your camera sees you, recognises you and then depending on your movement acts in computer.
As for crappy ass cameras, true, but its a movie for crist sake, would you prefer watching some pixelated image in movie?

Controlling trafic lights - Iiiiiii don't think so. They should be on a separate grid. Please feel free to correct me on this, I have no idea how it really works, but it's highly unlikely though...

Well traffic lights are conrolled from the main grid, and of RIPLY got acess to it, then why not.

Every fcking time someone, say a superhacker is about to do something insanely huge, we (Audience) sees graphics.

Once again, its a movie, you got to show watchers something, or you prefer to show couple minutes of processor cooler turning on the view?

I don't think any real hacker (Or cracker if the intent is to destroy! Very commonly missed, especially when it comes to movies!!!) wouldn't even use a GUI (That means Graphical User Interface, like Windows where you have everything esentially ready, like menues and such, all set to go for launching programs etc.), but the coldness of some kewl Linux-distrubution's black background, viewing nothing but a flashing DOS-like prompt.

he used his friends computer, which, was a gamer as we see in the begining of movie. also they both were good at gaming, so maybe they were gamers. also noone said he used windows. you can set up Linux to look very much like windows too.

Also, one of these classical 'errors' or flaws they do, very often;
When they're about to do something, someone says something, or as a result of some event (Naturally!), they begin typing the crap out of that keyboard, and we (audience) get to see a mousepointer moving towards an ok-button.

Ok, you definatelly watched the wrong movie, cause in this movie we have not seen mouse button moving at all, and they do used mouse whne they were tapping keyboard in some scenes.

I actually feel somewhat insulted when they do these errors, simply by beeing lazy, underbudgeted (Which is sad, and very unfortunate, but unhelpable I guess), ignorant or just blissfully unaware of the facts about the tech-stuff behind it.

If you create movie, you got to show soemthing, you can have watchers sit there and watch at codes and sparking processors without even explaining what happens.


I guess this will be hell of a discussion, cya.

reply

Hmmm... What kind of software would you use in your six year old PC..?
Are you retarded!!? It can't be done! Sure, if it's one computer, one camera, it might be able to recognice something, but not with a hundred percent accuracy, and especially with lo-res. analogue cameras!

"If you have a 100mhz box it doesnt mean there arent any supercomputers there." - What the hell are you going on about?

"Omg, you must be high on drugs or something. we even have a finger-mouses, when your camera sees you, recognises you and then depending on your movement acts in computer.
As for crappy ass cameras, true, but its a movie for crist sake, would you prefer watching some pixelated image in movie?" -Finger-mouses? (Mice is the term in plural!) WTH is that?

I've been doing electronics since I was a kid, not in a professional capacity, computers since '92, and my first PC was a i8088, 20 MB MFM-disc etc. I'd say that I'm pretty good with hardware nowadays.

Get a grip, will ya?
"-For crying out loud..." *Jack O'Neill - Stargate SG1*

reply

well i use standar software, as for 1.4ghz processor, and i optimized it (but not overclocled) and it works good enough. yes, it does with one camera, and pretty good acuracy, yet, were talking about supercomputers here, they got terabytes of temporary memory and who knows what their processors are capable of, its not some super home pc were talking about.

finger-mouse is when camera see your finger move and moves your computer mouse accordingly to it, so you can lay on the bed and control pc with your fingers (very effective watching movie or so).

Sorry for my grammar mistakes, btw, English is not my first language.

reply

My God, this is a movie. Do you expect 100% reality in this? Do you think that the producers only wanted to please hackers with this movie? Yeah, lets show them type some code on the screen because EVERYONE will know what it means. This is entertainment, if it bothers you to watch it that much then shut off your TV and go immerse yourself in reality. This is a movie, don't go into it expecting it to be constrained by what is real, if movies did that then what would be the point?

reply

Ok, so then, what's the point of spending millions of dollars on a movie, to make everything look good, if it's not realistic..?

Understand this - It's a friggen insult for us who happens to know about this kind of stuff.

It's like if you're, more or less of an expert, or just an enthusiast, in say cars. And this particular thing they show in this movie (Titled yeah-whatever), is something that you've done like a thousand times, without really noticing it, and in the movie they show something that you know, by heart, it won't make a snowballs difference in hell, and they state that it will increase the cars horsepowers tenfolds. Would'nt that bug you to death..?

Same thing for me. Also, most companies seem to love dumbing things down for the average-Joe. Wrong words for diffrent things, like CPU-stands (Those little computer-holders with tiny little wheels). It was ages ago when a CPU needed a stand...

Also, software-wise, the user isn't allowed to actually learning something nowadays, unless taking a course or sidestepping the manual. Everything (Windows Xx/NT/2k/XP/Vista) have to be big, clunky, unreadable, not showing anything, until you change it to your liking.

Most users think that it's so beatiful with icons as big as a barn, files & folders which you can't distinct between (Not showing 8.3- format, not so much 8 nowadays, but the.3 is needed).

Naa, time to eat something as I'm also quite tired, so bye bye.


"-For crying out loud..." *Jack O'Neill - Stargate SG1*

reply

"Understand this - It's a friggen insult for us who happens to know about this kind of stuff. "

Thats clearly not you Tony,as demonstrated by the point-ny-point rebuttal you got, so dont worry about it.

reply

And HTF would you know..?
Dumbass...

"-For crying out loud..." *Jack O'Neill - Stargate SG1*

reply

oh i know, i been in it for many years.
one thing i totally agree with you is:

"Most users think that it's so beatiful with icons as big as a barn, files & folders which you can't distinct between (Not showing 8.3- format, not so much 8 nowadays, but the.3 is needed)."

everygoddam time a log onto a pc i have to get rid of those icons! nobody else seems to know what i'm taliking about when i ask why the hell they use default "large icon" view. hide extensions - pfft. the default windows setting are pretty much wrong in every way - everything that can be ticked or unticked is set wrong imho.

reply

Exactly the same sh!t happens to me every god damn time...
It's amazing how dumb some ppl. are that feel comfy with that theme, lard-ass-icon, hidden sh!t and insane graphics-settings that annoys the hell out of me.

"-For crying out loud..." *Jack O'Neill - Stargate SG1*

reply

Remember this is a film for the general public. Showing lines of code in terminal isnt exactly exciting.

reply

I understand why they do as they do, but it's insulting to see it being done.

It's like saying that they make real live-size trains out of wood, it just don't add up.

"-For crying out loud..." *Jack O'Neill - Stargate SG1*

reply

maybe you understand pcs but you sure don't understand English. My god your grammar is so terrible i can barely read your posts. If English isn't your native language fine but otherwise go back to elementary.

reply

Ok. Instead of just spewing out that garbage, correct me if I'm that bad at it!

I'm all for improving my English, and grammar.

And for the record, I'm really not that bad at it, not in the way you mean.
I'm actually watching all movies (if possible) with English subs, I write in English, in a near daily basis, and I've never had any complaints, at least not as you describe.

Also, not everyone's an English-teacher either.


So, if you want to tell me I'm doing something wrong, do so, but in an orderly fashion and not this crap by just dissing me by telling me how bad it is.
Maybe you'll have to work on that giving-feedback-thing a bit!


"-For crying out loud..." *Jack O'Neill - Stargate SG1*

reply

But does anyone say "We're in!" Somebody always has to say that in a hacker movie.

reply

But does anyone say "We're in!" Somebody always has to say that in a hacker movie.


I say that everytime I bang my gf.

reply

Hehe

Top of every forum:"View: ... | ... | ... | nest" <-Choose "nest"!!

reply

Read with interest the most vocal protesters of this movie, and struggled to get past all the misspellings/bad word usage/etc. that these overly perturbed folks used; now, I'm putting in my 2 cents' worth:

1. First Rule of Thumb: If it is a sequel to a popular movie, and it goes straight to the video shelves, bypassing theaters...it's gonna stink like a hockey player that hasn't showered in 4 months!

2. Second Rule of Thumb: If it exhibits unrealistic technologies but purports to be accurate, it's bulls**t!

3. Third Rule of Thumb: If the best you can do is trash a movie that hundreds of folks worked on because the computer in the story doesn't fit your knowledge of same...GET OVER IT!!! It's a work of FICTION! You want things to be realistic, rent a frigging documentary!

4. Fourth Rule of Thumb: If you wish to make comments, your comments can be better understood if you learn to write/speak/spell intelligently/correctly!

Honestly, if I had a dime for every person that uses "your"/"you're", "their"/"they're"/"there" interchangeably, as though they all mean the same thing, I'd be rich enough to have Donald Trump spit-shining my boots, Oprah vacuuming my carpets, and Warren Buffett doing my taxes.

reply

Ok... First off: I'm Swedish, and English is my "second" language, and I do try my very best to be good at it, but on the other hand, when you're sitting there in the middle of the night / morning it's easy to misspell and / or use the wrong grammar. With that in mind, I'd like to see how you're at Swedish, if the intent was to insult my language-skill.

However, I do agree about your first and second "rule of thumb". =0)

But as for the other points, three in particular; I find it insulting to watch something that I have some knowledge about, and seeing it twisted out of context so to speak.

It's like you have knowledge about engines and someone starts to babble about carborators how they do something that you know it doesn't, like pumping raw oil into the cylinder instead of plain gas and air. Wouldn't that tick you off, even if just a little?

All I'm saying is that a good movie contains facts, and continuing to build on that, always with the facts in the loop.
A movie with proper research behind it I think stand a better chance to get better recognition than a movie based (almost) entirely on fiction.


And as for the "Honestly, if I had a dime for every person that uses "your"/"you're", "their"/"they're"/"there" interchangeably, as though they all mean the same thing, I'd be rich enough to have Donald Trump spit-shining my boots, Oprah vacuuming my carpets, and Warren Buffett doing my taxes. "

This is one thing that I really try to avoid, using the wrong words at the right time (Hmmm..?). But I guess that crap happens, especially when one's tired in the middle of the night.

"-For crying out loud..." *Jack O'Neill - Stargate SG1*

reply

FFS folks, was the M5, created by Daystrum, onboard the Enterprise realistic in the 1960's. Was Collosus realistic when Forbin created it and placed it in a mountain surrounded by radiation. Was Hal9000 realistic, or just plain psychotic.

And your cel phone has more computing power than all of NASA had at its disposal. So take the raw tech arguments to another forum, this is about a movie.

reply

Although I do laugh at some of the Hollywood "Computer Hacker" movies, I understand why they would do it.
I am sure most people would not want to watch a Unix console with someone doing nmap scans and a flurry of text flying up the screen. It's confusing and boring for the average viewer.
It's for the "wow" factor that many movies exaggerate reality.
Treat it for what it is, a movie.

reply

What the heck does Hollywood have to do with it? This film was made in Quebec, Canada!

reply

I guess YOU are wrong about the possibilities of super computer tracking capabilities being its now 2009 and We all seen LIVE: The Gulf War, War on Afganistan plus the fly on a soldier's nose thousands of miles from their home. Big Brother is ...and always has been out there watching our every move. THEY! are, in fact, even tracking your present everyday usage RIGHT NOW! ...on these message boards for subversive activity ...your buying & spending habits ...your financial transactions ...etc. If YOU don't believe that than YOU are dummber than I first thought. Your post is laughable ...YOU may know a little about personal computers but know nothing else regarding their "TRUE" capabilities. Most software doesn't utilize a computer's fullest capabilities & is not written with minimalist approach but for average users since not everybody is interested in a minimalist interface ...even those guys who progam software use more friendly interfaces for expedience purposes. The GUI allows for them to do a lot less actual typing in the multitude of scripting languages ...including their predecessor MSDOS. Script writing is a real pain in the a$$. PLUS not many people want to type hundreds & hundreds of characters & code each & everytime the use their computers. YOU really need to do your homework before speaking. Many cities in the USA & other have video monitoring. Washington DC has had video security for years even before webcams became the newest fad ...and satellites have had tracking abiliies for just about that long ...come out from under that rock you're under. What do YOU think we're doing in outer space ..Huh!!! looking for rocks & debris?!?! ...WRONG!!! I'd tell you how I know this but then I'd had to kill you :)

Whem my son was in the military and he told me that even he used a very similar training software to the game he use played on his home computer ...He said he laughed when he saw it. He and his friend use to spend very long afternoons killing off the bad guys. No rebuttal necessary ...just admit to yourself what YOU don't know!

Don' like wat U see! Don' like wat U heard! Don' like wat happen!
Go elsewhere or Turn movie off

reply

Oh, man... Where do I even begin..?
I really do know about computers and I know what they're capable of. The main thing keeping 'em back is the OS'es. Microsoft for certain.
I don't doubt that a supercomputer can use facial recognition, but in this movie, even a sh!tty wall-mounted camera was turned into a hi-def cam and the graphics (For showing us how it recognized the faces) was crap. Just about everything was crap, crap and more crap. I know the software extrapolates distances between different points (Like eyes, chin, cheekbones, hairline) etc, but simultaneous from 1000'nds of cameras... Naa... Don't think so.
Also, filtering out debris and inanimate objects, pets, cars etc.

That wouldn't be plausible, not in that magnitude. It'd have to have a sh!tload of CPU:s, Terabytes of RAM, RAMDISK's (SSD's too slow?). Keep in mind, it's not only facial recognition that computer did, but also lip-reading, as well as voice-recognition in realtime over the telephone-grids. Naa! BS!
It uses OCR also in realtime, then uses that data to search through databases and returns absolutes. OCR is never reliable to 100%, never has been, never will. One little number or letter wrong means someone else might bite the dust.

Also, not to forget one important detail; A.I. in itself utilizes tons of RAM, CPU as well as bandwidth, and A.I is far from being put into use. Some here would say there are some in existence. Depends on how you look at it. Routines that exist to make different choices based upon input, that kind of "A.I." can be done, but real A.I. that makes it's own decisions based upon a gut-feeling (After weighing facts, ruling out improbabilities and inconsistencies, and makes a not only logical choice, but the right choice - Is yet to be made.

I'd like to see what this kind of a super-computer would look like in real life.

Top of every forum:"View: ... | ... | ... | nest" <-Choose "nest"!!

reply

-------------------------------------------------------------------------
Oh, man... Where do I even begin..?
-------------------------------------------------------------------------

You given me alot of information to cover here, Tony, so I'll try to keep each area of your discussion as brief as possible...

The first use of the word "computer" itself was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued to be used in that sense until the middle of the 20th century. From the end of the 19th century onwards though, the word began to take on its more familiar meaning, describing a machine that carries out computations.[computer, n., Oxford English Dictionary (2 ed.), Oxford University Press, 1989, http://dictionary.oed.com/, retrieved 2009-04-10]... With the aforemention definition of what a computer actually does. And with our first functioning binary computer comprised a simple computation of either "On" or "Off" and essentially and independently a computer in, and of itself. Furthermore the "processur" being the first ever functioning multi-computer, the later deemed a multiprocessor (probably by IBM for marketing purposes) or slang, CPU. And both of these presumtions are misnomers of what a basic computer functionality actually was, and still is. Theory of Electrical Circuitry explains this best so I will not go into that for the sake of brevity.

Computer: http://en.wikipedia.org/wiki/Computer

"A computer is a programmable machine that receives input, stores and manipulates data, and provides output in a useful format. Although mechanical examples of computers have existed through much of recorded human history, the first electronic computers were developed in the mid-20th century (1940–1945). These were the size of a large room, consuming as much power as several hundred modern personal computers (PCs) [1]..."

Power requirements were lessened by the 1987s PC & Apples and farther deminished by nearly six thousand times less by 2008.

"(cont'd)...Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space (http://en.wikipedia.org/wiki/Computer#cite_note-1).";

Intel® Atom™ Processor, Intel's smallest chip built with the world's smallest transistors¹. The processor and transistor size comparison based on Intel® architecture products in production at the time of product disclosure (March 2008): http://www.intel.com/technology/atom/index.htm?iid=tech_as+micro_atom]";.

So now on to the processor to which I will retain the description and definition of the actual computer in reference to any "multiprocessing" electrical device and never the box or board containing the periperals.

SEE Roots of the Processor: Digital Logic and the Semiconductor: http://www.pcguide.com/ref/cpu/roots.htm

SEE Multicomputers and Supercomputers which were available as early as the 1960s and program-controlled ones since 1941 (http://en.wikipedia.org/wiki/Supercomputer).

Program-controlled "computer" was invented by Konrad Zuse [http://en.wikipedia.org/wiki/Konrad_Zuse].";

"The inventor of the program-controlled computer was Konrad Zuse, who built the first working computer in 1941 and later in 1955 the first computer based on magnetic storage [11]. This is a document in German??? of which I could not translate but is listed amongst the footnotes: Spiegel: The inventor of the computer's biography was published

SEE also Konrad Zuse [http://en.wikipedia.org/wiki/Konrad_Zuse#Zuse_the_entrepreneur]

"Konrad Zuse other notable contributions are the Z11, which was sold to the optics industry and to universities, and the Z22, the first computer with a memory based on magnetic storage [http://www.epemag.com/zuse/]. By 1967, the Zuse KG had built a total of 251 computers. Due to financial problems, it was then sold to Siemens."

This bring the discussion to when and where optical computing makes an entrance and plays an actual part of computing history. Since Konrad Zuse originally sold this technology to the optics industry, I would have to think said industry already had a usage for said technology albiet not quite the futuristic portrayed in the movie but it was being utilized as early as 1949 plus 30 years later in 1978...

"Entirely optical computers are still some time in the future," says Dr. Frazier, "but electro-optical hybrids have been possible since 1978, when it was learned that photons can respond to electrons through media such as lithium niobate (http://science.nasa.gov/headlines/y2000/ast28apr_1m.htm).

"(con't'd)... Computers using vacuum tubes as their electronic elements were in use throughout the 1950s, but by the 1960s had been largely replaced by transistor-based machines, which were smaller, faster, cheaper to produce, required less power, and were more reliable. The first transistorised computer was demonstrated at the University of Manchester in 1953 [Lavington, Simon (1998), A History of Manchester Computers (2 ed.), Swindon: The British Computer Society, ISBN 0902505018].

In the 1970s, integrated circuit technology and the subsequent creation of microprocessors, such as the Intel 4004, further decreased size and cost and further increased speed and reliability of computers. By the late 1970s, many products such as video recorders contained dedicated computers called microcontrollers, and they started to appear as a replacement to mechanical controls in domestic appliances such as washing machines. The 1980s witnessed home computers and the now ubiquitous personal computer. With the evolution of the Internet, personal computers are becoming as common as the television and the telephone in the household

Modern smartphones are fully-programmable computers in their own right, and as of 2009 may well be the most common form of such computers in existence"

The later portion of the above is addressed later on in my post of which You possible have some personal know of stated capabilities if you own a cellphone.

MORE ON MULTIPROCESSING specifically in the area of Cluster computing

"What is cluster computing? In computers, clustering is the use of multiple computers, typically PCs or UNIX workstations, multiple storage devices, and redundant interconnections, to form what appears to users as a single highly available system. Cluster computing can be used for load balancing as well as for high availability. Advocates of clustering suggest that the approach can help an enterprise achieve 99.999 availability in some cases. One of the main ideas of cluster computing is that, to the outside world, the cluster appears to be a single system... (TRUNCATED FOR BREVITY) ...Clustering has been available since the 1980s when it was used in DEC's VMS systems. IBM's sysplex is a cluster approach for a mainframe system. Microsoft, Sun Microsystems, and other leading hardware and software companies offer clustering packages that are said to offer scalability as well as availability. As traffic or availability assurance increases, all or some parts of the cluster can be increased in size or number[http://searchdatacenter.techtarget.com/sDefinition/0,,sid80_gci762034,00.html].. {TRUNCATED FOR BREVITY}...

There's that confusing terms again, multiprocessors, microcontrollers, central processing unit and clustering used interchangably however all essentially meaning the exact same thing as the multi-cored processor with multiple "computer(s), by actual definition of computer acting as the "core" of a whole system of computers. Multi-core processing is not some NEW term, it was probably adopted by IBM to give the allusion to some more proported powerful chip claims. It's called advertising in case you were wondering because their chips merely contained more miniaturized computer(s) and transistors.

SEE Miniaturization: http://en.wikipedia.org/wiki/Miniaturization

"Miniaturization is the creation of ever-smaller scales for mechanical, optical, and electronic products and devices. Miniaturization is a continuing trend in the production of such devices. {TRUNCATED FOR BREVITY}...

Early development: (http://en.wikipedia.org/wiki/Miniaturization#Early_development)

"The trend can be traced back to ancient times both as an abstract science and as a physical practice, beginning with the atomic theories of the nature of matter and the use of early microscopes. These first instances of miniaturization eventually led to the creation of current sciences such as nanotechnology and molecular nanotechnology."... {TRUNCATED FOR BREVITY}...

"Nanotechnology and nanoscience got started in the early 1980s with two major developments; the birth of cluster science and the invention of the scanning tunneling microscope (STM). This development led to the discovery of fullerenes in 1985 and carbon nanotubes a few years later (http://en.wikipedia.org/wiki/Nanotechnology#Origins)."; {TRUNCATED FOR BREVITY}...

"Molecular nanotechnology (MNT) is the concept of engineering functional mechanical systems at the molecular scale.[1] An equivalent definition would be "machines at the molecular scale designed and built atom-by-atom. {TRUNCATED FOR BREVITY}... (http://en.wikipedia.org/wiki/Molecular_nanotechnology)";

What I found most enlightening from the above was computing technologies and usage with regard to OPTICS. The scanning tunneling microscope to be concise.

Scanning tunneling microscope: http://en.wikipedia.org/wiki/Scanning_tunneling_microscope

A scanning tunneling microscope (STM) is a powerful instrument for imaging surfaces at the atomic level. Its development in 1981 earned its inventors, Gerd Binnig and Heinrich Rohrer (at IBM Zürich), the Nobel Prize in Physics in 1986. For an STM, good resolution is considered to be 0.1 nm lateral resolution and 0.01 nm depth resolution.[3]
[1] G. Binnig, H. Rohrer (1986). "Scanning tunneling microscopy". IBM Journal of Research and Development 30: 4.
[2] Press Release: The 1986 Nobel Prize in Physics:
http://nobelprize.org/nobel_prizes/physics/laureates/1986/press.html
[3] C. Bai (2000). Scanning tunneling microscopy and its applications. New York: Springer Verlag. ISBN 3540657150: http://books.google.com/books?id=3Q08jRmmtrkC&pg=PA345#v=onepage&q=&f=false

SEE Processor Manufacturing: http://www.pcguide.com/ref/cpu/char/mfg.htm

"Multiprocessing (http://www.pcguide.com/ref/cpu/arch/extSMP-c.html)is running a system with more than one processor. The theory is of course that you can double performance by using two processors instead of one. And the reality of course is that it doesn't work this well, although multiprocessing can result in improved performance under certain conditions. In order to employ multiprocessing effectively, the computer system must have all of the following in place:

* Motherboard Support: A motherboard capable of handling multiple processors. This means additional sockets or slots for the extra chips, and a chipset capable of handling the multiprocessing arrangement.
* Processor Support: Processors that are capable of being used in a multiprocessing system. Not all are, and in fact some versions of the same processor are while others are not.
* Operating System Support: An operating system that supports multiprocessing, such as Windows NT or one of the various flavors of UNIX"

I believe I covered your first 2 areas adequately from the beginning in as that these technolog were in existance between 1613, well under way as noted directly below and certainly available in 2008 when this movie came out.

"Unix (often spelled "UNIX," especially as an official trademark) is an operating system that originated at Bell Labs in 1969 as an interactive time-sharing system. Ken Thompson and Dennis Ritchie are considered the inventors of Unix. The name (pronounced YEW-nihks) was a pun based on an earlier system, Multics. In 1974, Unix became the first operating system written in the C language. Unix has evolved as a kind of large freeware product, with many extensions and new ideas provided in a variety of versions of Unix by different companies, universities, and individuals [http://searchenterpriselinux.techtarget.com/sDefinition/0,,sid39_gci213253,00.html].";

I also know of actual usage as far back as 1969 with NASA's L.E.M (Lunar Module). My father actually worked on the project when I was a young lad and I was in the actual cockpit when it was decommissioned and newer versions were built to contain more computing power and information processing. This Lunar Module that was sent to the Moon and back. It was essentially was a fully functional flying computer with far more capabilities than were actually used or might have been necessary. Though mechanically was not as sound due to robotics technology, computerwise it was first rate, top of the line, multigenerational computing power in 1969, after all computer(s) only do what they were programmed to do.

Photonics & Photonics computing were well under waty by the 1970s let alone Optical & optical computing

The term photonics developed as an outgrowth of the first practical semiconductor light emitters invented in the early 1960s and optical fibers developed in the 1970s {http://en.wikipedia.org/wiki/Photonics}. Computers work with binary, on or off, states. A completely optical computer requires that one light beam can turn another on and off. This was first achieved with the photonic transistor, invented in 1989 at the Rocky Mountain Research Center [http://en.wikipedia.org/wiki/Photonic_computing].

Optics technologies (http://en.wikipedia.org/wiki/Optics#Modern_optics), Image sensing (http://en.wikipedia.org/wiki/Image_sensor#Performance) and Contact Image Sensors (CIS) [http://en.wikipedia.org/wiki/Contact_image_sensor] had reached pretty accurate results by the 1990s [http://en.wikipedia.org/wiki/Image_sensor#Table_of_sensors_commercially_used_in_digital_cameras].


Optical character recognition, usually abbreviated to OCR, is the mechanical or electronic translation of images of handwritten, typewritten or printed text (usually captured by a scanner) into machine-editable text. It is used to convert paper books and documents into electronic files, for instance, to computerize an old record-keeping system in an office, or to serve on a website such as Project Gutenberg. By replacing each block of pixels that resembles a particular character (such as a letter, digit or punctuation mark) or word with that character or word, OCR makes it possible to edit printed text, search it for a given word or phrase, store it more compactly, display or print a copy free of scanning artifacts, and apply such techniques as machine translation, text-to-speech and text mining to it. OCR is a field of research in pattern recognition, artificial intelligence and computer vision (http://en.wikipedia.org/wiki/Optical_character_recognition).

A brief history of machine translation, The idea of machine translation may be traced back to the 17th century. In 1629, René Descartes proposed a universal language, with equivalent ideas in different tongues sharing one symbol. In the 1950s, The Georgetown experiment (1954) involved fully-automatic translation of over sixty Russian sentences into English... Several papers on the topic were published at the time, and even articles in popular journals (see for example Wireless World, Sept. 1955, Cleave and Zacharov) [http://en.wikipedia.org/wiki/Machine_translation#History]... You can read the rest on your own, be sure not to leave out the footnotes section for validity.

Pattern recognition (http://en.wikipedia.org/wiki/Pattern_recognition#Uses) is "the act of taking in raw data and taking an action based on the category of the pattern"[1]. Most research in pattern recognition is about methods for supervised learning and unsupervised learning. Pattern recognition aims to classify data (patterns) based either on a prior knowledge or on statistical information extracted from the patterns. The patterns to be classified are usually groups of measurements or observations, defining points in an appropriate multidimensional space. This is in contrast to pattern matching, where the pattern is rigidly specified. [1] Richard O. Duda, Peter E. Hart, David G. Stork (2001) Pattern classification (2nd edition), Wiley, New York, ISBN 0-471-05669-3, [2] R. Brunelli, Template Matching Techniques in Computer Vision: Theory and Practice, Wiley, ISBN 978-0-470-51706-2, 2009 ([1] TM book). However I think this article would be a much shorter read... Pattern Recognition Info by Mauricio Orozco-Alzate : http://www.docentes.unal.edu.co/morozcoa/docs/pr.php


Pattern Recognition (Journal of the Pattern Recognition Society): http://www.sciencedirect.com/science?_ob=ArticleListURL&_method=list&_ArticleListID=1221797413&_sort=r&_st=4&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=9694b4697c6474ef2126a22443015728

9. LAFTER: a real-time face and lips tracker with facial expression recognition
Pattern Recognition, Volume 33, Issue 8, August 2000, Pages 1369-1382
Nuria Oliver, Alex Pentland, François Bérard (CLIPS-IMAG, BP 53, 38041 Grenoble Cedex 9, France, Received 22 October 1998; accepted 15 April 1999):

"Abstract
This paper describes an active-camera real-time system for tracking, shape description, and classification of the human face and mouth expressions using only a PC or equivalent computer. The system is based on use of 2-D blob features, which are spatially compact clusters of pixels that are similar in terms of low-level image properties. Patterns of behavior (e.g., facial expressions and head movements) can be classified in real-time using hidden Markov models (HMMs). The system has been tested on hundreds of users and has demonstrated extremely reliable and accurate performance. Typical facial expression classification accuracies are near 100%."


Artificial intelligence: http://en.wikipedia.org/wiki/Artificial_intelligence

"The field of AI research was founded at a conference on the campus of Dartmouth College in the summer of 1956.[25] The attendees, including John McCarthy, Marvin Minsky, Allen Newell and Herbert Simon, became the leaders of AI research for many decades.[26] They and their students wrote programs that were, to most people, simply astonishing:[27] computers were solving word problems in algebra, proving logical theorems and speaking English [28]."

...I'm leaving the footnotes in cause I getting tired of citing every portion of your claims with porportional factual evidence...

The Lisp machines (http://en.wikipedia.org/wiki/Lisp_Machine) were general-purpose computers designed (usually through hardware support) to efficiently run Lisp as their main software language. In a sense, they were the first commercial single-user workstations. Despite being modest in number (perhaps 7,000 units total as of 1988[1]), Lisp machines commercially pioneered many now-commonplace technologies — including effective garbage collection, laser printing, windowing systems, computer mice, high-resolution bit-mapped graphics, computer graphic rendering, and networking innovations like CHAOSNet.

See also : http://www.sciencedirect.com/science?_ob=ArticleListURL&_method=list&_ArticleListID=1221796493&_sort=r&_st=4&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=df573cc487d0731f820442794730dcd2

SEE anybody can make supposed claims of the technology already in place by the time this movie was made. Multi-core, Multiprocessing, multitasking, Hyperthreading, what a joke :) All this technology and rolled into use at the level of the atom but somehow never concieve at the macro level... All this computing power at our disposal as early as the 70's I am being real

reply

I sure hope you didn't kill your keyboard and got any aneurysm writing that.

I read some of it, but I lost interest after a while (Go figure!!?).
Way, way too much text!

Top of every forum:"View: ... | ... | ... | nest" <-Choose "nest"!!

reply

--------------------------------------------------------------------------------
by - Tony_Hedlund on Wed Mar 3 2010 04:13:21

I sure hope you didn't kill your keyboard and got any aneurysm writing that.

I read some of it, but I lost interest after a while (Go figure!!?).
Way, way too much text!
--------------------------------------------------------------------------------

Nope, as a matter of fact I found it quite enjoyable... as did my keyboard. LOL I enjoyed providing the information toward your earlier posting:


--------------------------------------------------------------------------------
by Tony_Hedlund (Thu Feb 11 2010 05:55:21)

Oh, man... Where do I even begin..?
...
--------------------------------------------------------------------------------

Most advanced technologies are not brought to the silver screen right away, only when these industries release this technology to the general public will you see any true advances. As you may or may not know most funding for said technological advances are supplied by the goverment contracts. I know this only because of a prior employment opportunity in the 70s, I had dome some defense contract work for ??? (So if I told ya then I'd have to kill ya... only kidding), and besides their mentioning really important isn't toward this discussion. In any event, I've done actual board (drafting) work on some of the blueprints & schematics of said technology, so I did/do have some advantages of a prior knowledge. Also as you may know most of said technology had it's birthing in scientific and military application first. Although I had never seen actual finished products back then, the schematics were clearly labeled as to what they were. I was fortunate enough however in recent years to see only some of the finished components I had taken part in creating once they were release to the public.

If you really think hard about it, Tony... somebody had to think all this up before you saw it on the silver screen.

How do you think they put those rockets and explorers into space then have them return safely home since the mid-40s? They weren't bottle rockets you know, they were radio controlled and computer-assisted machines. There was no CGI however they are really not just the work of science fiction either...

"The early era of space exploration was driven by a "Space Race" between the Soviet Union and the United States; the launch of the first man-made object to orbit the Earth, the USSR's Sputnik 1, on October 4, 1957, and the first Moon landing by the American Apollo 11 craft on July 20, 1969 are often taken as the boundaries for this initial period. The Soviet space program achieved many of the first milestones, including the first living being in orbit in 1957, the first human spaceflight (Yuri Gagarin aboard Vostok 1) in 1961, the first spacewalk (by Aleksei Leonov) in 1965, the first automatic landing on another celestial body in 1966, and the launch of the first space station (Salyut 1) in 1971. (http://en.wikipedia.org/wiki/Space_exploration)";

SEE ALSO: http://en.wikipedia.org/wiki/Space_exploration#History_of_exploration_in_the_20th_Century

--------------------------------------------------------------------------------
by Tony_Hedlund (Thu Feb 11 2010 05:55:21)

I'd like to see what this kind of a super-computer would look like in real life.

--------------------------------------------------------------------------------

So if you were alive back then then you've already seen it, you just didn't recognize it as such. I know I did.

Dislike what UR viewing _what UR hearing _whatever's happening! U could go elsewhere or turn it off

reply