Amusingly outdated


It fascinates me that even as recently as 1970, people were utterly stuck on the notion that a powerful computer simply HAD to be BIG....ENORMOUS...COLOSSAL. It never EVER occurred to them that circuit elements could shrink, or that a computer the size of a mountain or even larger would have VERY L O N G circuit pathways that would make it very slooooooooooow.

reply

Ho hum.

"Everyone is ignorant, only on different subjects". Will Rogers (1879-1935)

reply

Oh lord!

It's a movie.....

reply

It's built into a mountain so it can withstand nuclear attack. Also, inside is housed a nuclear reactor.

reply

It's not just built into a mountain. It's the size of a mountain. We clearly see how colossal the circuits are. Slow stuff lol

reply

... and the crew and staff need a place to sleep and eat to tend to it's delicate circuitry... [snark!]

reply

They were stuck on the notion that it had to be so because at that time, ca.1968, supercomputers DID have to be big. The movie was made for a then-contemporary audience. It was supposed to be timely and relevant, tying directly in with the ongoing worries about nuclear Armageddon.

You seem to think that they were obligated to make a movie set in the future, and that they were somehow remiss in not accurately forecasting how technology would change in their future.

But that was in no way the filmmakers' intention. They were trying very hard to make a movie about their present and near-future.

And so, big honking computers it was.

reply

In terms of "big honking computers", it isn't outdated. Here is the supercomputer currently ranked the worlds fastest, an old picture of it:

http://phys.org/news/2013-06-tianhe-supercomputer-petaflops-title-contender.html

short youtube showing more of it here:

https://www.youtube.com/watch?v=GrM9WnE7J94

A picture of one of the other top supercomputers:

http://akihabaranews.com/2014/01/07/article/riken-institutes-next-supercomputer-exaflops-2020-351980900

Massive parallel processing even with longer path lengths speeds things up.

reply

I demanded no such obligation. I was simply observing how outdated their concept of the "future" was.

reply

well in 1970 a powerful computer HAD to be big...colossal.
1970 is not really recent in terms of electronics.
in fact a supercomputer today still has to be pretty damn big. what they showed in this movie isn't far from what supercomputers look like today in terms of size.

as far as your notion that the size would have much effect on it's speed - that depends on how the computer functions. However if properly designed and programmed that would be just insignificant compared with the processing capacity gained. electrons travel at the speed of light. traversing distance from one side of the mountain to the other would be nanoseconds.

with parallel processing you don't need each all processes in the computer to communicate at the most optimal speed yet you can still benefit immensely in speed performance. it's like assigning individual tasks to a number of workers and they get back when they are done with the tasks - they don't need to talk to each other while they are completing the tasks as these are relatively independent functions.

apart from today's supercomputers which pretty much all comprise multiple computers who are networked in a big datacenter we also have equivalent cloud supercomputer networks that you don't see necessarily together in one datacenter but can be even spanning different continents yet are interconnected and function over these huge distances.

reply

Actually, it’s the electromagnetic waves that travel fast, not the electrons themselves. Traveling over a distance of, say, a mile would take several microseconds, not nanoseconds. Also, comparing the distance over which today’s computers talk to each other with the relatively enormous distances in the CPU of Colossus is apples and oranges. The distance difference inside the CPUs alone would make Colossus’ internal processing thousands (if not millions) of times slower.

reply

I was speaking loosely about electrons - that is widely accepted as equivalent to electricity. Electromagnetic waves move at the speed of light.
About the measure - i misspoke - i meant microseconds not nanoseconds. that would be quite similar to the latencies in todays networked supercomputers.

Secondly if you had read my post parallel processing makes these distances/latencies largely irrelevant. You are also making assumptions on the "relatively enormous" distances in the CPU of Colossus - there is no mention of the detailed design of the supercomputer. Even in those days they knew about parallel processing so the Colossus CPU was probably not ONE cpu but many distributed ones acting in coordination or "parallel". So minute latencies between them are not that relevant to it's processing capability.

reply

You're right about parallel processing. I had the same thoughts and can't even begin to imagine how much computing power a complex the size of Colossus would have today. It truly would be impossible to out-think in the same situation.

reply

The larger size does not make the processing slower. It just means the queue is longer, and the latency to finish a particular command. The commands can be completed blazingly fast once queued up.

reply

I was a computer operator in the late 70s. I maintained the list of names of subscribers to a Publisher's various magazines so, later in the week, we could print thousands of lables to be glued to the monthly/weekly/daily magazine or newspaper. ('Footwear News' still cracks me up a bit.) Pretty dull job, but well-paying for not much actual 'work'. Besides, what girl in 1977 didn't go nuts for a tie wearing computer nerd, who hated disco.

The Honeywell 3000. It was not State-of-the-Art at the time, it was from 1969 or 70, about the timeframe of the movie. (As far as I remember, c'mon, it was the 70s and I was a young!)

I was in a big, glass-enclosed HVAC room, with floors raised nearly 2 feet for all of the cables and air condition/filtering conduits to keep two separate computers running 24/7. (The other was a more modern and twice the size IBM - hmmm, computer envy?) If the temperature rose above 78○, there was pandemonium.

With a row of about 7 of those James Bond, tall as a person tape machines lined up in front, I sat at a console with a keyboard/printer and a Punched Card reader. The program about to run was fed into the reader, and so was was any new Input data that was Sorted Into the Master List.

We could only run One program at a time. Programmers begged us to bump their tests up on the schedule, so they could go home at a decent hour.

And yes, there was a 'Card Sorter'; the kind every G-Man worth his salt would lean over, snatch a single card out and proclaim, "Well, here's your man - pick him up, boys."

The main Memory Core on this Honeywell were held in 2 big metal housing units about 10 or so feet long, 4 ft high and 3 ft deep. (Think of the deli-counter at your supermarket. )

This Memory Core (RAM - Random Access Memory) was composed of thin electric wires that were woven through the holes of many little magnetic 'donuts'. When One of the wires was electrified, the magnetic field of the little 'donut' ran in one direction, if Both wires were 'live', the field was reversed. A Third wire measured the direction: one way was ON (1), the other dierection was OFF (0). Each one of these 'donuts' was an actual physical Bit of information. If you could open it up ( you couldn't, only techs could), you could See Bits and Bytes of information that haven't been visable to us since the Silicon Chip!

The storage capacity of these 2 monsterous units was a whopping . . . 64k of memory!

64, 000 tiny magnets, criss-crossed with miles of copper wire, as big as a queen-size bed, that needed to be housed in its own climate-controlled environment for what today is an unimaginably small amount of data storage. The tablet on my lap holds 80 Gig. No big deal.

Now, imagine building a computer with "a brain the size of a planet" during those times. It's no wonder they needed to store it in a mountain!
Computers in the 70s were gigantic, petulant, power-sucking, fragile, finicky, Baby Hueys. They did one job at a time, weren't connected to each other, and generated so much heat, that that mountain would have exploded like a volcano.

When I think of how quickly technolgy has changed since then, and how fast all of us, whatever our age, have had to adapt to these changes, it's a wonder more of us are not blubbering technophobes.
But then I just think of my Grandmother, who was born before manned flight and the light bulb. She lived well into the 1990s, and she seemed to have managed just fine.



Now listen here, you mugs, nobody gets to say 'Meh' anymore unless you're Edward G. Robinson, see?

reply

lol

reply

Thanks for the best read I've had on these IMDB rant-boards in a long time, EtaoinShrdlu_56.


reply

@Dali-Fan:
Thanks!
I do tend to go on a bit at times, but I do have fun and try to make it interesting.
Might be cooler still, if I actually saw the movie and commented on that, instead of my old jobs!

Appreciate the note.
BTW: Also a big fan of Dali, . . . and DuChamp!

ETA
(Name is just letters, in order of use in English)


Now listen here, you mugs, nobody gets to say 'Meh' anymore unless you're Edward G. Robinson, see?

reply

Core memory units are fascinating devices. I got to handle and look at a few of those in a computing course I did years ago. Utterly fascinating pieces of hardware.

At least I thought so.

reply

I loved your post and I know exactly what you mean. I started in the tech industry around the time that a lot of the technology you speak of was being phased out. I am glad I got to see it because it really makes me appreciate kind of work that went in to the world we live in now which includes information on demand. None of it happened by magic and it didn't take place overnight. Many people take it for granted.

One of my first jobs was setting up the hardware and writing software that allowed government and corporate entities to communicate and retrieve information over great distances. I won't claim to have invented the internet but I sure as hell helped to usher it in. I remember when the IMDB was a Usenet news group before it was a "web site". The person that posted the comment does not really understand technology or appreciate the message about technology and human frailty in the Forbin Project.

If they do a remake. I hope they keep the focus on the awesome story and not try to wow everyone with a bunch of stupid special effects.

reply

Nice narrative. Thanks for the information.

reply

"Powerful" computers are still "colossal" - we just keep scaling up our definition of "powerful". So, data centers (will) still consume football fields.

To your point on circuit length, looks like Admiral Grace Hopper had you covered around the time of this film - https://m.youtube.com/watch?v=JEpsKnWZrJ8 .

Please notice she was elite before either of us were in diapers.

Best,
-Greg

reply

Not outdated at all. In fact, we are approaching a level of computer intelligence where the premise of this movie could be possible. And supercomputers are still relatively large.

reply

For a movie made 35 years ago it is quite prescient
IMO this movie is not outdated, it is underrated

reply

In the 1970s, not really knowing what was involved, people imagined that it would take a "colossal" computer to intelligently manage the country's nuclear missile arsenal. Today, we know that not a particularly high level of machine intelligence would be required; a moderately-sized control network would be more than adequate.

The whole reason for creating Colossus was that human judgment could not be trusted with the power to destroy the earth; a computer would evaluate situations logically and dispassionately. Then they turn around and give Colossus human intelligence, completely defeating the whole purpose of building it in the first place.

Further, they placed their trust entirely in Colossus, and didn't even have a contingency to retake control if something went wrong. Had Forbin never heard of CTRL-ALT-DEL? IMNSHO, they got what they deserved.

reply