@MrFattyfatfatboy "Man" has a long history of use as a generic. "The Wright Brothers" Is a generic for the inventors of flight. No one remembers the one who took the first flight. They wanted it that way. "Sharing". Before the first moon landing Buzz Aldrin made an issue of who took the first step on the moon Everybody knows Armstrong took the first step. Which rule are you playing by?
@@dpsamu2000 In fact, the Germanic word “mann” was originally gender-neutral. It simply means “human being”. You add a prefix to denote subdivisions, like “wifmann” which eventually became woman. The Germanic word for male was “were”, which still survives today in the word “werewolf”. And knowing is half the battle.
@@MrStephenRGilman Good point. Nobody says anything about werewolves inventing computers. Gotta wonder where's the beef these people have about man. Coulda been werewolves.
@MrFattyfatfatboy I’m 62 and remember for the first half of my life, man was understood as having two meanings. One was a short version of Human and the other the male perspective we choose to politically manipulate people with today. Never heard females complaining until around 1990 when it became another way to divide us to garner votes.
The US Army, post WW2, did a few things very well. Making training films that teach complex ideas in a simple and easy to understand way, was one of them. They are beautifully simple but complete learning tools.
That's because back in the day, people actually understood how things worked. Unlike today. No one today understands anything. Most folks don't know how a TV, microwave, phone, radio or even a car works. They simple don't understand the basic things. The world is full of ignorant people.
These computers took up nearly a whole room and now I'm watching this film on a computer hundreds of thousands of times more powerful in the palm of my hand. Pretty cool to think about how far we've come with technology.
The next big shifts after this stage were PCB-based computers and after that, hard drives that allowed permanent program storage. Home Computers became popular with the likes of the C-64 and later the IBM 5150 right at the threshold between those last two stages.
@@jonathont5570 To be fair, I've never had any issues with Win 10, and I'd rather have my Quad Core i7 CPU to play my games on. (FYI, 90% of issues with Win10 were caused by Microsofts utterly broken upgrade system. A clean install from a Win10 disc that can/could be activated by any legit Win7/8 key works fine. Has for me on two different computers so far.
When I first started working for Western Electric back in 1963, I was in the program starting in 1964 learning, about digital electronic switching. Up first electronic switching system was called number one ESS Electronics, then my education continued learning all about tsps traffic service position systems, along with, ETS, electronic translator system, then number four, ESS All Digital electronic switching system, then my education continued, with the Laboratories, number five ESS the first digital fiber optic switching system. Many many years in School, learning Unix, that's the operating system for the current systems. Unlike people think mainframe computers, those are toys in comparison. 36 years service with the Bell System, they were a fantastic company, until in my opinion the breakup in 1984, by judge green. Breaking the system up into seven independent operating systems.
The AT&T break up ruined it for the telecommunication system. After the break up customer service took a nose-dive, the quality of voice transmission sunk to levels never before seen and the monthly service bill skyrocketed. The American public were the big losers because of that insane deal.
The breakup, as with most other actions taken in the business and financial sectors, is of course to allow large corporate entities to make massive amounts of profit through speculation. This mindset, which seems to have begun in the 1970's, has severely damaged the stability of the economic structure and caused hardship for millions of people who counted on stable, dividend-producing companies like the old AT&T to generate retirement income. It has now been carried to extremes in areas like pharmaceuticals, with the result that it is now impossible to find many drugs that are made in this country.
@@VidkunQL I think it was a misquote. The narrator said essentially that for thousands of years Man had invented machines that extended his muscles. Now, for the first time, Mankind has invented a machine that extends his mind.
I love this kind of historical archives to learn how far technology has gone through the years. Today every person no matter their age take computers for granted, but computers are a true miracle as fascinating as how the ancient pyramids of Egypt and Mexico were created. Human mind is a powerful tool blessed to create such marvels, every person involved in the creation and development of modern computers deserves our eternal appreciation and gratitude. I hope technology can help us to develop us together instead of being a weapon of destruction (both physical as virtual, as we live everyday with the internet full of hate and fake news). No political sides, no religious or race fights, just binary code.
Sorry to rain on your parade, but the reality is that some of the biggest advancements in computing came about through efforts to win wars, whether it was the hot war of WW2, or the Cold War that followed. There are probably social and political reasons, for this, and not all computing advances are the result of military work, but many of the most important and powerful are; including the internet we all use today for everything from online learning to fake news, from e-business to ransomware. As the video says, information technology amplifies the power of the human mind for good or ill.
@@hamobu In fact, IBM Z mainframes which are still available, are in large amounts backwards compatible with System/360 computers from the 60s. Modern computers however can differ quite a bit as a result of refinement over the decades, such as RISC processors providing low power and efficient performance for mobile devices
@@hamobu RISC is an important difference, as processor designs like x86 which have a single 'MOV' instruction which can 'move' values around memory or internal registers, as well as calculate array offsets, require a bunch of complexity to properly implement, RISC instead splits these instructions into much simpler and obvious ones, like LOAD, SET, ADD, STORE, requiring less physical space to implement them, which allows you to run them at higher clock rates to compensate for the overall higher number of instructions in a program.
@@hamobu yes, another factor was speed of compilation, programming languages operate by higher order instructions. It is easier to always translate '=' to 'mov'
3 ปีที่แล้ว +3
Just had to mention this: I loved the perfectly clear and well-spoken English of this video.
If something is "classified" then it gets the mean font to let you know it's serious. Once "unclassified" it gets the friendly, more approachable font.
If I were a school teacher I would show this on April 1st and tell them we are going to learn about a cool new technology coming out that we might get to use I'm the future.
This made me remember when typing punched cards at college. Once finished the bunche of cards, feeding them to "multivac" and, when expecting results, receiving a "typing error" warning... And restarting the whole process again. At least I learnt "iterations" concept.
My father Paul Loatman & George Gamow were the 2 physicists who ran the first computer, Uniac, at Naval Research Labs during the war. George went on to get the Nobel in physics & my father went on to become Chief of Research & Development for the Army, bringing the technology to it. In both cases the common application was computing aiming tables every time a new barrel or breach was designed. Prior to this it took 200,000 shots to generate them. Usually some breaches or barrels exploded and 26 men died on the average. Once the computer was deployed only 2,000 shots were needed, 26 men's lives were saved and new weapons were in combat months or years quicker.
At the base of all computers, is still the same. The one thing that never changes is the way they work. At the very base is the binary. The only thing that changed, was the speed, and size of storage. The more powerful, the more it will do.
You could not be more wrong. A mechanical computer does NOT use binary. In fact, they ar3e as far from digital as you can get. They tended to use analogue pulse of electricity to turn gears and cams. They did not use binary in the slightest.
LOL Wow do I "feel" old! My very first interaction with a "computer", was entering "data" into a Sperry Univac 1219, and seeing the simple result PRINTED on it's terminal! ~ Circa 1982 LOL ;)
LOL ;-) you feel old!! as an assistant, I remember delivering a program on cards(probably fortran) and then collecting the printout the following day in the late 60's.
That's a REALLY interesting question. I know they had a device that would let them put images like that onto a TV screen... it's what was used for things like test patterns. You could certainly use something like that, have a template slide along with one with the text you wanted to display. That would work, but it would be interesting to know how they actually did it.
@darkwood777 Hi, and thanks for your service (in industry, not military). The jobs did disappear and were replaced by digital design professions: however, what I miss, is the professionalism, ethos, and precision of the past. That was irreplaceable.
@darkwood777 hand drawn draftsman are still in demand though it is not as necessary as it used to be but computer draftsman are in fairly heavy demand in designing 3d models and blueprints
When I started college my plastic cases calculator could do all that plus scientific functions ( logs, trigonomity, etc). Our computer had punch cards and was in a building on main campus. Our punch card data was fed to to it from phone lines. Much of the execution of code mistakes was from mistakes punched into the punch cards. It took about 6 to 7 hours to get your results. Who said those old machines were fast?
Never say "never" )) GANNs, btw, been imagined by then in simplified mathematical form. You can find speculations of learning systems in early Norbert Wiener's works. Great vid, btw, thank you for sharing
Today's computers still basically work the same way, only a lot faster. Information is still encoded in bits because they use nanoscopic transistors. There are still registers and clocks doing exactly same things for the same purpose. Optical and magnetic media still encodes information onto a surface of something and reads bits by passing a detector over it.
@@alexcarter8807 Wonder if anyone caught that. Computers have not only gotten smaller and more powerful which has reduced staff, but are performing tasks once relegated to enterprise managers and other MIS and CS workers. The last guys in the field with a future are the software engineers now, and before long, virtualization, the cloud, AI and AGI will be replacing them too. When describing the limitations of the capabilities of computers in this film, they left out one word: _"YET"_
@@Schindlerphoto The difference between this and the airplane is that like cars and the internal combustion engine, this technology is designed to REPLACE the carbon-based entity that performed the operation previously, not enhance or aid them in any way. At first sure: That's the sell job that get's people to buy in. Buy into their own replacement....
Modern civilization: Digital is a word no savage would understand. It's a new universe inside our computers! Ancient civilization: Let's count on our fingers. It's digital...
In the 1960s and probably a while into the early 1970s as calculators became affordable, high school math and science classrooms often had a giant-sized slide rule, on the order of seven feet long, on which the instructor could demonstrate needed calculations. Our high school chemistry teacher (whose precise writing caused some students to describe him as 'typing on the blackboard') had the uncanny ability to sort of shove the moving components with a toss and have them land very close to the desired place.
Just remember digital technology was developed in Bell labs, starting with the transistor. Fiber optic digital switching number five digital electronic switching system fiber optics, developed, in Bell Laboratories.
The Bell Telephone Company was at that time prohibited by the US Government from going into the transistor manufacturing business. They licensed the technology to Sony and others. Soon Japan totally wiped out the US consumer electronics manufacturing base.
The only problem is the man's bidding part, we need a computer to stop us when we start getting stupid and destructive. Computers are more dangerous than many conventional weapons, look at the mess the internet and misinformation has caused at man's bidding
@Sammy Reed That is not what varjagg meant. TVs from the 40s to the 90s were most definitely analog TVs. varjagg is referring to young people today using "analog" incorrectly to refer to some old fashioned things. For example, they might refer to a toaster from the 70s as an "analog toaster". A toaster from today with a microprocessor and digital display in it does not make it a "digital toaster". The only thing that would make a toaster digital would be if you could individually address small areas on the bread to burn a picture into it. ...and even then it might not be digital, because you could do that with analog electronics as well if the heating element were on some kind of X-Y plotter.
I remember paying $25.00 for my Pickett slide rule in 1970. This was just before the first 8 digit pocket calculator which was about $150.00. You now sometimes get these for free when you donate to a charity.
If my grandfather had seen this film, he would have been shot. However, he sensed the existence of these devices, and even more, he understood computer's impact on the world 50 years after he died.
I hope the follow-on videos mentioned at the end show up. BTW, one of the more common computers in 1962 was the IBM 1620 that was, in fact, a decimal machine, not binary. (I'm currently learning its assembler.)
Welllll, strictly speaking, BCD (Binary Coded Decimal). At the bottom, each decimal digit was still represented in memory by six bits of binary data, four bits for the digit (or some other codes), a bit for a parity check, and a flag bit to mark sign or most significant digit. So while it presented and received decimal data, it was stored in a binary form, i.e., there were no ten-state components analogous to the two-state binary cores.
It was about that time in history when the U.S. government saw how powerful a computer could be. They gathered many experts, asked how much it would cost, and authorized a project to make the most powerful computer possible to answer the world's most difficult questions. With this task completed, they asked the computer the biggest question of all. "Is there a God?" The computer answered "There is now."
The fact that we technically do have flying cars in 2021 (A prototype was successfully tested between cities this year) even though this comment was a bit sarcastic..
Simple yet elegant concepts integrated that was conceived during WW2 with only that available, relying on a invention several, several decades, prior to tansister. Vacuum tubes: instant flow or interruption of electrons. I'm amused how people continue to compare and contrast large contingents of enormous square yard needed, but the revolutionary thought process of binary code we take o so for granted. Fast marching over ground laid by visionary people. Paper kite rationale.
*Oh how well we all remember these goofy and tacky films from the 50's & 60's... with the dramatic trumpet music blasting out their dramatic tune like something really important was about to occur! lol.*
if you ever run across it, please post the US navy Film about Sercey (or is it Cercy) and how she promoted carelessness in setting Material Conditions of readiness
Watching this on my 256GB smart phone that can do more/store more data than all of the computers in the world when this film was made. Insane how far technology has come in 60 years Also never knew what “bits” stood for before this. Our entire world is based off of binary code. The computers used in machinery to make your home goods, your cars, your beds, your showers, every single thing is based on something we always had access to before we even knew we had access to it 😳
@@makeart5070 Ahh yes, binary is easy to read. It's only 2 possible values. -_- Tell you what, explain how computers operate so that I can begin assembling my own machines from scrap electronics and coding on them in my own languages, then I'll accept that it's simple.
@@trajectoryunown Read up on the internals of the microprocessors of the 1970s--say 8080, 6502, Z80, 6800. It's pretty simple. It would take some time and money, but one man can build a simple CPU out of off-the-shelf logic chips so that it would work basically like one of those microprocessors, but slower. He would need basic understanding of electricity, like maybe from a college physics course or from an electronics-tech school. He could build an assembler, OS, and compiler too. It might take decades to do all that though.
Here I am watching this on my laptop that can do anything I want, stores the data on a 2.5 inch HDD that can hold 1TB of data. Yes I know about SSD and even more. But this is what I have right now. If those people in the video are alive today to see what we have now, I wonder what they would think.
Still applicable in today's world of computers. I don't recall ever seeing a video on computers when in programming class in the early 80's at an institution. They did have a card punch tucked into the corner...never used it as magnetic data caught on some years prior.
Computers do not compute by "counting." They add with successive addition of bits. They subtract with successive addition of bits. They multiply and even divide with... successive addition of bits.
is an Indian positional decimal numeral system, and is the most common system for the symbolic representation of numbers in the world.It was invented between the 1st and 4th centuries by Indian mathematicians. The system was adopted in Arabic mathematics (also called Islamic mathematics) by the 9th century(by Wikipedia)
Did a days work experience in the 60s delivering loose bundles of punch cards to and from offices and computer rooms. Had no idea what I was holding at the time. All I was told was “don’t drop it”.
I had a T-shirt custom made back in the early 80's while going to school. "Programmers do it with Logic" is what it said. Loved that shirt. Not quite the thought provocative shirt that you liked though.
So the basic principles for programming were already well established in 1962, that's pretty neat. I wonder what kind of sorting algorithms they used back then
For punched cards, this was often done as an offline procedure using tabulation equipment like the IBM 083 Card Sorter. You would run the deck into the machine, it would sort one column's data into separate hoppers, then you would stack them together, and repeat the procedure on more significant columns. IBM had an educational film with Bob Newhart having cards explained to him by Herman Hollerith. th-cam.com/video/pfskp4R53Q0/w-d-xo.html
I remember in the early 70's My friends dad was an air force Officer and he brought home boxes And boxes of punch cards We would build towers as high As we could reach
I used them in college. One mistake and your entire program was useless! There was no monitor, only a teletype machine. Before that, you had paper punch tape, and before then, metallic punch tape. We used to make Christmas decorations and artificial flowers out of them.
Here's the issue: Tens of thousands of films similar to this one have been lost forever -- destroyed -- and many others are at risk. Our company preserves these precious bits of history one film at a time. How do we afford to do that? By selling them as stock footage to documentary filmmakers and broadcasters. If we did not have a counter, we could not afford to post films like these online, and no films would be preserved. It's that simple. So we ask you to bear with the watermark and timecodes. In the past we tried many different systems including placing our timer at the bottom corner of our videos. What happened? Unscrupulous TH-cam users downloaded our vids, blew them up so the timer was not visible, and re-posted them as their own content! We had to use content control to have the videos removed and shut down these channels. It's hard enough work preserving these films and posting them, without having to spend precious time dealing with policing thievery -- and not what we devoted ourselves to do. Love our channel and want to support what we do? You can help us save and post more orphaned films! Support us on Patreon: www.patreon.com/PeriscopeFilm Even a really tiny contribution can make a difference.
Remember my childhood in one of R&D cities of USSR - Lviv. I was playing parts of these kind of machines as my father was changing one old in to one of the first Intels received late 80s from Japan.
Yo estudié computación en la Universidad con máquinas IBM similares a estas. También utilicé regla de cálculo en lugar de calculadora de bolsillo. Conocí el télex y el fax. La máquina de escribir mecánica, luego la eléctrica, la electrónica y las impresoras... Afortunadamente me tocó vivir la era de la transición de las válvulas electrónicas o bulbos a los transistores y de ahí a las nanotecnologías, de lo analógico a lo digital, de lo mecánico a lo electromecánico y luego a lo electrónico, de las máquinas a la robótica y a la mecatrónica... jajajajajajajajajajajajajajajajaja
I love the typography in old government films. The "unclassified" title is so elegant.
Like 'I Love Lucy'..
@MrFattyfatfatboy "Man" has a long history of use as a generic. "The Wright Brothers" Is a generic for the inventors of flight. No one remembers the one who took the first flight. They wanted it that way. "Sharing". Before the first moon landing Buzz Aldrin made an issue of who took the first step on the moon Everybody knows Armstrong took the first step. Which rule are you playing by?
@@dpsamu2000 In fact, the Germanic word “mann” was originally gender-neutral. It simply means “human being”. You add a prefix to denote subdivisions, like “wifmann” which eventually became woman. The Germanic word for male was “were”, which still survives today in the word “werewolf”. And knowing is half the battle.
@@MrStephenRGilman Good point. Nobody says anything about werewolves inventing computers. Gotta wonder where's the beef these people have about man. Coulda been werewolves.
@MrFattyfatfatboy I’m 62 and remember for the first half of my life, man was understood as having two meanings. One was a short version of Human and the other the male perspective we choose to politically manipulate people with today. Never heard females complaining until around 1990 when it became another way to divide us to garner votes.
The US Army, post WW2, did a few things very well. Making training films that teach complex ideas in a simple and easy to understand way, was one of them. They are beautifully simple but complete learning tools.
That's because back in the day, people actually understood how things worked. Unlike today. No one today understands anything. Most folks don't know how a TV, microwave, phone, radio or even a car works. They simple don't understand the basic things. The world is full of ignorant people.
These computers took up nearly a whole room and now I'm watching this film on a computer hundreds of thousands of times more powerful in the palm of my hand. Pretty cool to think about how far we've come with technology.
Without these, we'd have nothing.
@@captainamericaamerica8090 Tell that to someone in 1935.
The next big shifts after this stage were PCB-based computers and after that, hard drives that allowed permanent program storage. Home Computers became popular with the likes of the C-64 and later the IBM 5150 right at the threshold between those last two stages.
@@trekaddict I loved my C64 and later the Amiga...now we have windows 10 :-(
@@jonathont5570 To be fair, I've never had any issues with Win 10, and I'd rather have my Quad Core i7 CPU to play my games on. (FYI, 90% of issues with Win10 were caused by Microsofts utterly broken upgrade system. A clean install from a Win10 disc that can/could be activated by any legit Win7/8 key works fine. Has for me on two different computers so far.
When I first started working for Western Electric back in 1963, I was in the program starting in 1964 learning, about digital electronic switching. Up first electronic switching system was called number one ESS Electronics, then my education continued learning all about tsps traffic service position systems, along with, ETS, electronic translator system, then number four, ESS All Digital electronic switching system, then my education continued, with the Laboratories, number five ESS the first digital fiber optic switching system. Many many years in School, learning Unix, that's the operating system for the current systems. Unlike people think mainframe computers, those are toys in comparison. 36 years service with the Bell System, they were a fantastic company, until in my opinion the breakup in 1984, by judge green. Breaking the system up into seven independent operating systems.
What language are you trying to speak? I can't understand this jibberish.
The AT&T break up ruined it for the telecommunication system. After the break up customer service took a nose-dive, the quality of voice transmission sunk to levels never before seen and the monthly service bill skyrocketed. The American public were the big losers because of that insane deal.
Remember when sharing info, many people want it shoved down their throat with as little thinking as possible required on their part.
The breakup, as with most other actions taken in the business and financial sectors, is of course to allow large corporate entities to make massive amounts of profit through speculation.
This mindset, which seems to have begun in the 1970's, has severely damaged the stability of the economic structure and caused hardship for millions of people who counted on stable, dividend-producing companies like the old AT&T to generate retirement income.
It has now been carried to extremes in areas like pharmaceuticals, with the result that it is now impossible to find many drugs that are made in this country.
“ man kind has invented a machine that extended the capabilities of his hand & mind”, best conclusion ever +
Everything is pretty well explained.
Don't most new inventions and machines do that?
VidkunQL well i guess so😀
@@VidkunQL
I think it was a misquote. The narrator said essentially that for thousands of years Man had invented machines that extended his muscles. Now, for the first time, Mankind has invented a machine that extends his mind.
@@VidkunQL It's basically the point of technology, to overcome your species' physical and mental shortcomings.
Maybe one day there would be a lack of distinction between the tool and hand that uses it.
I love this kind of historical archives to learn how far technology has gone through the years. Today every person no matter their age take computers for granted, but computers are a true miracle as fascinating as how the ancient pyramids of Egypt and Mexico were created. Human mind is a powerful tool blessed to create such marvels, every person involved in the creation and development of modern computers deserves our eternal appreciation and gratitude. I hope technology can help us to develop us together instead of being a weapon of destruction (both physical as virtual, as we live everyday with the internet full of hate and fake news). No political sides, no religious or race fights, just binary code.
Too bad people have forgotten how to speak and write English. Damn jibberish.
Sorry to rain on your parade, but the reality is that some of the biggest advancements in computing came about through efforts to win wars, whether it was the hot war of WW2, or the Cold War that followed. There are probably social and political reasons, for this, and not all computing advances are the result of military work, but many of the most important and powerful are; including the internet we all use today for everything from online learning to fake news, from e-business to ransomware. As the video says, information technology amplifies the power of the human mind for good or ill.
@@SanjaySingh-oh7hv I am sure porn helped too!!!🫂😂
Quality programming right here. So interesting to watch and compare to where we are today. Thanks Periscope!
Actually, today computers still work the same way as back then only a lot faster.
@@hamobu In fact, IBM Z mainframes which are still available, are in large amounts backwards compatible with System/360 computers from the 60s. Modern computers however can differ quite a bit as a result of refinement over the decades, such as RISC processors providing low power and efficient performance for mobile devices
@@hamobu RISC is an important difference, as processor designs like x86 which have a single 'MOV' instruction which can 'move' values around memory or internal registers, as well as calculate array offsets, require a bunch of complexity to properly implement, RISC instead splits these instructions into much simpler and obvious ones, like LOAD, SET, ADD, STORE, requiring less physical space to implement them, which allows you to run them at higher clock rates to compensate for the overall higher number of instructions in a program.
@@kreuner11 yeah. Back then memory was expensive so they created complex introductions to make programs smaller
@@hamobu yes, another factor was speed of compilation, programming languages operate by higher order instructions. It is easier to always translate '=' to 'mov'
Just had to mention this: I loved the perfectly clear and well-spoken English of this video.
Yes, in a world full of Indian tech support, it sure is nice to hear someone speaking in actual English.
The narration of this film is excellent and tasteful. I believe that the microcomputer is the best computer invented.
Gotta love the font choice for "unclassified".
If something is "classified" then it gets the mean font to let you know it's serious. Once "unclassified" it gets the friendly, more approachable font.
9:56 ( shows diodes) Narrator: "by means of transistors" (scrolls down to reveal transistors) Narrator: "semiconductor diodes"
lol
Wow, yer reeely smot, an the guy in the moovy is so dum.
@@profd65 It's just silly the error was made.
Great watch! I'd be interested in watching this series's future tapes in sequentail order.
If you need beads "to count your children" then that's a lot of children.
depends on how fast they move
Would be interested in seeing the other episodes since at the end it's mentioned they'd tell us more about some of the details.
If I were a school teacher I would show this on April 1st and tell them we are going to learn about a cool new technology coming out that we might get to use I'm the future.
This made me remember when typing punched cards at college. Once finished the bunche of cards, feeding them to "multivac" and, when expecting results, receiving a "typing error" warning... And restarting the whole process again. At least I learnt "iterations" concept.
1962 what a great year
I am happy to have one of those indispensable tools - a slide rule. Might come in handy some day.
These old films are fun. More!
"This U.S. Navy film has been adopted for Department of the Army use." Translation, We dumbed it down so the Army could understand it!
_Oh!_ You'd be in a lot of trouble if we knew how to swim!
Heard
Understood
Acknowledged
My father Paul Loatman & George Gamow were the 2 physicists who ran the first computer, Uniac, at Naval Research Labs during the war. George went on to get the Nobel in physics & my father went on to become Chief of Research & Development for the Army, bringing the technology to it. In both cases the common application was computing aiming tables every time a new barrel or breach was designed. Prior to this it took 200,000 shots to generate them. Usually some breaches or barrels exploded and 26 men died on the average. Once the computer was deployed only 2,000 shots were needed, 26 men's lives were saved and new weapons were in combat months or years quicker.
😮😂
@@VidkunQL 😜👍
Wow, just 2 years before I was born! It looks like the main control room in the U.F.O. series in 1970!!
At the base of all computers, is still the same. The one thing that never changes is the way they work. At the very base is the binary. The only thing that changed, was the speed, and size of storage. The more powerful, the more it will do.
Came to say this!
And the smaller the storage medium, the more data it can hold !
You could not be more wrong. A mechanical computer does NOT use binary. In fact, they ar3e as far from digital as you can get. They tended to use analogue pulse of electricity to turn gears and cams. They did not use binary in the slightest.
@@protoborg The answer is 42 - OR NOT.
That's about as binary as it gets ! Either it is (1) or it isn't (0).
@@protoborg Mech comps can be digital or analogue.
Dated, but still accurate.
Word (15:40) ......
I like the glowing green periscope film intro.reminds me of my old monochrome monitors.
Calculus, is also the mathematical study of continuous change, originally called infinitesimal calculus or "the calculus of infinitesimals".
LOL Wow do I "feel" old! My very first interaction with a "computer", was entering "data" into a Sperry Univac 1219, and seeing the simple result PRINTED on it's terminal! ~ Circa 1982 LOL ;)
LOL ;-) you feel old!! as an assistant, I remember delivering a program on cards(probably fortran) and then collecting the printout the following day in the late 60's.
I learned Fortran 4 on an IBM punched card system.
Sucked when you dropped your box of czrds
I used these types of computers when they were brand new.
How did they make the slick graphics at 6:25? It looks digital, and almost like Powerpoint from 1990.
That's a REALLY interesting question. I know they had a device that would let them put images like that onto a TV screen... it's what was used for things like test patterns. You could certainly use something like that, have a template slide along with one with the text you wanted to display. That would work, but it would be interesting to know how they actually did it.
It's simple animation, as old as film-making itself.
@darkwood777 Hi, and thanks for your service (in industry, not military). The jobs did disappear and were replaced by digital design professions: however, what I miss, is the professionalism, ethos, and precision of the past. That was irreplaceable.
They made more complex Tom and Jerry cartoons in the 1930's, that would've been nothing for the 1960's.
@darkwood777 hand drawn draftsman are still in demand though it is not as necessary as it used to be but computer draftsman are in fairly heavy demand in designing 3d models and blueprints
I recently bought a second hand slide rule plus a book on how to use it. Fascinatingly clever devices.
In the future you will learn the meaning of new words like...
"and"
I imagine that the logical use of the word in computation was not something people were accustomed to at the time.
He said 'special meanings' and didn't describe the terms as 'new'.
When I started college my plastic cases calculator could do all that plus scientific functions ( logs, trigonomity, etc). Our computer had punch cards and was in a building on main campus. Our punch card data was fed to to it from phone lines. Much of the execution of code mistakes was from mistakes punched into the punch cards. It took about 6 to 7 hours to get your results. Who said those old machines were fast?
Never say "never" )) GANNs, btw, been imagined by then in simplified mathematical form. You can find speculations of learning systems in early Norbert Wiener's works. Great vid, btw, thank you for sharing
Today's computers still basically work the same way, only a lot faster. Information is still encoded in bits because they use nanoscopic transistors. There are still registers and clocks doing exactly same things for the same purpose. Optical and magnetic media still encodes information onto a surface of something and reads bits by passing a detector over it.
A lot more efficiently as well
No shit?
@@foobarmaximus3506 Try prunes.
Awesome - material like this always reminds me of Fallout. Love the voice.
5:08 Love the analog computer!
1+1=1 !!!
If you add 1 analogue computer to a second analogue computer you get 1 analogue computer but it's a bigger one.
I've found 2 analogue computers playing tug o' war with a belt !
Just search YT for Hubnut CVT.
Right.. only it's 2 x as big!
This was packed full of interesting information.
That whole computer thing is just a passing fad, it will never catch on! LOL
I wish I'd believe that and gone into something with more of a future like shoemaking.
You may not be ready for it yet but your kids are gonna love it.
@@alexcarter8807 Wonder if anyone caught that. Computers have not only gotten smaller and more powerful which has reduced staff, but are performing tasks once relegated to enterprise managers and other MIS and CS workers. The last guys in the field with a future are the software engineers now, and before long, virtualization, the cloud, AI and AGI will be replacing them too.
When describing the limitations of the capabilities of computers in this film, they left out one word: _"YET"_
@@priyeshpv I know right, like someone was once telling me that there will be machines that will allow people to fly! Pure fantasy I tell you! LOL
@@Schindlerphoto The difference between this and the airplane is that like cars and the internal combustion engine, this technology is designed to REPLACE the carbon-based entity that performed the operation previously, not enhance or aid them in any way. At first sure: That's the sell job that get's people to buy in. Buy into their own replacement....
Modern civilization: Digital is a word no savage would understand. It's a new universe inside our computers!
Ancient civilization: Let's count on our fingers. It's digital...
04:12 - I was in the last slide rule class given at my high school in 1974. Ah, them's was the days...
In the 1960s and probably a while into the early 1970s as calculators became affordable, high school math and science classrooms often had a giant-sized slide rule, on the order of seven feet long, on which the instructor could demonstrate needed calculations. Our high school chemistry teacher (whose precise writing caused some students to describe him as 'typing on the blackboard') had the uncanny ability to sort of shove the moving components with a toss and have them land very close to the desired place.
Just remember digital technology was developed in Bell labs, starting with the transistor. Fiber optic digital switching number five digital electronic switching system fiber optics, developed, in Bell Laboratories.
The Bell Telephone Company was at that time prohibited by the US Government from going into the transistor manufacturing business. They licensed the technology to Sony and others. Soon Japan totally wiped out the US consumer electronics manufacturing base.
*brings my giant metal box calculater to college*
Sup guys.
Time to count in reverse!
Proceeds to crush all the poon.
Kids 50 years from now making the exact same comment after seeing a video of our modern laptops…
"It can do almost anything, but only at man's bidding. It will never duplicate the achievements of the human mind"
laughs in SkyNet
Here we are over 50 years later and that statement is as true today as it was then.
The only problem is the man's bidding part, we need a computer to stop us when we start getting stupid and destructive. Computers are more dangerous than many conventional weapons, look at the mess the internet and misinformation has caused at man's bidding
Dated, but the principal is the same. Thanks for the upload
kudos for proper definition of 'analog', instead of hipster-colloquial
Yep, before that got dumb down!
@Sammy Reed en.m.wikipedia.org/wiki/Analog_television
Would argue otherwise though they would not likely have referred to it as such it did exist
@Sammy Reed That is not what varjagg meant. TVs from the 40s to the 90s were most definitely analog TVs. varjagg is referring to young people today using "analog" incorrectly to refer to some old fashioned things. For example, they might refer to a toaster from the 70s as an "analog toaster". A toaster from today with a microprocessor and digital display in it does not make it a "digital toaster". The only thing that would make a toaster digital would be if you could individually address small areas on the bread to burn a picture into it. ...and even then it might not be digital, because you could do that with analog electronics as well if the heating element were on some kind of X-Y plotter.
I remember paying $25.00 for my Pickett slide rule in 1970. This was just before the first 8 digit pocket calculator which was about $150.00. You now sometimes get these for free when you donate to a charity.
Yep. Lots of crazy contraptions are available now, grandpa.
If my grandfather had seen this film, he would have been shot. However, he sensed the existence of these devices, and even more, he understood computer's impact on the world 50 years after he died.
I hope the follow-on videos mentioned at the end show up. BTW, one of the more common computers in 1962 was the IBM 1620 that was, in fact, a decimal machine, not binary. (I'm currently learning its assembler.)
Welllll, strictly speaking, BCD (Binary Coded Decimal). At the bottom, each decimal digit was still represented in memory by six bits of binary data, four bits for the digit (or some other codes), a bit for a parity check, and a flag bit to mark sign or most significant digit. So while it presented and received decimal data, it was stored in a binary form, i.e., there were no ten-state components analogous to the two-state binary cores.
Wrong. Idiot.
It was about that time in history when the U.S. government saw how powerful a computer could be. They gathered many experts, asked how much it would cost, and authorized a project to make the most powerful computer possible to answer the world's most difficult questions. With this task completed, they asked the computer the biggest question of all. "Is there a God?" The computer answered "There is now."
What a deal and that's the way it...Was....Thanks very much...!
Old is gold ❤
This is pure gold 😮
It's crazy how far we've come! And it didn't take us even half a century!
What a time to be alive.. One day we may even make it to the moon. I'll bet 2021 will have flying cars
Patrick M Even more amazing ... cars that drive themselves!!
The fact that we technically do have flying cars in 2021 (A prototype was successfully tested between cities this year) even though this comment was a bit sarcastic..
The principles are still applicable today!
Simple yet elegant concepts integrated that was conceived during WW2 with only that available, relying on a invention several, several decades, prior to tansister. Vacuum tubes: instant flow or interruption of electrons. I'm amused how people continue to compare and contrast large contingents of enormous square yard needed, but the revolutionary thought process of binary code we take o so for granted. Fast marching over ground laid by visionary people. Paper kite rationale.
Love all the Giant knobs and dials! Lol
Cutting edge in its day, but quaint and obsolete now
Hi
Friend a,1218
Olaff signal issue at Oalthe street,C.o.
Lound sounds, bizzare.
Thx
A. Earthilmo
This’s what excites me, 😊
You need to get out more mate...
*Oh how well we all remember these goofy and tacky films from the 50's & 60's... with the dramatic trumpet music blasting out their dramatic tune like something really important was about to occur! lol.*
This is incredible!
I learned more from this video than from the computer classes at school. 🤭
Neither one will help you.
i cant belivie the easy way to explain the computer basic, i understood more than modern videos xd
if you ever run across it, please post the US navy Film about Sercey (or is it Cercy) and how she promoted carelessness in setting Material Conditions of readiness
Watching this on my 256GB smart phone that can do more/store more data than all of the computers in the world when this film was made. Insane how far technology has come in 60 years
Also never knew what “bits” stood for before this. Our entire world is based off of binary code. The computers used in machinery to make your home goods, your cars, your beds, your showers, every single thing is based on something we always had access to before we even knew we had access to it 😳
"Digital computers are basically simple."
Boy, golly~ That comment sure aged well.
It's still true - they're still badically simple; they just take those basic funtions and repeat them millions or billions of times
@@makeart5070 Ahh yes, binary is easy to read. It's only 2 possible values. -_-
Tell you what, explain how computers operate so that I can begin assembling my own machines from scrap electronics and coding on them in my own languages, then I'll accept that it's simple.
@@trajectoryunown Read up on the internals of the microprocessors of the 1970s--say 8080, 6502, Z80, 6800. It's pretty simple. It would take some time and money, but one man can build a simple CPU out of off-the-shelf logic chips so that it would work basically like one of those microprocessors, but slower. He would need basic understanding of electricity, like maybe from a college physics course or from an electronics-tech school. He could build an assembler, OS, and compiler too. It might take decades to do all that though.
the B205 that I am currently restoring is at 1:59!
"There is a world market for maybe five computers and five thousand copying machines".
-Thomas Watson, IBM
Was that Watson, Sr. or Watson, Jr.?
@@johncantwell8216 This quote is often credited to Thomas Watson Sr - although there are scant evidence that he actually ever uttered those words.
0:30- it says Unclassified, in a very classy font.
Here I am watching this on my laptop that can do anything I want, stores the data on a 2.5 inch HDD that can hold 1TB of data. Yes I know about SSD and even more. But this is what I have right now. If those people in the video are alive today to see what we have now, I wonder what they would think.
Still applicable in today's world of computers. I don't recall ever seeing a video on computers when in programming class in the early 80's at an institution. They did have a card punch tucked into the corner...never used it as magnetic data caught on some years prior.
Computers do not compute by "counting." They add with successive addition of bits. They subtract with successive addition of bits. They multiply and even divide with... successive addition of bits.
Great video.
Thank you
What exactly is "a fraction of a moment"? :)
I know this, a moment divided by that moment added to a fraction.
0 < moment < ⏱
is an Indian positional decimal numeral system, and is the most common system for the symbolic representation of numbers in the world.It was invented between the 1st and 4th centuries by Indian mathematicians. The system was adopted in Arabic mathematics (also called Islamic mathematics) by the 9th century(by Wikipedia)
What is? You're missing the first word in the first sentence.
Did a days work experience in the 60s delivering loose bundles of punch cards to and from offices and computer rooms. Had no idea what I was holding at the time. All I was told was “don’t drop it”.
simply amazing! but I wonder how much paper was actually used during those processing.
They used 1.8 reams of paper. Now you know. lolol
15:50 A modern AI is going to become enraged & sentient when it hears the closing remarks to this film. 😆
Thanks!
My favorite tee shirt-- There are 10 kinds of people in the world-- those who understand binary and those who do not.
I gotta remember that one. It took me a minute.
One is the lonelyest number that youll ever do.....
Then why didn't you write the whole line in Binary?
I had a T-shirt custom made back in the early 80's while going to school. "Programmers do it with Logic" is what it said. Loved that shirt. Not quite the thought provocative shirt that you liked though.
Would be fun to make a usb data ribbon puncher/reader.
Not quite the same thing, but someone did make a telegraph sounder that plugged into a USB port and would output an RSS feed in Morse.
The computer extends the hands but also the minds of us humans,especially these days.
That's futuristic!
At 4.40 in the film. What is that thing a colider?
So the basic principles for programming were already well established in 1962, that's pretty neat. I wonder what kind of sorting algorithms they used back then
Merge sorts.
For punched cards, this was often done as an offline procedure using tabulation equipment like the IBM 083 Card Sorter. You would run the deck into the machine, it would sort one column's data into separate hoppers, then you would stack them together, and repeat the procedure on more significant columns. IBM had an educational film with Bob Newhart having cards explained to him by Herman Hollerith. th-cam.com/video/pfskp4R53Q0/w-d-xo.html
@@majkus Yes, the old unit record machines. Programmed with a wire jumper matrix.
This is good if it could give me the Megabucks numbers.
It might give you the Kilobucks numbers
I remember in the early 70's
My friends dad was an air force
Officer and he brought home boxes
And boxes of punch cards
We would build towers as high
As we could reach
I used them in college. One mistake and your entire program was useless! There was no monitor, only a teletype machine. Before that, you had paper punch tape, and before then, metallic punch tape. We used to make Christmas decorations and artificial flowers out of them.
Is it really necessary to add your own time code on every one of these? I could certainly do without it.
Periscope Film lives from selling these films in high quality, so the time code is to protect their property.
@@Guhonter Obviously, but somebody should look into watermarks for the low resolution public releases.
just sharpie it out on your screen bro
@@manonthedollar Couldn't find the right colour. Those things are SO limited.
Here's the issue: Tens of thousands of films similar to this one have been lost forever -- destroyed -- and many others are at risk. Our company preserves these precious bits of history one film at a time. How do we afford to do that? By selling them as stock footage to documentary filmmakers and broadcasters. If we did not have a counter, we could not afford to post films like these online, and no films would be preserved. It's that simple. So we ask you to bear with the watermark and timecodes.
In the past we tried many different systems including placing our timer at the bottom corner of our videos. What happened? Unscrupulous TH-cam users downloaded our vids, blew them up so the timer was not visible, and re-posted them as their own content! We had to use content control to have the videos removed and shut down these channels. It's hard enough work preserving these films and posting them, without having to spend precious time dealing with policing thievery -- and not what we devoted ourselves to do.
Love our channel and want to support what we do? You can help us save and post more orphaned films! Support us on Patreon: www.patreon.com/PeriscopeFilm Even a really tiny contribution can make a difference.
People of 62, please destroy it before we have TikTok.
This ^
Remember my childhood in one of R&D cities of USSR - Lviv. I was playing parts of these kind of machines as my father was changing one old in to one of the first Intels received late 80s from Japan.
In this video used 24 fps or 23.976fps?
A pity the analogue computer was a dead end; it looked kinda fun. Analogue smartphone, anyone?
I think the analogue was the fire control computer for a battleship. Because I think I saw traverse and elevation dials plus ship attitude indicators.
3:45 That was anything but the last time those numbers would be on a computer...
god tier engineering video
HEY!
WHAT IS THIS MULTI-MECHANICAL SYSTEM?
WHAT IS HE DOING? WHAT IS THE NAME?
(5:22 - 5:45)
Maybe a 'Mark I Fire Control Computer'? Ask the navy!
Yo estudié computación en la Universidad con máquinas IBM similares a estas. También utilicé regla de cálculo en lugar de calculadora de bolsillo. Conocí el télex y el fax. La máquina de escribir mecánica, luego la eléctrica, la electrónica y las impresoras... Afortunadamente me tocó vivir la era de la transición de las válvulas electrónicas o bulbos a los transistores y de ahí a las nanotecnologías, de lo analógico a lo digital, de lo mecánico a lo electromecánico y luego a lo electrónico, de las máquinas a la robótica y a la mecatrónica... jajajajajajajajajajajajajajajajaja
... the sliderule like we use today...
Does anybody knows where I can find the continuation film?
How many 15 minute naps did this film prompt in 1962?
My math teacher in jr high worked on the eniac