Dimitar Zitosanski yeah, it really makes my day when I see that a video with him has been uploaded. He's a great storyteller! I wish I could meet him in person, but I don't know when I'll ever be able to go on a trip to the UK. Does anyone know if he ever comes to visit the US? Maybe to give talks or anything like that?
In the early seventies our IBM mainframe got a memory expansion of one megabyte, which doubled its memory. To celebrate that IBM offered a dinner to the whole crew of our computing center, probably around 60 people. Those were the days…
+ProfDaveB wouldn't inflation have something to do with it? $10K in 1970's is $53K in today's money. though that still doesn't sound like a dinner for 60 people in markup...
I fear not. |The more I think about it, my imagined $10,000 was in actuality (and probably) $110,000 dollars. per Meg . I'm just waiting for someone from IBM to confirm this. That sort of pricing easily equates to a luxury dinner for 60 people.
Prof Brailsford's stories are just the absolute best. My goodness, he has forgotten in a day what we all have learnt till now. His stories are an absolute gift to us all and thankfully now contained here on the internet for all to enjoy! :-D
Remembering writing a service for a web interface, in 2000 or so, that connected to an IBM mainframe to fetch credit reports and scores over a protocol completely implemented in glorious EBCDIC.
I remember being on a project where I wrote code on the mainframe which then had to be formatted so we could load it into Sybase databases on the AIX boxes. This required some bit manipulation, so I used IBM Mainframe C with triglyphs.
In a way, IBM still does this, but more subtle. You purchase a half million dollar mainframe, but they deliver you a million dollar mainframe. A few years down the line the mainframe starts to have some performance issues and you contact IBM and they say "No worries! We can double it's performance with no down time! Just type in this command!" And you do, and suddenly the extra half million dollars of hardware magically turns on and a bill arrives.
Because if you own a half-million dollar mainframe, you don't go hacking on it if you want to keep your employment. I'm sure there's also some anti-tamper trickery going on where if you try to hack the function it will lock you out of something that requires a personal visit from IBM to fix.
I really love the way this professor tells stories about computer history. Heck, he could talk about ANYTHING and make it sound interesting. I wish I could meet him in person.
4:30 About the curly braces problem: I think this is not that well known, but the C preprocessor recognizes digraphs and trigraphs that map to some of the fancier characters, in case they can't be typed or whatever. Instead of { you could have or ??> , although these days trigraphs are ignored by default, at least by GCC.
Well, I certainly don't either, but... Digraphs or trigraphs? There might be a difference in how they're regarded. Also, C != C++. There are a few differences at the basic level (i.e. setting aside the obvious mountain of things C++ added) besides this stuff, so it's not too surprising that C still implements something while C++ has dropped it. Bear in mind, this was in response to EBCDIC lacking curly braces. Hopefully almost nobody uses EBCDIC these days either.
I remember there was a CDC Cyber machine which had an extra jumper in it. If you removed the jumper, the machine turned into the next model up. Of course, CDC did all of the system maintenance, so the local technicians had a note on their calendars for when scheduled maintenance from CDC was due to put the jumper back in place so they didn't get charged for "buying" the next model up.
I loved the "EBCDICity" term! I had problems with clients uploading files to my company's Linux FTP server which we would just forward to the mainframe for processing. We would blindly push files to mainframe datasets using FTP, and thankfully the FTP client we used to do this was capable of re-encoding the file from ASCII to EBCDIC. However, we'd get the occasional file that included a character with no equivalent or an Extended ASCII character. Not knowing how to convert the character, the FTP client would transmit the character as a raw byte (i.e., in Binary/Image mode), which in EBCDIC could be something the job or its underlying COBOL program(s) that would consume the dataset couldn't handle and we wouldn't even know about the problem until the job bombed out. Fun stuff, I tell you!
EBCDIC is still used today in current versions of both IBM i (Power Systems) and z/OS (Z mainframes). I wouldn't call these machines simply "replacement hardware" for your 30-year-old software-that's a bit of an oversimplification.* I will admit that backward compatibility has always been at the core of IBM's mantra, and that these machines-along with their traditional operating systems-are every bit the descendants of their System/32 and System/360 ancestors. Today these systems handle ASCII or Unicode just like any other computer so long as they are properly identified, but EBCDIC-ASCII conversion errors are not uncommon (and can be no end of frustration) when transferring text files off to other platforms. Oh, and a few terms professor Brailsford might have been looking for: IMS: Information Management System, IBM's hierarchical database for mainframes. Still available on z/OS today. IPL: Initial Program Load, IBM's term for booting the system. (Although, honestly we're talking far enough back that "regen" might actually have been a thing and I just don't know about it.) * They're more like the 3rd-generation children of immigrant families who still speak their mother tongue at home.
Had a hard time following this story, but the parts I did catch were pretty amusing. RAM issues reminds me of my laptop I had 10 years ago that I decided to upgrade from 1 gigabyte to 4. That laptop could really seem slow and struggle at times, but what happens? 3 gigabytes more RAM and suddenly it's doing things in 2-3 seconds that it used to spend 40-50 trying to accomplish. Never underestimate the importance of RAM I guess. XD
I once had a 486SX/25 laptop that had had a memory upgrade from 4MB to 8MB. Yeah. Megabytes. I got it second-hand with Windows 95 on it, and it was so slow I downgraded to MS-DOS 6.22 and Windows 3.11, not that Windows got much use either way. I mostly used DOS text editors and DJGPP to compile code, plus a bit of DOS gaming.
In my undergrad days the first C compiler we had on the Amdahl 470 mainframe (freshly upgraded from an IBM 370/168) caused lots of character set issues. One of my profs was a fan of BCPL, which had similar issues on the mainframes but worked fine on a VAX.
I worked on a System 370 when I was in the U. S. Army in the 1970s, and I remember my commanding officer informing me that heirarchical databases were the wave of the future. Now, I work with Oracle databases that are, of course, relational. Cheers, Russ
Well CICS is a transaction monitor, not a database. IMS also was no database but a transaction monitor too. IMS DB was the hierarchical database (which preceded DB/2 for many many years and was afaik initially developed for the moon landing and later turned into a universal product).
You go back a bit further than I do, Professor, but can still recall writing subroutines or functions, if you will, to translate from EBCDIC to ASCII and back again to allow data to go back and forth between the IBM mainframe world and everything else. Great fun! ;-)
Is this the reason why PDF dictionaries have the syntax > instead of { ... } ? I wouldn't be surprised. This explains a lot. I was always wondering why Adobe would duplicate a symbol, when there are so many other symbols available. Why would you prefer
in IBM-speak it's not "regen" it's "IPL" which is short for Initial Program Load, and I think the acronym for the system that was running too slowly was probably IMS. A hierarchical database system.
The IBM 360 and, I'm pretty sure, 370, families could use ASCII internally. There was a bit in the PSW (the ASCII bit) that, if set to "1," would cause the machine the use ASCII. I spent years staring at the front panels of 360s and 370s, and I never, ever, saw the ASCII lamp in the PSW display glowing.
Cieron Powell That happened to me just yesterday! Someone at work sent me an XML file they thought was garbage.. As soon as I saw all the 0x40's, I knew where it came from.
Talking to oldschoolers, it could've really been just two jumper wires that enabled the already present extra RAM. Just the same with the two wires that needed cut on some ancient S/360 mainframes to supercharge them (think of it as a multiplier lock on a modern CPU). In other words: unlock already present features. 2nd-world clone manufacturers found these out when copying the system.
Ah, just hearing the letters and numbers IBM 360 makes me sick. No, actually they taught 360 assembly in university in California in the 80's and I enjoyed it, thanks for the trip down memory lane
I would love to see a montage of people saying things akin to "It's a Meg in a box" that are only super important in the context of the period but later become the brunt of jokes.
Converting between EBCDIC and ISO8859-1 I do many dozen times pr day. But it still surprices me the number of people that have no idea of the existence of different codepage, and just assumes UTF-8, EBCDIC and ISO8859-1 is different names for this "universal" flat file format and works automaticly on different platforms and applications.
Christian Magnus Lie Because UTF-8/16 is practically THE universal code page. I mean who on this planet seriously needs this proprietary think called EBCDIC ? Yes everyone using mainframes for historical purposes, but else ? Having a common universal standard is something pretty useful and economic, as we have found out in the past.
Frank Schneider There is an EBCDIC equivalent of UTF-8, it's used by EBCDIC ports of perl. There's also a Chinese multibyte form of Unicode, which has compatibility with older Chinese DBCS systems.
I cant wait until we have UTF-8 on all our systems and platforms, but I expect someone working as a developer to atleast have a knowledge about the existence of other code page. Current schedule is for us to migrate away from IBM Mainframe is 3-5 years. Management have said for the past 15 years that we will be off z/Series in 5-8 years, but this time I think the schedule should somewhat hold. All new applications are being developed on a docker-based platform with Kubernetes. The fact that the men and women that manage our mainframe is also getting close to retirement, is also a huge factor.
I actually like AIX... it's a bit like the Amiga, the Mac and others where one company controls the _whole_ stack from hw to sw and you get good integration and abilities built in so you can depend on them and they are not optional. AIX took some getting used to, but as time went on it felt a lot more forward thinking than its contemporary unixen.
You know what's funny? I was speaking with an IBM representative today, saying that pretty much everything on IBM's cloud platform is open-source, starting from industry-standard bases (like Kubernetes, docker, OpenWhisk), with no platform lock-in. Today's IBM is now the antithesis of yesterday's IBM. Also, IBM is the only company in the world that grants people with free access to use a quantum computer (Microsoft and D-Wave are paid-only for the real thing last time I checked), IBM-q
Laurie O Wait...is there actually a REAL, functioning quantum computer now?? Did I miss something? I thought quantum computers were only experimental and they could only run like 1 or 2 very simple instructions.
We have quantum computers in the sense that we can do computation using quantum computers. Currently, we can only do some (actual) operations on (5) qubits, but not all operations imaginable, and can't do other things like save quantum state for later (ie variables, disk storage), and send (lots of) quantum information over large distances. You can check out IBM-Q to play around on the real thing for free
If the little matter of I/O hardware is ignored, there is nothing in the IBM 360 instruction architecture that means it has to use EBCDIC. Indeed, there were non-EBCDIC operating systems which ran on later versions of that architecture. Amdahl had their version of Unix, called UTS, that ran on S/390. Later, IBM ported Linux onto the Z-Series. So there really is nothing in the instruction set of S/360 which is sensitive to character sets (rather than the peripherals). It had binary and packed decimal as data types, but for the rest it was agnostic. All those SS instruction used to move around text didn't care what was in those bytes. It was purely an issue of peripherals. Also, it's worth noting, that EBCDIC wasn't confined to IBM. It was the native form used on peripherals on ICL's 2900 series mainframes running VME (albeit in an annoyingly slightly different encoding to IBM's). ICL had also inherited the S4 architecture from English Electric Computers who licensed RCA's commercially disastrous Spectre mainframes which were an attempt to copy IBM's S360 instruction set whilst "improving" those bits associated with the privileged level. Not only that, but SDS, Burroughs and Unisys used EBCDIC on their machines. Of course, the thing that locks people in is not so much the hardware as the software and format stored data is held in. I don't know to what extent this was an issue at the time about S/360, but I bet that there were a lot of companies that had lots of data on magnetic tape and even derived from early forms. IBM started out as a manufacturer of tabulating machines; it had punched card formats in its soul, as did a lot of its customers. The very way character files were presented were often as punched car formats, even when they were purely held in electronic form. Speaking as somebody who had to program IBM and S4 Assembly code, EBCDIC was a nuisance when it came to character handling. The non-contiguous alpha character set was a real pain when to vet (lazy programmers would treat it as contiguous on the basis there were no valid EBCDIC codes in the gaps). More sophisticated ones might use the TRT instruction, maybe even using the arcane execute instruction.
I learned about BCD a while ago, and I don't understand why they used it at the time. From what I've heard of it, you're wasting 30% of your nibble compared to hexadecimal. Why would people use that in a time where every bit counted?
Converting from hexadecimal to decimal requires arithmetic processing. In the early days of computers, both CPUs and storage were expensive. Its a trade-off....
Gordon Richardson Ah, I see. So you're using more storage, but you're saving time on calculations. Thank you for explaining in two sentences what my professor couldn't explain in an entire lecture :)
Back then scientific (number crunching) computers used fixed word length pure binary storage and registers, since their problems required lots of calculations, while business and accounting computers used memory to memory calculations on BCD digits which were a subset of the text characters available, because business problems involved reading and writing files with many records and repeating the same short computing sequences on each record. So accounting jobs could use a much slower CPU, going digit by digit in memory, because BCD to binary for those few steps, then binary back to BCD and formatted text, didn’t save enough time in computing, and the CPU was mostly waiting on even the fastest reading and writing. But for scientific work, you would read in a smaller set of numbers, convert them to binary, and repeat many iterations of arithmetic, then convert the final results back to decimal and text, and sometimes the tape drives were waiting on the CPU. And somehow, the computing world got started grouping bits by 3s and 6s and their multiples. The invention of hex as a way of representing bits in groups of 4 and 8 was a great example of thinking outside the (octal) box, as well as building machines that could do both kinds of arithmetic.
Also, the benefit of having numbers in a dataset (the mainframe equivalent of a file) or in program crash report be directly readable and understandable by humans cannot be overstated. Edit: oops, brainfart.
Both BCD and binary numbers will give you rounding errors (but different ones) and for financial applications people prefer the errors you get with BCD. For example, 0.20 in decimal can't be represented exactly in binary (it is 0.001100110011...). Do you really want your bank not to be able to say "you have exactly 20 cents in your account"? For scientific applications, on the other hand, the extra density of binary is very desirable and the numbers found in nature are not more likely to be exactly represented in one base than in another.
Interesting... I never heard about this IBM wall. Sounds very similar to the modern day Apple wall. Let's hope that one day the apple wall falls like the IBM wall has.
goeiecool9999 60000 people got fired in one day when the IBM mainframe departement ended. Foxconn has already fired 60000 people in one day. This is really things to hope for. Let them eat cake!
The IBM wall has only really partially fallen. The IBM Z series (used to be IBM 360/370/390) and IBM i OS on IBM Power hardware are both EBCDIC and both making money for IBM with new hardware models continuing to be released. You may laugh, but I've spent most all of the last 30+ years working on both the 'kernel' and in systems software on the IBM i/Power and its predecessors.
Isn't Linux on the Z Series using ASCII encoding? And couldn't Linux and the EBCDIC-encoding operating systems z/OS, z/VSE, z/VM, and z/TPF all run as guest operating systems on the same hardware? And does running Linux on the Z Series mean you cannot use block-mode 327x terminals ?
They sold that Meg because the original machine was underspecced (though overpriced). I remember a site which was doubling the speed and memory of their 4300, knowing that it still wouldn't meet their needs. They had not gone live on any systems yet - all live work was on their old ICL 2960.
We run an IBM OS/400 line at work (emulated). They really only support COBOL, CL, and RPG as far as system language goes. Yes, it uses EBCDIC it's awful. It's an environment that's been costing with blinders on since the early 80's. They'll claim more languages and tools but everything is broken or half implemented outside COBOL and RPG.
This is missing much of the the point. The problem with moving from EBCDIC to ASCII is not primarily to do with the hardware or peripherals, but maintaining application compatibility. If somebody has written an application program which assumes EBCDIC then it will break when presented with ASCII data and, in many cases, it will break even if re-compiled for ASCII and presented with ASCII data. That's because there are often many assumptions within the programs, such as the relationship between, say, upper and lower case, or the continuity of encodings and a host of other things. IBM's customers would not have thanked them for breaking all their existing code and making their archived data formats unreadable without a conversion exercise. As it happens, the IBM mainframes can (peripherals aside) happily run with ASCII. There were versions of UNIX that ran on IBM mainframes (and compatibles, like Amdahl and Hitachi machines) and those all used ASCII. It would be impossible to produce and EBCDIC version of UNIX as it would break every program. Yes, there was a peripheral issues in a few areas, such as card readers and directly attached terminals, but a bit of code versions deal with that. ASCII vs EBCDIC is not primarily a hardware issue; it's one of software. It's also worth noting that IBM were far from the only computing company that used EBCDIC. Pretty well every company that used punched card formats did the same, and that's because the format had its roots in maintaining the physical integrity of punched cards which were used in tabulating machines. That meant coming up with encoding systems that didn't put too many holes too close together on a punched card, and hence those weird discontinuities. That means back in the days when ASCII was being developed, the great majority of computing companies used a version of EBCDIC.
My English parser seems to be a bit broken at the moment ... I can't make out what Brailsford is saying at 1:20. I rewound it a few times and it still sounds like "but that's up a lot, we had six wrathions of love". Halp?
This year I had the questionable pleasure of working with an IBM mainframe that used EBDIC. Most of the conversions to/from ASCII were already done, but it was annoying nontheless. The letters of the alpabet are not even in one streak, they're split up...
I work with mainframe and we have to strip control character/ special characters form text to make it compatible. It’s an absolute nightmare. Saying that, ascii isn’t much better.
On a side note, as of yesterday (2018-05-08) Microsoft's Notepad supports not only CRLF, but CR and LF. "Did they eventually get the message and come properly into the modern world? Yes they did, but it took time..." I guess what Professor Brailsford is saying is that Microsoft just discovered UNIX.
I always thought that EBCDIC stood for "Erase Backup, Chew Disk, Ignite Cards".
HAHAHAHAHAH!@!!!
lol
You win the internet today, sir.
Who doesn't just love prof. Brailsford :)
Dimitar Zitosanski - IBM sales people maybe ?
Dimitar Zitosanski yeah, it really makes my day when I see that a video with him has been uploaded. He's a great storyteller! I wish I could meet him in person, but I don't know when I'll ever be able to go on a trip to the UK. Does anyone know if he ever comes to visit the US? Maybe to give talks or anything like that?
In the early seventies our IBM mainframe got a memory expansion of one megabyte, which doubled its memory. To celebrate that IBM offered a dinner to the whole crew of our computing center, probably around 60 people. Those were the days…
Wonderful story! It makes me think - as another commenter has pointed out - that my guess of $10,000 for 1 Meg is probably a massive understatement.
+ProfDaveB wouldn't inflation have something to do with it? $10K in 1970's is $53K in today's money. though that still doesn't sound like a dinner for 60 people in markup...
I fear not. |The more I think about it, my imagined $10,000 was in actuality (and probably) $110,000 dollars. per Meg . I'm just waiting for someone from IBM to confirm this. That sort of pricing easily equates to a luxury dinner for 60 people.
No, I fear it wouldn't. The more I think about it the more I am convinced that the real cost was probably a mind-blowing $110,000.
Prof Brailsford's stories are just the absolute best. My goodness, he has forgotten in a day what we all have learnt till now. His stories are an absolute gift to us all and thankfully now contained here on the internet for all to enjoy! :-D
Brian Streufert I would love to meet him in person. Even just having lunch with him and chatting about history would be awesome.
Remembering writing a service for a web interface, in 2000 or so, that connected to an IBM mainframe to fetch credit reports and scores over a protocol completely implemented in glorious EBCDIC.
I came for a bit but stayed for a byte instead
Word!
Double Word!
Only took a nibble of time, too..
WURRRRRD!
"Is it a bug or a feature?" For IBM those are synonyms.
cipher315 it's neither.
It's better.
It's billable. 😉
You mean for Microsoft ...
C allows "??" for curly braces. C++ allows ""
Digraphs and trigraphs are amazing
This is great for obfuscation ;)
As I recall, one of the Obfuscated C Code Contest entries won "best abuse of the rules" for using too many trigraphs.
Trigraphs have been removed from c++17
I remember being on a project where I wrote code on the mainframe which then had to be formatted so we could load it into Sybase databases on the AIX boxes. This required some bit manipulation, so I used IBM Mainframe C with triglyphs.
In a way, IBM still does this, but more subtle. You purchase a half million dollar mainframe, but they deliver you a million dollar mainframe. A few years down the line the mainframe starts to have some performance issues and you contact IBM and they say "No worries! We can double it's performance with no down time! Just type in this command!" And you do, and suddenly the extra half million dollars of hardware magically turns on and a bill arrives.
That was always the case, and was never a big secret. In a way, it is actually a good solution, since you only pay for the performance you need.
That's asinine! How come they aren't hacked yet?
Because if you own a half-million dollar mainframe, you don't go hacking on it if you want to keep your employment. I'm sure there's also some anti-tamper trickery going on where if you try to hack the function it will lock you out of something that requires a personal visit from IBM to fix.
I really love the way this professor tells stories about computer history. Heck, he could talk about ANYTHING and make it sound interesting. I wish I could meet him in person.
Nobody ever got fired for buying IBM...
I was just coming to say the same thing. :-)
And as its only other people's money you are burning ...
Fast forward to today and you'll nearly lose your job for simply considering them. Wasteful bunch.
matt b It depends on the age of the board.
4:30 About the curly braces problem: I think this is not that well known, but the C preprocessor recognizes digraphs and trigraphs that map to some of the fancier characters, in case they can't be typed or whatever. Instead of { you could have or ??> , although these days trigraphs are ignored by default, at least by GCC.
Wow, I didn't know that. That's f*cking amazing. Seriously.
Daniel Dawson I remember Borland having trigraph conversion as a separate program outside the compiler ca. Borland C/C++ version 4.
True, but nobody uses them. In fact, C++17 has gotten rid of them.
Well, I certainly don't either, but... Digraphs or trigraphs? There might be a difference in how they're regarded. Also, C != C++. There are a few differences at the basic level (i.e. setting aside the obvious mountain of things C++ added) besides this stuff, so it's not too surprising that C still implements something while C++ has dropped it. Bear in mind, this was in response to EBCDIC lacking curly braces. Hopefully almost nobody uses EBCDIC these days either.
wikipedia states '{' is 0xC0 and '}' is 0xD0
I remember there was a CDC Cyber machine which had an extra jumper in it. If you removed the jumper, the machine turned into the next model up. Of course, CDC did all of the system maintenance, so the local technicians had a note on their calendars for when scheduled maintenance from CDC was due to put the jumper back in place so they didn't get charged for "buying" the next model up.
I loved the "EBCDICity" term!
I had problems with clients uploading files to my company's Linux FTP server which we would just forward to the mainframe for processing. We would blindly push files to mainframe datasets using FTP, and thankfully the FTP client we used to do this was capable of re-encoding the file from ASCII to EBCDIC. However, we'd get the occasional file that included a character with no equivalent or an Extended ASCII character. Not knowing how to convert the character, the FTP client would transmit the character as a raw byte (i.e., in Binary/Image mode), which in EBCDIC could be something the job or its underlying COBOL program(s) that would consume the dataset couldn't handle and we wouldn't even know about the problem until the job bombed out. Fun stuff, I tell you!
I like listening to prof. Brailsford, it's like listening to your grandpa telling stories about the war, except cool ;)
Same!
I'd love to see that IBM document mentioned. As I recall, that kind of stuff was - for years - a destroy-before-reading kind of secret.
I could literally listen for hours to this
EBCDIC is still used today in current versions of both IBM i (Power Systems) and z/OS (Z mainframes). I wouldn't call these machines simply "replacement hardware" for your 30-year-old software-that's a bit of an oversimplification.* I will admit that backward compatibility has always been at the core of IBM's mantra, and that these machines-along with their traditional operating systems-are every bit the descendants of their System/32 and System/360 ancestors. Today these systems handle ASCII or Unicode just like any other computer so long as they are properly identified, but EBCDIC-ASCII conversion errors are not uncommon (and can be no end of frustration) when transferring text files off to other platforms.
Oh, and a few terms professor Brailsford might have been looking for:
IMS: Information Management System, IBM's hierarchical database for mainframes. Still available on z/OS today.
IPL: Initial Program Load, IBM's term for booting the system.
(Although, honestly we're talking far enough back that "regen" might actually have been a thing and I just don't know about it.)
* They're more like the 3rd-generation children of immigrant families who still speak their mother tongue at home.
IBM sounds like the original Clippy. "It looks like you are trying to transmit a document. Would you like me to mangle it for you?"
3:26 -- "Ebcdicity". Most definitely a word I've never heard before today. LOL!! That was great! :-)
I must write this in my little black book of Shit-I-have-garbage-letters Scrabble words, right next to the page of Polish curse words.
This guy is amazing. Post more videos of him!
Love listening to this guy!
Had a hard time following this story, but the parts I did catch were pretty amusing.
RAM issues reminds me of my laptop I had 10 years ago that I decided to upgrade from 1 gigabyte to 4.
That laptop could really seem slow and struggle at times, but what happens? 3 gigabytes more RAM and suddenly it's doing things in 2-3 seconds that it used to spend 40-50 trying to accomplish.
Never underestimate the importance of RAM I guess. XD
I once had a 486SX/25 laptop that had had a memory upgrade from 4MB to 8MB. Yeah. Megabytes. I got it second-hand with Windows 95 on it, and it was so slow I downgraded to MS-DOS 6.22 and Windows 3.11, not that Windows got much use either way. I mostly used DOS text editors and DJGPP to compile code, plus a bit of DOS gaming.
I'm a simple person: I see prof. Brailsford - I click :-)
In my undergrad days the first C compiler we had on the Amdahl 470 mainframe (freshly upgraded from an IBM 370/168) caused lots of character set issues. One of my profs was a fan of BCPL, which had similar issues on the mainframes but worked fine on a VAX.
IBM database was IMS and was not relational but heirarchical. Cheers, Russ
Yes!! Thanks Russ!
I worked on a System 370 when I was in the U. S. Army in the 1970s, and I remember my commanding officer informing me that heirarchical databases were the wave of the future. Now, I work with Oracle databases that are, of course, relational. Cheers, Russ
There's DB2 and CICS also.
Well CICS is a transaction monitor, not a database. IMS also was no database but a transaction monitor too. IMS DB was the hierarchical database (which preceded DB/2 for many many years and was afaik initially developed for the moon landing and later turned into a universal product).
You go back a bit further than I do, Professor, but can still recall writing subroutines or functions, if you will, to translate from EBCDIC to ASCII and back again to allow data to go back and forth between the IBM mainframe world and everything else. Great fun! ;-)
I love him tell these stories.
Is this the reason why PDF dictionaries have the syntax > instead of { ... } ? I wouldn't be surprised. This explains a lot. I was always wondering why Adobe would duplicate a symbol, when there are so many other symbols available. Why would you prefer
Great stories 😁
in IBM-speak it's not "regen" it's "IPL" which is short for Initial Program Load, and I think the acronym for the system that was running too slowly was probably IMS. A hierarchical database system.
This brought back a lot of memories. I don't mind.
Also casual reference to "nobody ever got fired for choosing IBM" reference :D
Professor: "So the American government went to IBM to come up with an encryption standard, and they came up with"?
Student: "EBCDIC!"
:)
“EBCDIC-ity” - hilarious! This is a great video. Thank you.
Being a Family-Guy fan, I had slightly weird images in my head for "Meg-in-a-box".
I love IBM so much. Even their shortcomings are wonderful.
Are there extra bytes for this episode?
Yes but they're all EBCDIC, rather than UTF-8 so can't be uploaded to TH-cam :P
I thought IBM first got into ASCII when they launched the PC - I believe that pre-dates AIX by 3 or 4 years.
Yes indeed! Thanks for reminding me of this ....
The IBM 360 and, I'm pretty sure, 370, families could use ASCII internally. There was a bit in the PSW (the ASCII bit) that, if set to "1," would cause the machine the use ASCII. I spent years staring at the front panels of 360s and 370s, and I never, ever, saw the ASCII lamp in the PSW display glowing.
I didn't know that! Thank you!
i just got a IBM ad before this video 😀
I work on mainframes, love downloading files and them not converting from EBCDIC :?
Cieron Powell That happened to me just yesterday! Someone at work sent me an XML file they thought was garbage.. As soon as I saw all the 0x40's, I knew where it came from.
So what is a 'Meg-In-A-Box'?
Kees Reuzelaar Sounds more like monthly rental than cash purchase price!
Talking to oldschoolers, it could've really been just two jumper wires that enabled the already present extra RAM. Just the same with the two wires that needed cut on some ancient S/360 mainframes to supercharge them (think of it as a multiplier lock on a modern CPU). In other words: unlock already present features.
2nd-world clone manufacturers found these out when copying the system.
"The moral of all this..." is that open source is the way
This reminds me that Stephen Fry covered IBMs business practices in the 5th episode of his podcast "Great Leap Years"...
Ah, just hearing the letters and numbers IBM 360 makes me sick. No, actually they taught 360 assembly in university in California in the 80's and I enjoyed it, thanks for the trip down memory lane
I would love to see a montage of people saying things akin to "It's a Meg in a box" that are only super important in the context of the period but later become the brunt of jokes.
Converting between EBCDIC and ISO8859-1 I do many dozen times pr day. But it still surprices me the number of people that have no idea of the existence of different codepage, and just assumes UTF-8, EBCDIC and ISO8859-1 is different names for this "universal" flat file format and works automaticly on different platforms and applications.
Christian Magnus Lie
Because UTF-8/16 is practically THE universal code page. I mean who on this planet seriously needs this proprietary think called EBCDIC ? Yes everyone using mainframes for historical purposes, but else ? Having a common universal standard is something pretty useful and economic, as we have found out in the past.
Frank Schneider There is an EBCDIC equivalent of UTF-8, it's used by EBCDIC ports of perl. There's also a Chinese multibyte form of Unicode, which has compatibility with older Chinese DBCS systems.
I cant wait until we have UTF-8 on all our systems and platforms, but I expect someone working as a developer to atleast have a knowledge about the existence of other code page.
Current schedule is for us to migrate away from IBM Mainframe is 3-5 years. Management have said for the past 15 years that we will be off z/Series in 5-8 years, but this time I think the schedule should somewhat hold. All new applications are being developed on a docker-based platform with Kubernetes.
The fact that the men and women that manage our mainframe is also getting close to retirement, is also a huge factor.
I actually like AIX... it's a bit like the Amiga, the Mac and others where one company controls the _whole_ stack from hw to sw and you get good integration and abilities built in so you can depend on them and they are not optional. AIX took some getting used to, but as time went on it felt a lot more forward thinking than its contemporary unixen.
You know what's funny? I was speaking with an IBM representative today, saying that pretty much everything on IBM's cloud platform is open-source, starting from industry-standard bases (like Kubernetes, docker, OpenWhisk), with no platform lock-in. Today's IBM is now the antithesis of yesterday's IBM.
Also, IBM is the only company in the world that grants people with free access to use a quantum computer (Microsoft and D-Wave are paid-only for the real thing last time I checked), IBM-q
Laurie O Wait...is there actually a REAL, functioning quantum computer now?? Did I miss something? I thought quantum computers were only experimental and they could only run like 1 or 2 very simple instructions.
We have quantum computers in the sense that we can do computation using quantum computers. Currently, we can only do some (actual) operations on (5) qubits, but not all operations imaginable, and can't do other things like save quantum state for later (ie variables, disk storage), and send (lots of) quantum information over large distances. You can check out IBM-Q to play around on the real thing for free
The database was IMS/DB. may it rest in peace.
If the little matter of I/O hardware is ignored, there is nothing in the IBM 360 instruction architecture that means it has to use EBCDIC. Indeed, there were non-EBCDIC operating systems which ran on later versions of that architecture. Amdahl had their version of Unix, called UTS, that ran on S/390. Later, IBM ported Linux onto the Z-Series.
So there really is nothing in the instruction set of S/360 which is sensitive to character sets (rather than the peripherals). It had binary and packed decimal as data types, but for the rest it was agnostic. All those SS instruction used to move around text didn't care what was in those bytes. It was purely an issue of peripherals.
Also, it's worth noting, that EBCDIC wasn't confined to IBM. It was the native form used on peripherals on ICL's 2900 series mainframes running VME (albeit in an annoyingly slightly different encoding to IBM's). ICL had also inherited the S4 architecture from English Electric Computers who licensed RCA's commercially disastrous Spectre mainframes which were an attempt to copy IBM's S360 instruction set whilst "improving" those bits associated with the privileged level. Not only that, but SDS, Burroughs and Unisys used EBCDIC on their machines.
Of course, the thing that locks people in is not so much the hardware as the software and format stored data is held in. I don't know to what extent this was an issue at the time about S/360, but I bet that there were a lot of companies that had lots of data on magnetic tape and even derived from early forms. IBM started out as a manufacturer of tabulating machines; it had punched card formats in its soul, as did a lot of its customers. The very way character files were presented were often as punched car formats, even when they were purely held in electronic form.
Speaking as somebody who had to program IBM and S4 Assembly code, EBCDIC was a nuisance when it came to character handling. The non-contiguous alpha character set was a real pain when to vet (lazy programmers would treat it as contiguous on the basis there were no valid EBCDIC codes in the gaps). More sophisticated ones might use the TRT instruction, maybe even using the arcane execute instruction.
Since I saw Professor Brailsford's 8-bit video, I've been waiting for EBCDIC to rear it's head.
Also, never ever heard EBCDIC pronounced in speech before, had always read it in by head as e-be-ke-dic
A bit simplistic... We did good things as well :)
Ebcdicity is a wonderful word!
The past is a foreign country; they do things differently there. :)
4:29: One way to get around a lack of curly braces is by using the trigraph sequences ??< and ??>. See K&R A12.1.
I learned about BCD a while ago, and I don't understand why they used it at the time. From what I've heard of it, you're wasting 30% of your nibble compared to hexadecimal. Why would people use that in a time where every bit counted?
Converting from hexadecimal to decimal requires arithmetic processing. In the early days of computers, both CPUs and storage were expensive. Its a trade-off....
Gordon Richardson Ah, I see. So you're using more storage, but you're saving time on calculations. Thank you for explaining in two sentences what my professor couldn't explain in an entire lecture :)
Back then scientific (number crunching) computers used fixed word length pure binary storage and registers, since their problems required lots of calculations, while business and accounting computers used memory to memory calculations on BCD digits which were a subset of the text characters available, because business problems involved reading and writing files with many records and repeating the same short computing sequences on each record. So accounting jobs could use a much slower CPU, going digit by digit in memory, because BCD to binary for those few steps, then binary back to BCD and formatted text, didn’t save enough time in computing, and the CPU was mostly waiting on even the fastest reading and writing. But for scientific work, you would read in a smaller set of numbers, convert them to binary, and repeat many iterations of arithmetic, then convert the final results back to decimal and text, and sometimes the tape drives were waiting on the CPU.
And somehow, the computing world got started grouping bits by 3s and 6s and their multiples. The invention of hex as a way of representing bits in groups of 4 and 8 was a great example of thinking outside the (octal) box, as well as building machines that could do both kinds of arithmetic.
Also, the benefit of having numbers in a dataset (the mainframe equivalent of a file) or in program crash report be directly readable and understandable by humans cannot be overstated.
Edit: oops, brainfart.
Both BCD and binary numbers will give you rounding errors (but different ones) and for financial applications people prefer the errors you get with BCD. For example, 0.20 in decimal can't be represented exactly in binary (it is 0.001100110011...). Do you really want your bank not to be able to say "you have exactly 20 cents in your account"? For scientific applications, on the other hand, the extra density of binary is very desirable and the numbers found in nature are not more likely to be exactly represented in one base than in another.
I thought the thumbnail was a Bill wurtz video
Replace IBM with Cisco and you have a modern day story! :)
Interesting... I never heard about this IBM wall. Sounds very similar to the modern day Apple wall. Let's hope that one day the apple wall falls like the IBM wall has.
goeiecool9999 60000 people got fired in one day when the IBM mainframe departement ended. Foxconn has already fired 60000 people in one day. This is really things to hope for. Let them eat cake!
The IBM wall has only really partially fallen. The IBM Z series (used to be IBM 360/370/390) and IBM i OS on IBM Power hardware are both EBCDIC and both making money for IBM with new hardware models continuing to be released. You may laugh, but I've spent most all of the last 30+ years working on both the 'kernel' and in systems software on the IBM i/Power and its predecessors.
Isn't Linux on the Z Series using ASCII encoding? And couldn't Linux and the EBCDIC-encoding operating systems z/OS, z/VSE, z/VM, and z/TPF all run as guest operating systems on the same hardware? And does running Linux on the Z Series mean you cannot use block-mode 327x terminals ?
Thank you for offering your content in 4k resolution!
Jason Tobin lol what exactly do you get out of 4k on content that doesn't benifit from it?
Life like detail. It also ehances their already high quality content.
A Meg in a box? Oh hot damn 🤣🐸
They sold that Meg because the original machine was underspecced (though overpriced). I remember a site which was doubling the speed and memory of their 4300, knowing that it still wouldn't meet their needs. They had not gone live on any systems yet - all live work was on their old ICL 2960.
ASCII: Because no other country will ever be good enough to want a computer.
We run an IBM OS/400 line at work (emulated). They really only support COBOL, CL, and RPG as far as system language goes. Yes, it uses EBCDIC it's awful.
It's an environment that's been costing with blinders on since the early 80's.
They'll claim more languages and tools but everything is broken or half implemented outside COBOL and RPG.
This is missing much of the the point. The problem with moving from EBCDIC to ASCII is not primarily to do with the hardware or peripherals, but maintaining application compatibility. If somebody has written an application program which assumes EBCDIC then it will break when presented with ASCII data and, in many cases, it will break even if re-compiled for ASCII and presented with ASCII data. That's because there are often many assumptions within the programs, such as the relationship between, say, upper and lower case, or the continuity of encodings and a host of other things. IBM's customers would not have thanked them for breaking all their existing code and making their archived data formats unreadable without a conversion exercise.
As it happens, the IBM mainframes can (peripherals aside) happily run with ASCII. There were versions of UNIX that ran on IBM mainframes (and compatibles, like Amdahl and Hitachi machines) and those all used ASCII. It would be impossible to produce and EBCDIC version of UNIX as it would break every program. Yes, there was a peripheral issues in a few areas, such as card readers and directly attached terminals, but a bit of code versions deal with that. ASCII vs EBCDIC is not primarily a hardware issue; it's one of software.
It's also worth noting that IBM were far from the only computing company that used EBCDIC. Pretty well every company that used punched card formats did the same, and that's because the format had its roots in maintaining the physical integrity of punched cards which were used in tabulating machines. That meant coming up with encoding systems that didn't put too many holes too close together on a punched card, and hence those weird discontinuities. That means back in the days when ASCII was being developed, the great majority of computing companies used a version of EBCDIC.
If memory was still that expensive we'd have much better software
i m loving it!!
Is this why Ascii85 was created? This overlap between EBCDIC and proper ASCII?
I wonder if those 6 versions of EBCDIC can be divined from the mapping tables in IBM ICU.
If there were only 64 chars reliably, is this the time to bring up base64 encoding?
Nice BIOS you got there. Be a shame if someone ripped it off.
My English parser seems to be a bit broken at the moment ... I can't make out what Brailsford is saying at 1:20. I rewound it a few times and it still sounds like "but that's up a lot, we had six wrathions of love". Halp?
"But there were 6 versions of that" Apologies - more enlightenment once I've done the subtitles :-)
Bahaha, I prefer his interpretation. Love the video as always Prof!
1:21 "???... we had six versions of that for historical reasons ..."
Six versions for historic reasons
It always amuses me how people on the internet feel the need give the answer to a question several times.
xxd -E will give you a hexdump which assumes EBDIC input.
IBM was the Apple of the 60's and 70's
"EBCDICity"
This year I had the questionable pleasure of working with an IBM mainframe that used EBDIC. Most of the conversions to/from ASCII were already done, but it was annoying nontheless. The letters of the alpabet are not even in one streak, they're split up...
The alphabet was split into three sequences in EBCDIC encoding: A-I, J-R, S-Z.
I work with mainframe and we have to strip control character/ special characters form text to make it compatible. It’s an absolute nightmare. Saying that, ascii isn’t much better.
Six versions of EBCDIC is no worse than ASCII with n different code pages like PCs used
I need to call my grand dad
On a side note, as of yesterday (2018-05-08) Microsoft's Notepad supports not only CRLF, but CR and LF. "Did they eventually get the message and come properly into the modern world? Yes they did, but it took time..."
I guess what Professor Brailsford is saying is that Microsoft just discovered UNIX.
I expected a demonstration.
typical IBM attitudes.
Why only one armrest?
I didn't notice, but my guess would be only the hand using the computer mouse needs an armrest.
I think the left armrest broke off and he never replaced the chair.
Hehe at least in theory you could interchange base64 between EBCDIC and ascii
Its the original "office".....
Yep, IBM had a good thing going before microcomputers crashed their party.
The EBCDICITY
Raised on EBCDIC... 76 79 76
this is not rocket science - you simple create a converter - not comlicated
too much ebcdicity in IBMs
@2:15 What like Inferior But Marketable, I've Been Misled, Itsa Bout Money?
COBOL makes me want to throw up
'MURICA
I prefer it the other way round: Me in Meg's box.
222 likes
2 dislikes
BaronVonTacocat
402 likes
2 dislikes
Probably from 2 IBMers.
"Ebcdicity"? LOL
When the US Government went to IBM for an encryption algorithm they received... EBCDIC.
9th.. 😂😫