the main problem with working in COBOL is the working environment. Banks are not known for a casual work environment. There seems to be an entire ecosystem of programmers in that sector that mostly don't interact with the rest of us. I bump into it once in a while but it's like a entirely different programming world
I am 73 years old. I have coded in RPG, RPGII, COBOL, Assembler, IMS, IDMS, DB2, ISAM, VSAM, CICS and JCL (both DOS & OS) on small, midsize and large mainframes. I enjoyed almost every minute of it. The hardest COBOL program I had to maintain was written by a German-speaking programmer. All of his data names were a minimum of 20 characters in German with MANY consonants. One or two of the letters towards the end of the data name would change to designate a different data name. It was a real trip as I didn't speak any German and these names meant nothing to me. In many cases, I redefined the record in working storage so I could use English words that made sense to me. Fun times. I am retired now and sometimes I miss the work but then I will lie down until that feeling goes away.
In the 60s, my mother learned Cobal and Fortran. She was immediately hired by First Citizens Bank in Raleigh, NC. She went from being a receptionist at a dentist office to the bank's IT specialist, since programmers were so rare in those days. She then learned to drive and bought a new 1965 Mustang.😊 I'm in my 70s and still have her Mustang. Thank you, Mommy. She was the Smartest Mommy ever.
In 1987 when I was applying for ny first programming job, I went to the Chicago Public Library, and checked out an article called "In Defense of COBOL" and the precis went something like "since the death of COBOL has been predicted for the last 22 years of its 24 years of existence, we may wish to jnestigate what properties have led to its continued existence." By 1989 the WSJ had a front page article about how a company pushed their mainframe out the window (stupid, stupid, stupid) and yet 37 years after reading that article ZOS and COBOL are very much around for the most IMPORTANT irganizations.
1. COBOL programmers are available to employers who are willing to pay for them. 2. People can still learn COBOL. 3. The reasons ANY old code base survives are a) it works, why change it? and b) when you have millions of lines of legacy code, not only is it sometimes difficult to know what it does, but also WHICH of those lines of code actually ever execute. Dead code can be as much as 2/3 of a codebase. Additionally, not only does old code sometimes lack documentation, but also tests, so risk exists that if you change something HERE, you may break something THERE, without knowing it.
About 40 years ago, I read a short article in a computer publication. It was about a programmer who wrote a routine that modified itself during execution. In the comments he wrote "When I first wrote this code, only myself and God, knew how it worked. Now only God knows ".
@@spadeespada9432 "Is it possible to copy the code and run simulations?" Probably, but that still doesn't really help anyone understand how the systems work, which is the actual problem. What the companies who still own COBOL systems really want is to either replace the ticking time bomb of spaghetti code, or maybe to patch, extend, & modify it for changing business requirements. Both require being able to read & understand the code, NOT just knowing how the black box responds to various input states. And plus, the input states can get VERY gnarly.
Have been coding since 1969 all on IBM equipment. I programmed 407 accounting machines (plugboard wiring) then 1401, 360, 370, 4300 series, 3090, s/390, series-1 Wrote for BPS, TOS, DOS (and its variants such as VS VSE, VSE/SP, etc) V/M, OS (and its variants MVS, MVS/XA, etc) Been writing in assembly since 1970. Converted COBOL code from DOS to MVS. Even recreated COBOL code from core dumps because the original COBOL source was lost. I am sorry for people that never experienced hands-on with a mainframe. Its is truly an experience. I can do things in 16k of memory that NO other language can do. I wrote a COMPLETE accounting system (A/P, A/R, G/L, PAYROLL, INVOICING) in 32k (including the operating system) I even got COBOL to dynamically call another COBOL program. Something IBM said was impossible. As of 2012 that interface that i wrote in in 1981 was still running! To me, there is IBM mainframe then the rest of the other machines
Bei mir genau dasselbe. Ich kernte ab 1981 in der Berufsausbildung an einer IBM3033 unter OS/VS 2 COBOL und Assembler und hatte seither praktisch mit keiner anderen Programmiersprache zu tun.@@jaimeduncan6167 Bei mir genau dasselbe. Ich lernte ab 1981 in der Berufsausbildung an einer IBM3033 unter OS/VS 2 COBOL und Assembler und hatte seither praktisch mit keiner anderen Programmiersprache zu tun. Exactly the same for me. I learnt COBOL and Assembler Language on an IBM3033 under OS/VS 2 during my vocational training in 1981 and have had practically no contact with any other programming language since then.
I remember learning COBOL, SNOBOL, ALGOL, GASP, at the computing center on North Campus at Univ. of Michigan, but I never used it much. I did learn assemblers (Big IBM 360) (Little Mostek 6502) , Fortran, Basic, LostInStupidParentheses. We used punch cards we placed into the reader and pushed the button and they spit out the other end. Some guys doing Inst. Social Research would come in with boxes and boxes of cards stacked in a hand truck and would take a half-hour feeding the machine. You checked the monitor to see when your job was finished and waited for the printout lady to call your job number. You never knew if the program worked until you unfolded your printout. BTW Nuclear plants are still running DEC PDP-11's
I am not a Cobol programmer, but the day my boss asked me to write a Cobol program, with my background of Assembly and C I managed to "learn" the necessary part of Cobol in two days. Anyone with basic programming skill can learn and program Cobol in matter of days. The BIG problem is to MODIFY or FIX an existing program made by someone else. But this is true for every language.
How do you learn it? I've looked into it and it's a language for mainframe computers. So it's not really something you can trial and error teach yourself without the hardware. I can't just sit here on my PC and "learn" it in my free time.
@@falkenjeff Since you need a mainframe to run the Cobol code it's not something you can do at home. I don't recall seeing anything that can run cobol on a desktop. Not much reason to. You need to learn it in a classroom where they have access to a mainframe that can run it. the problem I remember having was if there were errors in your code you wouldn't find out until they ran the code, which didn't happen in a timely fashion. It could also be difficult to understand someone else's code if they were lazy and didn't put helpful comments in the code.
@@falkenjeff To learn any language, the best method is to have really some problem to solve: it is better then make "something" just to practice. Afetr that, I generally read the sources of other similar program, try to understand how they work, and take the parts I need. The manual of the language is used as a reference to "decrypt" the programs I am inspiring from. Of course, I learned *some* cobol in 1979 on a Sperry Univac mainframe but there was cobol also for MSDOS. I specifically remember RM-Cobol, widely used.
My aunt retired from her programming job in the 1980s. In 1998 she took a consulting gig for a small company to "get ready for Y2K", updating their 30 year old mainframe software. Word got out, and the small companies all started hiring her. Work a week, go on to the next one. As the deadline got tighter, the offers got richer, but time was the issue. She'd tell a potential customer "I think I need 3 days, but it'll take me 1.5 days to travel both ways and I only have a 4 day window in my schedule - unless you wait till after the new year". "I'll fly my private jet to the city you are in now, wait for you to finish, take you direct from their door to mine, and take you to your next job after, that'll save you 20 hours of airlines and airports". That became the new standard. She'd call the next company on the list when she was wrapping up, and they'd have transport at the door when she walked out. Limos, charters, private planes from Lears to Piper Cubs, even helicopters normally used for logging. She finished her last job on January 1st 2000 and retired again.
Agreat history. And that is a good reminder too why there wasn't a Y2K problem - because we coders eradicated it. Your aunt did her duty in that battle!
LOL. Yes, the great Y2K meltdown. Everyone in my I.T. department had to "WORK" all night on Dec 31st 1999 until 06:00 on Jan 1st 2000 because of the great Y2K threat. Basically we had a company paid-for party as we ate pizza and drank (non-alcoholic) soft drinks because our programmers had done their jobs over the previous six months and eliminated any date codes that might have messed things up. You know, now that I think about it, isn't that where the toilet paper hoarding started that was so predominant in Covid-19? WTF is it with hoarding toilet paper during a crisis (whether perceived or real)????
I worked at a law firm that used an ancient docketing database that was also used to fill in the blanks on thousands of different WordPerfect forms, such as letters, pleadings, etc. In 1999, it occurred to them that it could not possibly work after 12-31-1999. The original company worked on a Windows replacement, but couldn't get it to work properly. They couldn't guarantee a fully tested product in time. So the boss asked me if I could replace the software with databases in Paradox (Corel's answer to Microsoft Access). I told them I was certain I could, as I knew how the old software worked and how the documents worked with it. I was able to finish at the end of October. Because the firm wasn't allowed by their liability insurance to operate without the software, I had to devise a way to let the attorneys and paralegals use the old software until quitting time on Friday. And then I had to export all the data from dozens of data tables, massage it a lot, import it all into the Paradox database, and have it all ready for them to use on Monday morning. It was a difficult job. The original software was written before DOS existed. The software didn't just have 6 digit dates, with only 2 for the year, they actually had the entire dates encoded into just 2 characters. I had to reverse engineer that mess. Asperger's and an IQ helped me. I figured out that the software would take a regular date and subtract from it the date the company was founded to get a 4-digit number of days. Then it took the first 2 digits, pretended it was hex and converted it to decimal. Then it found the ASCII symbol corresponding to that and put it in the programs database. It did the same to the other 2 digits. This gave dates that looked like a pair of random ASCII characters, hearts, smiley faces, etc. The data was also in serial format, meaning there were no field or record markers. I had to use a DOS software to pull up each table, line it up in rows equaling the record length I had found in the manual, and resave the file. Then I pulled it into WordPerfect and ran some macros I wrote for each table to put in the field markers at the proper locations and then convert those stupid dates. This was so massive, I had to do this on a computer unattached to the network and not running any software but the WordPerfect. I did the final switch over the weekend, and everyone was happy. BTW, I didn't even get a raise that year. So a couple of years later, I "retired." I was 50 years old and tired of being micromanaged.
Your aunt, like my self was very honest!! Y2K WAS A MYTH!! CREATED TO SELL COMPUTER AND CREATE CONSU LTING JOBS. ALL IBM COMPUTERS COULD HANDLE DATES IN THE 2000’S AND MOST PC’S THAT IS WHY SHE COULD GO TO A COMPANY, TEST THE SOFTWARE AND REASSURE THE COMPANY THAT THEY WERE OK.(ALL IN 1 TO 3 DAYS). DISHONEST CONSULTING COMPANY’ INCLUDING MAINFRAME MFG, WOULD BLOW THASE 1 TO 3 DAY JOBS INTO 3YEAR JOBS.(THE FEDERAL GOVERNMENT SPENT HUNDREDS OF BILLIONS ON Y2K. WITH BOTH CONSULTANTS AND BUYING NEW COMPUTERS!! (IT’S ONLY TAX PAYER MONEY). I WENT AROUND TURNING DOWN Y2K JOBS FOR 2 YEARS, UNTIL THE NY POST DID AN ARTICLE ON ME SAYING I WAS THE “MOST HONEST Y2K MAN IN NYC. SO IN THE LAST 6 MONTHS 0F 1999, I STARTED ACCEPTING Y2K JOBS AND DOING WHAT YOUR AUNT DID.
I had a few friends who were similarly making big bucks patching old COBOL code for Y2k. I could have jumped on that wagon myself, but doing web dev was a lot more fun and I figured in the long run it would be worth more to stay on that track. But as the day of reckoning approached I was sure something major was going to go wrong, and am still surprised nothing did. I had read that a sizable percentage of systems still hadn't been patched, so my wife and I avoided our customary holiday travel that year. Apparently the truly crucial things actually got fixed first - which seems like a miracle.
In 1998, a COBOL programmer, frazzled with the Y2K crunch decides to have himself frozen until the year 2500. He goes in for the procedure and it's successful. As he begins to wake up, he sees all these people standing around him looking at their devices and dressed oddly, and he says "Wow, is it 2500?" The lead scientist says, "I'm sorry, no it's not 2500. It's actually 2098 but it says here on your file that you know COBOL ..."
Most of the fixes will either still work or be unnecessary. Not sure the same is true for the similar event for Linux coming. Up soon😊 By the way it was fun @@coshy2748
1992, I worked with a guy who was the ONLY person in the company that could actually program an AS400 Mainframe. In itself, - "Meh".... until you learn that Europe's top 37 Insurance companies used us - so if we went down - so did they. And it progressed - we got commercial banks and eventually the NHS and Social Services... All dependent on an old AS400, and ONE guy who knew how to work it. I personally was never a programmer - I was the guy in the floor or ceiling fitting cables and setting IP addresses - But it always terrified me just weak the system was - if we lost that one person, SOOOOOO much would go with him. And he refused to train anyone else so... Well, he died a millionaire. I guess thats all any of us want.... My bank account? Well I don't have one. It went that well.... pfff......
An AS400 ‘mainframe’? That really is bigging up the AS400. AS400s were General Systems Division - small business computers and distributed systems. AS400s _dreamed_ of being part of the Data Processing Division!
I seriously - honestly - want to avoid a "word war". Sounds like you have serious experience - but may I ask - corporate, private or government? Not being rude. I've done all three - but... All I can tell you (and feel free to "nicely" attack me on this) ... in the 1990's Virgin, AXA and GESA - and HMV to some extent - were using AS400 tech. Even then it was out dated. Seen those pictures of train stations cripples with Win2000 crash screens? Same thing.
@@damightyshabba439 : AS400 technology evolved as did mainframe technology. If the companies refused or reluctant to upgrade, that’s on them. I live and work in the USA. The companies I’ve worked for had internal auditors. What they said goes (which I disagreed with, the power went to their heads). Anyway, the companies I’ve worked for would not have tolerated this man-child’s attitude. There would have been cross training and backup personnel in place. Again, the company where this man-child worked allowed him to get out of hand - that’s on them.
@@deepsleep7822 Agreed - some times the company is too scared to bring in someone new, in case it upsets the "God". But it has to be done, from a business point of view.
Admiral Hopper was certainly heavily involved in the development of COBOL and lectured on it around the world. She related the following story to us at a Bell Labs symposium. She was giving a lecture in Tokyo and was doing the usual meet and greet afterwards. After a bit, she realized that she had no way to get back to her hotel. She was having difficulty communicating in English to the remaining people on what she needed, when she had an idea. She grabbed a marker and had them gather with her in front of a whiteboard. She then wrote out a COBOL program whose primary line was MOVE Grace FROM Conference TO Hotel xyz. They understood, and got her back to her hotel. She had an incredible mind. I feel fortunate that I got to meet her in person.
Back in the 80s, Grace visited the University of Alberta to give a talk. At the reception she mantioned that a numbere of the CS faculty had been involved in the early development of COBOL and wondered why they didn't make a bigger deal about that. One friend of mine quipped, "Perhaps they're ashamed of it!" Another friend of mind pulled the first friend aside and explained to him just who Grace was.
Thank you for mentioning GH *AND* properly addressing by rank! Kudos. ..I was about to comment about the conspicuously low attribution (and off timeline).
COBOL is clear and straight-forward. The staying power of the code is mostly due to inertia. It epitomizes the maxim "if it ain't broke, don't fix it".
@@SahilP2648 well, yes, you can do that - if you have a few hundreds of lines of code. But these COBOL behemoths that run banks and insurance companies and whatnot have _millions_ of lines of code. Let's assume that you'd get a generative AI to convert all the code. Would you, as the bank's CIO, trust that brand-new, AI-generated code to replace the old & faithful _mostly_ working code which has been around for half a century - and risk dooming the bank to absolute collapse if nothing works? Also, who would test & evaluate that code? Other generative AIs? :) You see where the problem is: at some point, you'll have to trust AI providers with your entire business logic, and hope they come up with a "better" solution (in the sense of using brand-new, latest-generation programming languages with extra bells and whistles) Someone on this very long comment thread pointed out the obvious: you could, for instance, split the code in its modules (all hundreds of thousands of them!), and go through each one of them separately: get a module, convert it to some other modern language using generative AI, thoroughly test the result, deploy it, then move on to the next module - wash, rinse, repeat, until every single line of code has been fully converted. But that's essentially "rewriting the whole code from scratch" without the _main_ advantage that comes from actually rewriting the code, which is to rethink some of the old things that aren't probably not necessary or that can be done more efficiently thanks to contemporary technology, methodologies, and innovations. How long would _that_ take? How much would it _cost_? (even assuming a "free" generative AI which would not only convert the code but also provide test suites at each step, for each module, also for free) And if something goes wrong in that million-line-code-conversion... something which escaped even the best of the best generative AIs and error-checking AIs... who is going to be able to "fix" things? Generative AIs are not yet the magic wand that turns a multi-million-dollar, high-risk project into something that can be done next-to-free and take a few hours...
My mother earned a degree in mathematics in the 1950s and worked as a programmer. Whenever I see pictures of those mainframes showing the women who made them work, I think of her. Thanks for sharing.
What you didn't mention was that managers didn't trust programmers to not do stuff like "round off to the nearest penny and put the remainder in my bank account" stuff, so they wanted to be able to actually *read* what programmers wrote.
Because it's not true. You can do that a lot easier in cobol. All it takes is a wrong pic statement. Washington DC government famously had a guy do just that. He hard coded to put the extra cents into his personal bank account. Like clockwork this happened week after week. Then he retired and left the code in. It ran for years like that. Then he forgot and closed his bank account. Then they started looking and found his code. It was all in cobol. I think he got 10 years in jail and lost his government pension. All kinds of numbskull things used to happen with cobol and mainframes. Such as using a flat file or sam file for a customer database instead of a database for a database. I used to fix things like that. After a while nothing surprised me.
Just because something is old, this doesn't mean it's bad or should be replaced. Newer is not always better. This applies in many areas, not just computer programming.
Generally true. However, if it's unmaintainable, that can become an issue, as regulations change and code that has to adhere to those updated regulations has to change along with it. If it ain't broke, don't fix it only applies until the context changes and its operation is now considered "broke", then the "unmaintainable" part comes into play.
@@jamesbond_007OK, but you can’t hang that on the language. If a decision is taken to move to a new language, everything needs to be migrated as part of the transition, and if it’s not, the blame goes to the management that botched the move by not doing it wholly.
@@jamesbond_007 I worked at a nuclear plant in the mid 2000s that still used DOS for their working computers for repair orders, work orders, etc. They probably still use it for all I know, as I retired in 09. That was a real PITA for some of us to use that were not really trained in that language. Trying to navigate the system was very unwieldy as I was a mechanical superintendent up from the trades as a pipefitter. It might not be as bad if ALL the nuc plants used the same language, but they don't, so every time I went to a different job I never knew what I was going to have to try and adapt to, and as the outages often were so short that I might only be there for 2-3 months at a time, the continuous changes were just one more added stressor to an already highly charged environment.
@@harleyb.birdwhisperer You're absolutely right! There are a huge number of factors that go into these dusty deck programs, not the least of which is that good programming practices were not known, or even if some of them were, they weren't often followed, so the resulting code is difficult to understand. COBOL has a GOTO statement and it was not discouraged for much of the time code was being written in it (I *think* there are flow control constructs that obviate the need for so much usage of GOTOs, but I am not sure).
I used to work at a company that decided to do a significant reorganization. Part of it was getting rid of employees within 5 to 10 years of their retirement, offering up to 60% of their salary to not come work anymore. However they were still allowed to get another job anywhere, part-time or full-time without loosing their early retirement money. You can kind of guess what's coming: a part of the older employees that left were Cobol programmers. All of a sudden the company realized they lost the majority of their Cobol programmers, which became a massive problem. They had to re-hire those guys as external consultants. Not only were they in the position to earn more than they made before, but on top of that still received that monthly retirement. A nice win win for the old geezers, a bitter (and costly) pill to swallow for the company.
I used to work with British computer company ICL (before it was bought by Fujitsu). They had a joke about management getting rid of the wrong people and having to hire them back at consultancy rates. "Whose are those Jaguar XJ6s in the car park?" "Oh, they belong to the people made redundant last year."
*Grace Hopper* developed the first compiler A-0 in 1952 and the first human readable computer language FLOW-MATIC in 1955. She is also referred to as "Grandma COBOL". Grace Hopper is my all time programming hero ♥️
I watched one of her live lectures while I was serving in the Navy back in the '70s and '80s. Also, Adm. Hopper was one of the few people who would standup to Adm. Rickover. She wouldn't take any of his shit. 😆 Very interesting person. The quickest way to get on her bad side was to say, "That's the way we've always done it."
She was fantastic, but the first compiler was not A-0. There is a discussion of the priority of her presentation, or if Autocode was running before she published. In any case outside of USA and political guide discussion A-0 is not considered the first actual implementation of a Compiler. This does not make her less important, after all, Lorentz found the transformations that bear his name before Einstein did, and I have zero evidence that Hopper had any idea of Autocode, in fact we know she published first.
Having started my computer programming career learning COBOL and progressing through many other languages the language is just a detail. Problem solving logic is the difficult part to master. Once you can start thinking like a computer processes, the coding is just syntax. Some of the newer languages give you more masterful control over many processes and are more than enough reason to learn them and possibly shift to the newer environments.
An important point is that Cobol uses decimal fixed point arithmetic while all popular languages use binary floating point numbers. Financial people get really upset when the cents don't match exactly what they expect.
not even quad (128 bit) binary floating point will give you the same results as decimal math. Note that in both cases the results are wrong, just wrong in different ways. The financial industry likes to be wrong in the same way as their old decimal calculators. A few years back the C standard added an option for decimal floating point and several different processors are adding them as well.
Idk, in Russia we had not banks before 1991. We have the biggest fully online bank in the world and it was found only in 2006. So, we are lucky to kot have any COBOL code for banks. And somehow we haven't problems with incorrect counting of our cents. I don't think that is the real point.
@@paulinchannel3104 I mean Russians are smart. There's a way to remove the floating point errors by doing more operations per transaction, so they most likely must have done that.
I was a Cobol programmer once - not by choice though but a colleague resigned and made the big mistake of mentioning that I knew it on my CV and my boss remembered that. It is not the language really but IBM mainframes that makes it live so long - IBM developed several families of mainframes starting with the S/360 family and they all are compatible with each other. From custom CPUs to PowerPC's of the Z series today can in emulation mode and boot the OS and programs written in the 1960's. The US tax system is written in a combination of IBM mainframe assembler and for Cobol running on a mainframe from 1960. Today they still doing your tax returns using it on a modern IBM mainframe but the same code.
The same code, another more powerful machine. The most effecient way to upgrade. In the industrial environment the cpu’s inside also the small contollers are all the time ‘old-fasioned’ due stability and minor bugs inside.
Just having the ability to code in Cobal got me a US work visa back in 1999. I was 20 and in the USA for a holiday when a friendly guy in a church and I got chatting. Turned out he worked for a company that needed more Cobal coders due to the amount of financial systems they had to check for Y2K. Was fun work, got to stay in the USA a lot longer, and ended up with a really nice reference / work experience for my resume.
A COBOL programmer was so inundated by requests of patching systems ahead of Y2K. After he finished all the requests one week before the new year, he decided to cryo-freeze himself for a short time to get away from all of the Y2K commotion. Somehow the Y2K kicked the machine database and he's kept in it past his due date. Years gone by until some people tawed and woke him up. "Did it work? Did we survive the Y2K?", were his first words. One guy with important looks about him moved closer and sheepishly answered, "Well, yes, in fact now is year 9999, and according to record, you know COBOL, right?"
My first full-time job (~1972) was as a COBOL programmer. I haven't used it since then, but I still respect its run-time efficiency, standardization, etc.
IBM mainframes are the silent workhorses in the economy. They sit there and run year after year. The z in z/OS truly means zero down time. All that COBOL code regardless of what any new gen says is truly an investment-much of it made decades ago. I work in IBM midrange (35 years now). Our systems run a billion dollar enterprise. The only worry we have is the lack of talent.
Did DMA allow call from cobol forgo params ? I think there was something like that with the DMA chip. Or maybe I needed tiny values so some uh bad flags allowed data to stay... meh probably some screen thing on later micro
@@ocdtechtalkWindows yes, linux, meh, depends. Also OpenBSD, PCBSD, and a few more. Silicon Graphics IRIX 6.2, was my favourite OS. Ran many mining engineering departments on it.
@@ocdtechtalk The AS400 running the OS400 operating system, not one of the other available ones, was horrible for graphics and science, great for crunching all of the numbers that businesses with finances and inventory and production needed. The later models would phone IBM themselves that something needed replacing. We were an office hours shop. I would get a call from IBM service as I walked in the door in the morning. Hadn't even got my coffee yet. "Your AS/400 called in last night and said this one particular hard drive will probably fail soon. When can we come out?" Great systems, with great redundancy in design.
75, Unisys Cobol programmer for 30 years (with IBM before hand) and still working with it. I have learned C# and Python, which can do things COBOL cannot. But COBOL is easy and reliable. I got back into programming with the run-up to Y2K, working in two shops making the conversions. We made a lot of money making program conversions but _I expect to really clean up when _*_Y3K_*_ comes around._
@@meep.472 mostly embedded systems require to be updated hardware wise to support 64-bit memory. Like your router for example. Otherwise, any modern-day PC is going to be fine (way before 2038).
@@keith77mn77 lmao no. All devices use a standard called Unix time which started in 1970 as an integer value increasing once every second. This is a 32 bit value which is set to overflow in 2038, so if any device uses unix time and is not updated to 64 bit by that point it's gonna think it's 1970 again and that can create a lot of issues.
Cobol is still used for a reason, there is nothing to match it in efficiency and speed in many important fields. And Cobol do support graphical user interfaces... Simply build a Cobol backend application with an API layer, and call it from a front-end, receive a response and present the data in any way you desire :). When you login to one of the larger banks and perform a transaction for example, several real-time Cobol modules will be running/executed on a mainframe somewhere, and data then sent and presented to you via browser/app... I work as a Cobol/Mainframe developer...
Anyone who says "efficient" and "speed" in the same sentence with COBOL doesn't know anything about speed or efficiency. Sorry, that's the truth, it is neither fast nor efficient. It's what was available at the time when financial programmers didn't like FORTRAN. There's a reason why its nickname is CROWBAR.
@@chribm Well, in the reality I see millions of transactions daily, passing through hundreds of Cobol modules on Mainframes... Its extremely fast and efficient ... Why do you think 70%+ of all the worlds business transactions runs via Cobol on Mainframes? If it wasn't efficient it would never still been used... Obviously its not the language alone, you also need the infrastructure. You do not think the biggest banks of the world can afford recruiting people, competent enough to make intelligent decisions related to tech?
COBOL is one of the first programming languages that I learned back in the late 70s. I used it to write inventory control programs on my company’s minicomputer. It was easy to learn, although not very compact. Your video reminded me of the fun times writing code on the special programming paper, typing it into the computer, then making the minicomputer crunch reams of data entered by our office staff. You also reminded me of the huge removable disks on which we used to save our data.
I am 78 and spent much (but not all) of my working career “back in the day” . (1968-2007) programming computers. My first language was FORTRAN, but 99% of my career was spent writing in 1401/1410 and 7080 Autocoder (maintained by “patching” actual object/machine code), ALC, COBOL (compiled and link-edited) , and JCL (along with GDG’s) to write countless modularized programs and systems. Some of my early (object/machine code) programs were “bootstrapped” from paper tape or punched cards to execute (there was no “operating system” then). How tedious and exacting everything was in the beginning. I’ve read and debugged many a core dump. Did I mention tedious yet? I loved the power I felt when I made that huge machine do exactly what I told it to. Now I happily tap on my iPad and use my PC marveling at how much easier it is to use a computer. “We’ve come a long way, baby!” Now instead of core dumps we get “the blue screen of death.” It was a great ride!
@@ronaldlee3537 yes. And they need to replay form that functionality before like yesterday. Their dependency will get even more desperate the longer they procrastinate. We started years before Y2K to get ready for it. But back then we had a deadline. “Cracks” started showing up because of dated reminders generated for future events started to not “roll forward” into the 2000’s properly in the mid to late 90’s.
Mainframe.... ¡¡¡ cobol, C , Assembler , the past is the future. I have 36 now, I had started to work in IT in 2011 as QA manual tester, then I learn to develop first with automation testing and then development with .net core C# tech, java, ruby and one old man that was my boss when I started to work by a contractor company in Monterrey México for Citibank Banamex, and we are using AS400 mainframe system we had doing a migration and integration from that main frame system to new tech, he told me this language and tech is so expensive you should learn this to gain a lot of money... I did not believe that after I see jobs with a really high salaries couple years after.
Love your account of this. I started out with Autocoder on a 1401 in college followed by COBOL in the upper-level classes - and was hired by a major petroleum company to write COBOL for them. I moved on to other things in the IT area (data management, IT Auditing and DBA work) but it was a good, albeit stressful living. Burned out at 59 and took early retirement and haven't regretted it.
I’m 66 and a retired COBOL programmer. AI should be able to update it now except for COBOL spaghetti code with non-standard magnetic tape processing and hidden calls to special routines. Yes it’s still there unchanged and untouched for 50 years. It sits there waiting to trap some naive AI or person attempting to update it.
LLMs will never be able to write effective programs for the simple reason that it is incapable of reasoning about the "code" -- tokens, really -- that it spits out. It is doing a statistical inference on a copus of code already written by human beings. Think about that for a moment. There is no dynamic reasoning in statistics. None. I am always amazed that anyone expects LLMs to do better. They are good for a very limited domain of things. But anything truly creative and constrained by logic and reason? No.
Long enough ago for me to have had 2 careers since leaving the bank data industry back in 1996 having worked in the COBOL Tandem and JCL environment for 10 years. Spinning tapes and all, I left just as PC's were starting to interface as a front end. Long enough ago or me to do a 4 year degree in Graphic Design, work in that industry while also renovating houses then become a builder full time for the last 15 years. To be truthful, I seriously considered taking it back up again over Covid lockdown but thought, jeez, so much time has passed....
Much less than half a career ago for me. I was doing COBOL refactoring and modernization only about 9 years ago. I made some nice change over a very short project. When it is not broke, you don’t need to fix it, but you do need to maintain it to keep it making you money. It would take four years of a team of 50 developers and 100 verification engineers to completely rewrite that COBOL code, and there is a non-zero change they will screw it up. Paying someone to make minor updates in working, native COBOL to tap into data flows for new analytics is worth it many times over by comparison.
I work in local government. We still have plenty of systems that run on the mainframe, and those are typically COBOL. In fact, I know of jobs that are begging to be filled but they can't find anyone who will do COBOL programming
I used to be an IT professional til I retired. If it ain't broke, don't fix it ... that is the motto. So the huge amount of COBOL out there will survive until it MUST be changed. I had to program only a bit in COBOL, thank god! The majority of conversion happens when a platform no longer supports something. Like the DOS conversion at my company. When something is extinct, then change happens. But anything new.... that is when change happens.... so we see C++, Delphi, Python, C# etc... And when you do systems programming on a mainframe.... you typically HAVE to use Assembler to access supervisor routines. So you do what you must do. Nice video!!!! I enjoyed it a lot!!!
Two quotes: 1) "I know COBOL and I won't starve" C. Pitts Control Data Corp. circa 1979. 2). "What runs on the mainframe? Civilization." Roberts, International Business Machines circa 1987. Both statements are as applicable today as the were 40 to 50 years, from Shanghai to lower Manhattan and all points in between. Anyone who can code can pickup COBOL in about 1 hour (I did it on the Chicago NW line one afternoon). The institutions you listed didn't even name the largest institutions depending on COBOL. Why COBOL? It still works and doesn't even need recompilation. Book of records usually requires continuous availability and those grand mums and nainai from Chengdu to Mumbai, Joburg, San Palo and Des Moines expect those ATM and credit cards to work 99.999% of the time even after a disaster and that means COBOL running on big iron, even in Wai Guo Qiao Pudong.
No. Its the cost to move to new systems and the fact that so few people still know cobol and even less want to work with them. Its easier to contract a COBOL dev than hire a team of them to work with the backend and applications teams, because no one likes the weird, old, hairy COBOL devs, and there are like 3.
In 2000, Citibank reported that they had spent $500,000,000 on COBOL programming to fix Y2K problems. And they said if these weren't fixed, they would have "lost the bank".
@@m3talHalide-rt2fz If that's the case... then why aren't we on a new system that is better so we don't have to deal with the old, hairy COBOL 'grammers?
@@m3talHalide-rt2fz Fundamentally it's a "book of records" application limited to number of hours in a day, competing platforms fail to effectively scale (they lack the physical hardware to cluster with a single clock with a shared queue) and lastly they lack commercially available continuous availability and reasonable DR. Book of record applications have to get through all their books in a day every account has to be updated (there are some noncommercial exceptions) big iron addresses this by scaling through shared queues, common clocks and shared storage. Employing that same scalability technology financial institutions literally stretch the processing within a data center out across multiple physical data centers within a single geographical region to eliminate data centers as a SPOF. Then layer on top of that nearly continuous recovery against regional disasters through DASD replication technologies with automated recovery of operations in well under an hour. Other platforms simply can't compete with cost effective off the shelf solutions on the basis of Recovery Point Objective and Recovery Time Objective. Lastly you could try to port some of these 50 year old applications to new platforms but where are the long term savings? The platform you port it to might be out of date by the time the conversion is complete, start over this time we might get it right? You'd be porting our most important applications to a treadmill. The application code written 50 years ago runs just fine completing the tasks that modern customer facing applications direct. The small financial institutions, that maintained their own data centers long ago outsourced those operations where they operate as a partition a client VM, where the COBOL code happily does it's work day in and day out.
COBOL was the primary language of my college days. I spent a number of years working in it, even on desktop applications. Of course no discussion about COBOL is complete without mention of its mother, Admiral Grace Hopper. A true giant in history.
In college I studied Applied Computer Sciences. In my first year, back in 1997, we had a course on COBOL. During lectures and practical exercises, I didn't really understand it all. When the first exam approached I started studying. All of a sudden I had an epiphany and I understood it all. Like in comic books when a character has an idea a light bulb is drawn above its head💡, it was such a moment. I still remember it to this day, so special that it felt.
For me, it happened as a sophomore in high school where I was reading the manual the 6502 that was in our school's new Apple ] [ computers. It was like a bell rrang and it all clicked. That summer I disassembled and hand annotated Apple DOS and a few years later goa couple of engineering degrees, got a job in high tech and never looked back. I did go back and thank that high school math teacher who let me just run with it.
For me it was in 1968, that I stumbled upon the RJE room, where, without even knowing what a computer was, I helped students fix their Fortran programs just by reading the error messages. Leaving the RJE, I went to the book store, bought the 100 (50 sheets) page Fortran manual (written by an IBM engineer who said everything once),.I took it home on Friday, read it at over and over least 3 times from cover to cover, and was a Fortran expert on Monday. Truely an Aha moment!
I graduated from A.S.U. in 1982. I was a programmer/analyst for 20 years. I used to know over a dozen programming languages (BASIC, COBOL, GPSS, SAS, DYL280, Focus, ISPF Dialog Manager, REXX, Visual Basic 3/4, Powerbuilder, Unitech, MUMPS, not to mention Syncsort, Merge, SQL, Excel), and knew my share of JCL (Job Control Language). COBOL was a great, easy to understand procedural language.
COBOL was “old” when I was in college. The biggest issue with cobol systems is how a set of programs will share files. So knowing the language is only the first step. Understanding the dependencies and interactions is the bulk of the problem.
Yeah, I was going to say, I've seen textbooks on it in public libraries. It doesn't really look that hard to learn. Like I could probably learn it in a few weeks.
@@xenopholis47 some people I worked with spent over a year trying to reverse engineer 200 cobol programs used for credit card settlements. There were so Marty reading and wtiting to the same files under different circumstances that they were never sure they got all the interactions.
@@michaellatta yes exactly - VSAM / seq datasets, and not just the COBOL but also JCL (& what disposition each job has the files open under), and anything else the JCL is doing to the data outside COBOL.. (and that's just batch :)
I am retired now, but started programming for a DOD facility in 1966. Over the years I’ve programmed in 37 different languages. I’ve never written a line of COBOL, however it is so easy to read that I’ve debugged a number of COBOL programs for others. I’ve also showed a number of programmers how to make their COBOL more efficient. The prime example being a new application which, when in test, demonstrated that the nightly batch process would take about 10 days to run. While others were in a state of panic, I showed them how to fix two issues. That night, the test demonstrated that the daily nightly process could be completed easily in less than three hours. COBOL - easy to code, easy to misuse.
Yeah. One big problem I ran into maintaining COBOL programs was when programmers didn’t know any better and wrote sequential searches with large amounts of data. Some programs I modified went from running for 4 or 5 hours down to minutes. Wish I’d gotten bonuses based on how much I saved departments on their internal billing. I could have retired 25 years earlier, hehehe.
I started coding in COBOL in 1979, we coded using a pencil and 80 column paper forms which were sent to key punch to convert from pencil to cards with holes in them. I remember my first computer had 4K of RAM and no hard drive. Mass storage was a conventional audio cassette tape. It is amazing how far we have come.
@cliffhaczynski6121 I have the same memories but with PL/1 rather than COBOL. I recall that a significant part of my day was spent "Desk checking", over and over!
That sounds like the Timex-Sinclair I bought as my first computer. I had the extra memory pack, and I bought the whole thing at a department store in their jewelry department for about $80.
Learning how to program with paper and pencil made me a much better dev than many of my peers. You have to think through so much before you punch those cards so the number of logic bugs you introduce are significantly reduced because the process cost is so high. If you have to do that 4-5 times you learn REALLY quick to get things as right as possible on the 1st pass.
Proud COBOL'er and still here. The oldest code I wrote that is still running in production was delivered in 1983 with a Y2K update in 1999. Good stuff.
Normally any discussion of COBOL mentions Grace Hopper, one of the inventors of the language. There is anecdotal evidence she was also involved with the use of the term "bug" and "debugging". I learned COBOL in 1980 and used it for most of my career.
The term 'bug' was already used in engineering, even Edison used it. Her instance is the first _actual_ living bug that caused an error, and that's why she wrote that journal note. IIRC that journal page is still being preserved.
@@timradde4328 It was a moth, as I recall from what I've read. I wouldn't be surprised if the story has been distorted and different versions can be found in the literature.
It was a moth. Computer memory at the time consisted of a grid of wires with a donut shaped magnet at each intersection. The polarity of the magnet could be reversed so that the bit could be flipped on or off. A moth flew into this wire grid and shorted out a section of it causing a "memory fault."
@@DrunkenUFOPilot Actually, Hopper's meticulously-kept notebook has a page with the actual moth that documents the event. The notebook is in the Smithsonian, so no embellishment.
Actually, that’s also a bit of a myth. Rear Adm. Hopper did not invent COBOL. She was for a brief time on the CODASYL committee, but really not for very long. Some of the syntax of COBOL is based on Flowmatic which she did design.
One of the biggest benefits of using COBOL is that it does EXACT decimal arithmetic (i.e., not floating point double precision) , which is a huge advantage in financial systems. You can write highly structured code that is really easy to read - almost self documenting. But it's not at all suitable for web development, which is a huge disbenefit for most developers.
Floats should never be used for financial calculations. If necessary, should a language not provide fixed decimal arithmetic types, a whole number type can be utilised instead. Though a little extra work would be needed to display the correct values.
@@stewartkingsley Floats can be used, but not binary floats. Binary floats can't even represent 10 cents properly. IBM Machines of the Power and Z series have had 128 bit decimal floating point hardware for more than 16 years!!!. The precision is more than enough for any practical use.
One drawback to that is you need to make sure that the PIC allows for enough digits. If, for example, you are expecting a number that is 10,000 or more and you have a PIC 9(4) then your variable will roll over unexpectedly. I have accidentally created some infinite loops that way (and wasted a whole box of green bar paper...)
The insurance company I used to work for relied on COBOL for the backend to its web based customer and employee facing portal. COBOL also handled all the batch processing of creating documents and renewals.
I'm 54 years young and commenced learning how to code in 1999 with Pascal (enjoyable), then COBOL (nearly broke me), then onto C (once I got my head around pointers, LOVED it)... I'm now a sysadmin, network engineer and ICT all-rounder... except I no longer code (less a bit of bash scripting) and I really enjoy my work because I'm not a coder. Excellent video also BTW!
I was an accountant at an insurance company in 1989 when one day a flyer landed in my inbox about an entry-level programmer training class. I went to the presentation, took a test, and passed an interview to land one of the spots. I spent the summer learning COBOL, VSAM, and JCL, and was a mainframe programmer for the next eight years. I left my COBOL days behind when I took a job at a bank and worked on applications supporting their online banking. But I remember my COBOL days fondly.
Loved VSAM, beat the heck out of straight sequential data organization.... I love more that it is totally irrelevant today, our current storage systems use much more complicated strategies but hide them under an abstraction layer that makes data access so, so easy!
Same, I just can't concentrate long enough to be truly effective at it. NetWare saved my bacon back then. I still run 3.12 in Vbox for shits and giggles.
Agreed. I learned COBOL in about 84/85 but was never a professional programmer. I've just picked it up again now and it's amazing how it all (mostly) comes back!
I enrolled in a University computer science program in 1982. One of the introductory courses was working with Cobol, which was programmed into a machine using punch cards. Fortunately, I found computer science, that is, not the programming, but the actual science behind it, to be so fascinating, that I did not let the horrors of Cobol and punch cards turn me off. I graduated in 1985, and to this day, I still think that computer science is one of the most interesting sciences that we have, it is so generalizable, that it can be applied to all the other sciences in various ways to improve them. After all, everything boils down to information and algorithms.
Learned to ride a unicycle as a teenager, and how to program COBOL in 1990 after riding my unicycle across campus to class. My transcript includes: PASCAL, COBOL, FORTRAN, & C++ as languages because having a programming class account got you mainframe access (VAX) when the user count was getting too high and kicking personal accounts off the system.
So happy the algorithm brought me here great content and wow I was blown away with how much I learned in the first video. I'm starting to think the A.I. overlords really understand me, I think I could be happy as a battery. Anywho side bar I think I'm hooked. Keep doing what your doing , One Love.
Wellllllll not quite 99%... remember, the IRS was also using it (and that was true for most of the Western world). But sure, they would have owned the overwhelming majority of all computers.
Coulda fooled me, when I joined the Army in 90 it sure seemed like everything was still run on handwritten forms! I think there was a computer in the shop office.
"Also, in 1959 the department of defense probably owned 99% of all the computers in the world." Hardly! Not even close. Universities were the big owners. And by 1959, it was slowly starting to spread into business overall.
I spent 10 years on the computer side of the banking industry, working with ATM's and branch automation back in the 80's. I was a systems person, so I principally worked with Assembler, but pretty much every program that comprised those systems was written in Cobol, so we had to know what those programs were doing and how to dig through the dumps when things went bad. I spent a 40+ year career in IT and never wrote a Cobol program after I left college. It always felt too cumbersome and bureaucratic for my blood, but I always understood that because it was handling money, there was never any room for mistakes. A lot of my work back then was about fixing corrupted transaction data that slipped through the cracks.
"...how to dig through dumps when things went bad." This. This is the thing that is needed. Anyone can program Cobol. Far fewer can dig through dumps to backtrack to the line of Cobol code. Knowing how to program in Cobol can make a career. It can also be a career-killer as no one will promote you - they need you where you are, handling the Cobol job that no one else can easily fill. Will the pay for this irreplaceable person compensate for this state of affairs? Probably not. The trick is to recognize the situation, then have the courage to move on.
It was written in the days when people were trying to make computing as simple as possible. The idea was that since it was basically a constrained form of English, even managers would be able to write their own simple queries and so on. Then people brought out things like Easytrieve for manager's reports and eventually they realised managers couldn't learn to program under any circumstances. I can remember programming schools where they would take in anyone who passed an aptitude test and teach them COBOL in three months and guarantee them a job. One of the great things about COBOL is the arbitrary precision of numbers, especially in decimal - you could accurately represent numbers like 18 digits and a decimal point and 10 digits ... which made things like financial calculations work so much better than trying to lever them into a LONG or a FLOAT. Some of the new languages have retreated from the concept of easy and some of the new features of C# and Java are probably only usable by people with degrees in software engineering.
I Dont See much Developers who have even heard of Software engineering 😂 They followed a Multiple choice learning path and got a certificate, not more.
Yes, it took languages like C# and Java to realize the importance of decimal numbers. Prior to then, I've seen a host of languages all of which only integers and floats. Though, it seems weird, if one had to rewrite an existing Cobol module to Java, that one may discover of which Cobol does quite a bit behind the scenes.
That's very true, COBOL really "cracked" the decimal/number-manipulation issue though it felt extremely verbose in data definitions. My prog. "school" was 6 weeks only, no job guaranteed but I got one, & thankfully got out of COBOL ASAP. Into a different environment which was great for 20 years, then redundancy & my IT career ended. Application development had gone to India. If I'd stuck with COBOL I would have been OK financially, but perhaps driven crazy.
@@marcuswilliams3455 You said "Yes, it took languages like C# and Java to realize the importance of decimal numbers" - I beg to differ on your wording, they perhaps rediscovered the importance of decimal numbers. Otherwise, you're spot on.
I want to commend you on correctly explaining so much about COBOL. I started writing programs in COBOL in 1974. I also appreciate how upbeat you were at the end. I have several disagreements with some of your statements, but I am sure someone will point them out in previous comments. The one I would like to address though, is your statement about documentation. You are assuming that 50 years ago, it was the same as today. It was not. Back then, we had Documentation Standards. A really smart company would not only allow time for the development team to document their system (at business, program, and data structure levels), but they would give us time to update the documentation with every change we made. And, our IT management would enforce the standards. In my last job (I retired in 2017), no one I knew of was using documentation standards. I was working for a software development company in banking. I once asked a programmer for the documentation he used for developing a program module so I could help him fix a bug that we were have a hard time with. He had no idea what I was talking about!
In the 90s I worked for a major insurance company that was still using COBOL on mainframes for their backend stuff, actuarial tables, transactions, logging, etc. In the almost decade that I worked there they had 3 times where it was announced they were replacing it. Each time they tried to start that process, they gave up and just made a fancy new GUI that interfaced with the old systems. Now, 25+ years after I started I still have contacts there and it's all still chugging away on COBOL.
I have worked in insurance industry for 20 years. There are still lot of mainframe solutions with Cobol (or Programming Language I, look it up) hanging around. They work and are still extremely efficient.
I worked around old COBOL programmers in a very large financial company. I asked them why COBOL was still used when most people thought it was dead or never heard of it. They told me the advantage was that it was very stable and hard to change. Because of this accidents (fat fingers) didn't happen. The company advertised internally to train employees in COBOL and gave raises to those who took the training.
My uncle was a COBOL programmer who back in the day was one of the main guys programming the EPA computer system, and for the rest of his life he was the main guy maintaining the system. Even after he retired he kept getting calls to come and fix issues, but he passed away two years ago and I’m not sure who updates the systems now. Ironically, he was a librarian and therefore not a fan of the EPA but they were the main customer of his business throughout his career and he spent his life keeping their systems up and running.
I'm 62. COBOL programmer from 1983-86 in IBM 370 and 4331. Then moved to Sisinf 4GL, Sybase RDBMS and its 4GL, then moved to UNIFACE IDE that can access legacy storage format and allow modern UI, then Java Swing, and now Python. Still learning.
I became a COBOL programmer in 1975 thanks to IBM’s self-teaching manuals. Then I spent the next 20 years making a living at it. Difficult? No. Verbose? Yep. But its verbosity is one of its advantages. Well written structured COBOL should be self-documenting.
I'm an old COBOL programmer and I love it. You could write code that your supervisor could read using 88-levels. Was writing object oriented code in the 80's.
An interesting and very well presented topic. In the early 90s on the daily to commute to London a regular fellow traveller was a Cobol programmer for a major corporation who in spite of having reached retirement age was asked to stay and carry on working simply because no available replacements could be found in the job market. Thirty years on and it looks as though things haven't changed much.
A lot of us can program COBOL, but never admit it. The reason it is still around is because it was straightforward did exactly what it was supposed to. No more, no less. COBOL is in the client server world too. Peoplesoft still uses it... BTW, my Dad learned COBOL from a certain Commander (later Admiral) Grace Hopper. He was also an assembler programmer...with the War Department, later DOD, then IBM then GEICO.
This is not quite true - it was a common term by the time she discovered that "bug" - the joke was that the bug was caused by an actual bug. Not that the term bug was derived from this occurrence.
@@zoeherriot Thomas Edison also found a 'bug' -- a squashed insect in a telephone relay which prevented it from working properly. He wasn't the person who invented the first debugging hardware, however -- that would go to whomever invented the first insect screen. That said, we all remember Adm. Hopper's bug.
@@zoeherriotQuite correct, we know that it was a common term because of how she described it: "First actual case of bug being found." It was the first actual case of a computer bug (error) being caused by a literal bug (insect). Why would she write that unless errors were already called bugs?
Also heard the Turing machine, Colossus, crashed when a moth shorted out two valves, another anecdotal/legendary beginnings on the term "bug" - that was the 40s before the USA had even built a programmable electronic computer.
@@wolf5370 I've heard that, not sure how apocryphal it is. Btw, probably want to avoid calling anything "the turing machine", as "Turing Machine" is a specific important concept in computer science.
I'm 86 and started programming in 1962 on an IBM 7070 for the State of New York using the language called Autocoder. After a few years I was recruited by General Dynamics, Astronautics division in San Diego where İ worked on the Atlas-Centaur missile program where we had an IBM 7074; still using Autocoder. It wasn't until 1975 that I worked for Great American Insurance Co. where we made the transition to COBOL. Subsequently, I worked in ten other countries, and in the latter years as IT manager. COBOL was the backbone of my work until the mid 1980's when several databases took over.
I'm only 83, but my first hands-on encounter with a computer was in 1965 with an IBM 7074 which had no mass storage and used half-inch magnetic tapes for batch input and outputs. A separate 1401 mini-computer had mag tape drives and used a high-speed card reader to create batch input tapes to feed the mainframe. The 1401 also processed mainframe output tapes to line printers and a card punch, which was the other half of the card reader. Incidentally, the 7074 also had a one-at-a-time card reader at its operator console which was a re-purposed manual card punch and was useful only for last-minute run-time data and program options. I was assigned to be a computer operator and it wasn't long till I found the computer manuals and learned how to program both computers (my favorite was the 1401 'cause it was so versatile and easy to program via cards.) Shortly thereafter, I was reassigned as a programmer and soon learned Assembler, Fortran and COBOL and wrote many programs in those and several other more exotic languages. Actually, COBOL was the easiest, but Fortran is my lingua franca and I can still dream in Fortran!
@@terrycureton2042 I've been told that because of the limitations of the hardware the programmers back then had to write elegant, efficient programs and that modern programmers write sloppy bloated programs. Did you see that happen over time? (Idk a single programming language.)
This takes me back! This was one of the first languages I had to learn! Since I never worked in a financial institution, the language faded away in my memory! COBOL is still used today and follows the old adage, "If it ain't broke, don't fix it!" I heard a few years ago, that COBOL Programmers could earn excellent wages for their knowledge of the language since, there are so few out there.
I think *"...had to learn..."* sums it up. No-one would do it other than for money/career. Whereas I *wanted* to check out BASIC (boo, hiss) because you could do fun stuff quickly & easily. 43 years ago, that is
COBOL is a compiled transactional record oriented language. Python is a general purpose interpretive language. Huge difference when it comes to accounting. It's alive because it thrives in business.
We're not all dead yet. Just because COBOL is old, and those of us who know it are perhaps older; We are still among you. As an aside, I wrote the algorithms that secure your banking pins on the smartcard in C. So those will last a bit longer.
C, not C++ just plain old C, is going to be around about forever. The effort to deploy a modernized universal embedded systems language, Ada, basically failed. There were many reasons, but basically its advantages just weren't advantageous enough. C remains the very best language for embedded critical systems.
This was great, thanks Dee. I'm a 63 year old man who recently retired from public service. A good part of my career was spent developing mainframe applications using procedural programming languages such as COBOL and NATURAL. As I had set out to retire, the organization I worked for was still in the process of trying to upgrade their applications to a more modern, object oriented platform and finally making some headway in doing so after many years of trying and going nowhere. My experience is, there is a tendency for managers to blame issues related to maintaining and upgrading the existing applications on the (perceived to be) antiquated mainframe platform, when in fact most of the time these issues arise from poorly documented systems and staff turnover among the subject matter experts. As a result, all too often the organization's newbies have to resort to taking a "black box" testing approach in an attempt to determine what the existing applications do and how they work. And all too often it's a thankless job.
@@alann.7976 The manager assumes magically you can convert the COBOL systems into any modern language just by looking at the codes. Sadly this is every difficult because missing documents and many, many modifications!
I think a feature of COBOL that is often forgotten, is that it didn't use floating commas for calculations, avoiding 'rounding bugs' and strange (but very expensive) things like that.
In English, it's called "floating point". The trouble is exact calculations are very computationally expensive -- they take a lot of CPU power. Years ago, I heard that Bloomberg had special FPGA-based PCI cards to do 100-digit decimal calculations. FTSE just used the floating point hardware in normal CPUs, accepting that they would sometimes lose shares. I don't know what languages they use.
@@eekee6034 Floating point is MORE computationally expensive than fixed point. It's not even CLOSE. There's a reason that Floating Point Processors were an EXPENSIVE add-on for decades.
I once had to deal with a COBOL based financial system that saved and calculated big numbers in floating point. When changing to another system, the internal floating point storage format changed to IEEE standard, and rounding differences appeared between the 2 systems. Some Fortran coded Manufacturing systems used Floating point numbers alot as well.
My favorite memory of the 80s was modifying a PL/1 program and discovering a bug in the compiler. Whenever a specific constant was coded (i.e., NUM1 := 356.7289) I don’t remember the exact number, the compiler generated the wrong binary code, so that any computations would give the wrong answer. I reported it, but I got transferred to a new project, so I never followed up on the fix.
COBOL is really easy to understand. Nobody wants to learn it only because it is so damn boring. But I am sure with enough financial incentive many people will make an exception and learn it to make bank.
@@raybod1775Das gilt für COBOL vor 1968 und undisziplinierte Programmierer. Es ist auf jeden Fall ein gut handhabbares und wirklich mächtiges Werkzeug im Zusammenhang mit Massendaten. This applies to COBOL before 1968 and undisciplined programmers. In any case, it is an easy-to-use and really powerful tool in connection with mass data.
I am retired now but dealt with COBOL for over 40 years along with other languages. If you have good logical thinking COBOL can be your best friend. Even now IBM COBOL can be object-oriented, if that is what you desire. I found structured COBOL programming more to my liking.
I have 55+ years experience in programming and even though I could be retired I still enjoy the work. I still do some COBOL and IBM Assembler but my favorite mainframe was the RCA spectra -- so far ahead of IBM in many ways . I had the immense honour of meeting Admiral Hopper -- what an amazing lady!
Great post! I am a retired software developer who began my career as a COBOL programmer for the Brazilian IRS in 1980. Throughout my professional years, I had the opportunity to learn a myriad of different programming languages, including Natural 2, PL/SQL, Visual Basic, JavaScript, PHP, and Python, among others. However, I hold fond memories of my time working with the mother (or, if you prefer, the old aunt) of all those modern languages. I used to say in my COBOL classes that you only need to learn seven basic commands to solve 99% of your programming needs-indeed, COBOL is a truly simple language. As the Brits often say: "The Queen is dead! God save the Queen!" Greetings from Brazil. :)
I read some other comments and it reminded me of how my mom had to find, unscramble, and correct errors and just bad code. She had to print it out and go through it line by line, delete the bad parts and rewrite them, then test run her updated routines and make sure they worked. It was hard, and she was the only one at the bank that could do it. Wow.😮
I graduated in 1969 with a degree in history. Yet during those college years I learned Fortran, SNOBOL, Penelope. That is how I made my College money. I ended up within Ma Bell, trained in BAL and COBOl in 1970. The business focus was capturing all the transactions in a business and filling those gorgeous databases. Understanding and codifying the flows and stores of a business was done with COBOL. It was the tool we had. The pyramids were built with big blocks by processes we are still trying to understand. We are still processing transactions and filling those wonderful databases. That language will be with us. Forever. 😇 My killer skill was JCL. Making a 155, a 165, a 195 all sing - what fun.
JCL -- uh boy. After that there was IMS. That's the straw that broke my camel's back. Then came CICS :) :) :) ... are we having fun yet? But then came SQL ! ! ! ! There WAS a G*d in heaven!
@@noneagoogleuser4443 Ah, JCL home of the buggiest program ever written (per line of code), IEFBR14 - the "no action" program where all the work is done in DD statements. The problem was that JCL expects a completion code which the original single instruction program didn't set, so if the previous step had a error condition it propagated to next step. To fix the problem the return condition was set to 0 which doubled the size of the program (from one instruction to two).
COBOL is the epitome of "If it ain't broken, don't fix it!" - in fact, it's success runs way *WAY* deeper than is readily apparent. Not only is it perhaps the most-scalable programming language ever conceived, but it's quite probably the most secure language to use for financial purposes. To begin with, being a task-specific language (as opposed to a general-purpose language), there simply aren't very many vulnerabilities which can be exploited. In this regard, it's like playing basketball with a concrete backboard - the ball will rebound, period! Also, because it was written for systems with kilobytes of RAM and single-core CPUs running in the 2-4 MHz range, it is both extremely powerful and exceptionally lightweight. And because it's also immensely scalable, it's nearly as efficient at managing trillions of financial records as most modern languages are at managing millions. Last but not least is the Technology Rift, which is actually a huge benefit for COBOL. Much the same way that the best anti-theft device for a car is a manual transmission (because 99% of car thieves can't drive a stick), COBOL's relative obscurity makes it virtually impenetrable against outside (or even inside) attacks. Add to this the fact that every system is almost-fully bespoke and has been maintained for decades with minimal documentation, the same 'downsides' which make things difficult for people who are *supposed* to have access make things virtually impossible for people who aren't supposed to have access. When one considers all these factors - particularly in regards to how sensitive the data that's being handled - it's difficult to argue that these systems should *ever* use anything other than COBOL.
@@DaveC_TN Yes, they're wrong. Security through obscurity isn't a proactive security measure that actually does something to prevent intrusion. it's just "Gee, I hope no one puts in the work to figure out the puzzle." But people who are interested will _always_ solve the puzzle eventually.
@@wasd____ it is a very good idea - the most secure place is the one no one knows about. In fact, if there was a way to prevent people to ever find out, you could be safe for ever.
@@wasd____ That doesn't make it a bad idea, it just means it's not completely safe - but very few things truly are. And the puzzle doesn't always get solved - for a splendid example, see the Navajo codetalkers in the Pacific theatre of WW2.
Dee! Props (and L&S) from a fellow ZAR 🙂 Love your work COBOL was the first language I studied officially, and the first language I used in the workplace
I'm not from IT but worked at a bank with product management. It's true that the codebase is mostly a black box for most of the employees, and they need to be extra careful with systems updates because any change could stop other processes that relied on it. There's no documentation, and developers most familiar with particular systems are the ones who worked on it longer. It's even weirder that development is outsourced. I assume that the company holding all the knowledge about the systems can name their price. There are stories like the one developer that knew a system left for another job, and was begged to come for a visit from time to time to help fix something. But on the other hand, a few years ago the accounting system was rebuilt from scratch in SAP. It took 5 years until it was completely switched and the first few months were absolute chaos. A few millions in unreconciled entries were just forgotten about, probably because they just gave up trying to figure them out. Rewriting an old system might just be too complex, and companies will only do it, if think, if not doing so gets them in legal trouble.
From Cobol to SAP, Wow. Talking about from the frying pan into the fire. It takes about 5 years to write Hello world in SAP, so no surprise there. But don't go for Python or JavaScript for large critical systems. You can't create quality with testing, you can only improve existing quality with testing. Stick to proven languages with type safety and memory safety like C#, Java or Rust nowadays. (I know, SAP uses Java, that's not the point)
@@mennovanlavieren3885Yes, I was puzzled that Dee compared COBOL (which I assume is an abbreviation for "COmmon Business Operations Language") to Python, when Python is a scripting language, much slower than, say, C or C++ or whatever is the current equivalent. The first two are not only well-tested, but have lots of people who are well-versed in them. (BTW, Javascript is also a scripting language, which was written in 2 weeks.)
"There are stories like the one developer that knew a system left for another job, and was begged to come for a visit from time to time to help fix something." That'd be me with the Chinese Liaoning Province inter-bank ATM switch, based at the People's Bank of China in Shenyang! I used to love those trips in the 90s
I have a friend of mine that is still a highly proficient (but semi retired) programmer in COBOL and like me now in her 60's. She still comes out to play when her former customers need COBOL changes and makes a lot of money. When the Y2K panic came along in 2000 she was in big demand. Most old COBOL based software systems were never designed to go through a millennium change (00)She made a fortune upgrading COBOL based systems for many big companies, banks etc, and rich very quickly. She was quite busy for a year or two!! It is not really any harder to learn than any other language.
to the readers. Before 2000, lots of dates just had the year, month, day in various formats. Day, month, year etc. and did a trick like if year is > 90 put 18 in the century column.
I found this content unusually interesting because it is not really about coding but about the intricacies of how code influences or impacts our daily lives without people's knowledge. Shout out to Dee for that ingenious twist and creative approach to the coding world.
The problem is that the law changes, accounting requirements change, new payment systems are needed, new features are needed, on some level it's ALWAYS broken against current requirements.
@@dlbiggins COBOL code is regularly updated to match new regulation, the language itself has also had several new specs since the 50s, the latest one being from 2023.
I heard that long ago people were able to make also cars which work until now, construct buildings which last until now, and so on. It's not only about software.
Somewhere around 2000 was the peak of long-term reliability for Honda and Toyota. They came to know their cars were lasting TOO long, costing them repeat sales, and started deliberately making them to not last QUITE so long.
I was born in 1960 and started my IT career in 1984 as a Cobol programmer for a bank. Today I program with Python (but no longer for a bank). Can only confirm everything you said. The smartest thing I've heard about COBOL in recent years.
@@TheEVEInspiration a pity so few people agree with you. There is something mystical about Python. I have yet to understand what exactly makes it so popular. Perhaps that's the whole point, really: nobody asks and everybody just assume that there _is_ a reason for its popularity...
I got my degree in business computer programming. I studied 14 or 15 languages but specialized in COBOL. It served me well during my 29 years in IT. The other language/system I was proficient in was AIX UNIX; both very powerful. Thanks for the video.
@@codingwithdee Another one here. No need to worry, many of us can PERFORM SAVE_THE_WORD when needed. It would have been more informative to translate and compare trivial Cobol code doing exact decimal arithmetic into Python. 🙂
Can someone explain this to me? The statements seem to be in the same order. Is there a difference in what "ADD 20 TO FIRST-NUMBER" followed by "DISPLAY FIRST-NUMBER" does in Cobol vs. the Python += and then print()?
@@The_1ntern3t The output in COBOL: Here is the first Number 8 Let's add 20 to that number. 28 Create a second variable 30 The result is: 58 The output in Python: here is the first number. Let's add 20 to that number. 28 create a second number. 30 the result is: 58
Before learning COBOL I learned 1401 machine code, then Autocoder, Fortran, and finally COBOL. That was in 1966 as I recall, on an IBM 7040? It was nice reading the comments from others around the world who are also sharing these memories after so many years. My last project, in 1995-1998, was COBOL/CICS/DB2 at a company with approximately 7,000 users. Then in the time leading up into Y2K, we re-implemented into Unix/Oracle based systems with X term emulators running on PCs. That transition would not have been possible without a 4GL to help migrate all the transactions and background processing into 21st century technology. The bonus of course was giving the users the user friendly workstations after years of being tied to a 327x user interface. The conversion also cleaned up things like the interface with legacy financial systems using Oracle Transparent Gateway. Yes there is a lot of COBOL code running in the big systems to this day, but without the new technology our modern society would be unable to afford the conveniences of access and rapid high volume processing that we take for granted every day.
Learning any programming language is easy IF you are taught properly, leaning the concepts of programming and structured code. After that, all you need to know is the Syntax. I have coded in BASIC, COBOL, RPGII & ASSEMBLER, I've used JCL, DCF, and looked into more, hand coding HTML pages, and several others. There are SO many similarities that is almost stupidly easy to pick up a new programming language. Any competent programmer should be able to pick up the basics in no time.
Writing any new language idiomatically and efficiently tends to take a good while longer though. As the old joke goes: "real programmers can write FORTRAN in any language."
Yes this is true as long as the language shares the same general language structure, Try coding a GPU, or some of the newer processors. without a smart translation program. Code ASM for an IBM multiple transmission processor? Or a device driver at Ring 0. The instructions can be learned, but the OS/hardware environment makes it harder to operate correctly.
Don't learn computer languages, learn computer _science._ If you learn a language, you're just a code monkey who knows one language. If you know computer science, you know _every_ language if you can read a syntax guide and some library documentation.
I'm teaching myself COBOL for the pure joy of it. My first programming language was FORTRAN ANSI 77. My second was MS DOS BASIC. It will be fun if I'm able to monetize my exploration of learning and eventually mastering COBOL. Fun departure from COBOL running on big iron, I'm learning to rock that code on a Raspberry Pi5 ARM based SBC.
COBOL is a solid, no frills programming language. It does its job efficiently and reliably. It handles vast volumes of data readily. Much of the "new" programming seems to be more about "whizz-bangs" rather than getting the job done. There's a reason MS and others are called "bloatware". COBOL, Fortran and the like are all about getting the job done, not making pretty pictures and sounds.
COBOL is not difficult to learn. It's just that it is very limited, so to achieve certain things, you need to write a lot of code. COBOL is most suitable for record processing - read some record (like a credit card transaction) from one of the many input cards, do something with it (like increase saldo here and decrease there). Once you know PERFORM UNTIL and format records, you've 50% there :)
Exactly, same goes for all "mainframe languages", they are just very feature poor, so you always need to "reinvent the wheel", there are very few libraries compared to modern languages, meaning everything becomes tedious to do.
Congratulations, I got COBOL at uni because of the Y2K (yes I am that old) I hated it but it was pretty easy. If you go into mainframes the cool part is going to be the toolset.
@@deantiquisetnovis Where are all these jobs paying "fortunes"? When I researched a few COBOL jobs they were "competitive" but nothing out of the ordinary.
My first job, after leaving school in 1985, was working at the local council developing COBOL systems on ICL ME29 mainframes. This was when the UK Data Protection Act was introduced, and I wrote the council's database for storing all their DPA submissions from each department and made it searchable using keywords. I then went on to work at the local newspaper company working on their newspaper distribution systems in COBOL on a Burroughs B900 mainframe. Since then, I moved mainly out of programming, but I have worked on various other languages including VB, Postscript, various typesetting languages and scripting languages over the year. I imagine I could pick up a COBOL compiler and put out some COBOL code even now. Just like riding a bike.
That's great! However, it's important to consider that the amount of software written today is vastly greater than what was written during the COBOL era. Following that percentage and assuming it to be true, we could conclude that the quality of COBOL code, in comparison, wasn't as good as today's software. It's interesting to see how software quality has evolved over time.
@@FranciscoCarlosCalderonMuch of the software written today is bloatware, as it expands to fill the available storage. We built an antenna control system in 32k bytes that controlled 4 antenna servos, performed space stabilization, satellite orbital calculations, dead reckoning navigation, and space stabilization of 2 shipboard satellite tracking antennas in real time.
I wrote some COBOL code over 50 years ago and it is still running today, with out conversation. I know of a large company that tried converting to a so called modern language and they spent 10 years and many thousands of dollars and the new language always produced wrong answers when calculating dollars and cents when interest calculations or sales tax calculations were done because the new language didn’t follow the approve financial methods of rounding.
I've had that same experience, and I've complained bitterly that new languages don't (properly) support Decimal arithmetic. I'm working with SwiftUI now, and "Decimal" arithmetic is not fully integrated, and can make the same rounding errors as binary arithmetic if one is not extremely careful because it has NO rounding functions unless you want to convert to string, then back to decimal OR call NSDecimalNumber(mantissa:exponent:isNegative:). It's absurd. I never had these problems with COBOL.
I remember 1999 when, as a reporter, I was assigned to interview retired programmers getting Cobol machines ready for 2000. That was 25 years ago, Tthe scoo? A bunch of old chaps saving the econmy because they could code in an antique language. I guess I should have lerned Cobol then; I'd still be active today, Thank you for your piece.
About 40-60% of the banks and CU’s in the U.S. run on IBM midrange computers (IBM-i dating back to the AS/400 and Linux on Power dating back to the RS/6000). The IBM-i’s primarily use a programming language known today as RPG (though they support other languages like C, Java, etc.). All of these have amazing fascinating histories that would make a great video because they are still so pervasive in 2024!
I worked on RPG for a while back in the late 70s. I did not really care much for it and it would be way down at the bottom of my list of favorite languages. I much prefer COBOL.
I can program Cobol, but never put it on my resume, because they might make me program Cobol
So how does one get experience working on mainframes and COBOL?
is it that bad??
I hear you! I did a lot FORTRAN along the way, and managers made fun of it when I put that on my resume.
the main problem with working in COBOL is the working environment. Banks are not known for a casual work environment. There seems to be an entire ecosystem of programmers in that sector that mostly don't interact with the rest of us. I bump into it once in a while but it's like a entirely different programming world
@@myhandlehasbeenmishandled I've been told that's after a full internship in Hellco ULC.
I am 73 years old. I have coded in RPG, RPGII, COBOL, Assembler, IMS, IDMS, DB2, ISAM, VSAM, CICS and JCL (both DOS & OS) on small, midsize and large mainframes. I enjoyed almost every minute of it. The hardest COBOL program I had to maintain was written by a German-speaking programmer. All of his data names were a minimum of 20 characters in German with MANY consonants. One or two of the letters towards the end of the data name would change to designate a different data name. It was a real trip as I didn't speak any German and these names meant nothing to me. In many cases, I redefined the record in working storage so I could use English words that made sense to me. Fun times. I am retired now and sometimes I miss the work but then I will lie down until that feeling goes away.
I am 80+ and highly interested in AI.
Old COBOL programmers never die they just GEHEN ZU.
z/OS, MVS, TSO-ISPF here, and a REXX and 50 lol.
RPG on the AS400, those where happy days for me!
Worked on it in 1980s and today little better advance but 45 years.. I was et right
In the 60s, my mother learned Cobal and Fortran. She was immediately hired by First Citizens Bank in Raleigh, NC. She went from being a receptionist at a dentist office to the bank's IT specialist, since programmers were so rare in those days. She then learned to drive and bought a new 1965 Mustang.😊 I'm in my 70s and still have her Mustang. Thank you, Mommy. She was the Smartest Mommy ever.
If she had been smart, she would have gotten a tubal ligaton.
WOW, totally inspirational. Hope her grandchildren are suitably impressed.
Dude. Terrific story.
I have a friend whose wife left him for a computer language. Oy...
100% real. cool story bro!
In 1987 when I was applying for ny first programming job, I went to the Chicago Public Library, and checked out an article called "In Defense of COBOL" and the precis went something like "since the death of COBOL has been predicted for the last 22 years of its 24 years of existence, we may wish to jnestigate what properties have led to its continued existence." By 1989 the WSJ had a front page article about how a company pushed their mainframe out the window (stupid, stupid, stupid) and yet 37 years after reading that article ZOS and COBOL are very much around for the most IMPORTANT irganizations.
1. COBOL programmers are available to employers who are willing to pay for them.
2. People can still learn COBOL.
3. The reasons ANY old code base survives are a) it works, why change it? and b) when you have millions of lines of legacy code, not only is it sometimes difficult to know what it does, but also WHICH of those lines of code actually ever execute. Dead code can be as much as 2/3 of a codebase. Additionally, not only does old code sometimes lack documentation, but also tests, so risk exists that if you change something HERE, you may break something THERE, without knowing it.
Is it possible to copy the code and run simulations?
About 40 years ago, I read a short article in a computer publication. It was about a programmer who wrote a routine that modified itself during execution. In the comments he wrote "When I first wrote this code, only myself and God, knew how it worked. Now only God knows ".
Its also hard to replace as it handles data better than a scalable solution..
@@spadeespada9432 "Is it possible to copy the code and run simulations?"
Probably, but that still doesn't really help anyone understand how the systems work, which is the actual problem.
What the companies who still own COBOL systems really want is to either replace the ticking time bomb of spaghetti code, or maybe to patch, extend, & modify it for changing business requirements. Both require being able to read & understand the code, NOT just knowing how the black box responds to various input states.
And plus, the input states can get VERY gnarly.
And then the one who wrote it dies... @#$@😖
I can still program COBOL. Stop making me feel old.
I’m the same… and I’m not even 50!
That makes two of us. Though I am old.
Me to
Sounds like you can earn a lot of money then.
Me too
I have been coding in COBOL for 42 years - still going. And I can code in IBM Assembler.
What is your educational background or training that got you that job? Are you like an engineer?
The amazing part of the COBOL is the number of years. If by IBM assembler you mean Mainframes that is spectacular in it's own right.
Have been coding since 1969 all on IBM equipment. I programmed 407 accounting machines (plugboard wiring) then 1401, 360, 370, 4300 series, 3090, s/390, series-1
Wrote for BPS, TOS, DOS (and its variants such as VS VSE, VSE/SP, etc) V/M, OS (and its variants MVS, MVS/XA, etc)
Been writing in assembly since 1970. Converted COBOL code from DOS to MVS. Even recreated COBOL code from core dumps because the original COBOL source was lost.
I am sorry for people that never experienced hands-on with a mainframe. Its is truly an experience. I can do things in 16k of memory that NO other language can do. I wrote a COMPLETE accounting system (A/P, A/R, G/L, PAYROLL, INVOICING) in 32k (including the operating system)
I even got COBOL to dynamically call another COBOL program. Something IBM said was impossible. As of 2012 that interface that i wrote in in 1981 was still running!
To me, there is IBM mainframe then the rest of the other machines
Nobody has mentioned FORTRAN, although that was more the realm of science folks ... and what about FORTH, who remembers that one?
Bei mir genau dasselbe. Ich kernte ab 1981 in der Berufsausbildung an einer IBM3033 unter OS/VS 2 COBOL und Assembler und hatte seither praktisch mit keiner anderen Programmiersprache zu tun.@@jaimeduncan6167
Bei mir genau dasselbe. Ich lernte ab 1981 in der Berufsausbildung an einer IBM3033 unter OS/VS 2 COBOL und Assembler und hatte seither praktisch mit keiner anderen Programmiersprache zu tun.
Exactly the same for me. I learnt COBOL and Assembler Language on an IBM3033 under OS/VS 2 during my vocational training in 1981 and have had practically no contact with any other programming language since then.
I remember learning COBOL, SNOBOL, ALGOL, GASP, at the computing center on North Campus at Univ. of Michigan, but I never used it much. I did learn assemblers (Big IBM 360) (Little Mostek 6502) , Fortran, Basic, LostInStupidParentheses. We used punch cards we placed into the reader and pushed the button and they spit out the other end. Some guys doing Inst. Social Research would come in with boxes and boxes of cards stacked in a hand truck and would take a half-hour feeding the machine. You checked the monitor to see when your job was finished and waited for the printout lady to call your job number. You never knew if the program worked until you unfolded your printout. BTW Nuclear plants are still running DEC PDP-11's
I am not a Cobol programmer, but the day my boss asked me to write a Cobol program, with my background of Assembly and C I managed to "learn" the necessary part of Cobol in two days. Anyone with basic programming skill can learn and program Cobol in matter of days. The BIG problem is to MODIFY or FIX an existing program made by someone else. But this is true for every language.
With the introduction of the Structured programming paradigm, reading most programs became easier. I don't know if that method is still being taught.
It's not so much the maintenance - it's the programming "tricks" or "hacks" that COBOL programmers used to get around limitations in the language.
How do you learn it? I've looked into it and it's a language for mainframe computers. So it's not really something you can trial and error teach yourself without the hardware. I can't just sit here on my PC and "learn" it in my free time.
@@falkenjeff Since you need a mainframe to run the Cobol code it's not something you can do at home. I don't recall seeing anything that can run cobol on a desktop. Not much reason to. You need to learn it in a classroom where they have access to a mainframe that can run it. the problem I remember having was if there were errors in your code you wouldn't find out until they ran the code, which didn't happen in a timely fashion. It could also be difficult to understand someone else's code if they were lazy and didn't put helpful comments in the code.
@@falkenjeff To learn any language, the best method is to have really some problem to solve: it is better then make "something" just to practice. Afetr that, I generally read the sources of other similar program, try to understand how they work, and take the parts I need. The manual of the language is used as a reference to "decrypt" the programs I am inspiring from. Of course, I learned *some* cobol in 1979 on a Sperry Univac mainframe but there was cobol also for MSDOS. I specifically remember RM-Cobol, widely used.
My aunt retired from her programming job in the 1980s. In 1998 she took a consulting gig for a small company to "get ready for Y2K", updating their 30 year old mainframe software. Word got out, and the small companies all started hiring her. Work a week, go on to the next one. As the deadline got tighter, the offers got richer, but time was the issue. She'd tell a potential customer "I think I need 3 days, but it'll take me 1.5 days to travel both ways and I only have a 4 day window in my schedule - unless you wait till after the new year". "I'll fly my private jet to the city you are in now, wait for you to finish, take you direct from their door to mine, and take you to your next job after, that'll save you 20 hours of airlines and airports". That became the new standard. She'd call the next company on the list when she was wrapping up, and they'd have transport at the door when she walked out. Limos, charters, private planes from Lears to Piper Cubs, even helicopters normally used for logging.
She finished her last job on January 1st 2000 and retired again.
Agreat history. And that is a good reminder too why there wasn't a Y2K problem - because we coders eradicated it. Your aunt did her duty in that battle!
LOL. Yes, the great Y2K meltdown. Everyone in my I.T. department had to "WORK" all night on Dec 31st 1999 until 06:00 on Jan 1st 2000 because of the great Y2K threat. Basically we had a company paid-for party as we ate pizza and drank (non-alcoholic) soft drinks because our programmers had done their jobs over the previous six months and eliminated any date codes that might have messed things up. You know, now that I think about it, isn't that where the toilet paper hoarding started that was so predominant in Covid-19? WTF is it with hoarding toilet paper during a crisis (whether perceived or real)????
I worked at a law firm that used an ancient docketing database that was also used to fill in the blanks on thousands of different WordPerfect forms, such as letters, pleadings, etc. In 1999, it occurred to them that it could not possibly work after 12-31-1999. The original company worked on a Windows replacement, but couldn't get it to work properly. They couldn't guarantee a fully tested product in time.
So the boss asked me if I could replace the software with databases in Paradox (Corel's answer to Microsoft Access). I told them I was certain I could, as I knew how the old software worked and how the documents worked with it. I was able to finish at the end of October. Because the firm wasn't allowed by their liability insurance to operate without the software, I had to devise a way to let the attorneys and paralegals use the old software until quitting time on Friday. And then I had to export all the data from dozens of data tables, massage it a lot, import it all into the Paradox database, and have it all ready for them to use on Monday morning.
It was a difficult job. The original software was written before DOS existed. The software didn't just have 6 digit dates, with only 2 for the year, they actually had the entire dates encoded into just 2 characters. I had to reverse engineer that mess. Asperger's and an IQ helped me. I figured out that the software would take a regular date and subtract from it the date the company was founded to get a 4-digit number of days. Then it took the first 2 digits, pretended it was hex and converted it to decimal. Then it found the ASCII symbol corresponding to that and put it in the programs database. It did the same to the other 2 digits. This gave dates that looked like a pair of random ASCII characters, hearts, smiley faces, etc.
The data was also in serial format, meaning there were no field or record markers. I had to use a DOS software to pull up each table, line it up in rows equaling the record length I had found in the manual, and resave the file. Then I pulled it into WordPerfect and ran some macros I wrote for each table to put in the field markers at the proper locations and then convert those stupid dates. This was so massive, I had to do this on a computer unattached to the network and not running any software but the WordPerfect.
I did the final switch over the weekend, and everyone was happy. BTW, I didn't even get a raise that year. So a couple of years later, I "retired." I was 50 years old and tired of being micromanaged.
Your aunt, like my self was very honest!! Y2K WAS A MYTH!!
CREATED TO SELL COMPUTER AND CREATE CONSU LTING
JOBS. ALL IBM COMPUTERS COULD HANDLE DATES IN
THE 2000’S AND MOST PC’S
THAT IS WHY SHE COULD GO TO A COMPANY, TEST
THE SOFTWARE AND REASSURE THE COMPANY
THAT THEY WERE OK.(ALL IN 1 TO 3 DAYS).
DISHONEST CONSULTING COMPANY’ INCLUDING
MAINFRAME MFG, WOULD BLOW THASE 1 TO 3 DAY
JOBS INTO 3YEAR JOBS.(THE FEDERAL GOVERNMENT
SPENT HUNDREDS OF BILLIONS ON Y2K. WITH
BOTH CONSULTANTS AND BUYING NEW COMPUTERS!!
(IT’S ONLY TAX PAYER MONEY).
I WENT AROUND TURNING DOWN Y2K JOBS FOR
2 YEARS, UNTIL THE NY POST DID AN ARTICLE
ON ME SAYING I WAS THE “MOST HONEST Y2K MAN
IN NYC.
SO IN THE LAST 6 MONTHS 0F 1999, I STARTED
ACCEPTING Y2K JOBS AND DOING WHAT
YOUR AUNT DID.
I had a few friends who were similarly making big bucks patching old COBOL code for Y2k. I could have jumped on that wagon myself, but doing web dev was a lot more fun and I figured in the long run it would be worth more to stay on that track. But as the day of reckoning approached I was sure something major was going to go wrong, and am still surprised nothing did. I had read that a sizable percentage of systems still hadn't been patched, so my wife and I avoided our customary holiday travel that year. Apparently the truly crucial things actually got fixed first - which seems like a miracle.
In 1998, a COBOL programmer, frazzled with the Y2K crunch decides to have himself frozen until the year 2500. He goes in for the procedure and it's successful. As he begins to wake up, he sees all these people standing around him looking at their devices and dressed oddly, and he says "Wow, is it 2500?" The lead scientist says, "I'm sorry, no it's not 2500. It's actually 2098 but it says here on your file that you know COBOL ..."
😂😂😂😂
Yes, repetition is a feature of IT. Even Y2K fears - what will happen in 2100? COBOL may still exist in 2098 😅
Most of the fixes will either still work or be unnecessary.
Not sure the same is true for the similar event for Linux coming. Up soon😊
By the way it was fun @@coshy2748
😂😂😂
@@coshy2748 2038 AKA Y3K is already being discussed :)
1992, I worked with a guy who was the ONLY person in the company that could actually program an AS400 Mainframe. In itself, - "Meh".... until you learn that Europe's top 37 Insurance companies used us - so if we went down - so did they. And it progressed - we got commercial banks and eventually the NHS and Social Services... All dependent on an old AS400, and ONE guy who knew how to work it. I personally was never a programmer - I was the guy in the floor or ceiling fitting cables and setting IP addresses - But it always terrified me just weak the system was - if we lost that one person, SOOOOOO much would go with him. And he refused to train anyone else so... Well, he died a millionaire. I guess thats all any of us want.... My bank account? Well I don't have one. It went that well.... pfff......
Was his name Wally by chance?
An AS400 ‘mainframe’? That really is bigging up the AS400. AS400s were General Systems Division - small business computers and distributed systems. AS400s _dreamed_ of being part of the Data Processing Division!
I seriously - honestly - want to avoid a "word war". Sounds like you have serious experience - but may I ask - corporate, private or government? Not being rude. I've done all three - but... All I can tell you (and feel free to "nicely" attack me on this) ... in the 1990's Virgin, AXA and GESA - and HMV to some extent - were using AS400 tech. Even then it was out dated. Seen those pictures of train stations cripples with Win2000 crash screens? Same thing.
@@damightyshabba439 : AS400 technology evolved as did mainframe technology. If the companies refused or reluctant to upgrade, that’s on them. I live and work in the USA. The companies I’ve worked for had internal auditors. What they said goes (which I disagreed with, the power went to their heads). Anyway, the companies I’ve worked for would not have tolerated this man-child’s attitude. There would have been cross training and backup personnel in place. Again, the company where this man-child worked allowed him to get out of hand - that’s on them.
@@deepsleep7822 Agreed - some times the company is too scared to bring in someone new, in case it upsets the "God". But it has to be done, from a business point of view.
Admiral Hopper was certainly heavily involved in the development of COBOL and lectured on it around the world. She related the following story to us at a Bell Labs symposium. She was giving a lecture in Tokyo and was doing the usual meet and greet afterwards. After a bit, she realized that she had no way to get back to her hotel. She was having difficulty communicating in English to the remaining people on what she needed, when she had an idea. She grabbed a marker and had them gather with her in front of a whiteboard. She then wrote out a COBOL program whose primary line was MOVE Grace FROM Conference TO Hotel xyz. They understood, and got her back to her hotel.
She had an incredible mind. I feel fortunate that I got to meet her in person.
Good one😂
Grace is a legend.
Back in the 80s, Grace visited the University of Alberta to give a talk. At the reception she mantioned that a numbere of the CS faculty had been involved in the early development of COBOL and wondered why they didn't make a bigger deal about that. One friend of mine quipped, "Perhaps they're ashamed of it!" Another friend of mind pulled the first friend aside and explained to him just who Grace was.
Thank you for mentioning GH *AND* properly addressing by rank! Kudos. ..I was about to comment about the conspicuously low attribution (and off timeline).
Well, to communicate, one needs a common frame of reference.
COBOL is clear and straight-forward. The staying power of the code is mostly due to inertia. It epitomizes the maxim "if it ain't broke, don't fix it".
Isn't it slower than molasses though?
@@SahilP2648 that's why you run it on superfast mainframes 😀
Also: "if it _is_ broken, nobody knows how to fix it, so, it's better not to touch it" (the very definition of programmer's inertia).
@@GwynethLlewelyn or you can just decide to use a better language and convert all the code. Generative AI would be able to help a lot in this.
@@SahilP2648 well, yes, you can do that - if you have a few hundreds of lines of code. But these COBOL behemoths that run banks and insurance companies and whatnot have _millions_ of lines of code. Let's assume that you'd get a generative AI to convert all the code. Would you, as the bank's CIO, trust that brand-new, AI-generated code to replace the old & faithful _mostly_ working code which has been around for half a century - and risk dooming the bank to absolute collapse if nothing works?
Also, who would test & evaluate that code? Other generative AIs? :) You see where the problem is: at some point, you'll have to trust AI providers with your entire business logic, and hope they come up with a "better" solution (in the sense of using brand-new, latest-generation programming languages with extra bells and whistles)
Someone on this very long comment thread pointed out the obvious: you could, for instance, split the code in its modules (all hundreds of thousands of them!), and go through each one of them separately: get a module, convert it to some other modern language using generative AI, thoroughly test the result, deploy it, then move on to the next module - wash, rinse, repeat, until every single line of code has been fully converted. But that's essentially "rewriting the whole code from scratch" without the _main_ advantage that comes from actually rewriting the code, which is to rethink some of the old things that aren't probably not necessary or that can be done more efficiently thanks to contemporary technology, methodologies, and innovations.
How long would _that_ take?
How much would it _cost_? (even assuming a "free" generative AI which would not only convert the code but also provide test suites at each step, for each module, also for free)
And if something goes wrong in that million-line-code-conversion... something which escaped even the best of the best generative AIs and error-checking AIs... who is going to be able to "fix" things?
Generative AIs are not yet the magic wand that turns a multi-million-dollar, high-risk project into something that can be done next-to-free and take a few hours...
My mother earned a degree in mathematics in the 1950s and worked as a programmer. Whenever I see pictures of those mainframes showing the women who made them work, I think of her. Thanks for sharing.
What you didn't mention was that managers didn't trust programmers to not do stuff like "round off to the nearest penny and put the remainder in my bank account" stuff, so they wanted to be able to actually *read* what programmers wrote.
Because it's not true. You can do that a lot easier in cobol. All it takes is a wrong pic statement. Washington DC government famously had a guy do just that. He hard coded to put the extra cents into his personal bank account. Like clockwork this happened week after week. Then he retired and left the code in. It ran for years like that. Then he forgot and closed his bank account. Then they started looking and found his code. It was all in cobol. I think he got 10 years in jail and lost his government pension.
All kinds of numbskull things used to happen with cobol and mainframes. Such as using a flat file or sam file for a customer database instead of a database for a database. I used to fix things like that. After a while nothing surprised me.
Just because something is old, this doesn't mean it's bad or should be replaced. Newer is not always better. This applies in many areas, not just computer programming.
Generally true. However, if it's unmaintainable, that can become an issue, as regulations change and code that has to adhere to those updated regulations has to change along with it. If it ain't broke, don't fix it only applies until the context changes and its operation is now considered "broke", then the "unmaintainable" part comes into play.
@@jamesbond_007OK, but you can’t hang that on the language. If a decision is taken to move to a new language, everything needs to be migrated as part of the transition, and if it’s not, the blame goes to the management that botched the move by not doing it wholly.
@@jamesbond_007 I worked at a nuclear plant in the mid 2000s that still used DOS for their working computers for repair orders, work orders, etc. They probably still use it for all I know, as I retired in 09. That was a real PITA for some of us to use that were not really trained in that language. Trying to navigate the system was very unwieldy as I was a mechanical superintendent up from the trades as a pipefitter. It might not be as bad if ALL the nuc plants used the same language, but they don't, so every time I went to a different job I never knew what I was going to have to try and adapt to, and as the outages often were so short that I might only be there for 2-3 months at a time, the continuous changes were just one more added stressor to an already highly charged environment.
@@harleyb.birdwhisperer You're absolutely right! There are a huge number of factors that go into these dusty deck programs, not the least of which is that good programming practices were not known, or even if some of them were, they weren't often followed, so the resulting code is difficult to understand. COBOL has a GOTO statement and it was not discouraged for much of the time code was being written in it (I *think* there are flow control constructs that obviate the need for so much usage of GOTOs, but I am not sure).
@@ken481959 Yuck! I don't envy you at all. That doesn't sound like a very fun work environment, esp if you haven't been trained for it.
I used to work at a company that decided to do a significant reorganization. Part of it was getting rid of employees within 5 to 10 years of their retirement, offering up to 60% of their salary to not come work anymore. However they were still allowed to get another job anywhere, part-time or full-time without loosing their early retirement money. You can kind of guess what's coming: a part of the older employees that left were Cobol programmers. All of a sudden the company realized they lost the majority of their Cobol programmers, which became a massive problem. They had to re-hire those guys as external consultants. Not only were they in the position to earn more than they made before, but on top of that still received that monthly retirement. A nice win win for the old geezers, a bitter (and costly) pill to swallow for the company.
That's happened to me a few times, not on computers, one company that was 110 years old went bankrupt and everyone lost their retirement
Oh... when the accountant get to make decisions... They know the cost of everything and the value of nothing. Seen that one in the UK Civil Service.
I used to work with British computer company ICL (before it was bought by Fujitsu).
They had a joke about management getting rid of the wrong people and having to hire them back at consultancy rates.
"Whose are those Jaguar XJ6s in the car park?" "Oh, they belong to the people made redundant last year."
Sounds like a great story. An old Russian proverb says "The greedy pay double".
Common Corporate idiots at work .Don't know their employees function .
*Grace Hopper* developed the first compiler A-0 in 1952 and the first human readable computer language FLOW-MATIC in 1955. She is also referred to as "Grandma COBOL". Grace Hopper is my all time programming hero ♥️
Also known as "Amazing Grace".
Ah, the amazing grace
I watched one of her live lectures while I was serving in the Navy back in the '70s and '80s. Also, Adm. Hopper was one of the few people who would standup to Adm. Rickover. She wouldn't take any of his shit. 😆 Very interesting person. The quickest way to get on her bad side was to say, "That's the way we've always done it."
That person (Hopper) is my hero as well and I don’t much like her to be depicted as.. Mary….
She was fantastic, but the first compiler was not A-0. There is a discussion of the priority of her presentation, or if Autocode was running before she published. In any case outside of USA and political guide discussion A-0 is not considered the first actual implementation of a Compiler. This does not make her less important, after all, Lorentz found the transformations that bear his name before Einstein did, and I have zero evidence that Hopper had any idea of Autocode, in fact we know she published first.
Having started my computer programming career learning COBOL and progressing through many other languages the language is just a detail.
Problem solving logic is the difficult part to master. Once you can start thinking like a computer processes, the coding is just syntax.
Some of the newer languages give you more masterful control over many processes and are more than enough reason to learn them and possibly shift to the newer environments.
Total agreement! COBOL is easy; problem solving is the hard part!
An important point is that Cobol uses decimal fixed point arithmetic while all popular languages use binary floating point numbers. Financial people get really upset when the cents don't match exactly what they expect.
The joys of binary coded decimal on the SNAP (program and memory) dumps as well.
Double doesn't have the floating-point precision issue when dealing with math functions afaik
not even quad (128 bit) binary floating point will give you the same results as decimal math. Note that in both cases the results are wrong, just wrong in different ways. The financial industry likes to be wrong in the same way as their old decimal calculators. A few years back the C standard added an option for decimal floating point and several different processors are adding them as well.
Idk, in Russia we had not banks before 1991. We have the biggest fully online bank in the world and it was found only in 2006. So, we are lucky to kot have any COBOL code for banks. And somehow we haven't problems with incorrect counting of our cents.
I don't think that is the real point.
@@paulinchannel3104 I mean Russians are smart. There's a way to remove the floating point errors by doing more operations per transaction, so they most likely must have done that.
I was a Cobol programmer once - not by choice though but a colleague resigned and made the big mistake of mentioning that I knew it on my CV and my boss remembered that.
It is not the language really but IBM mainframes that makes it live so long - IBM developed several families of mainframes starting with the S/360 family and they all are compatible with each other.
From custom CPUs to PowerPC's of the Z series today can in emulation mode and boot the OS and programs written in the 1960's.
The US tax system is written in a combination of IBM mainframe assembler and for Cobol running on a mainframe from 1960.
Today they still doing your tax returns using it on a modern IBM mainframe but the same code.
The same code, another more powerful machine. The most effecient way to upgrade. In the industrial environment the cpu’s inside also the small contollers are all the time ‘old-fasioned’ due stability and minor bugs inside.
I mean if it works, it works. Not everything has to be rewritten in rust
@@moonasha Rust, some great ideas, but makes me want to gouge out my own eyes.
@@johnridout6540 👍👍Same!
As people say, "If it's not broken, don't fix it" or something like that (there are minor variations out there)
Just having the ability to code in Cobal got me a US work visa back in 1999. I was 20 and in the USA for a holiday when a friendly guy in a church and I got chatting. Turned out he worked for a company that needed more Cobal coders due to the amount of financial systems they had to check for Y2K. Was fun work, got to stay in the USA a lot longer, and ended up with a really nice reference / work experience for my resume.
A COBOL programmer was so inundated by requests of patching systems ahead of Y2K. After he finished all the requests one week before the new year, he decided to cryo-freeze himself for a short time to get away from all of the Y2K commotion.
Somehow the Y2K kicked the machine database and he's kept in it past his due date. Years gone by until some people tawed and woke him up.
"Did it work? Did we survive the Y2K?", were his first words. One guy with important looks about him moved closer and sheepishly answered, "Well, yes, in fact now is year 9999, and according to record, you know COBOL, right?"
cobol stands for common business oriented language. BAL is the IBM hardware language. Any one who thinks the cobal and bal are related is suspicious.
@@georgekashmar3983 suspicious? Probably doesn't know either ...
My first full-time job (~1972) was as a COBOL programmer. I haven't used it since then, but I still respect its run-time efficiency, standardization, etc.
IBM mainframes are the silent workhorses in the economy. They sit there and run year after year.
The z in z/OS truly means zero down time.
All that COBOL code regardless of what any new gen says is truly an investment-much of it made decades ago.
I work in IBM midrange (35 years now). Our systems run a billion dollar enterprise. The only worry we have is the lack of talent.
Did DMA allow call from cobol forgo params ? I think there was something like that with the DMA chip. Or maybe I needed tiny values so some uh bad flags allowed data to stay... meh probably some screen thing on later micro
I love the AS400. It is an amazing machine. Puts Windows and Linux to shame.
@@ocdtechtalkWindows yes, linux, meh, depends. Also OpenBSD, PCBSD, and a few more. Silicon Graphics IRIX 6.2, was my favourite OS. Ran many mining engineering departments on it.
@@ocdtechtalk The AS400 running the OS400 operating system, not one of the other available ones, was horrible for graphics and science, great for crunching all of the numbers that businesses with finances and inventory and production needed. The later models would phone IBM themselves that something needed replacing. We were an office hours shop. I would get a call from IBM service as I walked in the door in the morning. Hadn't even got my coffee yet. "Your AS/400 called in last night and said this one particular hard drive will probably fail soon. When can we come out?" Great systems, with great redundancy in design.
@@hunahpuyamamoto3964 3090 series!!! king of kings in early 90's
75, Unisys Cobol programmer for 30 years (with IBM before hand) and still working with it. I have learned C# and Python, which can do things COBOL cannot. But COBOL is easy and reliable.
I got back into programming with the run-up to Y2K, working in two shops making the conversions. We made a lot of money making program conversions but _I expect to really clean up when _*_Y3K_*_ comes around._
and really rake in the cash when we approach Y10K
y2k38 is an actual thing that will happen, better get started
@@meep.472 mostly embedded systems require to be updated hardware wise to support 64-bit memory. Like your router for example. Otherwise, any modern-day PC is going to be fine (way before 2038).
You mean 2048? 2^11? How could this possibly be an issue at this point?
@@keith77mn77 lmao no. All devices use a standard called Unix time which started in 1970 as an integer value increasing once every second. This is a 32 bit value which is set to overflow in 2038, so if any device uses unix time and is not updated to 64 bit by that point it's gonna think it's 1970 again and that can create a lot of issues.
Cobol is still used for a reason, there is nothing to match it in efficiency and speed in many important fields. And Cobol do support graphical user interfaces... Simply build a Cobol backend application with an API layer, and call it from a front-end, receive a response and present the data in any way you desire :). When you login to one of the larger banks and perform a transaction for example, several real-time Cobol modules will be running/executed on a mainframe somewhere, and data then sent and presented to you via browser/app... I work as a Cobol/Mainframe developer...
LOL what?
@@glee21012laughing about things you don’t understand is not the flex you think it is.
Yeh! It's amazing how COBOL co-exists with modern languages and arquitectures, it does well the dirty work 😅
Anyone who says "efficient" and "speed" in the same sentence with COBOL doesn't know anything about speed or efficiency. Sorry, that's the truth, it is neither fast nor efficient. It's what was available at the time when financial programmers didn't like FORTRAN. There's a reason why its nickname is CROWBAR.
@@chribm Well, in the reality I see millions of transactions daily, passing through hundreds of Cobol modules on Mainframes... Its extremely fast and efficient ...
Why do you think 70%+ of all the worlds business transactions runs via Cobol on Mainframes? If it wasn't efficient it would never still been used...
Obviously its not the language alone, you also need the infrastructure.
You do not think the biggest banks of the world can afford recruiting people, competent enough to make intelligent decisions related to tech?
COBOL is one of the first programming languages that I learned back in the late 70s. I used it to write inventory control programs on my company’s minicomputer. It was easy to learn, although not very compact. Your video reminded me of the fun times writing code on the special programming paper, typing it into the computer, then making the minicomputer crunch reams of data entered by our office staff. You also reminded me of the huge removable disks on which we used to save our data.
I am 78 and spent much (but not all) of my working career “back in the day” . (1968-2007) programming computers. My first language was FORTRAN, but 99% of my career was spent writing in 1401/1410 and 7080 Autocoder (maintained by “patching” actual object/machine code), ALC, COBOL (compiled and link-edited) , and JCL (along with GDG’s) to write countless modularized programs and systems. Some of my early (object/machine code) programs were “bootstrapped” from paper tape or punched cards to execute (there was no “operating system” then). How tedious and exacting everything was in the beginning. I’ve read and debugged many a core dump. Did I mention tedious yet? I loved the power I felt when I made that huge machine do exactly what I told it to. Now I happily tap on my iPad and use my PC marveling at how much easier it is to use a computer. “We’ve come a long way, baby!” Now instead of core dumps we get “the blue screen of death.” It was a great ride!
ALC=assembly language coding. Happy IKFCBL00 everyone!
@@ronaldlee3537 yes. And they need to replay form that functionality before like yesterday. Their dependency will get even more desperate the longer they procrastinate. We started years before Y2K to get ready for it. But back then we had a deadline. “Cracks” started showing up because of dated reminders generated for future events started to not “roll forward” into the 2000’s properly in the mid to late 90’s.
Mainframe.... ¡¡¡ cobol, C , Assembler , the past is the future.
I have 36 now, I had started to work in IT in 2011 as QA manual tester, then I learn to develop first with automation testing and then development with .net core C# tech, java, ruby and one old man that was my boss when I started to work by a contractor company in Monterrey México for Citibank Banamex, and we are using AS400 mainframe system we had doing a migration and integration from that main frame system to new tech, he told me this language and tech is so expensive you should learn this to gain a lot of money... I did not believe that after I see jobs with a really high salaries couple years after.
Love your account of this. I started out with Autocoder on a 1401 in college followed by COBOL in the upper-level classes - and was hired by a major petroleum company to write COBOL for them. I moved on to other things in the IT area (data management, IT Auditing and DBA work) but it was a good, albeit stressful living. Burned out at 59 and took early retirement and haven't regretted it.
And, that ride is no where near over.
I’m 66 and a retired COBOL programmer. AI should be able to update it now except for COBOL spaghetti code with non-standard magnetic tape processing and hidden calls to special routines. Yes it’s still there unchanged and untouched for 50 years. It sits there waiting to trap some naive AI or person attempting to update it.
AI, you said? Don't give us nasty ideas here, such as telling ChatGPT or some other AI chat bot to make a simple COBOL "Hello world!" program! 😂
Ai couldn't even help me write a simple component test in JavaScript, I doubt it can rewrite entire software
it's still hard for AI to create a program of a memory game without spitting errors every 2 lines
I`m not a programmer, but I imagine such an undertaking would be like opening a compressed can of worms...
LLMs will never be able to write effective programs for the simple reason that it is incapable of reasoning about the "code" -- tokens, really -- that it spits out. It is doing a statistical inference on a copus of code already written by human beings.
Think about that for a moment. There is no dynamic reasoning in statistics. None.
I am always amazed that anyone expects LLMs to do better. They are good for a very limited domain of things. But anything truly creative and constrained by logic and reason? No.
There are still quite a few of us COBOL programmers around. It wasn’t that long ago.
Yes... it was.
Long enough ago for me to have had 2 careers since leaving the bank data industry back in 1996 having worked in the COBOL Tandem and JCL environment for 10 years. Spinning tapes and all, I left just as PC's were starting to interface as a front end. Long enough ago or me to do a 4 year degree in Graphic Design, work in that industry while also renovating houses then become a builder full time for the last 15 years. To be truthful, I seriously considered taking it back up again over Covid lockdown but thought, jeez, so much time has passed....
Of course it wasn't that long ago. For the earliest of us Cobalt programmers it was only, say, 3/4 of a lifetime ago. 😮
Much less than half a career ago for me. I was doing COBOL refactoring and modernization only about 9 years ago. I made some nice change over a very short project. When it is not broke, you don’t need to fix it, but you do need to maintain it to keep it making you money.
It would take four years of a team of 50 developers and 100 verification engineers to completely rewrite that COBOL code, and there is a non-zero change they will screw it up. Paying someone to make minor updates in working, native COBOL to tap into data flows for new analytics is worth it many times over by comparison.
I work in local government. We still have plenty of systems that run on the mainframe, and those are typically COBOL. In fact, I know of jobs that are begging to be filled but they can't find anyone who will do COBOL programming
I used to be an IT professional til I retired. If it ain't broke, don't fix it ... that is the motto. So the huge amount of COBOL out there will survive until it MUST be changed. I had to program only a bit in COBOL, thank god! The majority of conversion happens when a platform no longer supports something. Like the DOS conversion at my company. When something is extinct, then change happens. But anything new.... that is when change happens.... so we see C++, Delphi, Python, C# etc...
And when you do systems programming on a mainframe.... you typically HAVE to use Assembler to access supervisor routines. So you do what you must do.
Nice video!!!! I enjoyed it a lot!!!
Two quotes: 1) "I know COBOL and I won't starve" C. Pitts Control Data Corp. circa 1979. 2). "What runs on the mainframe? Civilization." Roberts, International Business Machines circa 1987. Both statements are as applicable today as the were 40 to 50 years, from Shanghai to lower Manhattan and all points in between. Anyone who can code can pickup COBOL in about 1 hour (I did it on the Chicago NW line one afternoon). The institutions you listed didn't even name the largest institutions depending on COBOL. Why COBOL? It still works and doesn't even need recompilation. Book of records usually requires continuous availability and those grand mums and nainai from Chengdu to Mumbai, Joburg, San Palo and Des Moines expect those ATM and credit cards to work 99.999% of the time even after a disaster and that means COBOL running on big iron, even in Wai Guo Qiao Pudong.
No. Its the cost to move to new systems and the fact that so few people still know cobol and even less want to work with them. Its easier to contract a COBOL dev than hire a team of them to work with the backend and applications teams, because no one likes the weird, old, hairy COBOL devs, and there are like 3.
Sir, I don't speak Spanish.
In 2000, Citibank reported that they had spent $500,000,000 on COBOL programming to fix Y2K problems. And they said if these weren't fixed, they would have "lost the bank".
@@m3talHalide-rt2fz If that's the case... then why aren't we on a new system that is better so we don't have to deal with the old, hairy COBOL 'grammers?
@@m3talHalide-rt2fz Fundamentally it's a "book of records" application limited to number of hours in a day, competing platforms fail to effectively scale (they lack the physical hardware to cluster with a single clock with a shared queue) and lastly they lack commercially available continuous availability and reasonable DR. Book of record applications have to get through all their books in a day every account has to be updated (there are some noncommercial exceptions) big iron addresses this by scaling through shared queues, common clocks and shared storage. Employing that same scalability technology financial institutions literally stretch the processing within a data center out across multiple physical data centers within a single geographical region to eliminate data centers as a SPOF. Then layer on top of that nearly continuous recovery against regional disasters through DASD replication technologies with automated recovery of operations in well under an hour. Other platforms simply can't compete with cost effective off the shelf solutions on the basis of Recovery Point Objective and Recovery Time Objective. Lastly you could try to port some of these 50 year old applications to new platforms but where are the long term savings? The platform you port it to might be out of date by the time the conversion is complete, start over this time we might get it right? You'd be porting our most important applications to a treadmill. The application code written 50 years ago runs just fine completing the tasks that modern customer facing applications direct. The small financial institutions, that maintained their own data centers long ago outsourced those operations where they operate as a partition a client VM, where the COBOL code happily does it's work day in and day out.
COBOL was the primary language of my college days. I spent a number of years working in it, even on desktop applications. Of course no discussion about COBOL is complete without mention of its mother, Admiral Grace Hopper. A true giant in history.
In college I studied Applied Computer Sciences. In my first year, back in 1997, we had a course on COBOL. During lectures and practical exercises, I didn't really understand it all. When the first exam approached I started studying. All of a sudden I had an epiphany and I understood it all. Like in comic books when a character has an idea a light bulb is drawn above its head💡, it was such a moment. I still remember it to this day, so special that it felt.
Sci-Fi readers call that moment "GROKing". Look it up.
My "Ah Ha!" moment was when I understood the method called "Structured Programming". Made things easy a pie.
For me, it happened as a sophomore in high school where I was reading the manual the 6502 that was in our school's new Apple ] [ computers. It was like a bell rrang and it all clicked. That summer I disassembled and hand annotated Apple DOS and a few years later goa couple of engineering degrees, got a job in high tech and never looked back.
I did go back and thank that high school math teacher who let me just run with it.
For me it was in 1968, that I stumbled upon the RJE room, where, without even knowing what a computer was, I helped students fix their Fortran programs just by reading the error messages. Leaving the RJE, I went to the book store, bought the 100 (50 sheets) page Fortran manual (written by an IBM engineer who said everything once),.I took it home on Friday, read it at over and over least 3 times from cover to cover, and was a Fortran expert on Monday. Truely an Aha moment!
I graduated from A.S.U. in 1982. I was a programmer/analyst for 20 years. I used to know over a dozen programming languages (BASIC, COBOL, GPSS, SAS, DYL280, Focus, ISPF Dialog Manager, REXX, Visual Basic 3/4, Powerbuilder, Unitech, MUMPS, not to mention Syncsort, Merge, SQL, Excel), and knew my share of JCL (Job Control Language). COBOL was a great, easy to understand procedural language.
COBOL was “old” when I was in college. The biggest issue with cobol systems is how a set of programs will share files. So knowing the language is only the first step. Understanding the dependencies and interactions is the bulk of the problem.
Could you please elaborate through a rudimentary example?
Yeah, I was going to say, I've seen textbooks on it in public libraries. It doesn't really look that hard to learn. Like I could probably learn it in a few weeks.
@@xenopholis47 some people I worked with spent over a year trying to reverse engineer 200 cobol programs used for credit card settlements. There were so Marty reading and wtiting to the same files under different circumstances that they were never sure they got all the interactions.
@@michaellatta yes exactly - VSAM / seq datasets, and not just the COBOL but also JCL (& what disposition each job has the files open under), and anything else the JCL is doing to the data outside COBOL.. (and that's just batch :)
@@johnlacey155there are different operating systems hence different JCL
I am retired now, but started programming for a DOD facility in 1966. Over the years I’ve programmed in 37 different languages. I’ve never written a line of COBOL, however it is so easy to read that I’ve debugged a number of COBOL programs for others. I’ve also showed a number of programmers how to make their COBOL more efficient. The prime example being a new application which, when in test, demonstrated that the nightly batch process would take about 10 days to run. While others were in a state of panic, I showed them how to fix two issues. That night, the test demonstrated that the daily nightly process could be completed easily in less than three hours. COBOL - easy to code, easy to misuse.
"Easy to code easy to misuse"? Surely that's the same for most software languages?
Yeah. One big problem I ran into maintaining COBOL programs was when programmers didn’t know any better and wrote sequential searches with large amounts of data. Some programs I modified went from running for 4 or 5 hours down to minutes. Wish I’d gotten bonuses based on how much I saved departments on their internal billing. I could have retired 25 years earlier, hehehe.
I started coding in COBOL in 1979, we coded using a pencil and 80 column paper forms which were sent to key punch to convert from pencil to cards with holes in them. I remember my first computer had 4K of RAM and no hard drive. Mass storage was a conventional audio cassette tape.
It is amazing how far we have come.
@cliffhaczynski6121 I have the same memories but with PL/1 rather than COBOL. I recall that a significant part of my day was spent "Desk checking", over and over!
That sounds like the Timex-Sinclair I bought as my first computer. I had the extra memory pack, and I bought the whole thing at a department store in their jewelry department for about $80.
Learning how to program with paper and pencil made me a much better dev than many of my peers. You have to think through so much before you punch those cards so the number of logic bugs you introduce are significantly reduced because the process cost is so high. If you have to do that 4-5 times you learn REALLY quick to get things as right as possible on the 1st pass.
I was the first semester at school that didn't have to do a punch card project.
The original IBM PC model 5150 from 1981, not the XT 5160, had a RCA audio cassette conector to store programs or files in audio casette tapes.
0:28 And, they were repairable too 😂
Proud COBOL'er and still here. The oldest code I wrote that is still running in production was delivered in 1983 with a Y2K update in 1999. Good stuff.
Normally any discussion of COBOL mentions Grace Hopper, one of the inventors of the language. There is anecdotal evidence she was also involved with the use of the term "bug" and "debugging". I learned COBOL in 1980 and used it for most of my career.
The term 'bug' was already used in engineering, even Edison used it. Her instance is the first _actual_ living bug that caused an error, and that's why she wrote that journal note. IIRC that journal page is still being preserved.
@@timradde4328 It was a moth, as I recall from what I've read. I wouldn't be surprised if the story has been distorted and different versions can be found in the literature.
It was a moth. Computer memory at the time consisted of a grid of wires with a donut shaped magnet at each intersection. The polarity of the magnet could be reversed so that the bit could be flipped on or off. A moth flew into this wire grid and shorted out a section of it causing a "memory fault."
@@DrunkenUFOPilot Actually, Hopper's meticulously-kept notebook has a page with the actual moth that documents the event. The notebook is in the Smithsonian, so no embellishment.
Actually, that’s also a bit of a myth. Rear Adm. Hopper did not invent COBOL. She was for a brief time on the CODASYL committee, but really not for very long. Some of the syntax of COBOL is based on Flowmatic which she did design.
One of the biggest benefits of using COBOL is that it does EXACT decimal arithmetic (i.e., not floating point double precision) , which is a huge advantage in financial systems. You can write highly structured code that is really easy to read - almost self documenting. But it's not at all suitable for web development, which is a huge disbenefit for most developers.
Floats should never be used for financial calculations. If necessary, should a language not provide fixed decimal arithmetic types, a whole number type can be utilised instead. Though a little extra work would be needed to display the correct values.
@@stewartkingsley Floats can be used, but not binary floats. Binary floats can't even represent 10 cents properly. IBM Machines of the Power and Z series have had 128 bit decimal floating point hardware for more than 16 years!!!. The precision is more than enough for any practical use.
One drawback to that is you need to make sure that the PIC allows for enough digits. If, for example, you are expecting a number that is 10,000 or more and you have a PIC 9(4) then your variable will roll over unexpectedly. I have accidentally created some infinite loops that way (and wasted a whole box of green bar paper...)
@@myofficegoes65 Yeah I had days like that in my early career 🤣🤣
The insurance company I used to work for relied on COBOL for the backend to its web based customer and employee facing portal. COBOL also handled all the batch processing of creating documents and renewals.
I'm 54 years young and commenced learning how to code in 1999 with Pascal (enjoyable), then COBOL (nearly broke me), then onto C (once I got my head around pointers, LOVED it)... I'm now a sysadmin, network engineer and ICT all-rounder... except I no longer code (less a bit of bash scripting) and I really enjoy my work because I'm not a coder. Excellent video also BTW!
I was an accountant at an insurance company in 1989 when one day a flyer landed in my inbox about an entry-level programmer training class. I went to the presentation, took a test, and passed an interview to land one of the spots. I spent the summer learning COBOL, VSAM, and JCL, and was a mainframe programmer for the next eight years. I left my COBOL days behind when I took a job at a bank and worked on applications supporting their online banking. But I remember my COBOL days fondly.
Loved VSAM, beat the heck out of straight sequential data organization.... I love more that it is totally irrelevant today, our current storage systems use much more complicated strategies but hide them under an abstraction layer that makes data access so, so easy!
I learned COBOL in 1981. Like riding a bicycle, never forget it.
Same, I just can't concentrate long enough to be truly effective at it. NetWare saved my bacon back then. I still run 3.12 in Vbox for shits and giggles.
Agreed. I learned COBOL in about 84/85 but was never a professional programmer. I've just picked it up again now and it's amazing how it all (mostly) comes back!
I enrolled in a University computer science program in 1982. One of the introductory courses was working with Cobol, which was programmed into a machine using punch cards. Fortunately, I found computer science, that is, not the programming, but the actual science behind it, to be so fascinating, that I did not let the horrors of Cobol and punch cards turn me off. I graduated in 1985, and to this day, I still think that computer science is one of the most interesting sciences that we have, it is so generalizable, that it can be applied to all the other sciences in various ways to improve them. After all, everything boils down to information and algorithms.
Unlike riding a bicycle, people try to forget it. /wink
Learned to ride a unicycle as a teenager, and how to program COBOL in 1990 after riding my unicycle across campus to class. My transcript includes: PASCAL, COBOL, FORTRAN, & C++ as languages because having a programming class account got you mainframe access (VAX) when the user count was getting too high and kicking personal accounts off the system.
Long live COBOL and COBOL programmers. Make these greedy corporations pay for your skills. Don’t sell yourself short.
So happy the algorithm brought me here great content and wow I was blown away with how much I learned in the first video. I'm starting to think the A.I. overlords really understand me, I think I could be happy as a battery. Anywho side bar I think I'm hooked. Keep doing what your doing , One Love.
Also, in 1959 the department of defense probably owned 99% of all the computers in the world. So if they didn’t do it, nobody would.
Wellllllll not quite 99%... remember, the IRS was also using it (and that was true for most of the Western world). But sure, they would have owned the overwhelming majority of all computers.
Maybe 99% of computers in the USA - the UK government and banks had plenty too, even Universities had them by then.
Also, the DoD has huge administrative functions, between payroll, facility maintenance, etc. The lifeblood of the DoD is money.
Coulda fooled me, when I joined the Army in 90 it sure seemed like everything was still run on handwritten forms! I think there was a computer in the shop office.
"Also, in 1959 the department of defense probably owned 99% of all the computers in the world."
Hardly! Not even close.
Universities were the big owners. And by 1959, it was slowly starting to spread into business overall.
I spent 10 years on the computer side of the banking industry, working with ATM's and branch automation back in the 80's. I was a systems person, so I principally worked with Assembler, but pretty much every program that comprised those systems was written in Cobol, so we had to know what those programs were doing and how to dig through the dumps when things went bad. I spent a 40+ year career in IT and never wrote a Cobol program after I left college. It always felt too cumbersome and bureaucratic for my blood, but I always understood that because it was handling money, there was never any room for mistakes. A lot of my work back then was about fixing corrupted transaction data that slipped through the cracks.
We definitely couldnt make mistakes. My entire career was fixing the mistakes that definitely never happened.
"...how to dig through dumps when things went bad." This. This is the thing that is needed. Anyone can program Cobol. Far fewer can dig through dumps to backtrack to the line of Cobol code.
Knowing how to program in Cobol can make a career. It can also be a career-killer as no one will promote you - they need you where you are, handling the Cobol job that no one else can easily fill. Will the pay for this irreplaceable person compensate for this state of affairs? Probably not. The trick is to recognize the situation, then have the courage to move on.
It was written in the days when people were trying to make computing as simple as possible. The idea was that since it was basically a constrained form of English, even managers would be able to write their own simple queries and so on. Then people brought out things like Easytrieve for manager's reports and eventually they realised managers couldn't learn to program under any circumstances.
I can remember programming schools where they would take in anyone who passed an aptitude test and teach them COBOL in three months and guarantee them a job.
One of the great things about COBOL is the arbitrary precision of numbers, especially in decimal - you could accurately represent numbers like 18 digits and a decimal point and 10 digits ... which made things like financial calculations work so much better than trying to lever them into a LONG or a FLOAT.
Some of the new languages have retreated from the concept of easy and some of the new features of C# and Java are probably only usable by people with degrees in software engineering.
Probably not even by them (points at self)
I Dont See much Developers who have even heard of Software engineering 😂 They followed a Multiple choice learning path and got a certificate, not more.
Yes, it took languages like C# and Java to realize the importance of decimal numbers. Prior to then, I've seen a host of languages all of which only integers and floats. Though, it seems weird, if one had to rewrite an existing Cobol module to Java, that one may discover of which Cobol does quite a bit behind the scenes.
That's very true, COBOL really "cracked" the decimal/number-manipulation issue though it felt extremely verbose in data definitions. My prog. "school" was 6 weeks only, no job guaranteed but I got one, & thankfully got out of COBOL ASAP. Into a different environment which was great for 20 years, then redundancy & my IT career ended. Application development had gone to India. If I'd stuck with COBOL I would have been OK financially, but perhaps driven crazy.
@@marcuswilliams3455 You said "Yes, it took languages like C# and Java to realize the importance of decimal numbers" - I beg to differ on your wording, they perhaps rediscovered the importance of decimal numbers. Otherwise, you're spot on.
I want to commend you on correctly explaining so much about COBOL. I started writing programs in COBOL in 1974. I also appreciate how upbeat you were at the end. I have several disagreements with some of your statements, but I am sure someone will point them out in previous comments. The one I would like to address though, is your statement about documentation. You are assuming that 50 years ago, it was the same as today. It was not. Back then, we had Documentation Standards. A really smart company would not only allow time for the development team to document their system (at business, program, and data structure levels), but they would give us time to update the documentation with every change we made. And, our IT management would enforce the standards. In my last job (I retired in 2017), no one I knew of was using documentation standards. I was working for a software development company in banking. I once asked a programmer for the documentation he used for developing a program module so I could help him fix a bug that we were have a hard time with. He had no idea what I was talking about!
In the 90s I worked for a major insurance company that was still using COBOL on mainframes for their backend stuff, actuarial tables, transactions, logging, etc. In the almost decade that I worked there they had 3 times where it was announced they were replacing it. Each time they tried to start that process, they gave up and just made a fancy new GUI that interfaced with the old systems. Now, 25+ years after I started I still have contacts there and it's all still chugging away on COBOL.
Don't know if it is true today, but the US Ag department was still using Cobol programs in 2010. This was clearly "if it ain't broke don't fix it"
I have worked in insurance industry for 20 years. There are still lot of mainframe solutions with Cobol (or Programming Language I, look it up) hanging around. They work and are still extremely efficient.
I worked around old COBOL programmers in a very large financial company. I asked them why COBOL was still used when most people thought it was dead or never heard of it. They told me the advantage was that it was very stable and hard to change. Because of this accidents (fat fingers) didn't happen. The company advertised internally to train employees in COBOL and gave raises to those who took the training.
My uncle was a COBOL programmer who back in the day was one of the main guys programming the EPA computer system, and for the rest of his life he was the main guy maintaining the system. Even after he retired he kept getting calls to come and fix issues, but he passed away two years ago and I’m not sure who updates the systems now.
Ironically, he was a librarian and therefore not a fan of the EPA but they were the main customer of his business throughout his career and he spent his life keeping their systems up and running.
Wow my dad used to program COBOL but he sort of just fell off the tech wagon and now just works a minimum wage job 🤷🏻♂️
I'm 62. COBOL programmer from 1983-86 in IBM 370 and 4331. Then moved to Sisinf 4GL, Sybase RDBMS and its 4GL, then moved to UNIFACE IDE that can access legacy storage format and allow modern UI, then Java Swing, and now Python. Still learning.
I became a COBOL programmer in 1975 thanks to IBM’s self-teaching manuals. Then I spent the next 20 years making a living at it. Difficult? No. Verbose? Yep. But its verbosity is one of its advantages. Well written structured COBOL should be self-documenting.
I'm an old COBOL programmer and I love it. You could write code that your supervisor could read using 88-levels. Was writing object oriented code in the 80's.
I prefer writing code my supervisor can't read.
Hurrah for COBOL!! I was a COBOL programmer for years 😀
me too, but now I am a python programmer 🙂
An interesting and very well presented topic. In the early 90s on the daily to commute to London a regular fellow traveller was a Cobol programmer for a major corporation who in spite of having reached retirement age was asked to stay and carry on working simply because no available replacements could be found in the job market. Thirty years on and it looks as though things haven't changed much.
A lot of us can program COBOL, but never admit it. The reason it is still around is because it was straightforward did exactly what it was supposed to. No more, no less.
COBOL is in the client server world too. Peoplesoft still uses it...
BTW, my Dad learned COBOL from a certain Commander (later Admiral) Grace Hopper. He was also an assembler programmer...with the War Department, later DOD, then IBM then GEICO.
Admiral Grace Hopper was a woman.
Grandma COBOL is a legend. And her first documentation of a computer bug.
This is not quite true - it was a common term by the time she discovered that "bug" - the joke was that the bug was caused by an actual bug. Not that the term bug was derived from this occurrence.
@@zoeherriot Thomas Edison also found a 'bug' -- a squashed insect in a telephone relay which prevented it from working properly. He wasn't the person who invented the first debugging hardware, however -- that would go to whomever invented the first insect screen. That said, we all remember Adm. Hopper's bug.
@@zoeherriotQuite correct, we know that it was a common term because of how she described it: "First actual case of bug being found."
It was the first actual case of a computer bug (error) being caused by a literal bug (insect).
Why would she write that unless errors were already called bugs?
Also heard the Turing machine, Colossus, crashed when a moth shorted out two valves, another anecdotal/legendary beginnings on the term "bug" - that was the 40s before the USA had even built a programmable electronic computer.
@@wolf5370 I've heard that, not sure how apocryphal it is.
Btw, probably want to avoid calling anything "the turing machine", as "Turing Machine" is a specific important concept in computer science.
I'm 86 and started programming in 1962 on an IBM 7070 for the State of New York using the language called Autocoder. After a few years I was recruited by General Dynamics, Astronautics division in San Diego where İ worked on the Atlas-Centaur missile program where we had an IBM 7074; still using Autocoder. It wasn't until 1975 that I worked for Great American Insurance Co. where we made the transition to COBOL. Subsequently, I worked in ten other countries, and in the latter years as IT manager. COBOL was the backbone of my work until the mid 1980's when several databases took over.
Sounds like the "good old days" for sure!
Im 84 and spent decades coding in Cobol, starting around 1963. There were certian problems that caused me to wish I was still writing in Autocoder.
I'm 36 and I'm learning Cobol.
I'm only 83, but my first hands-on encounter with a computer was in 1965 with an IBM 7074 which had no mass storage and used half-inch magnetic tapes for batch input and outputs. A separate 1401 mini-computer had mag tape drives and used a high-speed card reader to create batch input tapes to feed the mainframe. The 1401 also processed mainframe output tapes to line printers and a card punch, which was the other half of the card reader. Incidentally, the 7074 also had a one-at-a-time card reader at its operator console which was a re-purposed manual card punch and was useful only for last-minute run-time data and program options.
I was assigned to be a computer operator and it wasn't long till I found the computer manuals and learned how to program both computers (my favorite was the 1401 'cause it was so versatile and easy to program via cards.) Shortly thereafter, I was reassigned as a programmer and soon learned Assembler, Fortran and COBOL and wrote many programs in those and several other more exotic languages. Actually, COBOL was the easiest, but Fortran is my lingua franca and I can still dream in Fortran!
@@terrycureton2042 I've been told that because of the limitations of the hardware the programmers back then had to write elegant, efficient programs and that modern programmers write sloppy bloated programs. Did you see that happen over time? (Idk a single programming language.)
Great video. I am surprised you didn't mention Grace Hopper, the female coinventor of COBOL
This takes me back! This was one of the first languages I had to learn! Since I never worked in a financial institution, the language faded away in my memory! COBOL is still used today and follows the old adage, "If it ain't broke, don't fix it!" I heard a few years ago, that COBOL Programmers could earn excellent wages for their knowledge of the language since, there are so few out there.
I think *"...had to learn..."* sums it up. No-one would do it other than for money/career. Whereas I *wanted* to check out BASIC (boo, hiss) because you could do fun stuff quickly & easily. 43 years ago, that is
COBOL is a compiled transactional record oriented language. Python is a general purpose interpretive language. Huge difference when it comes to accounting. It's alive because it thrives in business.
We're not all dead yet. Just because COBOL is old, and those of us who know it are perhaps older; We are still among you. As an aside, I wrote the algorithms that secure your banking pins on the smartcard in C. So those will last a bit longer.
C, not C++ just plain old C, is going to be around about forever. The effort to deploy a modernized universal embedded systems language, Ada, basically failed. There were many reasons, but basically its advantages just weren't advantageous enough. C remains the very best language for embedded critical systems.
Thank youuuuuuu!
This was great, thanks Dee.
I'm a 63 year old man who recently retired from public service. A good part of my career was spent developing mainframe applications using procedural programming languages such as COBOL and NATURAL. As I had set out to retire, the organization I worked for was still in the process of trying to upgrade their applications to a more modern, object oriented platform and finally making some headway in doing so after many years of trying and going nowhere. My experience is, there is a tendency for managers to blame issues related to maintaining and upgrading the existing applications on the (perceived to be) antiquated mainframe platform, when in fact most of the time these issues arise from poorly documented systems and staff turnover among the subject matter experts. As a result, all too often the organization's newbies have to resort to taking a "black box" testing approach in an attempt to determine what the existing applications do and how they work. And all too often it's a thankless job.
@@alann.7976 The manager assumes magically you can convert the COBOL systems into any modern language just by looking at the codes. Sadly this is every difficult because missing documents and many, many modifications!
I think a feature of COBOL that is often forgotten, is that it didn't use floating commas for calculations, avoiding 'rounding bugs' and strange (but very expensive) things like that.
No. Misunderstanding.
In English, it's called "floating point". The trouble is exact calculations are very computationally expensive -- they take a lot of CPU power. Years ago, I heard that Bloomberg had special FPGA-based PCI cards to do 100-digit decimal calculations. FTSE just used the floating point hardware in normal CPUs, accepting that they would sometimes lose shares. I don't know what languages they use.
@@eekee6034 Floating point is MORE computationally expensive than fixed point.
It's not even CLOSE.
There's a reason that Floating Point Processors were an EXPENSIVE add-on for decades.
I have a bad experience with floating point and financial database. I was new and clueless back then. 😅
I once had to deal with a COBOL based financial system that saved and calculated big numbers in floating point. When changing to another system, the internal floating point storage format changed to IEEE standard, and rounding differences appeared between the 2 systems.
Some Fortran coded Manufacturing systems used Floating point numbers alot as well.
My favorite memory of the 80s was modifying a PL/1 program and discovering a bug in the compiler. Whenever a specific constant was coded (i.e., NUM1 := 356.7289) I don’t remember the exact number, the compiler generated the wrong binary code, so that any computations would give the wrong answer. I reported it, but I got transferred to a new project, so I never followed up on the fix.
COBOL is really easy to understand. Nobody wants to learn it only because it is so damn boring. But I am sure with enough financial incentive many people will make an exception and learn it to make bank.
Standard COBOL is easy, old COBOL from 1970’s and earlier can be a dystopian nightmare.
@@raybod1775 not really.
It's boring because it is only for... business. Not exciting for most young programmers who prefer the latest hype and I don't blame them.
@@raybod1775so the problem is not cobol but the spaguetti legacy code. PHP, java or python can be a monstruocity in wrong hands...
@@raybod1775Das gilt für COBOL vor 1968 und undisziplinierte Programmierer. Es ist auf jeden Fall ein gut handhabbares und wirklich mächtiges Werkzeug im Zusammenhang mit Massendaten.
This applies to COBOL before 1968 and undisciplined programmers. In any case, it is an easy-to-use and really powerful tool in connection with mass data.
I am retired now but dealt with COBOL for over 40 years along with other languages. If you have good logical thinking COBOL can be your best friend. Even now IBM COBOL can be object-oriented, if that is what you desire. I found structured COBOL programming more to my liking.
I have 55+ years experience in programming and even though I could be retired I still enjoy the work. I still do some COBOL and IBM Assembler but my favorite mainframe was the RCA spectra -- so far ahead of IBM in many ways . I had the immense honour of meeting Admiral Hopper -- what an amazing lady!
High five WHITEY!!!
As a COBOL programmer, I approve this message.
Great post! I am a retired software developer who began my career as a COBOL programmer for the Brazilian IRS in 1980. Throughout my professional years, I had the opportunity to learn a myriad of different programming languages, including Natural 2, PL/SQL, Visual Basic, JavaScript, PHP, and Python, among others. However, I hold fond memories of my time working with the mother (or, if you prefer, the old aunt) of all those modern languages. I used to say in my COBOL classes that you only need to learn seven basic commands to solve 99% of your programming needs-indeed, COBOL is a truly simple language.
As the Brits often say: "The Queen is dead! God save the Queen!"
Greetings from Brazil. :)
I read some other comments and it reminded me of how my mom had to find, unscramble, and correct errors and just bad code. She had to print it out and go through it line by line, delete the bad parts and rewrite them, then test run her updated routines and make sure they worked. It was hard, and she was the only one at the bank that could do it. Wow.😮
I graduated in 1969 with a degree in history. Yet during those college years I learned Fortran, SNOBOL, Penelope. That is how I made my College money. I ended up within Ma Bell, trained in BAL and COBOl in 1970. The business focus was capturing all the transactions in a business and filling those gorgeous databases.
Understanding and codifying the flows and stores of a business was done with COBOL. It was the tool we had. The pyramids were built with big blocks by processes we are still trying to understand.
We are still processing transactions and filling those wonderful databases. That language will be with us. Forever. 😇
My killer skill was JCL. Making a 155, a 165, a 195 all sing - what fun.
JCL -- uh boy.
After that there was IMS. That's the straw that broke my camel's back.
Then came CICS :) :) :)
... are we having fun yet?
But then came SQL ! ! ! ! There WAS a G*d in heaven!
@@noneagoogleuser4443 Ah, JCL home of the buggiest program ever written (per line of code), IEFBR14 - the "no action" program where all the work is done in DD statements. The problem was that JCL expects a completion code which the original single instruction program didn't set, so if the previous step had a error condition it propagated to next step. To fix the problem the return condition was set to 0 which doubled the size of the program (from one instruction to two).
COBOL is the epitome of "If it ain't broken, don't fix it!" - in fact, it's success runs way *WAY* deeper than is readily apparent. Not only is it perhaps the most-scalable programming language ever conceived, but it's quite probably the most secure language to use for financial purposes.
To begin with, being a task-specific language (as opposed to a general-purpose language), there simply aren't very many vulnerabilities which can be exploited. In this regard, it's like playing basketball with a concrete backboard - the ball will rebound, period!
Also, because it was written for systems with kilobytes of RAM and single-core CPUs running in the 2-4 MHz range, it is both extremely powerful and exceptionally lightweight. And because it's also immensely scalable, it's nearly as efficient at managing trillions of financial records as most modern languages are at managing millions.
Last but not least is the Technology Rift, which is actually a huge benefit for COBOL. Much the same way that the best anti-theft device for a car is a manual transmission (because 99% of car thieves can't drive a stick), COBOL's relative obscurity makes it virtually impenetrable against outside (or even inside) attacks. Add to this the fact that every system is almost-fully bespoke and has been maintained for decades with minimal documentation, the same 'downsides' which make things difficult for people who are *supposed* to have access make things virtually impossible for people who aren't supposed to have access.
When one considers all these factors - particularly in regards to how sensitive the data that's being handled - it's difficult to argue that these systems should *ever* use anything other than COBOL.
...Did you just unironically argue that security through obscurity / obstructionism-to-access is a _good idea?_
@@wasd____ Are they wrong?? Personally, I don't think they are...
@@DaveC_TN Yes, they're wrong. Security through obscurity isn't a proactive security measure that actually does something to prevent intrusion. it's just "Gee, I hope no one puts in the work to figure out the puzzle." But people who are interested will _always_ solve the puzzle eventually.
@@wasd____ it is a very good idea - the most secure place is the one no one knows about. In fact, if there was a way to prevent people to ever find out, you could be safe for ever.
@@wasd____ That doesn't make it a bad idea, it just means it's not completely safe - but very few things truly are.
And the puzzle doesn't always get solved - for a splendid example, see the Navajo codetalkers in the Pacific theatre of WW2.
Dee! Props (and L&S) from a fellow ZAR 🙂
Love your work
COBOL was the first language I studied officially, and the first language I used in the workplace
I'm not from IT but worked at a bank with product management. It's true that the codebase is mostly a black box for most of the employees, and they need to be extra careful with systems updates because any change could stop other processes that relied on it. There's no documentation, and developers most familiar with particular systems are the ones who worked on it longer. It's even weirder that development is outsourced. I assume that the company holding all the knowledge about the systems can name their price. There are stories like the one developer that knew a system left for another job, and was begged to come for a visit from time to time to help fix something.
But on the other hand, a few years ago the accounting system was rebuilt from scratch in SAP. It took 5 years until it was completely switched and the first few months were absolute chaos. A few millions in unreconciled entries were just forgotten about, probably because they just gave up trying to figure them out.
Rewriting an old system might just be too complex, and companies will only do it, if think, if not doing so gets them in legal trouble.
From Cobol to SAP, Wow. Talking about from the frying pan into the fire. It takes about 5 years to write Hello world in SAP, so no surprise there.
But don't go for Python or JavaScript for large critical systems. You can't create quality with testing, you can only improve existing quality with testing. Stick to proven languages with type safety and memory safety like C#, Java or Rust nowadays. (I know, SAP uses Java, that's not the point)
@@mennovanlavieren3885 SAP looks complicated to develop for. But they nailed the corporate pitch (governance, compliance yada yada).
@@mennovanlavieren3885Yes, I was puzzled that Dee compared COBOL (which I assume is an abbreviation for "COmmon Business Operations Language") to Python, when Python is a scripting language, much slower than, say, C or C++ or whatever is the current equivalent. The first two are not only well-tested, but have lots of people who are well-versed in them. (BTW, Javascript is also a scripting language, which was written in 2 weeks.)
SAP SUCKS
"There are stories like the one developer that knew a system left for another job, and was begged to come for a visit from time to time to help fix something."
That'd be me with the Chinese Liaoning Province inter-bank ATM switch, based at the People's Bank of China in Shenyang! I used to love those trips in the 90s
I will move COBOL from "obsolete skills" section to "strategic skills" on my resume!
I have a friend of mine that is still a highly proficient (but semi retired) programmer in COBOL and like me now in her 60's. She still comes out to play when her former customers need COBOL changes and makes a lot of money. When the Y2K panic came along in 2000 she was in big demand. Most old COBOL based software systems were never designed to go through a millennium change (00)She made a fortune upgrading COBOL based systems for many big companies, banks etc, and rich very quickly. She was quite busy for a year or two!! It is not really any harder to learn than any other language.
to the readers. Before 2000, lots of dates just had the year, month, day in various formats. Day, month, year etc. and did a trick like if year is > 90 put 18 in the century column.
I found this content unusually interesting because it is not really about coding but about the intricacies of how code influences or impacts our daily lives without people's knowledge. Shout out to Dee for that ingenious twist and creative approach to the coding world.
"If it is not broken don't fix" is the rule.
just replace it
@@tms2566 🤣
@@tms2566There's WAY too much of it to replace all in one go.
The problem is that the law changes, accounting requirements change, new payment systems are needed, new features are needed, on some level it's ALWAYS broken against current requirements.
@@dlbiggins COBOL code is regularly updated to match new regulation, the language itself has also had several new specs since the 50s, the latest one being from 2023.
I heard that long ago people were able to make also cars which work until now, construct buildings which last until now, and so on. It's not only about software.
Perhaps "the world has moved on".
Somewhere around 2000 was the peak of long-term reliability for Honda and Toyota.
They came to know their cars were lasting TOO long, costing them repeat sales, and started deliberately making them to not last QUITE so long.
I was born in 1960 and started my IT career in 1984 as a Cobol programmer for a bank. Today I program with Python (but no longer for a bank). Can only confirm everything you said. The smartest thing I've heard about COBOL in recent years.
You really got a knack for choosing your languages, haha.
I won't touch Python even if my career depends on it, its so bad...its 10 steps back.
@@TheEVEInspiration a pity so few people agree with you. There is something mystical about Python. I have yet to understand what exactly makes it so popular. Perhaps that's the whole point, really: nobody asks and everybody just assume that there _is_ a reason for its popularity...
I got my degree in business computer programming. I studied 14 or 15 languages but specialized in COBOL. It served me well during my 29 years in IT. The other language/system I was proficient in was AIX UNIX; both very powerful. Thanks for the video.
At 3:19, your Python does not match the COBOL. You don't print the first_number before adding to it.
🤣
Didn’t take long to find the COBOL programmer 🫡🫡🫡
@@codingwithdee Another one here. No need to worry, many of us can PERFORM SAVE_THE_WORD when needed.
It would have been more informative to translate and compare trivial Cobol code doing exact decimal arithmetic into Python. 🙂
Can someone explain this to me? The statements seem to be in the same order. Is there a difference in what "ADD 20 TO FIRST-NUMBER" followed by "DISPLAY FIRST-NUMBER" does in Cobol vs. the Python += and then print()?
@@The_1ntern3t The output in COBOL:
Here is the first Number
8
Let's add 20 to that number.
28
Create a second variable
30
The result is:
58
The output in Python:
here is the first number.
Let's add 20 to that number.
28
create a second number.
30
the result is: 58
Before learning COBOL I learned 1401 machine code, then Autocoder, Fortran, and finally COBOL. That was in 1966 as I recall, on an IBM 7040? It was nice reading the comments from others around the world who are also sharing these memories after so many years. My last project, in 1995-1998, was COBOL/CICS/DB2 at a company with approximately 7,000 users. Then in the time leading up into Y2K, we re-implemented into Unix/Oracle based systems with X term emulators running on PCs. That transition would not have been possible without a 4GL to help migrate all the transactions and background processing into 21st century technology. The bonus of course was giving the users the user friendly workstations after years of being tied to a 327x user interface. The conversion also cleaned up things like the interface with legacy financial systems using Oracle Transparent Gateway. Yes there is a lot of COBOL code running in the big systems to this day, but without the new technology our modern society would be unable to afford the conveniences of access and rapid high volume processing that we take for granted every day.
Learning any programming language is easy IF you are taught properly, leaning the concepts of programming and structured code. After that, all you need to know is the Syntax. I have coded in BASIC, COBOL, RPGII & ASSEMBLER, I've used JCL, DCF, and looked into more, hand coding HTML pages, and several others. There are SO many similarities that is almost stupidly easy to pick up a new programming language. Any competent programmer should be able to pick up the basics in no time.
Writing any new language idiomatically and efficiently tends to take a good while longer though. As the old joke goes: "real programmers can write FORTRAN in any language."
Yes this is true as long as the language shares the same general language structure, Try coding a GPU, or some of the newer processors. without a smart translation program. Code ASM for an IBM multiple transmission processor? Or a device driver at Ring 0. The instructions can be learned, but the OS/hardware environment makes it harder to operate correctly.
Don't learn computer languages, learn computer _science._
If you learn a language, you're just a code monkey who knows one language.
If you know computer science, you know _every_ language if you can read a syntax guide and some library documentation.
I'm teaching myself COBOL for the pure joy of it. My first programming language was FORTRAN ANSI 77. My second was MS DOS BASIC. It will be fun if I'm able to monetize my exploration of learning and eventually mastering COBOL. Fun departure from COBOL running on big iron, I'm learning to rock that code on a Raspberry Pi5 ARM based SBC.
COBOL is a solid, no frills programming language. It does its job efficiently and reliably. It handles vast volumes of data readily. Much of the "new" programming seems to be more about "whizz-bangs" rather than getting the job done. There's a reason MS and others are called "bloatware". COBOL, Fortran and the like are all about getting the job done, not making pretty pictures and sounds.
COBOL is not difficult to learn. It's just that it is very limited, so to achieve certain things, you need to write a lot of code. COBOL is most suitable for record processing - read some record (like a credit card transaction) from one of the many input cards, do something with it (like increase saldo here and decrease there). Once you know PERFORM UNTIL and format records, you've 50% there :)
Control breaks was one of the most enduring lessons I got from COBOL. Such a simple and useful concept that if poorly implemented can create havoc.
Fun fact: COBOL-D did not have the perform clause. It was all goto's
@@rty1955 :-)
Exactly, same goes for all "mainframe languages", they are just very feature poor, so you always need to "reinvent the wheel", there are very few libraries compared to modern languages, meaning everything becomes tedious to do.
@@youtubebob123 except Fortran. There's math libraries out there where they write a C-wrapper, instead of porting the lib itself.
I just passed my COBOL uni exam today
Congratulations, I got COBOL at uni because of the Y2K (yes I am that old) I hated it but it was pretty easy. If you go into mainframes the cool part is going to be the toolset.
Gratz
You will definitely not run out of opportunities to work. And you can make a fortune doing so!
Congratulations! I am so glad to hear that!
@@deantiquisetnovis Where are all these jobs paying "fortunes"? When I researched a few COBOL jobs they were "competitive" but nothing out of the ordinary.
My first job, after leaving school in 1985, was working at the local council developing COBOL systems on ICL ME29 mainframes. This was when the UK Data Protection Act was introduced, and I wrote the council's database for storing all their DPA submissions from each department and made it searchable using keywords. I then went on to work at the local newspaper company working on their newspaper distribution systems in COBOL on a Burroughs B900 mainframe. Since then, I moved mainly out of programming, but I have worked on various other languages including VB, Postscript, various typesetting languages and scripting languages over the year. I imagine I could pick up a COBOL compiler and put out some COBOL code even now. Just like riding a bike.
The quality of this code is likely significantly better than 99% of today's software.
That's great! However, it's important to consider that the amount of software written today is vastly greater than what was written during the COBOL era. Following that percentage and assuming it to be true, we could conclude that the quality of COBOL code, in comparison, wasn't as good as today's software. It's interesting to see how software quality has evolved over time.
yes you really need cobol for your bios
@@FranciscoCarlosCalderon Software quality is at an all-time low.
It was more likely your code was clean and disciplined, that's for sure.
@@FranciscoCarlosCalderonMuch of the software written today is bloatware, as it expands to fill the available storage. We built an antenna control system in 32k bytes that controlled 4 antenna servos, performed space stabilization, satellite orbital calculations, dead reckoning navigation, and space stabilization of 2 shipboard satellite tracking antennas in real time.
I wrote some COBOL code over 50 years ago and it is still running today, with out conversation. I know of a large company that tried converting to a so called modern language and they spent 10 years and many thousands of dollars and the new language always produced wrong answers when calculating dollars and cents when interest calculations or sales tax calculations were done because the new language didn’t follow the approve financial methods of rounding.
I've had that same experience, and I've complained bitterly that new languages don't (properly) support Decimal arithmetic. I'm working with SwiftUI now, and "Decimal" arithmetic is not fully integrated, and can make the same rounding errors as binary arithmetic if one is not extremely careful because it has NO rounding functions unless you want to convert to string, then back to decimal OR call NSDecimalNumber(mantissa:exponent:isNegative:). It's absurd. I never had these problems with COBOL.
I graduated in 2014, this is *exactly* why COBOL was still required.
I remember 1999 when, as a reporter, I was assigned to interview retired programmers getting Cobol machines ready for 2000. That was 25 years ago, Tthe scoo? A bunch of old chaps saving the econmy because they could code in an antique language. I guess I should have lerned Cobol then; I'd still be active today, Thank you for your piece.
About 40-60% of the banks and CU’s in the U.S. run on IBM midrange computers (IBM-i dating back to the AS/400 and Linux on Power dating back to the RS/6000). The IBM-i’s primarily use a programming language known today as RPG (though they support other languages like C, Java, etc.). All of these have amazing fascinating histories that would make a great video because they are still so pervasive in 2024!
I worked on RPG for a while back in the late 70s. I did not really care much for it and it would be way down at the bottom of my list of favorite languages. I much prefer COBOL.