*When a super AI merges with a super quantum computer system, humanity will definitely be summoning a god mode level of computing that will surely change everything forever!*
I know and like a number of people working at IBM. So much deep research and pure science going on there, it's the sort of company you want to keep afloat, even if market forces aren't always working in its favor.
Guys consider this: But they have a service business model, they earn mostly from consulting and maintaining systems giving out by them. They don’t you to have full access to their systems or make it hard to navigate it, so they continue to send their people to train your people to use their systems. IDK how supportive I should be of this business model.
Just what "influence" in Generative AI are you referring to, exactly? What have they done with these AI chips and the Granite models, aside from produce components that will go into their products (zMainframe, Power11 and IBM Cloud)?
They have maintained relevance in an industry that eats innovation for breakfast. Remember Netscape, iomega, rambus, US Robotics, and all the other "cutting edge" companies that forced the door of innovation open a tiny crack only to have it slam shut? We still talk about IBM as a mainstream player after a century of participation.
I worked with IBM hardware and software for my whole career. I worked for them twice, once as a student and secondly as a consultant near the end of my 40 year career on the mainframe platform. It is nice to hear about recent developments there.
I had to laugh when your video showed the first IBM PC. I bought that PC the day they released it for sale in 1981 or 1982. It had an Intel 8080 processor, 512k of memory, dual floppy drives (hard drives weren’t a thing yet) and a monochrome monitor. It cost me $3,200 back then. I eventually used it to do COBOL coding with Realia COBOL. It saved me a fortune because buying time on mainframe computers back then was unaffordable for an independent programmer. I think it took me 6 months to make my investment back. I actually upgraded the PC with an AST Sixpack + which gave it a real-time clock (the stock PC date and time had to be manually entered each time it was powered up) and increased the memory to an insane 640k of memory. I seem to remember some additional ports on that card. It is an absolute joke in today’s technology, but it was worth its weight in gold in the early 80’s. Thanks for the fond memories. 😊😊😊😊
@@j-555 It was DOS 1.? (Not sure if they had a point release yet). I worked for a major aerospace company at the time and I was the first person to show the value. I wound up forming the Office Automation group and began deploying the stand-alone IBM XT computers throughout the office areas (connectivity back to the mainframe was via 1200 baud modems). Each one was equipped with an Epson dot matrix printer with a box of green bar fan fold paper to generate hard copy. I’m not even sure if Apple was a thing yet. It did eventually come on the scene, but mainly in the areas that produced documentation / publications and I never got involved with them. My only exposure to Apple products was went I retired (after 43 years). That was 6 years ago when I bought an iPhone and iPad. I haven’t looked back since.
@@retiredbitjuggler3471 God being able to get these insights is my favorite part of the internet. If it wasn't for the internet and the presence of people like yourself, all those memories would be lost like "tears in the rain." (Blade Runner.) Speaking of which, you must've felt like you were living in the future back then.
@@jamesalec1321 Well, yes I did think I lived in the future - In fact, I can recall at the time my manager saying “This technology will shrink down to the size of a phone in the next 20 years”. Now to put that in proper perspective, desk phones were basically a 6x6x6 inch cube (massive by today’s standards). I thought it would take much longer. The first iPhone was released in 2007 and was massively more powerful and capable and it fit in the palm of your hand. At my age, the future is accelerating at a blinding pace. I have an old Jeep that has served me well for many years. Well, I took delivery of a Cybertruck 2 months ago and it is an X-wing Starfighter (Star Wars Trilogy) in comparison. This is a technology leap that was probably unimaginable back in the 80’s. Heck, I’m having a difficult time imagining it now - a truck that looks like a space craft, is extremely fast, very silent, and it drives itself. It is mind numbing- like a conversation I had with my wife’s grandfather decades ago. He witnessed the introduction of so much over his lifetime that were considered miraculous- internal combustion cars, planes, man landing on the moon in 1969 (my own father fabricated parts for the Lunar Lander for Grumman and they will be littered across it’s surface forever) and as crazy as that all seemed to him, what I have witnessed is technology advancing at an every increasing pace. It makes me wonder what my 6 grandchildren will witness in their lifetimes?
Anastasi, you're amazing. I studied chip design in high school through the Techtronix explorer scouts, and we even laid out our circuits and had them run on an empty part of a wafer. It was pretty cool for a bunch of high school kids. Sadly, I didn't go to college until I was in my mid thirties, three associates degrees, not in tech, but foreign language and business. And after being a one-man IT shop for twenty-some years, I had to give it up. But I'm looking to get back into programming (there are a couple of products that I helped design hardware for after highschool, as well as wrote the software for them). One dominates the retail glass shop management industry, and the other was hardware and software that was an early player in digital call generation and user response processing. Sadly, I didn't have stock in either company.
IBM didn't develop the first PC. They only developed the first IBM PC (which is a tautology). They weren't even close. Even with the most restrictive of definitions, the first PC was released well over 4 years prior to the release of the IBM PC. That was an extremely long time in that market. *Many* companies had released PCs before IBM.
IBM developed a personal computer in the mid 70s. It was powerful but too expensive, and IBM marketing didn't know how to sell it. Even if it was the first (big if, I really don't know), it didn't make computers popular. The honor belongs to the Apple II running the first killer app: VisiCalc, the first spreadsheet.
@@Howiefm28496 IBM did not enter the market reluctantly. They entered radically--with great conviction that they had to get to market as quick as possible. Their biggest problem was that they waited so long--and that is why, when they finally woke up to it, they had to move so fast, so as not to lose the opportunity to dominate the market for PCs in business.
@@bearcb You're referring to the IBM 5100 from 1975. It was a personal computer (even if it wasn't called that), but it didn't use a microprocessor (it's CPU took up an entire PCB), and the base model cost $9000 (1975 $). There were other personal computers prior to that, and multiple of those were microprocessor-based PCs, even--the latest of which was the Altair 8800, which was released in Jan 1975 for $621 assembled. And that was the model that took the US by storm. The TRS-80 was *far* more popular than the Apple II, even for a while after VisiCalc was released in 1979. The TRS-80 was arguably the first PC to make PCs popular.
@RetroDawn yes about the IBM one, too expensive like I said. I knew about Altair and IMSAI, but was not sure which year they were launched. TRS-80 was popular in the computer-savy community, but soon Apple II blew it out of the park.
I've often wondered where you get the ideas for the topics that you come up with Ms Anastasia. You really conver such a great range of topics. Thank you for the time and effort you bring to the industry.
On December 11th, 1972, Q1 Corporation sold the first Q1 microcomputer, based on the Intel 8008 microprocessor. It had the first QWERTY keyboard, printer and floppy. It was IBM that sold first computers 1957, 610 transistor and tube based processor, price tag was $55K.
Thanks for this video. I'm retired. I no longer work in this field as an engineer. Your videos are very interesting and your explanations are clear and supported by your remarkable intelligence.
No. Many clones of IBM pc were manufactured & sold of course but they were NOT running windo2s. They were running DOS. Windows (3.1.1) didnt appear until years later.
@@AdamsOlympia Apple fanbois like to pretend Steve Jobs invented the GUI. He didnt. Xerox PARC did. This idea that had Apple not stolen the idea that MS wouldn't have is absurd rewriting of History.
IBM, a hardware company, lost marketing and commercial relevance when they gave up the ThinkPad. When they come back with relevant hardware they can bounce back. Hope they do.
@@AdamsOlympiawindows 1 was a total failure. It didn't drive anyone's sales. Windows didn't start to make an impact until version 3.1 in the early 90s.
From Bharat: I am 67 years old and i did my electrical engineering. I follow your channel most of the time. all the information you provide keeps my brain healthy as it jogs it for sometime. our age group is one of those lucky groups as we have seen technology evolve and made it available for use by common population. I have used slide rules, the first commercially available scientific calculator (RPN type too), card reader computers which occupied huge rooms, then table type floppy [flippy too] { 8.5 inch/ 8 inch / 3.5 inches } computers, then PCs, Laptops, pagers, first type cellphones, [now called as feature phones i guess], then smartphones and finally AI solutions. only thing i guess our generation may not use is the quantum PC !! if we can use that too literally we have quantum jumped from slides rules to quantum computing !! Regards
This breakdown of IBM’s chip innovation is well done! The focus on accelerating AI workloads for mainframes is timely, looking forward to seeing how it plays into larger trends like edge computing.
Each chip excels in specific AI computations, making them complementary rather than mutually exclusive. Niche Specialization IBM Telum: CPU-AI integration, security, and enterprise. IBM SyNAPSE: Neuromorphic computing, low-power AI. Google TPU: Optimized for Google's AI services and cloud. NVIDIA GPU: General-purpose computing, broad adoption. FPGAs: Reconfigurable hardware for optimized performance. No Obsolescence None will make others obsolete; instead: Data centers: Combine CPUs, GPUs, TPUs, and FPGAs. Edge devices: Leverage specialized chips for efficiency. Research: Explores diverse architectures. Modular Approach AI systems integrate various chips, each handling specific tasks: Inference Training Processing Memory management Heterogeneous Computing The future of AI computing involves combining diverse hardware to optimize performance, power efficiency, and cost. We highlighted the importance of understanding each chip's strengths and weaknesses, rather than proclaiming one as universally "better."
" Neuromorphic computing" is not a word or phrase.... You don't need AI for your list of offerings or categories. Me thinks you got bit by the AI superbug !!!! Diverse Hardware will not cut it.
I imagine younger viewers look at the footage of people using those big floppy discs feel like I do watching films from 1910 with everyone travelling around by horse and cart!
I remember when the machines in the first clips were in use, I operated some of them. IBM's streak to stardom was when WW2 broke out, IBM went to the government and asked; what can we do for the war effort. Every military had an IBM. Then when the first commercial mainframe was glitchy, they sent the engineers that designed them out to fix them, establishing the reputation of standing behind their product; reliability. It didn't help their reputation when they made a deal with Apple and then reneged.
I think there’s a lot of growth potential on Linux and containerization, which runs on mainframe or LinuxONE hardware with Red Rat, so you can get real time ai and a consolidated private cloud stack, that doesn’t have the same security flaws as distributed hardware
Very spot on...this is obvious advantage of these companies that IBM targeted for acquisition. IBM is engineering focused with the outcome that follows.
The video impressed me. IBM is vast, yet not my investment focus. Still, intrigued, I subscribed and explored your course. As an investor, I seek to grasp the hardware behind the transformer model, aiming for smarter decisions.
I'm glad IBM is doing well, because at least historically, they were customer supportive; their mainframes were beautifully built, even in the parts not visible to the customer; they insisted on acting and looking professional, which may have been a bit too much at times, but it could bring out your best, and inspire it in other.
@@eglintonflats I work in the industry to solve complex problems and their often praised Watson was nothing but a burden in solving real world problems. Thereby hindering me to be innovative and make use of my ability to think. So please don't tell me... the company has lost its soul. Other companies such as Microsoft are way more innovative and customer centric. Working together with a guy from Microsoft was so much more helpful... But that's just my experience.
@@datenanalyseTheir keyboards were top notch too. And Microsoft is not really innovative and in any case not consumer centric. They ignore their customers all the time.. not only private, but also small and larger businesses. Some products might work out, but most don't and get deprecated sooner or later. Not a company to do business with, really.
No one should ever confuse patents with innovation. Patents are part of the legal arm of the company and engineers are highly incentivized (in some companies like IBM) to write patents on literally anything regardless of it's real value. I had the same pressures put on me back when I worked. Of course any real new tech will have as many patents written on it as feasibly possible as well. But a patent often represents no real innovation at all.
Thx for the report, pretty informative and good to know how IBM turned into almost dead to highly profitable and growing. One tiny advice - this vid would be shorter if u would speed up your speech, for example I went with 1.25 speed as it was kind of slow pace😉Shorter vids r better than long ones👍
Never a fan of Apple, I built PC compatibles starting in the mid 80's. I purchased a significant number of additional IBM shares in 2017 and watched it drop not long after. I didn't plan on selling, so I still have a lot of IBM, Intel as well. Moral of story: If you're gonna play the market, you have to be willing to ride the financial/emotional roller coaster. Glad IBM is doing so well these days. They've always been innovators, and they are never going belly up, so it's a safe stock, one of the "too big to fail" companies.
I wouldn’t “play the market”. I think you’re correct about not letting market volatility shake you, but that doesn’t mean all stocks will inevitably rise, not even blue chips. Buying smaller blocks over time can insulate you from some volatility, and if you’re lucky and paying attention you might get in on the dips. You don’t sound all that different than me, I just don’t like the term “playing the market”.
@@MarcosElMalo2 Having a diversified portfolio safety net also helps ride out the storms. The old maxim "Don't put all your eggs in one basket." is sound advice for long-term security as well.
I remember at the University of Toronto where the IBM 360/370 mainframe took up an entire floor of a building , a computer which had less computing power than your average smartphone today .
Their hybrid cloud solutions are just the offerings they provide to companies who are too embedded within IBM's infrastructure to leave their platform. Their consulting business are the service contracts they offer to those same customers who can't leave their legacy systems.
Neural network architecture is extremely compatible with implementation in silicon. It consists of a large number of fairly simple “neurons”. The main problem is interconnection. You want to connect every neuron in one row to every neuron in the next. The other problem is the weight of the connection. You want to load in weight values for a particular task, or adjust weights incrementally during learning. But NNs are so useful, I am confident these problems will be overcome.
I see a future in quantum computing and photonics. It's going to improve exponentially, as did the transistor. I think the trick will be finding the perfect substrate. Excellent video. Thanks👍
@truthsocialmedia It is not a joke. I work in the field. With photonics, you can run parallel processes. At our current point, we are able to use coherent light to run these computations. Even though the wave length is much larger than classical electrons, the power consumption is reduced. It is not currently for everything, but it excels in certain areas. With current chip designs, we have run into many issues. Physics is limiting our ability to build more complex recticals. The only solution at the moment is to re think the way we use computational power. Their is more to this field than most people understand. Their are many ways to create qbits. Once a process is well defined, it's up to mechatronic engineering.
@truthsocialmedia even though I work in the field. I still respect your imput,comment, and opinion. i actually have seen the video, and I am subscribed to her.👍
Cloud computing is not a new concept. Early on, folks connected dumb terminals to remote mainframes and servers for compilation, applications, and data. The advent of stand-alone PCs came about because of tech improvements and limited connection speeds (and WinTel monopolies). Innovations in high-speed connectivity brought the cloud back. Cloud 2.0 may be a better term.
IBM wasn't behind Microsoft and Amazon in cloud offerings. IBM was just avoiding getting into another commodity market like their PC experience. IBM was working with hybrid cloud solutions at the same time as Microsoft and Amazon. The market just hadn't caught up to the relative value of a hybrid cloud with its data ownership versus commodity cloud. Unlike Amazon and Microsoft, IBM actually steps in and offers consulting to design and provide end to end solutions as opposed to hosting and licensing. Also, IBM didn't need an acquisition for consulting. IBM has been offering consulting services for nearly 70 years.
Right, pay $200-500 plus man/hr for management consulting and over priced technology solutions. Not forgetting they are still the Wall Street darling for consulting and services, if you want Wall Street support and stock ratings go along to get along. Their last great idea was SaaS (Software as a Service) business model that failed. Still they have tons of assets in technology, people and patents ... they still could reconstitute or they could just continue to milk those assets as they fade away into white noise. Cloud computing is a piece of cake for providers with deep deep pockets, not surprising they are cashing in as best they can.
@abebrock5559 IBM manages property management systems for hotels, point of sales systems for auto stores and architects super computers for the Department of Energy while still hosting the majority of banking. Cloud computing isn't the piece of cake you're trying to make it. Hybrid cloud less so. That, and this isn't IBMs first evolution. They started off with time clocks and type writers.
@@wisenber Well stated. I acknowlege they are successful in cloud computing and making a pretty penny currently doing it. IBM is not the leader in it but successful. The question for myself .. does IBM have the commitment and the vision to prevail and innovate in it, or their other active technology adventures to which they are investing. IBM could be the next SpaceX in the cloud and AI industry or they might pull out altogether if competion gets momentem. They have a long history of leaving markets that stay profitable for many many years after IBM is gone. I wish IBM all the best and wish they would bring back the pension plan and reduce executive salaries, and return to manufacturing and R&D to the US.
@abebrock5559 I think you're missing what IBM actually does. Their hybrid cloud isn't what Amazon or Microsoft is doing, and neither Microsoft nor Amazon offer the consulting and architecture to have an actual solution. And no, IBM doesn't have a long history of leaving markets. They remain in markets that predate both Amazon and Microsoft existing.
@@wisenber PC manufacturing, PC client operating systems, fiber optics adapters, DASD and storage solutions, communication controllers, laptops, collabortive computing, PC office suites, mid-range business solutions, digital phone systems, computer management solutions. To name a few off the top of my head, all still viable markets.
Sorry but you had one detail wrong, IBM did not make the first PC, although they were the first ones to slap PC on the front cover, prior to the IBM PC there were plenty of companies making PC's, in fact Heathkit even made them, the H8 and H89 were both out before the IBM PC, not to mention Apple, Radio Shack, and many more, there was also an emerging common OS, CP/M starting to get everyone on the same page. That said IBM legitimized the PC as a "real" computer in the eyes of Corporate America. I worked for a big bank prior to the PC launch and at that time you could not get funding for a PC based project to save your life, IBM released the PC and corporate executives could not sign up fast enough and it was raining money for PC projects after that and that was the real turning point for PC's. We used to have a saying in corporate IT, you never get fired for going with IBM.
Yes, first business pc, but did little good even with IRMA emulator board. Very little could be done on a pc. The fact is all the data, computing, utilities was still on the mainframe.
You might explore Low-Precision Training Mechanism (LPTM) to 1 bit inference, with glass chips by ASML. I was told this will be picowatt, so would be sustainable from the inference point of view, significantly lowering power consumption.
@@lilblackduc7312 that was 1988. Sinclair and BBC Micro in the UK, Microbee in Australia and many others were pumping out computers from the beginning of the1980s. Apple was selling from the late 1970s if I remember right...
The end of the video reminded me of a reoccurring thought. Quantum research feels like we are tapping into how human neurons work. Classic computing was designed to and excels at correcting errors within quantum computing i.e. human thinking. So how IBM is approaching quantum error correction With classical computing seems ideal.
Fantastic channel can you make video on history of Silicon valley company Like Intel and hp Sun Compaq dell google Amazon etc And software company Like Microsoft Etc And another video on supercomputer Ibm Watson And history of ai and programming language
First personal computer was not launched by IBM. Before IBM there were Apple and a bunch of other companies. IBM watched how the PC era started for at least few years before they decided, that it's time to step to the market with it's own product.
In the 90's IBM had some big layoffs because of OS/2 declining revenue in PC's and hardware. Many people thought that the mainframe was going to be dead. But with large investments in new hardware and software dealing with the power of the mainframe. IBM is now large player in cloud, supporting software for the consulting business. It took 15 years of changes and now IBM is starting to grow again. The have huge niche with large copanies who deal large amounts of data banks, insurance and all in the financial market. They have some of the best hardware engineers in the world. The sad part with AI becoming so huge we have seen IBM, Google, INtel, Facebook and others layoff around 40,000 to 80,000 people. Eventually it will hit other sectors too.
Many brilliant people working at IBM. IBM have a long track record of supporting their software platforms. Ibm wrote the book on large scale business systems, transactional processing and Database technology. Most banks and gov. Are heavily invested in IBM technology and the relationship is solid. Large business, Banks and Government have been successfully using IBM for 50 years. Thats leaves all the other businesses that needed IT solutions and often didn’t have the capital to invest in IBM tech and that’s. IBM management seems to have suffered from the same disease as many large companies, they seem to loose there focus and direction with an excess of middle management intent on golden handshakes.
IBM business is to create fenced solutions for their banking and military clients. This type of chip is not really an innovation, but it is something that will upgrade the solutions they can deliver at very expensive prices, so it will maintain the company going for a few more years.
They are classic players in both once hardware and software. I always have my eyes and ears open when they publish something. They do make it harder to choose whom to run with "now a days."😅
Hey guys at IBM. I am very impressed that you have so much invested in tangible products, rather than trading Capital, like the douche bags who are destroying the economy (might want to keep a close eye on PWC). One thing to consider, is that as far as the AI tech goes, it does not make sense to build multi-core computer chips that have cores numbered below 100. NVIDIA is dominating the market, because they have found a way to put cores numbered higher than 10,000 on a single chip. Yes, you heard that right, NVIDIA makes chips with the number of cores well north of 10,000. So while CPUs are important, GPUs have become extremely important. GPU software, such as shaders are way out in front, as far as computing speed, which makes use of these massive number of cores. The AI software guys found a way to adapt the shader technology into neural network software.
It was a nice briefing about IBM's microchip. Would you explain in detail about 133 Qubits IBM's microchip please? I like to learn more about microchip and Qubit Algorithms and photon particles. I use to work on robot manually to control it from control panel, because the robot didn't work automatically. I learned it in one hour with zero experience in technology and computer. Is it possible to learn and to build a powerful IBM microchip without high school diploma? Because I love Quantum Mechanical Engineering. Thank you.
At 1:24, they were all running on Microsoft DOS, when the clones were really being made Windows wasn't available yet. Windows eventually dominated the PCs, but to say they were all running Windows is putting the cart before the horse.
I've probably seen a dozen of IBM's ghost buildings throughout New York state and the northeast, it's probably the oldest surviving visionaries of the computing industry, that was born by merging multiple manufacturers of some of the most advanced [mechanical and analog] business calculations and data recording and processing technologies that were available in the early 20th century. After the WWII they became pioneers in the transistor technology, and since it had already dominated the old analog and mechanical computing technologies in practice by the governments are businesses, the leadership under Watson Jr. (Son of the the original founder made use of transistor technology mandatory for all the equipment they were manufacturing and built some of the largest microelectronics and semiconductor research facilities throughout NY, indeed the IBM R&D spending and partnership with NY state universities, made Albany the leader of nanotechnology and semiconductor research in the world, and by partnering with advanced chip makers like Samsung IBM is boosting its manufacturing capacity to give the world the next Generation microchip technologies years or decades before any other competitor, and due to their intimate relationship with the Pentagon, the military and its multi-billion dollar futuristic war machines are often the first real world applications of the technology before they reach the mass production capacity to introduce civilian and commercial variants
Pro tip: Don't talk about a stock's performance without comparing it to key indexes (e.g. S&P) and sector performance. There's no context to just saying up 40%.
Consulting being the least profitable makes sense. You can charge good money for consulting, but to do that you need good consultant, who demand good pay or they'll go elsewhere or become independent. Much of consulting revenue presumably goes straight to payroll. Still, good luck building a software empire without consultants to help sell and implement it. Just because it's not the most profitable doesn't mean it's not extremely important to the business model as a whole.
Why all the comments about “first mini-computer” who cares??? Oh, yeah, I have a PC! 😏 This is about the evolution of the hardware mainframe hosting cloud applications… rather than using servers farms and it simply makes more sense as networks become more redundant and bandwidth more capable to scale traffic… anyone who’s had to sit eyes-on-glass to maintain an Azure or AWS production platform will appreciate that IBM’s “batch” runs with a fraction of the resources… in fact IBM runs Amazon’s card services: ACAPS and TSYS using good ol’ COBOL, CICS and IMS else you couldn’t get a credit check in a millisecond or post a billions of transactions by 5AM! BREATH DEEP! The world is bigger than your desktop or - consumer based - serve farms.
When I hear the name of IBM my immediate response is to conjur up an image of the typeset word "THINK" in classic roman text, framed behind glass and hung over one of the workbenches where for more than 50 years my father serviced IBM office machines in a bustling downtown Chicago raising a family through the post war 50s, the cold war 60s, the sufferin' 70s, and right up until the end of the word processor as the new century and a new world of office digitalization was being born...the end of mechanical machines that had the tolerance of swiss watches sitting on every office worker's desk across the world. I still dream of the last Selectric II I caressed and with which I typed a long letter that had a few typos here and there but at no time was I ever likely to see a blue screen telling me my lovely letter was lost and irretrievable. It's good to know that when that eventual day comes I'll see IBM there in the cloud; waiting for me to finally catch on.
I have found memories for IBM for a simple fact of playing Haunted house Mansion😂 my 1st Introduction to video games Back when there was only 1 color green lol
The historical introduction is quite distorted. IBM didn't make computers popular, Apple II + VisiCalc did, although it did design a personal computer in the mid 70s that had no impact. PCs were never the main source of IBM revenue, mainframes and associated software, peripherals and service were.
WIthout IBM we would not have what we have today. What champions they have been for the industry. I worked with mainframes for thirty years. I worked on DOS/VSE VM and MVS and z/OS. All backward compatible. In those thirty years we had one two day down time because of a busted physical disk pack. I was always mindful of the sheer grunt needed to create their systems. WIth 99.9% uptime says a lot.
IBM was nowhere near the first to make a desktop personal computer. Personally affordable desktop computers, known as PCs, started in the early 1970's. Altair, IMSAI, Cromemco, Apple, Commodore, Atari, Tandy/Radio Shack and many many others were delivering PCs long before IBM even started on theirs.
Inform that the next generation of computers will be developed in hexagonal format from the cutting of silicon wafers to the processors in their respective sockets on the motherboards...
IBM was great in the 70's with Mainframes. Since that time the company went downhill due mismanagement and lost out the connection to new technologies. They lost all their valuable research and technologies and finally they are doing consulting business and left over mainframe. All due mismanagement with management staff clueless about technology challenges and leveraging the company's know how finally lost. The stock value has continuously decreased since 2012 and just now reached 2012 level again.
I still don't know why IBM's AI chip should be a breakthrough as there are several companies producing such chips. Their business grows in SW and consulting (however, the growth could be also explained by decrease in other sectors. I am not saying it was wrong but it makes me always a bit suspecious when relative number are presented as if they were absolute value? A revenue increase of 470Mio actually sound more impressive than 7% imho :D ). I understand if IBM is using their new chips to facilitate their AI services?! which was not pointed out in the video. But what is the unique benefit to use IBM IA cloud services over AWS, Azure, Google?
@@dgrxd That's what also came into my mind (just without the details and numbers :D). I assume those chips are very well designed for specific purposes for IBM's cloud services. For sure IBM will also use the 'real' AI chips. Those chips very likely will reduce operating costs. Even then designing a new chip just for their own datacenters while there are already similar chips ready and available?! Makes me really wonder why IBM did put resources into this chip.
These chips are designed to add AI operations to high throughput high transaction load scenarios which are operating on the order of something like 300K - 1M transactions per second. Those systems already existed before AI and were running at that scale. And now their customers want to be able to run inference on the same systems without a performance hit. That is not possible with commodity hardware. These are not supposed to be bleeding edge AI chips. These are supposed to run distilled and quantized AI models from 2-3 generations ago in scenarios where it was previously technically infeasible to use them at all.
@@timseguine2 They are not specifically hardwired to the models but implement common operators such as those in pytorch and tensorflow and so have some degree of futurity.
@ 10:38 I'm confused ,but paraphrased" roughly 70% of the worlds transactions are run through the IBM hybrid cloud". Are you saying 70% of consumer transactions are run through this cloud?
Your description of the role IBM played in the emergence of PCs is inaccurate. They launched their PC division in response to the inroads Compaq Computers made into their corporate customer base. Their mainframe division was the most profitable portion of the business then and it always looked for ways to “kill” their internal rivals. You are correct that they never found a way to keep up with the competition on any other type of hardware which is why they eventually sold off their storage, printer, and PC businesses. Finally, you should look into the details of the software business. I’ll bet that a significant portion of it is recurring license fees from some very, very old platforms that still run the deep transactional core components of their financial services and healthcare clients.
Straight Arrow News and Ground News seem to have the same goal. What differences are there between them? Is it just me, or is Straight Arrow News only available as an app for Apple's operating systems?
This video is sponsored by Straight Arrow News. Check out SAN using my link: www.san.com/anastasi
Χαφιεδοτσιπ Αναστασι?
Thank you for this concise and very informative video.
*When a super AI merges with a super quantum computer system, humanity will definitely be summoning a god mode level of computing that will surely change everything forever!*
I kept getting a blank page using this link. Perhaps the link has expired! 🤔
IBM owns pwc? Damn, I had no idea :D Learning to code as tax consultant seems as very good idea now :D
I know and like a number of people working at IBM. So much deep research and pure science going on there, it's the sort of company you want to keep afloat, even if market forces aren't always working in its favor.
Bravo 🎉
Agree. Especially all their silicon technology stuff that they are still running.
It was interesting to learn from this video, why.
Guys consider this: But they have a service business model, they earn mostly from consulting and maintaining systems giving out by them. They don’t you to have full access to their systems or make it hard to navigate it, so they continue to send their people to train your people to use their systems. IDK how supportive I should be of this business model.
Consider the following: The pathway to hell is paved with good intentions.
So who is making their chips?
I've followed IBM for many decades. They are like the salt in the technological ocean. They go up, they go down, but their influence is undeniable.
Open the pod bay doors HAL
@@Williamb612 I'm afraid I can't do that Dave.
Just what "influence" in Generative AI are you referring to, exactly?
What have they done with these AI chips and the Granite models, aside from produce components that will go into their products (zMainframe, Power11 and IBM Cloud)?
They have maintained relevance in an industry that eats innovation for breakfast. Remember Netscape, iomega, rambus, US Robotics, and all the other "cutting edge" companies that forced the door of innovation open a tiny crack only to have it slam shut? We still talk about IBM as a mainstream player after a century of participation.
"LIKE SALT IN THE TECHNOLOGICAL OCEAN" THIS LOOKS LIKE SOMETHING CHATGPT WOULD WRITE. MAYBE MY RESPONSE IS ALSO CHATGPT'S
Excellent video. I started as a computer operator in 1974 using IBM 370 145 mainframes. Always have a soft spot for IBM.
I worked with IBM hardware and software for my whole career. I worked for them twice, once as a student and secondly as a consultant near the end of my 40 year career on the mainframe platform. It is nice to hear about recent developments there.
How do you feel about IBM getting in trouble for firing at least one white guy to meet racial quotas to increase diversity?
I had to laugh when your video showed the first IBM PC. I bought that PC the day they released it for sale in 1981 or 1982. It had an Intel 8080 processor, 512k of memory, dual floppy drives (hard drives weren’t a thing yet) and a monochrome monitor. It cost me $3,200 back then. I eventually used it to do COBOL coding with Realia COBOL. It saved me a fortune because buying time on mainframe computers back then was unaffordable for an independent programmer. I think it took me 6 months to make my investment back. I actually upgraded the PC with an AST Sixpack + which gave it a real-time clock (the stock PC date and time had to be manually entered each time it was powered up) and increased the memory to an insane 640k of memory. I seem to remember some additional ports on that card. It is an absolute joke in today’s technology, but it was worth its weight in gold in the early 80’s. Thanks for the fond memories. 😊😊😊😊
did it have a proprietary OS back then or was it DOS? Did you eventually move on to Apple back then??
@@j-555 It was DOS 1.? (Not sure if they had a point release yet). I worked for a major aerospace company at the time and I was the first person to show the value. I wound up forming the Office Automation group and began deploying the stand-alone IBM XT computers throughout the office areas (connectivity back to the mainframe was via 1200 baud modems). Each one was equipped with an Epson dot matrix printer with a box of green bar fan fold paper to generate hard copy. I’m not even sure if Apple was a thing yet. It did eventually come on the scene, but mainly in the areas that produced documentation / publications and I never got involved with them. My only exposure to Apple products was went I retired (after 43 years). That was 6 years ago when I bought an iPhone and iPad. I haven’t looked back since.
You are never going to need more than 512k of memory.
@@retiredbitjuggler3471 God being able to get these insights is my favorite part of the internet. If it wasn't for the internet and the presence of people like yourself, all those memories would be lost like "tears in the rain." (Blade Runner.) Speaking of which, you must've felt like you were living in the future back then.
@@jamesalec1321 Well, yes I did think I lived in the future - In fact, I can recall at the time my manager saying “This technology will shrink down to the size of a phone in the next 20 years”. Now to put that in proper perspective, desk phones were basically a 6x6x6 inch cube (massive by today’s standards). I thought it would take much longer. The first iPhone was released in 2007 and was massively more powerful and capable and it fit in the palm of your hand. At my age, the future is accelerating at a blinding pace. I have an old Jeep that has served me well for many years. Well, I took delivery of a Cybertruck 2 months ago and it is an X-wing Starfighter (Star Wars Trilogy) in comparison. This is a technology leap that was probably unimaginable back in the 80’s. Heck, I’m having a difficult time imagining it now - a truck that looks like a space craft, is extremely fast, very silent, and it drives itself. It is mind numbing- like a conversation I had with my wife’s grandfather decades ago. He witnessed the introduction of so much over his lifetime that were considered miraculous- internal combustion cars, planes, man landing on the moon in 1969 (my own father fabricated parts for the Lunar Lander for Grumman and they will be littered across it’s surface forever) and as crazy as that all seemed to him, what I have witnessed is technology advancing at an every increasing pace. It makes me wonder what my 6 grandchildren will witness in their lifetimes?
Anastasi, you're amazing. I studied chip design in high school through the Techtronix explorer scouts, and we even laid out our circuits and had them run on an empty part of a wafer. It was pretty cool for a bunch of high school kids. Sadly, I didn't go to college until I was in my mid thirties, three associates degrees, not in tech, but foreign language and business. And after being a one-man IT shop for twenty-some years, I had to give it up. But I'm looking to get back into programming (there are a couple of products that I helped design hardware for after highschool, as well as wrote the software for them). One dominates the retail glass shop management industry, and the other was hardware and software that was an early player in digital call generation and user response processing. Sadly, I didn't have stock in either company.
IBM didn't develop the first PC. They only developed the first IBM PC (which is a tautology). They weren't even close. Even with the most restrictive of definitions, the first PC was released well over 4 years prior to the release of the IBM PC. That was an extremely long time in that market. *Many* companies had released PCs before IBM.
IBM entered the PC market reluctantly and didn’t care too much for PCs afterward.
IBM developed a personal computer in the mid 70s. It was powerful but too expensive, and IBM marketing didn't know how to sell it. Even if it was the first (big if, I really don't know), it didn't make computers popular. The honor belongs to the Apple II running the first killer app: VisiCalc, the first spreadsheet.
@@Howiefm28496 IBM did not enter the market reluctantly. They entered radically--with great conviction that they had to get to market as quick as possible. Their biggest problem was that they waited so long--and that is why, when they finally woke up to it, they had to move so fast, so as not to lose the opportunity to dominate the market for PCs in business.
@@bearcb You're referring to the IBM 5100 from 1975. It was a personal computer (even if it wasn't called that), but it didn't use a microprocessor (it's CPU took up an entire PCB), and the base model cost $9000 (1975 $). There were other personal computers prior to that, and multiple of those were microprocessor-based PCs, even--the latest of which was the Altair 8800, which was released in Jan 1975 for $621 assembled. And that was the model that took the US by storm. The TRS-80 was *far* more popular than the Apple II, even for a while after VisiCalc was released in 1979. The TRS-80 was arguably the first PC to make PCs popular.
@RetroDawn yes about the IBM one, too expensive like I said. I knew about Altair and IMSAI, but was not sure which year they were launched. TRS-80 was popular in the computer-savy community, but soon Apple II blew it out of the park.
I've often wondered where you get the ideas for the topics that you come up with Ms Anastasia. You really conver such a great range of topics. Thank you for the time and effort you bring to the industry.
On December 11th, 1972, Q1 Corporation sold the first Q1 microcomputer, based on the Intel 8008 microprocessor.
It had the first QWERTY keyboard, printer and floppy. It was IBM that sold first computers 1957, 610 transistor and tube based processor, price tag was $55K.
Thanks for this video. I'm retired. I no longer work in this field as an engineer. Your videos are very interesting and your explanations are clear and supported by your remarkable intelligence.
No.
Many clones of IBM pc were manufactured & sold of course but they were NOT running windo2s.
They were running DOS.
Windows (3.1.1) didnt appear until years later.
@@AdamsOlympia Apple fanbois like to pretend Steve Jobs invented the GUI. He didnt. Xerox PARC did. This idea that had Apple not stolen the idea that MS wouldn't have is absurd rewriting of History.
IBM, a hardware company, lost marketing and commercial relevance when they gave up the ThinkPad. When they come back with relevant hardware they can bounce back. Hope they do.
She is definitely an expert XD
Windows was a clone of os2. If IBM had open sourced this nobody would be running windows
@@AdamsOlympiawindows 1 was a total failure. It didn't drive anyone's sales. Windows didn't start to make an impact until version 3.1 in the early 90s.
From Bharat: I am 67 years old and i did my electrical engineering. I follow your channel most of the time. all the information you provide keeps my brain healthy as it jogs it for sometime. our age group is one of those lucky groups as we have seen technology evolve and made it available for use by common population. I have used slide rules, the first commercially available scientific calculator (RPN type too), card reader computers which occupied huge rooms, then table type floppy [flippy too] { 8.5 inch/ 8 inch / 3.5 inches } computers, then PCs, Laptops, pagers, first type cellphones, [now called as feature phones i guess], then smartphones and finally AI solutions. only thing i guess our generation may not use is the quantum PC !! if we can use that too literally we have quantum jumped from slides rules to quantum computing !! Regards
Thanks for singing my song.😊
My Dad used the punch card computer at IISc when it arrived.
Ha, but did you learn arithmetic in grade 2/3 writing on slates, with a stylus? Was with you on the rest of our journey!
@@DataCrusade1999 good racism he achieved more than you ever can
This breakdown of IBM’s chip innovation is well done! The focus on accelerating AI workloads for mainframes is timely, looking forward to seeing how it plays into larger trends like edge computing.
Each chip excels in specific AI computations, making them complementary rather than mutually exclusive.
Niche Specialization
IBM Telum: CPU-AI integration, security, and enterprise.
IBM SyNAPSE: Neuromorphic computing, low-power AI.
Google TPU: Optimized for Google's AI services and cloud.
NVIDIA GPU: General-purpose computing, broad adoption.
FPGAs: Reconfigurable hardware for optimized performance.
No Obsolescence
None will make others obsolete; instead:
Data centers: Combine CPUs, GPUs, TPUs, and FPGAs.
Edge devices: Leverage specialized chips for efficiency.
Research: Explores diverse architectures.
Modular Approach
AI systems integrate various chips, each handling specific tasks:
Inference
Training
Processing
Memory management
Heterogeneous Computing
The future of AI computing involves combining diverse hardware to optimize performance, power efficiency, and cost.
We highlighted the importance of understanding each chip's strengths and weaknesses, rather than proclaiming one as universally "better."
great stocks to buy
" Neuromorphic computing" is not a word or phrase.... You don't need AI for your list of offerings or categories. Me thinks you got bit by the AI superbug !!!! Diverse Hardware will not cut it.
I get almost all of my news and info from TH-cam. I think TH-cam producers need to understand that unique news and analysis is always desired.
U-toob is notorious for it's censorship.
You should read serious news channels.
I imagine younger viewers look at the footage of people using those big floppy discs feel like I do watching films from 1910 with everyone travelling around by horse and cart!
Nothing like slotting in a floppy disc after growing up watching chokablock.
I remember when the machines in the first clips were in use, I operated some of them. IBM's streak to stardom was when WW2 broke out, IBM went to the government and asked; what can we do for the war effort. Every military had an IBM. Then when the first commercial mainframe was glitchy, they sent the engineers that designed them out to fix them, establishing the reputation of standing behind their product; reliability. It didn't help their reputation when they made a deal with Apple and then reneged.
I think there’s a lot of growth potential on Linux and containerization, which runs on mainframe or LinuxONE hardware with Red Rat, so you can get real time ai and a consolidated private cloud stack, that doesn’t have the same security flaws as distributed hardware
Very spot on...this is obvious advantage of these companies that IBM targeted for acquisition. IBM is engineering focused with the outcome that follows.
The video impressed me. IBM is vast, yet not my investment focus. Still, intrigued, I subscribed and explored your course. As an investor, I seek to grasp the hardware behind the transformer model, aiming for smarter decisions.
I meant to add, that your videos are great, and fun to watch. Thank you for the time you give to making them -- they really are excellent!
Excellent presentation. It's like a digest delivery for the background section in a Harvard Business School case.
I'm glad IBM is doing well, because at least historically, they were customer supportive; their mainframes were beautifully built, even in the parts not visible to the customer; they insisted on acting and looking professional, which may have been a bit too much at times, but it could bring out your best, and inspire it in other.
stocks of IBM goes up:
me: "Are they making a new thinkpad?"
Lol, in my opinion ThinkPads were the only realy good products they produced. And they gave them to Lenovo...
Do not lie to us, you do not know how to think.
@@eglintonflats I work in the industry to solve complex problems and their often praised Watson was nothing but a burden in solving real world problems. Thereby hindering me to be innovative and make use of my ability to think. So please don't tell me... the company has lost its soul. Other companies such as Microsoft are way more innovative and customer centric. Working together with a guy from Microsoft was so much more helpful... But that's just my experience.
@@datenanalyseTheir keyboards were top notch too.
And Microsoft is not really innovative and in any case not consumer centric. They ignore their customers all the time.. not only private, but also small and larger businesses. Some products might work out, but most don't and get deprecated sooner or later. Not a company to do business with, really.
@@datenanalyseMicrosoft and “customer centric” isn’t how I’d describe them
No one should ever confuse patents with innovation. Patents are part of the legal arm of the company and engineers are highly incentivized (in some companies like IBM) to write patents on literally anything regardless of it's real value. I had the same pressures put on me back when I worked. Of course any real new tech will have as many patents written on it as feasibly possible as well. But a patent often represents no real innovation at all.
Thx for the report, pretty informative and good to know how IBM turned into almost dead to highly profitable and growing. One tiny advice - this vid would be shorter if u would speed up your speech, for example I went with 1.25 speed as it was kind of slow pace😉Shorter vids r better than long ones👍
Never a fan of Apple, I built PC compatibles starting in the mid 80's. I purchased a significant number of additional IBM shares in 2017 and watched it drop not long after. I didn't plan on selling, so I still have a lot of IBM, Intel as well.
Moral of story: If you're gonna play the market, you have to be willing to ride the financial/emotional roller coaster. Glad IBM is doing so well these days. They've always been innovators, and they are never going belly up, so it's a safe stock, one of the "too big to fail" companies.
I wouldn’t “play the market”. I think you’re correct about not letting market volatility shake you, but that doesn’t mean all stocks will inevitably rise, not even blue chips. Buying smaller blocks over time can insulate you from some volatility, and if you’re lucky and paying attention you might get in on the dips.
You don’t sound all that different than me, I just don’t like the term “playing the market”.
Imagine that you sold right before that peak... And bought right at the dip.
@@MarcosElMalo2 Having a diversified portfolio safety net also helps ride out the storms. The old maxim "Don't put all your eggs in one basket." is sound advice for long-term security as well.
I remember at the University of Toronto where the IBM 360/370 mainframe took up an entire floor of a building , a computer which had less computing power than your average smartphone today .
Their hybrid cloud solutions are just the offerings they provide to companies who are too embedded within IBM's infrastructure to leave their platform. Their consulting business are the service contracts they offer to those same customers who can't leave their legacy systems.
Neural network architecture is extremely compatible with implementation in silicon.
It consists of a large number of fairly simple “neurons”.
The main problem is interconnection. You want to connect every neuron in one row to every neuron in the next.
The other problem is the weight of the connection.
You want to load in weight values for a particular task, or adjust weights incrementally during learning.
But NNs are so useful, I am confident these problems will be overcome.
I see a future in quantum computing and photonics. It's going to improve exponentially, as did the transistor. I think the trick will be finding the perfect substrate. Excellent video. Thanks👍
How about more than 2 algorithms that outperorm classical?
quantum computing is bunk. sabine hossenfelder has a series on Quantum computer nonsense.
@truthsocialmedia It is not a joke. I work in the field.
With photonics, you can run parallel processes. At our current point, we are able to use coherent light to run these computations. Even though the wave length is much larger than classical electrons, the power consumption is reduced. It is not currently for everything, but it excels in certain areas. With current chip designs, we have run into many issues. Physics is limiting our ability to build more complex recticals. The only solution at the moment is to re think the way we use computational power. Their is more to this field than most people understand. Their are many ways to create qbits. Once a process is well defined, it's up to mechatronic engineering.
@@Sven_Dongle please elaborate 👍
@truthsocialmedia even though I work in the field. I still respect your imput,comment, and opinion. i actually have seen the video, and I am subscribed to her.👍
I believe IBM entered the cloud space with the acquisition of Soft Layer around 2014 much before they acquired Red Hat
I worked as an intern at IBM. Wish them the best. A great company then...
Cloud computing is not a new concept. Early on, folks connected dumb terminals to remote mainframes and servers for compilation, applications, and data. The advent of stand-alone PCs came about because of tech improvements and limited connection speeds (and WinTel monopolies). Innovations in high-speed connectivity brought the cloud back. Cloud 2.0 may be a better term.
IBM wasn't behind Microsoft and Amazon in cloud offerings. IBM was just avoiding getting into another commodity market like their PC experience. IBM was working with hybrid cloud solutions at the same time as Microsoft and Amazon.
The market just hadn't caught up to the relative value of a hybrid cloud with its data ownership versus commodity cloud.
Unlike Amazon and Microsoft, IBM actually steps in and offers consulting to design and provide end to end solutions as opposed to hosting and licensing.
Also, IBM didn't need an acquisition for consulting. IBM has been offering consulting services for nearly 70 years.
Right, pay $200-500 plus man/hr for management consulting and over priced technology solutions. Not forgetting they are still the Wall Street darling for consulting and services, if you want Wall Street support and stock ratings go along to get along. Their last great idea was SaaS (Software as a Service) business model that failed. Still they have tons of assets in technology, people and patents ... they still could reconstitute or they could just continue to milk those assets as they fade away into white noise. Cloud computing is a piece of cake for providers with deep deep pockets, not surprising they are cashing in as best they can.
@abebrock5559 IBM manages property management systems for hotels, point of sales systems for auto stores and architects super computers for the Department of Energy while still hosting the majority of banking.
Cloud computing isn't the piece of cake you're trying to make it. Hybrid cloud less so.
That, and this isn't IBMs first evolution. They started off with time clocks and type writers.
@@wisenber Well stated. I acknowlege they are successful in cloud computing and making a pretty penny currently doing it. IBM is not the leader in it but successful. The question for myself .. does IBM have the commitment and the vision to prevail and innovate in it, or their other active technology adventures to which they are investing. IBM could be the next SpaceX in the cloud and AI industry or they might pull out altogether if competion gets momentem. They have a long history of leaving markets that stay profitable for many many years after IBM is gone. I wish IBM all the best and wish they would bring back the pension plan and reduce executive salaries, and return to manufacturing and R&D to the US.
@abebrock5559 I think you're missing what IBM actually does. Their hybrid cloud isn't what Amazon or Microsoft is doing, and neither Microsoft nor Amazon offer the consulting and architecture to have an actual solution.
And no, IBM doesn't have a long history of leaving markets. They remain in markets that predate both Amazon and Microsoft existing.
@@wisenber PC manufacturing, PC client operating systems, fiber optics adapters, DASD and storage solutions, communication controllers, laptops, collabortive computing, PC office suites, mid-range business solutions, digital phone systems, computer management solutions. To name a few off the top of my head, all still viable markets.
Sorry but you had one detail wrong, IBM did not make the first PC, although they were the first ones to slap PC on the front cover, prior to the IBM PC there were plenty of companies making PC's, in fact Heathkit even made them, the H8 and H89 were both out before the IBM PC, not to mention Apple, Radio Shack, and many more, there was also an emerging common OS, CP/M starting to get everyone on the same page. That said IBM legitimized the PC as a "real" computer in the eyes of Corporate America. I worked for a big bank prior to the PC launch and at that time you could not get funding for a PC based project to save your life, IBM released the PC and corporate executives could not sign up fast enough and it was raining money for PC projects after that and that was the real turning point for PC's. We used to have a saying in corporate IT, you never get fired for going with IBM.
Yes, first business pc, but did little good even with IRMA emulator board. Very little could be done on a pc. The fact is all the data, computing, utilities was still on the mainframe.
Thank you for covering one of our countries best assets!
You might explore Low-Precision Training Mechanism (LPTM) to 1 bit inference, with glass chips by ASML. I was told this will be picowatt, so would be sustainable from the inference point of view, significantly lowering power consumption.
That research is promising. 😊
Which models their chips are optimised for ?
I assume inference of some financial stuff or business related, as the nature of ibm cloud solutions
Sorry, small correction. The UK Feranti mk 1 was the first commercial computer
For me personally, the I.B.M. AS/400 was the first commercial computer. (Puppy Love ;-)
Mom worked at IBM I remember going in there and seeing the punch card reading computer in Nashville IBM Means Service 😊
she said 1st "mainframe" computer
@@lilblackduc7312 that was 1988. Sinclair and BBC Micro in the UK, Microbee in Australia and many others were pumping out computers from the beginning of the1980s. Apple was selling from the late 1970s if I remember right...
Does the UK government pay for the army of commenters that comment this on American videos?
Finally. Its about time AI is slightly more focused on for what it could be. Thanks!
Very professional. Very good. Thank you!
Ohh.. your Voice s been upgraded.. Enhanced.. Boosted-up very well.. Congrats.!?!
Deinterlace filter removes "comb" effects when splicing in classic TV broadcast footage.
Bought Cadence after viewing. Thanks for all the money!
The end of the video reminded me of a reoccurring thought. Quantum research feels like we are tapping into how human neurons work. Classic computing was designed to and excels at correcting errors within quantum computing i.e. human thinking. So how IBM is approaching quantum error correction With classical computing seems ideal.
Fantastic channel can you make video on history of
Silicon valley company
Like Intel and hp
Sun Compaq dell google
Amazon etc
And software company
Like Microsoft
Etc
And another video on supercomputer
Ibm Watson
And history of ai and programming language
First personal computer was not launched by IBM. Before IBM there were Apple and a bunch of other companies. IBM watched how the PC era started for at least few years before they decided, that it's time to step to the market with it's own product.
In the 90's IBM had some big layoffs because of OS/2 declining revenue in PC's and hardware. Many people thought that the mainframe was going to be dead. But with large investments in new hardware and software dealing with the power of the mainframe. IBM is now large player in cloud, supporting software for the consulting business. It took 15 years of changes and now IBM is starting to grow again. The have huge niche with large copanies who deal large amounts of data banks, insurance and all in the financial market. They have some of the best hardware engineers in the world. The sad part with AI becoming so huge we have seen IBM, Google, INtel, Facebook and others layoff around 40,000 to 80,000 people. Eventually it will hit other sectors too.
OOOooooooohhhhhhh, Anastasi!!!!!!!
My apologies. Just couldn't help myself. 😁✌🖖
Fantastic video :) i did`t knowed IBM is going that big in quantum computers
IBM has always been a core technology company.
Always interesting really new tech every decade or so.
Many brilliant people working at IBM. IBM have a long track record of supporting their software platforms.
Ibm wrote the book on large scale business systems, transactional processing and Database technology.
Most banks and gov. Are heavily invested in IBM technology and the relationship is solid.
Large business, Banks and Government have been successfully using IBM for 50 years. Thats leaves all the other businesses that needed IT solutions and often didn’t have the capital to invest in IBM tech and that’s.
IBM management seems to have suffered from the same disease as many large companies, they seem to loose there focus and direction with an excess of middle management intent on golden handshakes.
IBM business is to create fenced solutions for their banking and military clients. This type of chip is not really an innovation, but it is something that will upgrade the solutions they can deliver at very expensive prices, so it will maintain the company going for a few more years.
Loved the video! Thank you very much
IBM didn't make the first personal computer.
Their chip facilities are in Fishkill, NY, not Albany, and Essex Junction, VT.
They are classic players in both once hardware and software. I always have my eyes and ears open when they publish something. They do make it harder to choose whom to run with "now a days."😅
Hey guys at IBM. I am very impressed that you have so much invested in tangible products, rather than trading Capital, like the douche bags who are destroying the economy (might want to keep a close eye on PWC). One thing to consider, is that as far as the AI tech goes, it does not make sense to build multi-core computer chips that have cores numbered below 100. NVIDIA is dominating the market, because they have found a way to put cores numbered higher than 10,000 on a single chip. Yes, you heard that right, NVIDIA makes chips with the number of cores well north of 10,000. So while CPUs are important, GPUs have become extremely important. GPU software, such as shaders are way out in front, as far as computing speed, which makes use of these massive number of cores. The AI software guys found a way to adapt the shader technology into neural network software.
It was a nice briefing about IBM's microchip.
Would you explain in detail about 133 Qubits IBM's microchip please?
I like to learn more about microchip and Qubit Algorithms and photon particles. I use to work on robot manually to control it from control panel, because the robot didn't work automatically. I learned it in one hour with zero experience in technology and computer.
Is it possible to learn and to build a powerful IBM microchip without high school diploma? Because I love Quantum Mechanical Engineering.
Thank you.
its the quandum computing that blows my mind and why i'm buying IBM stock
At 1:24, they were all running on Microsoft DOS, when the clones were really being made Windows wasn't available yet. Windows eventually dominated the PCs, but to say they were all running Windows is putting the cart before the horse.
I'm a huge fan. Thank you for these videos!
quantum computing sounds like crazy future technology, no idea how they figured that one out, amazing
I've probably seen a dozen of IBM's ghost buildings throughout New York state and the northeast, it's probably the oldest surviving visionaries of the computing industry, that was born by merging multiple manufacturers of some of the most advanced [mechanical and analog] business calculations and data recording and processing technologies that were available in the early 20th century. After the WWII they became pioneers in the transistor technology, and since it had already dominated the old analog and mechanical computing technologies in practice by the governments are businesses, the leadership under Watson Jr. (Son of the the original founder made use of transistor technology mandatory for all the equipment they were manufacturing and built some of the largest microelectronics and semiconductor research facilities throughout NY, indeed the IBM R&D spending and partnership with NY state universities, made Albany the leader of nanotechnology and semiconductor research in the world, and by partnering with advanced chip makers like Samsung IBM is boosting its manufacturing capacity to give the world the next Generation microchip technologies years or decades before any other competitor, and due to their intimate relationship with the Pentagon, the military and its multi-billion dollar futuristic war machines are often the first real world applications of the technology before they reach the mass production capacity to introduce civilian and commercial variants
I was there. Most of them didn't run windows. MsDos came first, but there were a lot of better OS's out there in the 1980's.
Pro tip: Don't talk about a stock's performance without comparing it to key indexes (e.g. S&P) and sector performance. There's no context to just saying up 40%.
Excellent comment, context is key !
Consulting being the least profitable makes sense. You can charge good money for consulting, but to do that you need good consultant, who demand good pay or they'll go elsewhere or become independent. Much of consulting revenue presumably goes straight to payroll. Still, good luck building a software empire without consultants to help sell and implement it. Just because it's not the most profitable doesn't mean it's not extremely important to the business model as a whole.
Why all the comments about “first mini-computer” who cares??? Oh, yeah, I have a PC! 😏
This is about the evolution of the hardware mainframe hosting cloud applications… rather than using servers farms and it simply makes more sense as networks become more redundant and bandwidth more capable to scale traffic… anyone who’s had to sit eyes-on-glass to maintain an Azure or AWS production platform will appreciate that IBM’s “batch” runs with a fraction of the resources… in fact IBM runs Amazon’s card services: ACAPS and TSYS using good ol’ COBOL, CICS and IMS else you couldn’t get a credit check in a millisecond or post a billions of transactions by 5AM!
BREATH DEEP! The world is bigger than your desktop or - consumer based - serve farms.
Beauty and brains. A heady combination.
*_ThanX So Much._*
_Danke aus Polen._
When I hear the name of IBM my immediate response is to conjur up an image of the typeset word "THINK" in classic roman text, framed behind glass and hung over one of the workbenches where for more than 50 years my father serviced IBM office machines in a bustling downtown Chicago raising a family through the post war 50s, the cold war 60s, the sufferin' 70s, and right up until the end of the word processor as the new century and a new world of office digitalization was being born...the end of mechanical machines that had the tolerance of swiss watches sitting on every office worker's desk across the world. I still dream of the last Selectric II I caressed and with which I typed a long letter that had a few typos here and there but at no time was I ever likely to see a blue screen telling me my lovely letter was lost and irretrievable. It's good to know that when that eventual day comes I'll see IBM there in the cloud; waiting for me to finally catch on.
very informative video.thanks
I have found memories for IBM for a simple fact of playing Haunted house Mansion😂 my 1st Introduction to video games Back when there was only 1 color green lol
The historical introduction is quite distorted. IBM didn't make computers popular, Apple II + VisiCalc did, although it did design a personal computer in the mid 70s that had no impact. PCs were never the main source of IBM revenue, mainframes and associated software, peripherals and service were.
WIthout IBM we would not have what we have today. What champions they have been for the industry. I worked with mainframes for thirty years. I worked on DOS/VSE VM and MVS and z/OS. All backward compatible. In those thirty years we had one two day down time because of a busted physical disk pack. I was always mindful of the sheer grunt needed to create their systems. WIth 99.9% uptime says a lot.
IBM was nowhere near the first to make a desktop personal computer. Personally affordable desktop computers, known as PCs, started in the early 1970's. Altair, IMSAI, Cromemco, Apple, Commodore, Atari, Tandy/Radio Shack and many many others were delivering PCs long before IBM even started on theirs.
I guess she meant the first "IBM compatible" PC? 🤔
Inform that the next generation of computers will be developed in hexagonal format from the cutting of silicon wafers to the processors in their respective sockets on the motherboards...
No. Reliably dicing apart the wafer requires criss-cross cuts. It's like cutting sheet glass into tiles.
IBM was destroyed in the 90s by bad management.
IBM was great in the 70's with Mainframes. Since that time the company went downhill due mismanagement and lost out the connection to new technologies. They lost all their valuable research and technologies and finally they are doing consulting business and left over mainframe. All due mismanagement with management staff clueless about technology challenges and leveraging the company's know how finally lost. The stock value has continuously decreased since 2012 and just now reached 2012 level again.
They did not introduce the hard drive. It's my understanding companies had them in the early 1960s before them.
Need some information about quantum computing at IBM. What are the chances of improving error rate of such technology?
Quantum pc wow sounds amazing ❤
Maybe, this will make IBM relevant again 🧐.
Now i am curious about those bad descisions you have made😎
Watson😂 That disaster was forgotten quickly.
International Business Machines Corporation never knew what IBM stands for.
After their huge blunder regarding the capitalization of software, IBM has been two steps behind ever since.
If it’s not better than NVIDIA then it does not matter
Strange, I could barely watch and listen to the video, because of how slowly you were talking, but with video at 1.5x speed it's perfect
IBM is a very good stock. It has been going up since 2023, depending on how you look at it.
I still don't know why IBM's AI chip should be a breakthrough as there are several companies producing such chips.
Their business grows in SW and consulting (however, the growth could be also explained by decrease in other sectors. I am not saying it was wrong but it makes me always a bit suspecious when relative number are presented as if they were absolute value? A revenue increase of 470Mio actually sound more impressive than 7% imho :D ).
I understand if IBM is using their new chips to facilitate their AI services?! which was not pointed out in the video.
But what is the unique benefit to use IBM IA cloud services over AWS, Azure, Google?
@@dgrxd That's what also came into my mind (just without the details and numbers :D).
I assume those chips are very well designed for specific purposes for IBM's cloud services. For sure IBM will also use the 'real' AI chips.
Those chips very likely will reduce operating costs.
Even then designing a new chip just for their own datacenters while there are already similar chips ready and available?! Makes me really wonder why IBM did put resources into this chip.
These chips are designed to add AI operations to high throughput high transaction load scenarios which are operating on the order of something like 300K - 1M transactions per second. Those systems already existed before AI and were running at that scale. And now their customers want to be able to run inference on the same systems without a performance hit.
That is not possible with commodity hardware.
These are not supposed to be bleeding edge AI chips. These are supposed to run distilled and quantized AI models from 2-3 generations ago in scenarios where it was previously technically infeasible to use them at all.
@@timseguine2 They are not specifically hardwired to the models but implement common operators such as those in pytorch and tensorflow and so have some degree of futurity.
It always comes from wonderful Suresh's Patelution.
i like your videos
Dammit, looks like I'm gonna have to buy a couple of these stocks too.
@ 10:38 I'm confused ,but paraphrased" roughly 70% of the worlds transactions are run through the IBM hybrid cloud". Are you saying 70% of consumer transactions are run through this cloud?
Your description of the role IBM played in the emergence of PCs is inaccurate. They launched their PC division in response to the inroads Compaq Computers made into their corporate customer base. Their mainframe division was the most profitable portion of the business then and it always looked for ways to “kill” their internal rivals. You are correct that they never found a way to keep up with the competition on any other type of hardware which is why they eventually sold off their storage, printer, and PC businesses. Finally, you should look into the details of the software business. I’ll bet that a significant portion of it is recurring license fees from some very, very old platforms that still run the deep transactional core components of their financial services and healthcare clients.
Thanks!
Straight Arrow News and Ground News seem to have the same goal. What differences are there between them?
Is it just me, or is Straight Arrow News only available as an app for Apple's operating systems?
I predict IBM ai chip will be a distant 10th behind all of the other ai hardware.