*When a super AI merges with a super quantum computer system, humanity will definitely be summoning a god mode level of computing that will surely change everything forever!*
I know and like a number of people working at IBM. So much deep research and pure science going on there, it's the sort of company you want to keep afloat, even if market forces aren't always working in its favor.
Guys consider this: But they have a service business model, they earn mostly from consulting and maintaining systems giving out by them. They don’t you to have full access to their systems or make it hard to navigate it, so they continue to send their people to train your people to use their systems. IDK how supportive I should be of this business model.
Anastasi, you're amazing. I studied chip design in high school through the Techtronix explorer scouts, and we even laid out our circuits and had them run on an empty part of a wafer. It was pretty cool for a bunch of high school kids. Sadly, I didn't go to college until I was in my mid thirties, three associates degrees, not in tech, but foreign language and business. And after being a one-man IT shop for twenty-some years, I had to give it up. But I'm looking to get back into programming (there are a couple of products that I helped design hardware for after highschool, as well as wrote the software for them). One dominates the retail glass shop management industry, and the other was hardware and software that was an early player in digital call generation and user response processing. Sadly, I didn't have stock in either company.
No. Many clones of IBM pc were manufactured & sold of course but they were NOT running windo2s. They were running DOS. Windows (3.1.1) didnt appear until years later.
I had to laugh when your video showed the first IBM PC. I bought that PC the day they released it for sale in 1981 or 1982. It had an Intel 8080 processor, 512k of memory, dual floppy drives (hard drives weren’t a thing yet) and a monochrome monitor. It cost me $3,200 back then. I eventually used it to do COBOL coding with Realia COBOL. It saved me a fortune because buying time on mainframe computers back then was unaffordable for an independent programmer. I think it took me 6 months to make my investment back. I actually upgraded the PC with an AST Sixpack + which gave it a real-time clock (the stock PC date and time had to be manually entered each time it was powered up) and increased the memory to an insane 640k of memory. I seem to remember some additional ports on that card. It is an absolute joke in today’s technology, but it was worth its weight in gold in the early 80’s. Thanks for the fond memories. 😊😊😊😊
From Bharat: I am 67 years old and i did my electrical engineering. I follow your channel most of the time. all the information you provide keeps my brain healthy as it jogs it for sometime. our age group is one of those lucky groups as we have seen technology evolve and made it available for use by common population. I have used slide rules, the first commercially available scientific calculator (RPN type too), card reader computers which occupied huge rooms, then table type floppy [flippy too] { 8.5 inch/ 8 inch / 3.5 inches } computers, then PCs, Laptops, pagers, first type cellphones, [now called as feature phones i guess], then smartphones and finally AI solutions. only thing i guess our generation may not use is the quantum PC !! if we can use that too literally we have quantum jumped from slides rules to quantum computing !! Regards
I've often wondered where you get the ideas for the topics that you come up with Ms Anastasia. You really conver such a great range of topics. Thank you for the time and effort you bring to the industry.
Thanks for this video. I'm retired. I no longer work in this field as an engineer. Your videos are very interesting and your explanations are clear and supported by your remarkable intelligence.
I imagine younger viewers look at the footage of people using those big floppy discs feel like I do watching films from 1910 with everyone travelling around by horse and cart!
Neural network architecture is extremely compatible with implementation in silicon. It consists of a large number of fairly simple “neurons”. The main problem is interconnection. You want to connect every neuron in one row to every neuron in the next. The other problem is the weight of the connection. You want to load in weight values for a particular task, or adjust weights incrementally during learning. But NNs are so useful, I am confident these problems will be overcome.
On December 11th, 1972, Q1 Corporation sold the first Q1 microcomputer, based on the Intel 8008 microprocessor. It had the first QWERTY keyboard, printer and floppy. It was IBM that sold first computers 1957, 610 transistor and tube based processor, price tag was $55K.
I remember when the machines in the first clips were in use, I operated some of them. IBM's streak to stardom was when WW2 broke out, IBM went to the government and asked; what can we do for the war effort. Every military had an IBM. Then when the first commercial mainframe was glitchy, they sent the engineers that designed them out to fix them, establishing the reputation of standing behind their product; reliability. It didn't help their reputation when they made a deal with Apple and then reneged.
This breakdown of IBM’s chip innovation is well done! The focus on accelerating AI workloads for mainframes is timely, looking forward to seeing how it plays into larger trends like edge computing.
@@lilblackduc7312 that was 1988. Sinclair and BBC Micro in the UK, Microbee in Australia and many others were pumping out computers from the beginning of the1980s. Apple was selling from the late 1970s if I remember right...
Never a fan of Apple, I built PC compatibles starting in the mid 80's. I purchased a significant number of additional IBM shares in 2017 and watched it drop not long after. I didn't plan on selling, so I still have a lot of IBM, Intel as well. Moral of story: If you're gonna play the market, you have to be willing to ride the financial/emotional roller coaster. Glad IBM is doing so well these days. They've always been innovators, and they are never going belly up, so it's a safe stock, one of the "too big to fail" companies.
I wouldn’t “play the market”. I think you’re correct about not letting market volatility shake you, but that doesn’t mean all stocks will inevitably rise, not even blue chips. Buying smaller blocks over time can insulate you from some volatility, and if you’re lucky and paying attention you might get in on the dips. You don’t sound all that different than me, I just don’t like the term “playing the market”.
@@MarcosElMalo2 Having a diversified portfolio safety net also helps ride out the storms. The old maxim "Don't put all your eggs in one basket." is sound advice for long-term security as well.
I think there’s a lot of growth potential on Linux and containerization, which runs on mainframe or LinuxONE hardware with Red Rat, so you can get real time ai and a consolidated private cloud stack, that doesn’t have the same security flaws as distributed hardware
It was a nice briefing about IBM's microchip. Would you explain in detail about 133 Qubits IBM's microchip please? I like to learn more about microchip and Qubit Algorithms and photon particles. I use to work on robot manually to control it from control panel, because the robot didn't work automatically. I learned it in one hour with zero experience in technology and computer. Is it possible to learn and to build a powerful IBM microchip without high school diploma? Because I love Quantum Mechanical Engineering. Thank you.
I'm glad IBM is doing well, because at least historically, they were customer supportive; their mainframes were beautifully built, even in the parts not visible to the customer; they insisted on acting and looking professional, which may have been a bit too much at times, but it could bring out your best, and inspire it in other.
They are classic players in both once hardware and software. I always have my eyes and ears open when they publish something. They do make it harder to choose whom to run with "now a days."😅
Hey guys at IBM. I am very impressed that you have so much invested in tangible products, rather than trading Capital, like the douche bags who are destroying the economy (might want to keep a close eye on PWC). One thing to consider, is that as far as the AI tech goes, it does not make sense to build multi-core computer chips that have cores numbered below 100. NVIDIA is dominating the market, because they have found a way to put cores numbered higher than 10,000 on a single chip. Yes, you heard that right, NVIDIA makes chips with the number of cores well north of 10,000. So while CPUs are important, GPUs have become extremely important. GPU software, such as shaders are way out in front, as far as computing speed, which makes use of these massive number of cores. The AI software guys found a way to adapt the shader technology into neural network software.
Thx for the report, pretty informative and good to know how IBM turned into almost dead to highly profitable and growing. One tiny advice - this vid would be shorter if u would speed up your speech, for example I went with 1.25 speed as it was kind of slow pace😉Shorter vids r better than long ones👍
The end of the video reminded me of a reoccurring thought. Quantum research feels like we are tapping into how human neurons work. Classic computing was designed to and excels at correcting errors within quantum computing i.e. human thinking. So how IBM is approaching quantum error correction With classical computing seems ideal.
You might explore Low-Precision Training Mechanism (LPTM) to 1 bit inference, with glass chips by ASML. I was told this will be picowatt, so would be sustainable from the inference point of view, significantly lowering power consumption.
No, with research and development engineering the parents are a platform broken down into it's smallest parts. Every sub system of a platform. Every new step and any plausible use case that it could be used for. Maybe a few are for competitive reasons, but it's going to be against..their competition; other global corporations who are doing the same. Most of the patents will be legacy systems they already owned or developed. Think about how long they've been around. Patent inertia lol IBM and the like isn't out there stopping some random small player from making progress. Get real.
It’s a fair point @marcusk7855, it’s like the exploration for minerals and acquiring tenure. The big players can afford to have and support lots of tenure, sometimes just to prevent other companies getting a foothold. Work the system as always. I do wonder if they even know what they have? Managers move on, and so do technical people, stuff gets forgotten and business change focus too.
Their hybrid cloud solutions are just the offerings they provide to companies who are too embedded within IBM's infrastructure to leave their platform. Their consulting business are the service contracts they offer to those same customers who can't leave their legacy systems.
Cloud computing is not a new concept. Early on, folks connected dumb terminals to remote mainframes and servers for compilation, applications, and data. The advent of stand-alone PCs came about because of tech improvements and limited connection speeds (and WinTel monopolies). Innovations in high-speed connectivity brought the cloud back. Cloud 2.0 may be a better term.
I see a future in quantum computing and photonics. It's going to improve exponentially, as did the transistor. I think the trick will be finding the perfect substrate. Excellent video. Thanks👍
@truthsocialmedia It is not a joke. I work in the field. With photonics, you can run parallel processes. At our current point, we are able to use coherent light to run these computations. Even though the wave length is much larger than classical electrons, the power consumption is reduced. It is not currently for everything, but it excels in certain areas. With current chip designs, we have run into many issues. Physics is limiting our ability to build more complex recticals. The only solution at the moment is to re think the way we use computational power. Their is more to this field than most people understand. Their are many ways to create qbits. Once a process is well defined, it's up to mechatronic engineering.
@truthsocialmedia even though I work in the field. I still respect your imput,comment, and opinion. i actually have seen the video, and I am subscribed to her.👍
Straight Arrow News and Ground News seem to have the same goal. What differences are there between them? Is it just me, or is Straight Arrow News only available as an app for Apple's operating systems?
Consulting being the least profitable makes sense. You can charge good money for consulting, but to do that you need good consultant, who demand good pay or they'll go elsewhere or become independent. Much of consulting revenue presumably goes straight to payroll. Still, good luck building a software empire without consultants to help sell and implement it. Just because it's not the most profitable doesn't mean it's not extremely important to the business model as a whole.
My understanding is IBM outsourced the development and manufacturing of their first PC. And that is how they were able to beat Digital Equipment Corporation to market.
IBM was nowhere near the first to make a desktop personal computer. Personally affordable desktop computers, known as PCs, started in the early 1970's. Altair, IMSAI, Cromemco, Apple, Commodore, Atari, Tandy/Radio Shack and many many others were delivering PCs long before IBM even started on theirs.
Question: Is quantum computing a form of analog computing (reading varying states instead of digital 1/0)? That is how l like to think of it, but I may be incorrect.
If IBM try a test field or garden as forever state where everything restore at end of day like falling leaf goes back on branch at end of day and weather always soft no extreme no matter where on earth. Base on principle of memory harddrive of time space and sound field light interacting...all change during days is information like input out put to feed for global readjusting and end of day restore back to default
Would an FPGA architecture which could be programmatically configured to model application specific neural nets, and then process the program data independent of the CPU/GPUs be a better path? - given the smaller nm methods and other cost savings which historically have made FPGAs pricey or limited in nodes
IBM has promised a lot for many years. At one point their accounts persuaded Warren Buffet to invest, but after several years of poor performance he sold in 2017 & 2018 & later invested in Apple. His IBM stake was around $10b so it was a size-able investment. Has IBM changed or is it too widely focused & operating in areas that do not have the growth of Apple or Nvidia. Now the market seems to feel that IBM has changed & it will grow rapidly, Is the market right or will IBM disappoint again. In my humble opinion there seem to be other options with more potential growth. Thank you for sharing!
Thank you......imagine folks who started their computing skills, along with IBM.....prior to most of these moden day so called " Tech Billionair's"...they were unable to decipher the passage....it's not a mistry why......
When I hear the name of IBM my immediate response is to conjur up an image of the typeset word "THINK" in classic roman text, framed behind glass and hung over one of the workbenches where for more than 50 years my father serviced IBM office machines in a bustling downtown Chicago raising a family through the post war 50s, the cold war 60s, the sufferin' 70s, and right up until the end of the word processor as the new century and a new world of office digitalization was being born...the end of mechanical machines that had the tolerance of swiss watches sitting on every office worker's desk across the world. I still dream of the last Selectric II I caressed and with which I typed a long letter that had a few typos here and there but at no time was I ever likely to see a blue screen telling me my lovely letter was lost and irretrievable. It's good to know that when that eventual day comes I'll see IBM there in the cloud; waiting for me to finally catch on.
I still don't know why IBM's AI chip should be a breakthrough as there are several companies producing such chips. Their business grows in SW and consulting (however, the growth could be also explained by decrease in other sectors. I am not saying it was wrong but it makes me always a bit suspecious when relative number are presented as if they were absolute value? A revenue increase of 470Mio actually sound more impressive than 7% imho :D ). I understand if IBM is using their new chips to facilitate their AI services?! which was not pointed out in the video. But what is the unique benefit to use IBM IA cloud services over AWS, Azure, Google?
@@dgrxd That's what also came into my mind (just without the details and numbers :D). I assume those chips are very well designed for specific purposes for IBM's cloud services. For sure IBM will also use the 'real' AI chips. Those chips very likely will reduce operating costs. Even then designing a new chip just for their own datacenters while there are already similar chips ready and available?! Makes me really wonder why IBM did put resources into this chip.
These chips are designed to add AI operations to high throughput high transaction load scenarios which are operating on the order of something like 300K - 1M transactions per second. Those systems already existed before AI and were running at that scale. And now their customers want to be able to run inference on the same systems without a performance hit. That is not possible with commodity hardware. These are not supposed to be bleeding edge AI chips. These are supposed to run distilled and quantized AI models from 2-3 generations ago in scenarios where it was previously technically infeasible to use them at all.
@@timseguine2 They are not specifically hardwired to the models but implement common operators such as those in pytorch and tensorflow and so have some degree of futurity.
Your description of the role IBM played in the emergence of PCs is inaccurate. They launched their PC division in response to the inroads Compaq Computers made into their corporate customer base. Their mainframe division was the most profitable portion of the business then and it always looked for ways to “kill” their internal rivals. You are correct that they never found a way to keep up with the competition on any other type of hardware which is why they eventually sold off their storage, printer, and PC businesses. Finally, you should look into the details of the software business. I’ll bet that a significant portion of it is recurring license fees from some very, very old platforms that still run the deep transactional core components of their financial services and healthcare clients.
45b trans is like NV’s ga103 (?) Ofc nv ga103 is not general cics processor, but Rtx gpu; tho the next gen is coming soon. Also it seems to me that the stack of tulum and spire chips will struggle on the bus bandwidth side, where cerebras excels.
Back in the day IBM computers did the calculations for NASA calculating re-entry and landing position. 64KB of memory was a big deal back in those times.....😁
Quantum processing will probably land more in the realm of robotics. How we use this technology will dictate where computing's going, no longer sitting at a keyboard but interacting with robotics.
Inform that the next generation of computers will be developed in hexagonal format from the cutting of silicon wafers to the processors in their respective sockets on the motherboards...
Well, we need new chips to do AI to get away from GPUs, because they are too expensive. The question is, will IBM produce AI chips for the consumer market?
Sorry but you had one detail wrong, IBM did not make the first PC, although they were the first ones to slap PC on the front cover, prior to the IBM PC there were plenty of companies making PC's, in fact Heathkit even made them, the H8 and H89 were both out before the IBM PC, not to mention Apple, Radio Shack, and many more, there was also an emerging common OS, CP/M starting to get everyone on the same page. That said IBM legitimized the PC as a "real" computer in the eyes of Corporate America. I worked for a big bank prior to the PC launch and at that time you could not get funding for a PC based project to save your life, IBM released the PC and corporate executives could not sign up fast enough and it was raining money for PC projects after that and that was the real turning point for PC's. We used to have a saying in corporate IT, you never get fired for going with IBM.
WIthout IBM we would not have what we have today. What champions they have been for the industry. I worked with mainframes for thirty years. I worked on DOS/VSE VM and MVS and z/OS. All backward compatible. In those thirty years we had one two day down time because of a busted physical disk pack. I was always mindful of the sheer grunt needed to create their systems. WIth 99.9% uptime says a lot.
Great. big up i want to make litography machine so i could use to to autonomus AI robots i have build to big city as soon as possible what can you help me to make autonomous robotic AI to built city
I saw some summaries of what's actually been done by quantum computers now and unlike what so many of the stories have said they really have not had and I don't mean IBM but no one has had success in getting a quantum computer to actually provide an accurate answer in any way shape or form that was equivalent to beating a regular conventional computer both starting from scratch with a non predefined problem and answer. I'm not sure quantum computing is going to pan out.
7:43 It looks likely that a more 'real' rather than simulated neural network is built in the chip to run AI, and it's just a matter of time the Skynet will be put into reality. 14:14 Your data might be a little bit obsolete, as I've heard from another YT channel that 'According to the new plan, they will stay at 1000 qubits at least for the next few years (till 2028) and instead focus on error correction.'
Linux is a foundation start. The reality of secure computing will include modifications of the kernel, limits and check/balance operations with streamlined code for the cases in which it is applied. Many machines are running variations of *nix, but they are not equal in flexibility or security. *nix is a great starting point, but the best of the best will be tailored directly to the hardware strengths algorithmically and vice-versa. "Virtual Organic Connection" or "Bit based usage/workload compression" At this point a new term must be used.
As I said: IBM forgot more about building computers than all other together have ever known. They lasted for 140 years and will at least last for the next 140 years.😁
Correct me if I'm wrong, but Quantum computing won't explode in society until there is a near or fully ambient temperature/pressure superconductor discovered or invented. I infer that a materials-oriented AI will be required for that. So, (1) AI to get us an ambient t/p SC (2) it becomes trivial to scale quantum bits (3) we all get quantum wristwatches. Yes/no?
No, Quantum computing is more like Cold Fusion, its always 20 years in the future. AI isn't a panacea either, its basically a automated pattern matcher, more like an accelerator, but its still the humans making decisions. Even if they make QC work, it probably will never be portable and always be a remote device running in a data-center, getting it to run in ambient temperature seems much more harder than getting it to scale to enough quantum bits to be useful. And cooling doesn't miniaturize very well, you are going to need to discover new physics for that and you can't just put a deadline on discover of new physical, could happen the next decade or the next century. I think we already got to the top of the miniaturization of the mobile devices, networking will be riding the innovation curve for the next decade. Mobile computing is ironically going to become more and more like Mainframes. I would invest in IBM. Mobile phones aren't going to run the computation on themselves, they're going to be dumb terminals. The fact that we can't miniaturize cooling already made mobile devices hit a wall of power dissipation, your phone can't use more than 20W of power. If you cold solve just that problem, you could use a desktop chip for 500W of power dissipation in a mobile and that would already be a 10X to 30X increase in performance without even needing to use QC. I would be my coins on optical computing instead, optronics seems like it would allow for a paradigm shift away from CMOS. I won't bet anything on QC for the foreseeable future. QC is useful to exchanging cryptographic keys, that's the only actual commercial use I ever seem for it.
There are no quantum algorithms except for a prime number sieve and a database optimization algorithm that perform faster than classical computing, thats the biggest limitation so far. What they are good at is simulating quantum systems like magnetic spin models, not much use to your average spreadsheet user.
This video is sponsored by Straight Arrow News. Check out SAN using my link: www.san.com/anastasi
Χαφιεδοτσιπ Αναστασι?
Thank you for this concise and very informative video.
*When a super AI merges with a super quantum computer system, humanity will definitely be summoning a god mode level of computing that will surely change everything forever!*
I know and like a number of people working at IBM. So much deep research and pure science going on there, it's the sort of company you want to keep afloat, even if market forces aren't always working in its favor.
Bravo 🎉
Yap!
Agree. Especially all their silicon technology stuff that they are still running.
It was interesting to learn from this video, why.
Guys consider this: But they have a service business model, they earn mostly from consulting and maintaining systems giving out by them. They don’t you to have full access to their systems or make it hard to navigate it, so they continue to send their people to train your people to use their systems. IDK how supportive I should be of this business model.
Consider the following: The pathway to hell is paved with good intentions.
I've followed IBM for many decades. They are like the salt in the technological ocean. They go up, they go down, but their influence is undeniable.
Open the pod bay doors HAL
I meant to add, that your videos are great, and fun to watch. Thank you for the time you give to making them -- they really are excellent!
Anastasi, you're amazing. I studied chip design in high school through the Techtronix explorer scouts, and we even laid out our circuits and had them run on an empty part of a wafer. It was pretty cool for a bunch of high school kids. Sadly, I didn't go to college until I was in my mid thirties, three associates degrees, not in tech, but foreign language and business. And after being a one-man IT shop for twenty-some years, I had to give it up. But I'm looking to get back into programming (there are a couple of products that I helped design hardware for after highschool, as well as wrote the software for them). One dominates the retail glass shop management industry, and the other was hardware and software that was an early player in digital call generation and user response processing. Sadly, I didn't have stock in either company.
No.
Many clones of IBM pc were manufactured & sold of course but they were NOT running windo2s.
They were running DOS.
Windows (3.1.1) didnt appear until years later.
I had to laugh when your video showed the first IBM PC. I bought that PC the day they released it for sale in 1981 or 1982. It had an Intel 8080 processor, 512k of memory, dual floppy drives (hard drives weren’t a thing yet) and a monochrome monitor. It cost me $3,200 back then. I eventually used it to do COBOL coding with Realia COBOL. It saved me a fortune because buying time on mainframe computers back then was unaffordable for an independent programmer. I think it took me 6 months to make my investment back. I actually upgraded the PC with an AST Sixpack + which gave it a real-time clock (the stock PC date and time had to be manually entered each time it was powered up) and increased the memory to an insane 640k of memory. I seem to remember some additional ports on that card. It is an absolute joke in today’s technology, but it was worth its weight in gold in the early 80’s. Thanks for the fond memories. 😊😊😊😊
From Bharat: I am 67 years old and i did my electrical engineering. I follow your channel most of the time. all the information you provide keeps my brain healthy as it jogs it for sometime. our age group is one of those lucky groups as we have seen technology evolve and made it available for use by common population. I have used slide rules, the first commercially available scientific calculator (RPN type too), card reader computers which occupied huge rooms, then table type floppy [flippy too] { 8.5 inch/ 8 inch / 3.5 inches } computers, then PCs, Laptops, pagers, first type cellphones, [now called as feature phones i guess], then smartphones and finally AI solutions. only thing i guess our generation may not use is the quantum PC !! if we can use that too literally we have quantum jumped from slides rules to quantum computing !! Regards
Thanks for singing my song.😊
Bhakt spotted
My Dad used the punch card computer at IISc when it arrived.
I've often wondered where you get the ideas for the topics that you come up with Ms Anastasia. You really conver such a great range of topics. Thank you for the time and effort you bring to the industry.
Thanks for this video. I'm retired. I no longer work in this field as an engineer. Your videos are very interesting and your explanations are clear and supported by your remarkable intelligence.
I imagine younger viewers look at the footage of people using those big floppy discs feel like I do watching films from 1910 with everyone travelling around by horse and cart!
Neural network architecture is extremely compatible with implementation in silicon.
It consists of a large number of fairly simple “neurons”.
The main problem is interconnection. You want to connect every neuron in one row to every neuron in the next.
The other problem is the weight of the connection.
You want to load in weight values for a particular task, or adjust weights incrementally during learning.
But NNs are so useful, I am confident these problems will be overcome.
On December 11th, 1972, Q1 Corporation sold the first Q1 microcomputer, based on the Intel 8008 microprocessor.
It had the first QWERTY keyboard, printer and floppy. It was IBM that sold first computers 1957, 610 transistor and tube based processor, price tag was $55K.
I remember when the machines in the first clips were in use, I operated some of them. IBM's streak to stardom was when WW2 broke out, IBM went to the government and asked; what can we do for the war effort. Every military had an IBM. Then when the first commercial mainframe was glitchy, they sent the engineers that designed them out to fix them, establishing the reputation of standing behind their product; reliability. It didn't help their reputation when they made a deal with Apple and then reneged.
This breakdown of IBM’s chip innovation is well done! The focus on accelerating AI workloads for mainframes is timely, looking forward to seeing how it plays into larger trends like edge computing.
I get almost all of my news and info from TH-cam. I think TH-cam producers need to understand that unique news and analysis is always desired.
U-toob is notorious for it's censorship.
You should read serious news channels.
Sorry, small correction. The UK Feranti mk 1 was the first commercial computer
For me personally, the I.B.M. AS/400 was the first commercial computer. (Puppy Love ;-)
Mom worked at IBM I remember going in there and seeing the punch card reading computer in Nashville IBM Means Service 😊
she said 1st "mainframe" computer
@@lilblackduc7312 that was 1988. Sinclair and BBC Micro in the UK, Microbee in Australia and many others were pumping out computers from the beginning of the1980s. Apple was selling from the late 1970s if I remember right...
Does the UK government pay for the army of commenters that comment this on American videos?
Never a fan of Apple, I built PC compatibles starting in the mid 80's. I purchased a significant number of additional IBM shares in 2017 and watched it drop not long after. I didn't plan on selling, so I still have a lot of IBM, Intel as well.
Moral of story: If you're gonna play the market, you have to be willing to ride the financial/emotional roller coaster. Glad IBM is doing so well these days. They've always been innovators, and they are never going belly up, so it's a safe stock, one of the "too big to fail" companies.
I wouldn’t “play the market”. I think you’re correct about not letting market volatility shake you, but that doesn’t mean all stocks will inevitably rise, not even blue chips. Buying smaller blocks over time can insulate you from some volatility, and if you’re lucky and paying attention you might get in on the dips.
You don’t sound all that different than me, I just don’t like the term “playing the market”.
Imagine that you sold right before that peak... And bought right at the dip.
@@MarcosElMalo2 Having a diversified portfolio safety net also helps ride out the storms. The old maxim "Don't put all your eggs in one basket." is sound advice for long-term security as well.
I think there’s a lot of growth potential on Linux and containerization, which runs on mainframe or LinuxONE hardware with Red Rat, so you can get real time ai and a consolidated private cloud stack, that doesn’t have the same security flaws as distributed hardware
I believe IBM entered the cloud space with the acquisition of Soft Layer around 2014 much before they acquired Red Hat
It was a nice briefing about IBM's microchip.
Would you explain in detail about 133 Qubits IBM's microchip please?
I like to learn more about microchip and Qubit Algorithms and photon particles. I use to work on robot manually to control it from control panel, because the robot didn't work automatically. I learned it in one hour with zero experience in technology and computer.
Is it possible to learn and to build a powerful IBM microchip without high school diploma? Because I love Quantum Mechanical Engineering.
Thank you.
I'm glad IBM is doing well, because at least historically, they were customer supportive; their mainframes were beautifully built, even in the parts not visible to the customer; they insisted on acting and looking professional, which may have been a bit too much at times, but it could bring out your best, and inspire it in other.
stocks of IBM goes up:
me: "Are they making a new thinkpad?"
Very professional. Very good. Thank you!
Deinterlace filter removes "comb" effects when splicing in classic TV broadcast footage.
They are classic players in both once hardware and software. I always have my eyes and ears open when they publish something. They do make it harder to choose whom to run with "now a days."😅
Finally. Its about time AI is slightly more focused on for what it could be. Thanks!
Hey guys at IBM. I am very impressed that you have so much invested in tangible products, rather than trading Capital, like the douche bags who are destroying the economy (might want to keep a close eye on PWC). One thing to consider, is that as far as the AI tech goes, it does not make sense to build multi-core computer chips that have cores numbered below 100. NVIDIA is dominating the market, because they have found a way to put cores numbered higher than 10,000 on a single chip. Yes, you heard that right, NVIDIA makes chips with the number of cores well north of 10,000. So while CPUs are important, GPUs have become extremely important. GPU software, such as shaders are way out in front, as far as computing speed, which makes use of these massive number of cores. The AI software guys found a way to adapt the shader technology into neural network software.
Thx for the report, pretty informative and good to know how IBM turned into almost dead to highly profitable and growing. One tiny advice - this vid would be shorter if u would speed up your speech, for example I went with 1.25 speed as it was kind of slow pace😉Shorter vids r better than long ones👍
The end of the video reminded me of a reoccurring thought. Quantum research feels like we are tapping into how human neurons work. Classic computing was designed to and excels at correcting errors within quantum computing i.e. human thinking. So how IBM is approaching quantum error correction With classical computing seems ideal.
I was there. Most of them didn't run windows. MsDos came first, but there were a lot of better OS's out there in the 1980's.
Which models their chips are optimised for ?
I assume inference of some financial stuff or business related, as the nature of ibm cloud solutions
You might explore Low-Precision Training Mechanism (LPTM) to 1 bit inference, with glass chips by ASML. I was told this will be picowatt, so would be sustainable from the inference point of view, significantly lowering power consumption.
It's sad that they are lodging 5 patients a day because they would mostly be blocking patients so stop other people from make progress.
No, with research and development engineering the parents are a platform broken down into it's smallest parts. Every sub system of a platform. Every new step and any plausible use case that it could be used for. Maybe a few are for competitive reasons, but it's going to be against..their competition; other global corporations who are doing the same. Most of the patents will be legacy systems they already owned or developed. Think about how long they've been around. Patent inertia lol IBM and the like isn't out there stopping some random small player from making progress. Get real.
It’s a fair point @marcusk7855, it’s like the exploration for minerals and acquiring tenure. The big players can afford to have and support lots of tenure, sometimes just to prevent other companies getting a foothold. Work the system as always. I do wonder if they even know what they have? Managers move on, and so do technical people, stuff gets forgotten and business change focus too.
Welcome to the Deep State 😮
I think you mean "patent".
@@joythoughtBoth of those comments butchered the word patents. Nobody proofreads anymore.
Their hybrid cloud solutions are just the offerings they provide to companies who are too embedded within IBM's infrastructure to leave their platform. Their consulting business are the service contracts they offer to those same customers who can't leave their legacy systems.
Cloud computing is not a new concept. Early on, folks connected dumb terminals to remote mainframes and servers for compilation, applications, and data. The advent of stand-alone PCs came about because of tech improvements and limited connection speeds (and WinTel monopolies). Innovations in high-speed connectivity brought the cloud back. Cloud 2.0 may be a better term.
I see a future in quantum computing and photonics. It's going to improve exponentially, as did the transistor. I think the trick will be finding the perfect substrate. Excellent video. Thanks👍
How about more than 2 algorithms that outperorm classical?
quantum computing is bunk. sabine hossenfelder has a series on Quantum computer nonsense.
@truthsocialmedia It is not a joke. I work in the field.
With photonics, you can run parallel processes. At our current point, we are able to use coherent light to run these computations. Even though the wave length is much larger than classical electrons, the power consumption is reduced. It is not currently for everything, but it excels in certain areas. With current chip designs, we have run into many issues. Physics is limiting our ability to build more complex recticals. The only solution at the moment is to re think the way we use computational power. Their is more to this field than most people understand. Their are many ways to create qbits. Once a process is well defined, it's up to mechatronic engineering.
@@Sven_Dongle please elaborate 👍
@truthsocialmedia even though I work in the field. I still respect your imput,comment, and opinion. i actually have seen the video, and I am subscribed to her.👍
*_ThanX So Much._*
_Danke aus Polen._
quantum computing sounds like crazy future technology, no idea how they figured that one out, amazing
IBM was destroyed in the 90s by bad management.
It always comes from wonderful Suresh's Patelution.
Straight Arrow News and Ground News seem to have the same goal. What differences are there between them?
Is it just me, or is Straight Arrow News only available as an app for Apple's operating systems?
thanks.
Need some information about quantum computing at IBM. What are the chances of improving error rate of such technology?
I'm a huge fan. Thank you for these videos!
Consulting being the least profitable makes sense. You can charge good money for consulting, but to do that you need good consultant, who demand good pay or they'll go elsewhere or become independent. Much of consulting revenue presumably goes straight to payroll. Still, good luck building a software empire without consultants to help sell and implement it. Just because it's not the most profitable doesn't mean it's not extremely important to the business model as a whole.
I don't know if IBM will be developing in this new format.
i like your videos
Dammit, looks like I'm gonna have to buy a couple of these stocks too.
They did not introduce the hard drive. It's my understanding companies had them in the early 1960s before them.
My understanding is IBM outsourced the development and manufacturing of their first PC. And that is how they were able to beat Digital Equipment Corporation to market.
IBM is a very good stock. It has been going up since 2023, depending on how you look at it.
IBM didn't make the first personal computer.
How much of IBM's services overlap with PLTR?
IBM was nowhere near the first to make a desktop personal computer. Personally affordable desktop computers, known as PCs, started in the early 1970's. Altair, IMSAI, Cromemco, Apple, Commodore, Atari, Tandy/Radio Shack and many many others were delivering PCs long before IBM even started on theirs.
I guess she meant the first "IBM compatible" PC? 🤔
Question: Is quantum computing a form of analog computing (reading varying states instead of digital 1/0)? That is how l like to think of it, but I may be incorrect.
Fantastic video :) i did`t knowed IBM is going that big in quantum computers
I predict IBM ai chip will be a distant 10th behind all of the other ai hardware.
Now i am curious about those bad descisions you have made😎
Thanks!
Strange, I could barely watch and listen to the video, because of how slowly you were talking, but with video at 1.5x speed it's perfect
If IBM try a test field or garden as forever state where everything restore at end of day like falling leaf goes back on branch at end of day and weather always soft no extreme no matter where on earth. Base on principle of memory harddrive of time space and sound field light interacting...all change during days is information like input out put to feed for global readjusting and end of day restore back to default
Would an FPGA architecture which could be programmatically configured to model application specific neural nets, and then process the program data independent of the CPU/GPUs be a better path? - given the smaller nm methods and other cost savings which historically have made FPGAs pricey or limited in nodes
Seems to me IBM fell behind Arm in designing chips. And for commercial super computers, China maybe ahead of IBM.
IBM has promised a lot for many years. At one point their accounts persuaded Warren Buffet to invest, but after several years of poor performance he sold in 2017 & 2018 & later invested in Apple. His IBM stake was around $10b so it was a size-able investment. Has IBM changed or is it too widely focused & operating in areas that do not have the growth of Apple or Nvidia. Now the market seems to feel that IBM has changed & it will grow rapidly, Is the market right or will IBM disappoint again. In my humble opinion there seem to be other options with more potential growth. Thank you for sharing!
And or if not maybe exciting indeed.
I thought that IBM was a consulting company, didn’t know if a software company 😮
Thank you......imagine folks who started their computing skills, along with IBM.....prior to most of these moden day so called " Tech Billionair's"...they were unable to decipher the passage....it's not a mistry why......
IBM stocks are up due to mass mass layoffs.
When I hear the name of IBM my immediate response is to conjur up an image of the typeset word "THINK" in classic roman text, framed behind glass and hung over one of the workbenches where for more than 50 years my father serviced IBM office machines in a bustling downtown Chicago raising a family through the post war 50s, the cold war 60s, the sufferin' 70s, and right up until the end of the word processor as the new century and a new world of office digitalization was being born...the end of mechanical machines that had the tolerance of swiss watches sitting on every office worker's desk across the world. I still dream of the last Selectric II I caressed and with which I typed a long letter that had a few typos here and there but at no time was I ever likely to see a blue screen telling me my lovely letter was lost and irretrievable. It's good to know that when that eventual day comes I'll see IBM there in the cloud; waiting for me to finally catch on.
Why do I feel so blue when I hear news about IBM?
IBM released Watson a decade ago. Why did it take OpenAI to kick start the “AI renaissance?”
How are these new chips compared to nvdia’s?
Yes that quantum computer is why the stocks going up
I still don't know why IBM's AI chip should be a breakthrough as there are several companies producing such chips.
Their business grows in SW and consulting (however, the growth could be also explained by decrease in other sectors. I am not saying it was wrong but it makes me always a bit suspecious when relative number are presented as if they were absolute value? A revenue increase of 470Mio actually sound more impressive than 7% imho :D ).
I understand if IBM is using their new chips to facilitate their AI services?! which was not pointed out in the video.
But what is the unique benefit to use IBM IA cloud services over AWS, Azure, Google?
@@dgrxd That's what also came into my mind (just without the details and numbers :D).
I assume those chips are very well designed for specific purposes for IBM's cloud services. For sure IBM will also use the 'real' AI chips.
Those chips very likely will reduce operating costs.
Even then designing a new chip just for their own datacenters while there are already similar chips ready and available?! Makes me really wonder why IBM did put resources into this chip.
These chips are designed to add AI operations to high throughput high transaction load scenarios which are operating on the order of something like 300K - 1M transactions per second. Those systems already existed before AI and were running at that scale. And now their customers want to be able to run inference on the same systems without a performance hit.
That is not possible with commodity hardware.
These are not supposed to be bleeding edge AI chips. These are supposed to run distilled and quantized AI models from 2-3 generations ago in scenarios where it was previously technically infeasible to use them at all.
@@timseguine2 They are not specifically hardwired to the models but implement common operators such as those in pytorch and tensorflow and so have some degree of futurity.
Your description of the role IBM played in the emergence of PCs is inaccurate. They launched their PC division in response to the inroads Compaq Computers made into their corporate customer base. Their mainframe division was the most profitable portion of the business then and it always looked for ways to “kill” their internal rivals. You are correct that they never found a way to keep up with the competition on any other type of hardware which is why they eventually sold off their storage, printer, and PC businesses. Finally, you should look into the details of the software business. I’ll bet that a significant portion of it is recurring license fees from some very, very old platforms that still run the deep transactional core components of their financial services and healthcare clients.
45b trans is like NV’s ga103 (?)
Ofc nv ga103 is not general cics processor, but Rtx gpu; tho the next gen is coming soon.
Also it seems to me that the stack of tulum and spire chips will struggle on the bus bandwidth side, where cerebras excels.
when is the new Thread Ripper CPU coming?
When you say the algorithm runs in hardware not software. Is the hardware fixed or can it be reconfigured like a FPGA ?
NPUs are ASICs that perform inference on predefined operations on quantized models using functions found in pytorch and tensorflow.
These advanced IBM chips are kind of equal to the A100 technology from Nvidia
IIBM did not show any breakthrough technology for last 40 years. This one isn't either
No they showed watson AI, but they were unable to utilize it.
Back in the day IBM computers did the calculations for NASA calculating re-entry and landing position. 64KB of memory was a big deal back in those times.....😁
HYPE HYPE HYPE
Quantum processing will probably land more in the realm of robotics. How we use this technology will dictate where computing's going, no longer sitting at a keyboard but interacting with robotics.
Inform that the next generation of computers will be developed in hexagonal format from the cutting of silicon wafers to the processors in their respective sockets on the motherboards...
No. Reliably dicing apart the wafer requires criss-cross cuts. It's like cutting sheet glass into tiles.
Well, we need new chips to do AI to get away from GPUs, because they are too expensive. The question is, will IBM produce AI chips for the consumer market?
Quando eu era criança queria ser projetista de Chips....
Sorry but you had one detail wrong, IBM did not make the first PC, although they were the first ones to slap PC on the front cover, prior to the IBM PC there were plenty of companies making PC's, in fact Heathkit even made them, the H8 and H89 were both out before the IBM PC, not to mention Apple, Radio Shack, and many more, there was also an emerging common OS, CP/M starting to get everyone on the same page. That said IBM legitimized the PC as a "real" computer in the eyes of Corporate America. I worked for a big bank prior to the PC launch and at that time you could not get funding for a PC based project to save your life, IBM released the PC and corporate executives could not sign up fast enough and it was raining money for PC projects after that and that was the real turning point for PC's. We used to have a saying in corporate IT, you never get fired for going with IBM.
WIthout IBM we would not have what we have today. What champions they have been for the industry. I worked with mainframes for thirty years. I worked on DOS/VSE VM and MVS and z/OS. All backward compatible. In those thirty years we had one two day down time because of a busted physical disk pack. I was always mindful of the sheer grunt needed to create their systems. WIth 99.9% uptime says a lot.
International Business Machines Corporation never knew what IBM stands for.
hey nobel?
I.B.M. should have been nicer to Cyrix or bought them out. 😅
Great. big up i want to make litography machine so i could use to to autonomus AI robots i have build to big city as soon as possible what can you help me to make autonomous robotic AI to built city
IBM talks BS a lot. I heard IBM quantum computer is operational 4 yrs ago. Quantum Computer has to be better than Nvidia chips.
no way "70% pf the entire world's transactions" are run the through the IBM hybrid cloud. What is the source for the number you are quoting?
I saw some summaries of what's actually been done by quantum computers now and unlike what so many of the stories have said they really have not had and I don't mean IBM but no one has had success in getting a quantum computer to actually provide an accurate answer in any way shape or form that was equivalent to beating a regular conventional computer both starting from scratch with a non predefined problem and answer.
I'm not sure quantum computing is going to pan out.
Only the earliest investors can profit from quantum. Everyone else is a loser.
7:43 It looks likely that a more 'real' rather than simulated neural network is built in the chip to run AI, and it's just a matter of time the Skynet will be put into reality.
14:14 Your data might be a little bit obsolete, as I've heard from another YT channel that 'According to the new plan, they will stay at 1000 qubits at least for the next few years (till 2028) and instead focus on error correction.'
IBM uses Linux,they are on a good direction.
Linux is not bringing the forefront of research anymore. Its a good direction to the status quo, not the future of operating systems.
Why do you say that? Is it for reasons other than “OS ideology”?
@@monad_tcp bringing the forefront of research. I don't even know what that means. Linux is growing and innovating and running a lot of the world.
Linux is a foundation start. The reality of secure computing will include modifications of the kernel, limits and check/balance operations with streamlined code for the cases in which it is applied. Many machines are running variations of *nix, but they are not equal in flexibility or security. *nix is a great starting point, but the best of the best will be tailored directly to the hardware strengths algorithmically and vice-versa. "Virtual Organic Connection" or "Bit based usage/workload compression" At this point a new term must be used.
As I said: IBM forgot more about building computers than all other together have ever known. They lasted for 140 years and will at least last for the next 140 years.😁
Correct me if I'm wrong, but Quantum computing won't explode in society until there is a near or fully ambient temperature/pressure superconductor discovered or invented. I infer that a materials-oriented AI will be required for that. So, (1) AI to get us an ambient t/p SC (2) it becomes trivial to scale quantum bits (3) we all get quantum wristwatches. Yes/no?
No, Quantum computing is more like Cold Fusion, its always 20 years in the future. AI isn't a panacea either, its basically a automated pattern matcher, more like an accelerator, but its still the humans making decisions.
Even if they make QC work, it probably will never be portable and always be a remote device running in a data-center, getting it to run in ambient temperature seems much more harder than getting it to scale to enough quantum bits to be useful. And cooling doesn't miniaturize very well, you are going to need to discover new physics for that and you can't just put a deadline on discover of new physical, could happen the next decade or the next century.
I think we already got to the top of the miniaturization of the mobile devices, networking will be riding the innovation curve for the next decade. Mobile computing is ironically going to become more and more like Mainframes. I would invest in IBM.
Mobile phones aren't going to run the computation on themselves, they're going to be dumb terminals.
The fact that we can't miniaturize cooling already made mobile devices hit a wall of power dissipation, your phone can't use more than 20W of power. If you cold solve just that problem, you could use a desktop chip for 500W of power dissipation in a mobile and that would already be a 10X to 30X increase in performance without even needing to use QC.
I would be my coins on optical computing instead, optronics seems like it would allow for a paradigm shift away from CMOS. I won't bet anything on QC for the foreseeable future.
QC is useful to exchanging cryptographic keys, that's the only actual commercial use I ever seem for it.
There are no quantum algorithms except for a prime number sieve and a database optimization algorithm that perform faster than classical computing, thats the biggest limitation so far. What they are good at is simulating quantum systems like magnetic spin models, not much use to your average spreadsheet user.