IBM's Microchip Breakthrough: Why It’s a Big Deal

แชร์
ฝัง
  • เผยแพร่เมื่อ 24 ต.ค. 2024

ความคิดเห็น • 272

  • @AnastasiInTech
    @AnastasiInTech  4 วันที่ผ่านมา +16

    This video is sponsored by Straight Arrow News. Check out SAN using my link: www.san.com/anastasi

    • @directorchris2
      @directorchris2 3 วันที่ผ่านมา

      Χαφιεδοτσιπ Αναστασι?

    • @GiNodrog
      @GiNodrog 3 วันที่ผ่านมา +1

      Thank you for this concise and very informative video.

    • @Constant_Distant_Instant
      @Constant_Distant_Instant 11 ชั่วโมงที่ผ่านมา

      *When a super AI merges with a super quantum computer system, humanity will definitely be summoning a god mode level of computing that will surely change everything forever!*

  • @davidcerutti8795
    @davidcerutti8795 4 วันที่ผ่านมา +125

    I know and like a number of people working at IBM. So much deep research and pure science going on there, it's the sort of company you want to keep afloat, even if market forces aren't always working in its favor.

    • @ikust007
      @ikust007 4 วันที่ผ่านมา +4

      Bravo 🎉

    • @crownlands7246
      @crownlands7246 4 วันที่ผ่านมา +3

      Yap!

    • @dchdch8290
      @dchdch8290 3 วันที่ผ่านมา +6

      Agree. Especially all their silicon technology stuff that they are still running.
      It was interesting to learn from this video, why.

    • @leoliu5472
      @leoliu5472 3 วันที่ผ่านมา +3

      Guys consider this: But they have a service business model, they earn mostly from consulting and maintaining systems giving out by them. They don’t you to have full access to their systems or make it hard to navigate it, so they continue to send their people to train your people to use their systems. IDK how supportive I should be of this business model.

    • @realdomdom
      @realdomdom 3 วันที่ผ่านมา +4

      Consider the following: The pathway to hell is paved with good intentions.

  • @Indrid__Cold
    @Indrid__Cold 3 วันที่ผ่านมา +29

    I've followed IBM for many decades. They are like the salt in the technological ocean. They go up, they go down, but their influence is undeniable.

    • @Williamb612
      @Williamb612 4 ชั่วโมงที่ผ่านมา

      Open the pod bay doors HAL

  • @jasonelliott1346
    @jasonelliott1346 2 วันที่ผ่านมา +8

    I meant to add, that your videos are great, and fun to watch. Thank you for the time you give to making them -- they really are excellent!

  • @jasonelliott1346
    @jasonelliott1346 2 วันที่ผ่านมา +5

    Anastasi, you're amazing. I studied chip design in high school through the Techtronix explorer scouts, and we even laid out our circuits and had them run on an empty part of a wafer. It was pretty cool for a bunch of high school kids. Sadly, I didn't go to college until I was in my mid thirties, three associates degrees, not in tech, but foreign language and business. And after being a one-man IT shop for twenty-some years, I had to give it up. But I'm looking to get back into programming (there are a couple of products that I helped design hardware for after highschool, as well as wrote the software for them). One dominates the retail glass shop management industry, and the other was hardware and software that was an early player in digital call generation and user response processing. Sadly, I didn't have stock in either company.

  • @professor-viewsalot
    @professor-viewsalot 2 วันที่ผ่านมา +10

    No.
    Many clones of IBM pc were manufactured & sold of course but they were NOT running windo2s.
    They were running DOS.
    Windows (3.1.1) didnt appear until years later.

  • @retiredbitjuggler3471
    @retiredbitjuggler3471 50 นาทีที่ผ่านมา +1

    I had to laugh when your video showed the first IBM PC. I bought that PC the day they released it for sale in 1981 or 1982. It had an Intel 8080 processor, 512k of memory, dual floppy drives (hard drives weren’t a thing yet) and a monochrome monitor. It cost me $3,200 back then. I eventually used it to do COBOL coding with Realia COBOL. It saved me a fortune because buying time on mainframe computers back then was unaffordable for an independent programmer. I think it took me 6 months to make my investment back. I actually upgraded the PC with an AST Sixpack + which gave it a real-time clock (the stock PC date and time had to be manually entered each time it was powered up) and increased the memory to an insane 640k of memory. I seem to remember some additional ports on that card. It is an absolute joke in today’s technology, but it was worth its weight in gold in the early 80’s. Thanks for the fond memories. 😊😊😊😊

  • @RupanagudiRaviShankar
    @RupanagudiRaviShankar 3 วันที่ผ่านมา +30

    From Bharat: I am 67 years old and i did my electrical engineering. I follow your channel most of the time. all the information you provide keeps my brain healthy as it jogs it for sometime. our age group is one of those lucky groups as we have seen technology evolve and made it available for use by common population. I have used slide rules, the first commercially available scientific calculator (RPN type too), card reader computers which occupied huge rooms, then table type floppy [flippy too] { 8.5 inch/ 8 inch / 3.5 inches } computers, then PCs, Laptops, pagers, first type cellphones, [now called as feature phones i guess], then smartphones and finally AI solutions. only thing i guess our generation may not use is the quantum PC !! if we can use that too literally we have quantum jumped from slides rules to quantum computing !! Regards

    • @Otto-d2t
      @Otto-d2t 3 วันที่ผ่านมา +1

      Thanks for singing my song.😊

    • @DataCrusade1999
      @DataCrusade1999 2 วันที่ผ่านมา

      Bhakt spotted

    • @prashanthb6521
      @prashanthb6521 วันที่ผ่านมา

      My Dad used the punch card computer at IISc when it arrived.

  • @greggapowell67
    @greggapowell67 3 วันที่ผ่านมา +6

    I've often wondered where you get the ideas for the topics that you come up with Ms Anastasia. You really conver such a great range of topics. Thank you for the time and effort you bring to the industry.

  • @kjelm
    @kjelm 5 ชั่วโมงที่ผ่านมา

    Thanks for this video. I'm retired. I no longer work in this field as an engineer. Your videos are very interesting and your explanations are clear and supported by your remarkable intelligence.

  • @leematthews6812
    @leematthews6812 3 วันที่ผ่านมา +12

    I imagine younger viewers look at the footage of people using those big floppy discs feel like I do watching films from 1910 with everyone travelling around by horse and cart!

  • @harrybarrow6222
    @harrybarrow6222 2 วันที่ผ่านมา +1

    Neural network architecture is extremely compatible with implementation in silicon.
    It consists of a large number of fairly simple “neurons”.
    The main problem is interconnection. You want to connect every neuron in one row to every neuron in the next.
    The other problem is the weight of the connection.
    You want to load in weight values for a particular task, or adjust weights incrementally during learning.
    But NNs are so useful, I am confident these problems will be overcome.

  • @vilijanac
    @vilijanac 3 วันที่ผ่านมา +4

    On December 11th, 1972, Q1 Corporation sold the first Q1 microcomputer, based on the Intel 8008 microprocessor.
    It had the first QWERTY keyboard, printer and floppy. It was IBM that sold first computers 1957, 610 transistor and tube based processor, price tag was $55K.

  • @everettputerbaugh3996
    @everettputerbaugh3996 3 วันที่ผ่านมา +1

    I remember when the machines in the first clips were in use, I operated some of them. IBM's streak to stardom was when WW2 broke out, IBM went to the government and asked; what can we do for the war effort. Every military had an IBM. Then when the first commercial mainframe was glitchy, they sent the engineers that designed them out to fix them, establishing the reputation of standing behind their product; reliability. It didn't help their reputation when they made a deal with Apple and then reneged.

  • @AdvantestInc
    @AdvantestInc 4 วันที่ผ่านมา +10

    This breakdown of IBM’s chip innovation is well done! The focus on accelerating AI workloads for mainframes is timely, looking forward to seeing how it plays into larger trends like edge computing.

  • @ilkoderez601
    @ilkoderez601 4 วันที่ผ่านมา +5

    I get almost all of my news and info from TH-cam. I think TH-cam producers need to understand that unique news and analysis is always desired.

    • @lilblackduc7312
      @lilblackduc7312 3 วันที่ผ่านมา

      U-toob is notorious for it's censorship.

    • @yorkan213swd6
      @yorkan213swd6 2 วันที่ผ่านมา

      You should read serious news channels.

  • @mrmicklord
    @mrmicklord 4 วันที่ผ่านมา +15

    Sorry, small correction. The UK Feranti mk 1 was the first commercial computer

    • @lilblackduc7312
      @lilblackduc7312 3 วันที่ผ่านมา +1

      For me personally, the I.B.M. AS/400 was the first commercial computer. (Puppy Love ;-)

    • @tsmart9478
      @tsmart9478 3 วันที่ผ่านมา +2

      Mom worked at IBM I remember going in there and seeing the punch card reading computer in Nashville IBM Means Service 😊

    • @USA-CIA-NED_ProxyDeathSquadOps
      @USA-CIA-NED_ProxyDeathSquadOps 3 วันที่ผ่านมา

      she said 1st "mainframe" computer

    • @joythought
      @joythought 3 วันที่ผ่านมา

      ​​@@lilblackduc7312 that was 1988. Sinclair and BBC Micro in the UK, Microbee in Australia and many others were pumping out computers from the beginning of the1980s. Apple was selling from the late 1970s if I remember right...

    • @user-yr1uq1qe6y
      @user-yr1uq1qe6y วันที่ผ่านมา

      Does the UK government pay for the army of commenters that comment this on American videos?

  • @Justin_Arut
    @Justin_Arut 4 วันที่ผ่านมา +14

    Never a fan of Apple, I built PC compatibles starting in the mid 80's. I purchased a significant number of additional IBM shares in 2017 and watched it drop not long after. I didn't plan on selling, so I still have a lot of IBM, Intel as well.
    Moral of story: If you're gonna play the market, you have to be willing to ride the financial/emotional roller coaster. Glad IBM is doing so well these days. They've always been innovators, and they are never going belly up, so it's a safe stock, one of the "too big to fail" companies.

    • @MarcosElMalo2
      @MarcosElMalo2 4 วันที่ผ่านมา

      I wouldn’t “play the market”. I think you’re correct about not letting market volatility shake you, but that doesn’t mean all stocks will inevitably rise, not even blue chips. Buying smaller blocks over time can insulate you from some volatility, and if you’re lucky and paying attention you might get in on the dips.
      You don’t sound all that different than me, I just don’t like the term “playing the market”.

    • @kayakMike1000
      @kayakMike1000 3 วันที่ผ่านมา +1

      Imagine that you sold right before that peak... And bought right at the dip.

    • @Justin_Arut
      @Justin_Arut 3 วันที่ผ่านมา

      @@MarcosElMalo2 Having a diversified portfolio safety net also helps ride out the storms. The old maxim "Don't put all your eggs in one basket." is sound advice for long-term security as well.

  • @theEric180
    @theEric180 3 วันที่ผ่านมา +3

    I think there’s a lot of growth potential on Linux and containerization, which runs on mainframe or LinuxONE hardware with Red Rat, so you can get real time ai and a consolidated private cloud stack, that doesn’t have the same security flaws as distributed hardware

  • @rishikeshkhedkar3768
    @rishikeshkhedkar3768 2 ชั่วโมงที่ผ่านมา +1

    I believe IBM entered the cloud space with the acquisition of Soft Layer around 2014 much before they acquired Red Hat

  • @FARDEEN.MUSTAFA
    @FARDEEN.MUSTAFA 22 ชั่วโมงที่ผ่านมา +1

    It was a nice briefing about IBM's microchip.
    Would you explain in detail about 133 Qubits IBM's microchip please?
    I like to learn more about microchip and Qubit Algorithms and photon particles. I use to work on robot manually to control it from control panel, because the robot didn't work automatically. I learned it in one hour with zero experience in technology and computer.
    Is it possible to learn and to build a powerful IBM microchip without high school diploma? Because I love Quantum Mechanical Engineering.
    Thank you.

  • @R777-RLM
    @R777-RLM วันที่ผ่านมา

    I'm glad IBM is doing well, because at least historically, they were customer supportive; their mainframes were beautifully built, even in the parts not visible to the customer; they insisted on acting and looking professional, which may have been a bit too much at times, but it could bring out your best, and inspire it in other.

  • @swisstraeng
    @swisstraeng 3 วันที่ผ่านมา +2

    stocks of IBM goes up:
    me: "Are they making a new thinkpad?"

  • @squirrelarmor
    @squirrelarmor 10 ชั่วโมงที่ผ่านมา +1

    Very professional. Very good. Thank you!

  • @DennisMadiastreamer
    @DennisMadiastreamer 3 วันที่ผ่านมา +1

    Deinterlace filter removes "comb" effects when splicing in classic TV broadcast footage.

  • @Augustine-x5i
    @Augustine-x5i 4 วันที่ผ่านมา +1

    They are classic players in both once hardware and software. I always have my eyes and ears open when they publish something. They do make it harder to choose whom to run with "now a days."😅

  • @hanskraut2018
    @hanskraut2018 4 วันที่ผ่านมา +5

    Finally. Its about time AI is slightly more focused on for what it could be. Thanks!

  • @auroraRealms
    @auroraRealms ชั่วโมงที่ผ่านมา

    Hey guys at IBM. I am very impressed that you have so much invested in tangible products, rather than trading Capital, like the douche bags who are destroying the economy (might want to keep a close eye on PWC). One thing to consider, is that as far as the AI tech goes, it does not make sense to build multi-core computer chips that have cores numbered below 100. NVIDIA is dominating the market, because they have found a way to put cores numbered higher than 10,000 on a single chip. Yes, you heard that right, NVIDIA makes chips with the number of cores well north of 10,000. So while CPUs are important, GPUs have become extremely important. GPU software, such as shaders are way out in front, as far as computing speed, which makes use of these massive number of cores. The AI software guys found a way to adapt the shader technology into neural network software.

  • @El.Duder-ino
    @El.Duder-ino 3 วันที่ผ่านมา +1

    Thx for the report, pretty informative and good to know how IBM turned into almost dead to highly profitable and growing. One tiny advice - this vid would be shorter if u would speed up your speech, for example I went with 1.25 speed as it was kind of slow pace😉Shorter vids r better than long ones👍

  • @nvygw171
    @nvygw171 3 วันที่ผ่านมา +1

    The end of the video reminded me of a reoccurring thought. Quantum research feels like we are tapping into how human neurons work. Classic computing was designed to and excels at correcting errors within quantum computing i.e. human thinking. So how IBM is approaching quantum error correction With classical computing seems ideal.

  • @danieldegroff188
    @danieldegroff188 23 ชั่วโมงที่ผ่านมา

    I was there. Most of them didn't run windows. MsDos came first, but there were a lot of better OS's out there in the 1980's.

  • @dchdch8290
    @dchdch8290 3 วันที่ผ่านมา +3

    Which models their chips are optimised for ?
    I assume inference of some financial stuff or business related, as the nature of ibm cloud solutions

  • @realconsulting9745
    @realconsulting9745 3 วันที่ผ่านมา +1

    You might explore Low-Precision Training Mechanism (LPTM) to 1 bit inference, with glass chips by ASML. I was told this will be picowatt, so would be sustainable from the inference point of view, significantly lowering power consumption.

  • @marcusk7855
    @marcusk7855 3 วันที่ผ่านมา +7

    It's sad that they are lodging 5 patients a day because they would mostly be blocking patients so stop other people from make progress.

    • @rickevans7941
      @rickevans7941 3 วันที่ผ่านมา +1

      No, with research and development engineering the parents are a platform broken down into it's smallest parts. Every sub system of a platform. Every new step and any plausible use case that it could be used for. Maybe a few are for competitive reasons, but it's going to be against..their competition; other global corporations who are doing the same. Most of the patents will be legacy systems they already owned or developed. Think about how long they've been around. Patent inertia lol IBM and the like isn't out there stopping some random small player from making progress. Get real.

    • @alexcastas8405
      @alexcastas8405 3 วันที่ผ่านมา

      It’s a fair point @marcusk7855, it’s like the exploration for minerals and acquiring tenure. The big players can afford to have and support lots of tenure, sometimes just to prevent other companies getting a foothold. Work the system as always. I do wonder if they even know what they have? Managers move on, and so do technical people, stuff gets forgotten and business change focus too.

    • @TERRANcmb
      @TERRANcmb 3 วันที่ผ่านมา

      Welcome to the Deep State 😮

    • @joythought
      @joythought 3 วันที่ผ่านมา +3

      I think you mean "patent".

    • @CommodoreGreg
      @CommodoreGreg 6 ชั่วโมงที่ผ่านมา

      ​@@joythoughtBoth of those comments butchered the word patents. Nobody proofreads anymore.

  • @viperx7262
    @viperx7262 3 วันที่ผ่านมา

    Their hybrid cloud solutions are just the offerings they provide to companies who are too embedded within IBM's infrastructure to leave their platform. Their consulting business are the service contracts they offer to those same customers who can't leave their legacy systems.

  • @djbazzone5698
    @djbazzone5698 2 วันที่ผ่านมา +1

    Cloud computing is not a new concept. Early on, folks connected dumb terminals to remote mainframes and servers for compilation, applications, and data. The advent of stand-alone PCs came about because of tech improvements and limited connection speeds (and WinTel monopolies). Innovations in high-speed connectivity brought the cloud back. Cloud 2.0 may be a better term.

  • @friskydingo5370
    @friskydingo5370 4 วันที่ผ่านมา +8

    I see a future in quantum computing and photonics. It's going to improve exponentially, as did the transistor. I think the trick will be finding the perfect substrate. Excellent video. Thanks👍

    • @Sven_Dongle
      @Sven_Dongle 3 วันที่ผ่านมา +3

      How about more than 2 algorithms that outperorm classical?

    • @truthsocialmedia
      @truthsocialmedia 3 วันที่ผ่านมา +2

      quantum computing is bunk. sabine hossenfelder has a series on Quantum computer nonsense.

    • @friskydingo5370
      @friskydingo5370 3 วันที่ผ่านมา +1

      @truthsocialmedia It is not a joke. I work in the field.
      With photonics, you can run parallel processes. At our current point, we are able to use coherent light to run these computations. Even though the wave length is much larger than classical electrons, the power consumption is reduced. It is not currently for everything, but it excels in certain areas. With current chip designs, we have run into many issues. Physics is limiting our ability to build more complex recticals. The only solution at the moment is to re think the way we use computational power. Their is more to this field than most people understand. Their are many ways to create qbits. Once a process is well defined, it's up to mechatronic engineering.

    • @friskydingo5370
      @friskydingo5370 3 วันที่ผ่านมา +1

      @@Sven_Dongle please elaborate 👍

    • @friskydingo5370
      @friskydingo5370 3 วันที่ผ่านมา +1

      @truthsocialmedia even though I work in the field. I still respect your imput,comment, and opinion. i actually have seen the video, and I am subscribed to her.👍

  • @Banerled
    @Banerled 3 วันที่ผ่านมา +1

    *_ThanX So Much._*
    _Danke aus Polen._

  • @yoyo-jc5qg
    @yoyo-jc5qg 3 วันที่ผ่านมา

    quantum computing sounds like crazy future technology, no idea how they figured that one out, amazing

  • @acasualviewer5861
    @acasualviewer5861 2 วันที่ผ่านมา +1

    IBM was destroyed in the 90s by bad management.

  • @sureshchandrapatel3925
    @sureshchandrapatel3925 3 วันที่ผ่านมา

    It always comes from wonderful Suresh's Patelution.

  • @VulcanOnWheels
    @VulcanOnWheels 5 ชั่วโมงที่ผ่านมา

    Straight Arrow News and Ground News seem to have the same goal. What differences are there between them?
    Is it just me, or is Straight Arrow News only available as an app for Apple's operating systems?

  • @friendlycommentwolf
    @friendlycommentwolf 2 วันที่ผ่านมา

    thanks.

  • @brianfruman123
    @brianfruman123 3 วันที่ผ่านมา

    Need some information about quantum computing at IBM. What are the chances of improving error rate of such technology?

  • @Bcowzz
    @Bcowzz 3 วันที่ผ่านมา +2

    I'm a huge fan. Thank you for these videos!

  • @Steamrick
    @Steamrick 4 วันที่ผ่านมา +1

    Consulting being the least profitable makes sense. You can charge good money for consulting, but to do that you need good consultant, who demand good pay or they'll go elsewhere or become independent. Much of consulting revenue presumably goes straight to payroll. Still, good luck building a software empire without consultants to help sell and implement it. Just because it's not the most profitable doesn't mean it's not extremely important to the business model as a whole.

  • @despluguecom8966
    @despluguecom8966 4 วันที่ผ่านมา +2

    I don't know if IBM will be developing in this new format.

  • @gokutheterror
    @gokutheterror ชั่วโมงที่ผ่านมา +1

    i like your videos

  • @franciscomagalhaes7457
    @franciscomagalhaes7457 3 วันที่ผ่านมา

    Dammit, looks like I'm gonna have to buy a couple of these stocks too.

  • @alvarofernandez5118
    @alvarofernandez5118 ชั่วโมงที่ผ่านมา

    They did not introduce the hard drive. It's my understanding companies had them in the early 1960s before them.

  • @garycard1826
    @garycard1826 3 วันที่ผ่านมา +1

    My understanding is IBM outsourced the development and manufacturing of their first PC. And that is how they were able to beat Digital Equipment Corporation to market.

  • @martinbreslow1401
    @martinbreslow1401 3 วันที่ผ่านมา

    IBM is a very good stock. It has been going up since 2023, depending on how you look at it.

  • @galencollins9057
    @galencollins9057 4 วันที่ผ่านมา +4

    IBM didn't make the first personal computer.

  • @ifeanyiibeanu7765
    @ifeanyiibeanu7765 3 วันที่ผ่านมา

    How much of IBM's services overlap with PLTR?

  • @tconiam
    @tconiam วันที่ผ่านมา +1

    IBM was nowhere near the first to make a desktop personal computer. Personally affordable desktop computers, known as PCs, started in the early 1970's. Altair, IMSAI, Cromemco, Apple, Commodore, Atari, Tandy/Radio Shack and many many others were delivering PCs long before IBM even started on theirs.

    • @gaggleweed
      @gaggleweed 12 ชั่วโมงที่ผ่านมา

      I guess she meant the first "IBM compatible" PC? 🤔

  • @djbazzone5698
    @djbazzone5698 2 วันที่ผ่านมา

    Question: Is quantum computing a form of analog computing (reading varying states instead of digital 1/0)? That is how l like to think of it, but I may be incorrect.

  • @mr.iot-tech278
    @mr.iot-tech278 4 วันที่ผ่านมา +8

    Fantastic video :) i did`t knowed IBM is going that big in quantum computers

  • @woolfel
    @woolfel 4 วันที่ผ่านมา +5

    I predict IBM ai chip will be a distant 10th behind all of the other ai hardware.

  • @ypey1
    @ypey1 3 วันที่ผ่านมา

    Now i am curious about those bad descisions you have made😎

  • @DavidHughes-hv7rl
    @DavidHughes-hv7rl 3 วันที่ผ่านมา +1

    Thanks!

  • @G0lliath
    @G0lliath 3 วันที่ผ่านมา

    Strange, I could barely watch and listen to the video, because of how slowly you were talking, but with video at 1.5x speed it's perfect

  • @TriPham-j3b
    @TriPham-j3b ชั่วโมงที่ผ่านมา

    If IBM try a test field or garden as forever state where everything restore at end of day like falling leaf goes back on branch at end of day and weather always soft no extreme no matter where on earth. Base on principle of memory harddrive of time space and sound field light interacting...all change during days is information like input out put to feed for global readjusting and end of day restore back to default

  • @solosailorsv8065
    @solosailorsv8065 วันที่ผ่านมา

    Would an FPGA architecture which could be programmatically configured to model application specific neural nets, and then process the program data independent of the CPU/GPUs be a better path? - given the smaller nm methods and other cost savings which historically have made FPGAs pricey or limited in nodes

  • @malekmoqaddam5806
    @malekmoqaddam5806 3 วันที่ผ่านมา

    Seems to me IBM fell behind Arm in designing chips. And for commercial super computers, China maybe ahead of IBM.

  • @springwoodcottage4248
    @springwoodcottage4248 4 วันที่ผ่านมา +2

    IBM has promised a lot for many years. At one point their accounts persuaded Warren Buffet to invest, but after several years of poor performance he sold in 2017 & 2018 & later invested in Apple. His IBM stake was around $10b so it was a size-able investment. Has IBM changed or is it too widely focused & operating in areas that do not have the growth of Apple or Nvidia. Now the market seems to feel that IBM has changed & it will grow rapidly, Is the market right or will IBM disappoint again. In my humble opinion there seem to be other options with more potential growth. Thank you for sharing!

  • @AX2SEG
    @AX2SEG 3 วันที่ผ่านมา

    And or if not maybe exciting indeed.

  • @carloslemare6060
    @carloslemare6060 3 วันที่ผ่านมา

    I thought that IBM was a consulting company, didn’t know if a software company 😮

  • @hubstrangers3450
    @hubstrangers3450 3 วันที่ผ่านมา

    Thank you......imagine folks who started their computing skills, along with IBM.....prior to most of these moden day so called " Tech Billionair's"...they were unable to decipher the passage....it's not a mistry why......

  • @Dxeus
    @Dxeus 3 วันที่ผ่านมา +1

    IBM stocks are up due to mass mass layoffs.

  • @416dl
    @416dl 3 วันที่ผ่านมา

    When I hear the name of IBM my immediate response is to conjur up an image of the typeset word "THINK" in classic roman text, framed behind glass and hung over one of the workbenches where for more than 50 years my father serviced IBM office machines in a bustling downtown Chicago raising a family through the post war 50s, the cold war 60s, the sufferin' 70s, and right up until the end of the word processor as the new century and a new world of office digitalization was being born...the end of mechanical machines that had the tolerance of swiss watches sitting on every office worker's desk across the world. I still dream of the last Selectric II I caressed and with which I typed a long letter that had a few typos here and there but at no time was I ever likely to see a blue screen telling me my lovely letter was lost and irretrievable. It's good to know that when that eventual day comes I'll see IBM there in the cloud; waiting for me to finally catch on.

  • @dahlia695
    @dahlia695 2 วันที่ผ่านมา +1

    Why do I feel so blue when I hear news about IBM?

  • @EterpayKugml
    @EterpayKugml 2 วันที่ผ่านมา

    IBM released Watson a decade ago. Why did it take OpenAI to kick start the “AI renaissance?”

  • @DavidP92105
    @DavidP92105 3 ชั่วโมงที่ผ่านมา

    How are these new chips compared to nvdia’s?

  • @Julian-t4c
    @Julian-t4c 3 วันที่ผ่านมา

    Yes that quantum computer is why the stocks going up

  • @naalsocomment9449
    @naalsocomment9449 4 วันที่ผ่านมา +2

    I still don't know why IBM's AI chip should be a breakthrough as there are several companies producing such chips.
    Their business grows in SW and consulting (however, the growth could be also explained by decrease in other sectors. I am not saying it was wrong but it makes me always a bit suspecious when relative number are presented as if they were absolute value? A revenue increase of 470Mio actually sound more impressive than 7% imho :D ).
    I understand if IBM is using their new chips to facilitate their AI services?! which was not pointed out in the video.
    But what is the unique benefit to use IBM IA cloud services over AWS, Azure, Google?

    • @naalsocomment9449
      @naalsocomment9449 4 วันที่ผ่านมา +1

      @@dgrxd That's what also came into my mind (just without the details and numbers :D).
      I assume those chips are very well designed for specific purposes for IBM's cloud services. For sure IBM will also use the 'real' AI chips.
      Those chips very likely will reduce operating costs.
      Even then designing a new chip just for their own datacenters while there are already similar chips ready and available?! Makes me really wonder why IBM did put resources into this chip.

    • @timseguine2
      @timseguine2 4 วันที่ผ่านมา

      These chips are designed to add AI operations to high throughput high transaction load scenarios which are operating on the order of something like 300K - 1M transactions per second. Those systems already existed before AI and were running at that scale. And now their customers want to be able to run inference on the same systems without a performance hit.
      That is not possible with commodity hardware.
      These are not supposed to be bleeding edge AI chips. These are supposed to run distilled and quantized AI models from 2-3 generations ago in scenarios where it was previously technically infeasible to use them at all.

    • @Sven_Dongle
      @Sven_Dongle 3 วันที่ผ่านมา

      @@timseguine2 They are not specifically hardwired to the models but implement common operators such as those in pytorch and tensorflow and so have some degree of futurity.

  • @elisinyak1166
    @elisinyak1166 3 วันที่ผ่านมา

    Your description of the role IBM played in the emergence of PCs is inaccurate. They launched their PC division in response to the inroads Compaq Computers made into their corporate customer base. Their mainframe division was the most profitable portion of the business then and it always looked for ways to “kill” their internal rivals. You are correct that they never found a way to keep up with the competition on any other type of hardware which is why they eventually sold off their storage, printer, and PC businesses. Finally, you should look into the details of the software business. I’ll bet that a significant portion of it is recurring license fees from some very, very old platforms that still run the deep transactional core components of their financial services and healthcare clients.

  • @extremumone
    @extremumone วันที่ผ่านมา

    45b trans is like NV’s ga103 (?)
    Ofc nv ga103 is not general cics processor, but Rtx gpu; tho the next gen is coming soon.
    Also it seems to me that the stack of tulum and spire chips will struggle on the bus bandwidth side, where cerebras excels.

  • @connorharris1900
    @connorharris1900 3 วันที่ผ่านมา

    when is the new Thread Ripper CPU coming?

  • @rustyfox81
    @rustyfox81 4 วันที่ผ่านมา

    When you say the algorithm runs in hardware not software. Is the hardware fixed or can it be reconfigured like a FPGA ?

    • @Sven_Dongle
      @Sven_Dongle 3 วันที่ผ่านมา

      NPUs are ASICs that perform inference on predefined operations on quantized models using functions found in pytorch and tensorflow.

  • @MattRodriguez-h7j
    @MattRodriguez-h7j 3 วันที่ผ่านมา

    These advanced IBM chips are kind of equal to the A100 technology from Nvidia

  • @dexterek011
    @dexterek011 4 วันที่ผ่านมา +9

    IIBM did not show any breakthrough technology for last 40 years. This one isn't either

    • @yorkan213swd6
      @yorkan213swd6 2 วันที่ผ่านมา +2

      No they showed watson AI, but they were unable to utilize it.

  • @henryford2736
    @henryford2736 3 วันที่ผ่านมา

    Back in the day IBM computers did the calculations for NASA calculating re-entry and landing position. 64KB of memory was a big deal back in those times.....😁

  • @virtualworldsbyloff
    @virtualworldsbyloff 2 วันที่ผ่านมา +1

    HYPE HYPE HYPE

  • @bay9876
    @bay9876 4 วันที่ผ่านมา

    Quantum processing will probably land more in the realm of robotics. How we use this technology will dictate where computing's going, no longer sitting at a keyboard but interacting with robotics.

  • @despluguecom8966
    @despluguecom8966 4 วันที่ผ่านมา

    Inform that the next generation of computers will be developed in hexagonal format from the cutting of silicon wafers to the processors in their respective sockets on the motherboards...

    • @imconsequetau5275
      @imconsequetau5275 วันที่ผ่านมา

      No. Reliably dicing apart the wafer requires criss-cross cuts. It's like cutting sheet glass into tiles.

  • @HectorDiabolucus
    @HectorDiabolucus 3 วันที่ผ่านมา

    Well, we need new chips to do AI to get away from GPUs, because they are too expensive. The question is, will IBM produce AI chips for the consumer market?

  • @mvbs64
    @mvbs64 4 วันที่ผ่านมา

    Quando eu era criança queria ser projetista de Chips....

  • @MrWildbill
    @MrWildbill 13 ชั่วโมงที่ผ่านมา

    Sorry but you had one detail wrong, IBM did not make the first PC, although they were the first ones to slap PC on the front cover, prior to the IBM PC there were plenty of companies making PC's, in fact Heathkit even made them, the H8 and H89 were both out before the IBM PC, not to mention Apple, Radio Shack, and many more, there was also an emerging common OS, CP/M starting to get everyone on the same page. That said IBM legitimized the PC as a "real" computer in the eyes of Corporate America. I worked for a big bank prior to the PC launch and at that time you could not get funding for a PC based project to save your life, IBM released the PC and corporate executives could not sign up fast enough and it was raining money for PC projects after that and that was the real turning point for PC's. We used to have a saying in corporate IT, you never get fired for going with IBM.

  • @rudycramer225
    @rudycramer225 2 วันที่ผ่านมา

    WIthout IBM we would not have what we have today. What champions they have been for the industry. I worked with mainframes for thirty years. I worked on DOS/VSE VM and MVS and z/OS. All backward compatible. In those thirty years we had one two day down time because of a busted physical disk pack. I was always mindful of the sheer grunt needed to create their systems. WIth 99.9% uptime says a lot.

  • @dlfabrications
    @dlfabrications 3 วันที่ผ่านมา

    International Business Machines Corporation never knew what IBM stands for.

  • @Privacityuser
    @Privacityuser 2 วันที่ผ่านมา

    hey nobel?

  • @racagnac920
    @racagnac920 4 วันที่ผ่านมา +2

    I.B.M. should have been nicer to Cyrix or bought them out. 😅

  • @BilichaGhebremuse
    @BilichaGhebremuse 3 วันที่ผ่านมา

    Great. big up i want to make litography machine so i could use to to autonomus AI robots i have build to big city as soon as possible what can you help me to make autonomous robotic AI to built city

  • @taeyoungsin
    @taeyoungsin 3 วันที่ผ่านมา +2

    IBM talks BS a lot. I heard IBM quantum computer is operational 4 yrs ago. Quantum Computer has to be better than Nvidia chips.

  • @tristanwegner
    @tristanwegner 23 ชั่วโมงที่ผ่านมา

    no way "70% pf the entire world's transactions" are run the through the IBM hybrid cloud. What is the source for the number you are quoting?

  • @yougeo
    @yougeo 3 วันที่ผ่านมา

    I saw some summaries of what's actually been done by quantum computers now and unlike what so many of the stories have said they really have not had and I don't mean IBM but no one has had success in getting a quantum computer to actually provide an accurate answer in any way shape or form that was equivalent to beating a regular conventional computer both starting from scratch with a non predefined problem and answer.
    I'm not sure quantum computing is going to pan out.

    • @imconsequetau5275
      @imconsequetau5275 วันที่ผ่านมา

      Only the earliest investors can profit from quantum. Everyone else is a loser.

  • @aupotter2584
    @aupotter2584 3 วันที่ผ่านมา

    7:43 It looks likely that a more 'real' rather than simulated neural network is built in the chip to run AI, and it's just a matter of time the Skynet will be put into reality.
    14:14 Your data might be a little bit obsolete, as I've heard from another YT channel that 'According to the new plan, they will stay at 1000 qubits at least for the next few years (till 2028) and instead focus on error correction.'

  • @fabricio4794
    @fabricio4794 4 วันที่ผ่านมา +8

    IBM uses Linux,they are on a good direction.

    • @monad_tcp
      @monad_tcp 4 วันที่ผ่านมา +2

      Linux is not bringing the forefront of research anymore. Its a good direction to the status quo, not the future of operating systems.

    • @MarcosElMalo2
      @MarcosElMalo2 4 วันที่ผ่านมา +1

      Why do you say that? Is it for reasons other than “OS ideology”?

    • @billfarley9015
      @billfarley9015 3 วันที่ผ่านมา

      @@monad_tcp bringing the forefront of research. I don't even know what that means. Linux is growing and innovating and running a lot of the world.

    • @jaredandcande11
      @jaredandcande11 3 วันที่ผ่านมา

      Linux is a foundation start. The reality of secure computing will include modifications of the kernel, limits and check/balance operations with streamlined code for the cases in which it is applied. Many machines are running variations of *nix, but they are not equal in flexibility or security. *nix is a great starting point, but the best of the best will be tailored directly to the hardware strengths algorithmically and vice-versa. "Virtual Organic Connection" or "Bit based usage/workload compression" At this point a new term must be used.

  • @crazyedo9979
    @crazyedo9979 3 วันที่ผ่านมา

    As I said: IBM forgot more about building computers than all other together have ever known. They lasted for 140 years and will at least last for the next 140 years.😁

  • @markldevine
    @markldevine 4 วันที่ผ่านมา +2

    Correct me if I'm wrong, but Quantum computing won't explode in society until there is a near or fully ambient temperature/pressure superconductor discovered or invented. I infer that a materials-oriented AI will be required for that. So, (1) AI to get us an ambient t/p SC (2) it becomes trivial to scale quantum bits (3) we all get quantum wristwatches. Yes/no?

    • @monad_tcp
      @monad_tcp 4 วันที่ผ่านมา +3

      No, Quantum computing is more like Cold Fusion, its always 20 years in the future. AI isn't a panacea either, its basically a automated pattern matcher, more like an accelerator, but its still the humans making decisions.
      Even if they make QC work, it probably will never be portable and always be a remote device running in a data-center, getting it to run in ambient temperature seems much more harder than getting it to scale to enough quantum bits to be useful. And cooling doesn't miniaturize very well, you are going to need to discover new physics for that and you can't just put a deadline on discover of new physical, could happen the next decade or the next century.
      I think we already got to the top of the miniaturization of the mobile devices, networking will be riding the innovation curve for the next decade. Mobile computing is ironically going to become more and more like Mainframes. I would invest in IBM.
      Mobile phones aren't going to run the computation on themselves, they're going to be dumb terminals.
      The fact that we can't miniaturize cooling already made mobile devices hit a wall of power dissipation, your phone can't use more than 20W of power. If you cold solve just that problem, you could use a desktop chip for 500W of power dissipation in a mobile and that would already be a 10X to 30X increase in performance without even needing to use QC.
      I would be my coins on optical computing instead, optronics seems like it would allow for a paradigm shift away from CMOS. I won't bet anything on QC for the foreseeable future.
      QC is useful to exchanging cryptographic keys, that's the only actual commercial use I ever seem for it.

    • @Sven_Dongle
      @Sven_Dongle 3 วันที่ผ่านมา +1

      There are no quantum algorithms except for a prime number sieve and a database optimization algorithm that perform faster than classical computing, thats the biggest limitation so far. What they are good at is simulating quantum systems like magnetic spin models, not much use to your average spreadsheet user.