Are Computers Still Getting Faster?

แชร์
ฝัง
  • เผยแพร่เมื่อ 20 ม.ค. 2025

ความคิดเห็น • 5K

  • @LGR
    @LGR 9 ปีที่แล้ว +1203

    Thanks for asking me to be a part of this! I really enjoy how a seemingly simple question has no one answer, it's fascinating stuff.

    • @obsoletegeek
      @obsoletegeek 9 ปีที่แล้ว +24

      +Lazy Game Reviews I think we can all agree that modern computing hardware is fairly bland!

    • @8BitKeys
      @8BitKeys 9 ปีที่แล้ว +21

      +Lazy Game Reviews Actually, thank YOU for being part of it.

    • @patiencedvrx
      @patiencedvrx 9 ปีที่แล้ว

      +The Obsolete Geek I'd somewhat disagree, but it might just be because I bought my first graphics card upgrade in five years back in September, so that was really exciting for me on a personal level :3

    • @-taz-
      @-taz- 9 ปีที่แล้ว +10

      +Lazy Game Reviews There might be even deeper underlying reasons for the observations you cited while wearing the MS-DOS shirt I always wanted.
      1. Physics. Processors couldn't get faster or smaller due to bumping up against quantum effects, so they started getting more cores. But the software technology like operating systems, compilers and languages, not to mention programming methodologies in programmers' minds, couldn't very easily adapt. It's just starting to catch up now, and is only partially useful at that.
      2. Oligarchy. Why are tablets and chrome books becoming more prevalent, and cool? Because in the late 90's, big intelligence took back the consumer computer industry which had largely escaped their control. More has to be moved into the cloud, and away from private ownership, so that end users can both be monitored and manipulated. Here's a quotation from Jaron Lanier who coincidentally was the MIT guy involved with the Power Glove:
      "What we call 'advertising' in Silicon Valley is something totally different: It's micromanagement of the portion of the limited options immediately available to you when you're using an information system, for the purpose of calculated behavior modification." -Jaron Lanier

    • @B-Roll_Gaming
      @B-Roll_Gaming 9 ปีที่แล้ว +4

      your mustache is jarring and genrally quite troubling. I love you.

  • @davidmullenix3757
    @davidmullenix3757 4 ปีที่แล้ว +642

    "Some CPUs have as many as 8 cores."
    Looks over at the AMD threadripper with 64 cores and 128 threads...
    Man, this video was only 4 years ago!

    • @DrDoom-yf2qj
      @DrDoom-yf2qj 4 ปีที่แล้ว +35

      sure, but the average is still 4 cores

    • @ryanmoore8814
      @ryanmoore8814 4 ปีที่แล้ว +47

      I'd argue that 6 core cpus are more popular now what with Ryzen 2nd and 3rd gen. I could be wrong though.

    • @stigrabbid589
      @stigrabbid589 4 ปีที่แล้ว +13

      @@ryanmoore8814 And 6 core Intel core i5 and i7 CPUs (my ASUS ROG Scar II has a 6 core i7)

    • @Moonknife
      @Moonknife 4 ปีที่แล้ว +4

      That's the Moore law for you!

    • @derrickcummings3990
      @derrickcummings3990 4 ปีที่แล้ว +1

      the bus partly already had quad pumped 4*32 (to make an example would be a Core 2 Duo mit FSB1333)
      if you have a buffer you fill or reapting commands that can be used the cpu also can skip parts of the code like it does with the TLB
      in case the TLB still had the data nothing had to be reloaded only the new commands have to be done

  • @beedslolkuntus2070
    @beedslolkuntus2070 4 ปีที่แล้ว +376

    Kind of amazing how we are "doubling" every resource of a computer, he told 8 cores is the max? Today we have 64 cores in one socket!

    • @PunakiviAddikti
      @PunakiviAddikti 4 ปีที่แล้ว +27

      64 cores? That's gonna generate a fuckton of heat though, a gaming PC with 64 cores would become a stove!

    • @GhostyOcean
      @GhostyOcean 4 ปีที่แล้ว +55

      @@PunakiviAddikti gaming doesn't fully utilize all the cores on your PC, unless you're using them in virtual machine and they're all gaming

    • @PunakiviAddikti
      @PunakiviAddikti 4 ปีที่แล้ว +6

      @@GhostyOcean Idk what kind of gaming PC you have, but my PC always uses all cores to do various background tasks and assigns a few cores for the game itself.

    • @JamieVatarga
      @JamieVatarga 4 ปีที่แล้ว +26

      @@PunakiviAddikti still, most games are optimized to run on only 4 cores.

    • @PunakiviAddikti
      @PunakiviAddikti 4 ปีที่แล้ว +6

      @@JamieVatarga That might be the case, but consider this: if your PC only has 4 cores and the game needs 4 cores to run smoothly, ignoring the GPU and RAM, then your operating system doesn't have any resources left to do background tasks. Just because you have nothing open currently and your CPU should be idle, doesn't mean your PC isn't doing a lot of things behind the scenes. Especially when you're playing a game. Your operating system will use one or two cores to do other tasks if needed, but that takes resources away from the game. Minimum of 4 cores doesn't mean you can get away with only 4 cores.

  • @SollowP
    @SollowP 3 ปีที่แล้ว +103

    "As many as 8 cores"
    AMD: "Gotta pump those numbers up, those are rookie numbers."

    • @johnsledge3942
      @johnsledge3942 3 ปีที่แล้ว +3

      the next amd epyc chip could have 128 cores, amd just keeps going!

    • @SkyenNovaA
      @SkyenNovaA 3 ปีที่แล้ว +1

      @@johnsledge3942 I'm really starting to like amd, they're much better than they were a few years back.

    • @everythingtube172
      @everythingtube172 ปีที่แล้ว

      @@SkyenNovaAyeah as soon as zen happened

  • @LazerLord10
    @LazerLord10 9 ปีที่แล้ว +103

    You should do a video about how efficient computers are. It seems like even though the performance "numbers" have increased quite a lot, a lot of the usage seems to be pretty heavy on the hardware, even if they do things very similar to old hardware. I mean, why does my internet browser use up 500MB of RAM? It can't need all those resources, can it? How did I prows the internet on a system with 256MB of RAM?
    (I know this is a drastic oversimplification, and I'm aware that newer programs do more, but it just seems like with all this excess hardware power, things have become a lot less efficient.)
    Also, I wrote this before the end of the video.

    • @raafmaat
      @raafmaat 9 ปีที่แล้ว +17

      +LazerLord10 the higher ram usage is only because the new internet browsers are not just an internet browser anymore like 15 years ago... they can now play smooth ultra HD video and stuff... but if you want you can revert back to a browser from 1998 (use the compatibility for windows 98 option) and just have it use like 10MB of ram ;) sure it wont be able to play youtube videos and stuff but yeah, thats seems to be the thing you want :P

    • @fablungo
      @fablungo 9 ปีที่แล้ว +50

      As someone who has been doing a Computer Science degree, the difference today with efficiency and 10 or more years ago was that back then optimisation was really important. Nowadays maintainability and performance outweighs this and so even when using 2x more resource won't gain you 2x performance it is deemed worthy for any gain. Another issue is complacency when developing. We have basically been taught that for most cases optimising efficiency is a low priority and when it comes to a developers time in the commercial world efficiency doesn't compare to the commercial gain of more features: the attitude is "PCs have the power, might as well use it", whereas before the efficiency of the application was the difference between it running on home hardware or not and commercially you weren't going to be able to sell the software if it wasn't efficient. Adding to all this is the power of the internet and the ability to update applications, this drives two things: more maintainable code (usually less efficient) and less time testing and optimising before release (it can be patched after release).
      That's my thoughts on this anyway.

    • @LazerLord10
      @LazerLord10 9 ปีที่แล้ว +1

      Fabrizio Lungo
      Thanks for your response!

    • @simplelifediy1772
      @simplelifediy1772 9 ปีที่แล้ว +4

      +raafmaat Also you can add to HD video, the sheer amount of images and streaming advertisements...
      Remember static webpages that had 50-84k images and a midi song playing in the background? And java script was used for mouse trailers... lol

    • @raafmaat
      @raafmaat 9 ปีที่แล้ว

      SimpleLife DIY
      cached data does not add up to RAM usage.... that kinda stuff is all just saved on the HDD ;)

  • @nko3210
    @nko3210 ปีที่แล้ว +17

    Fun to look back after ~7 years to see this perspective. Would be a great candidate for a follow-up episode, maybe on this video's 10th anniversary. It's almost coming up soonish.

    • @morganrussman
      @morganrussman ปีที่แล้ว +4

      Maybe if we bring it up enough in his newly released video's, maybe he'll release an update video on it. I know people have asked David a few times repeatedly about an X item (take the kitty rocket tower for instance) in his video's and he eventually did make a single video explaining where certain things are.

    • @KofolaDealer
      @KofolaDealer ปีที่แล้ว +2

      Core 2 duos are still usable

  • @vwestlife
    @vwestlife 9 ปีที่แล้ว +316

    The combination of the Great Recession, the failure of Windows Vista, and the netbook fad forced software to continue to support older operating systems and underpowered CPUs much longer than they normally would've. Also CPU designs like Pentium 4 and PowerPC G5 reached the thermal limits of what a normal desktop PC could handle, which forced new CPUs to focus on being more efficient rather than just increasing the clock speed.

    • @vicr123
      @vicr123 9 ปีที่แล้ว +10

      You're here? :O

    • @KaiserTom
      @KaiserTom 9 ปีที่แล้ว +10

      +vwestlife It's funny because Intel was actually coming up with entirely new Mobo standards like the BTX in order to better handle the amount of heat generated from things like the Pentium 4 and whatever they thought may come in the future that would generate even more heat. We went from 10 watt CPUs to 100 watt in a matter of 10 years from 1995 to 2005, people were convinced we weren't going to stop, at least not as hard as we did when things like the core series started coming out promising more performance for roughly the same power usage.

    • @bryndal36
      @bryndal36 9 ปีที่แล้ว +3

      +vwestlife yes, anyone remember the "Presshot" series?

    • @saturnotaku
      @saturnotaku 9 ปีที่แล้ว +4

      +Bryndal Dwyre I had one of those. Worst computer I ever owned.

    • @bryndal36
      @bryndal36 9 ปีที่แล้ว +3

      thankfully i owned an amd 64 4200+ at the time. never had any issues with it and it ran like a dream alongside my gt6600

  • @uthoshantm
    @uthoshantm 5 ปีที่แล้ว +668

    Throw an SSD in a 10 years old PC, add some RAM and you are good to go.

    • @dragunovbushcraft152
      @dragunovbushcraft152 5 ปีที่แล้ว +42

      I have a nine year old T500 ThinkPad, with 8gb/RAM, and a 256gb SSD. It would make someone a GREAT "Daily Driver". It has Radeon graphics, and still plays some pretty decent games. :)

    • @philstuf
      @philstuf 5 ปีที่แล้ว +19

      Have a circa 2008 Dell Precision M6300. Threw an SSD in it and it runs like a current gen PC, with minimal latency. I won't be playing Half Life 3 anytime soon, but it is a stout system compared to current commodity I3's

    • @arwahsapi
      @arwahsapi 5 ปีที่แล้ว +19

      Using SSD in 10-year old laptops is cool. But DDR2 RAM's are scarce and expensive nowadays, and replacement batteries are mostly obsolete

    • @dragunovbushcraft152
      @dragunovbushcraft152 5 ปีที่แล้ว +1

      @@arwahsapi My T500 is DDR3

    • @uthoshantm
      @uthoshantm 5 ปีที่แล้ว +12

      @@arwahsapi You have a point. I installed Linux Mint and it works well with 2GB.

  • @Michirin9801
    @Michirin9801 9 ปีที่แล้ว +250

    Oh hey! It's LGR! Now that's a welcome cameo...

    • @The8BitGuy
      @The8BitGuy  9 ปีที่แล้ว +21

      +Ruko Michiharu Indeed. I was glad to have him in the episode.

    • @8BitKeys
      @8BitKeys 9 ปีที่แล้ว +1

      +Computer Whiz 1999 I just watch his channel, although admittedly he has so many videos I have not seen them all.

    • @Michirin9801
      @Michirin9801 9 ปีที่แล้ว +6

      ***** I watch his channel, I really like his reviews and his sense of humour, I especially like how in-depth he goes with some of the more obscure stuff he covers

    • @EgoShredder
      @EgoShredder 9 ปีที่แล้ว

      +Ruko Michiharu Agreed his channel is very good. I had to disagree with his comment about things like Facebook being low in CPU usage etc. I find this is slow and sluggish even on a fast well maintained computer on a ultra fast web connection. On an older computer from ten years ago or less, it can be a pain in the backside.

    • @EgoShredder
      @EgoShredder 9 ปีที่แล้ว +1

      +Ruko Michiharu Agreed his channel is very good. I had to disagree with his comment about things like Facebook being low in CPU usage etc. I find this is slow and sluggish even on a fast well maintained computer on a ultra fast web connection. On an older computer from ten years ago or less, it can be a pain in the backside.

  • @theinternetstolemysoulbuti2740
    @theinternetstolemysoulbuti2740 8 ปีที่แล้ว +60

    Silicon processors are at its limit right now since transistors have become small enough for electrons to skip over them when manufacturers try to make them smaller, effectively ruining the binary code that we rely on. But once graphene becomes the new medium for circuitry we will see a resurgence of progress. Graphene CPUs that are being thought up can handle higher temperatures with less energy wasted.

    • @tobiaszstanford
      @tobiaszstanford 8 ปีที่แล้ว +10

      Yes that's true. But graphene transistors is in development and if that stuff gets into a CPU, you'll have +20Ghz standard

    • @theinternetstolemysoulbuti2740
      @theinternetstolemysoulbuti2740 8 ปีที่แล้ว +5

      It's mainly because graphenes ability to be so thin/electrically efficient is it's biggest advantage/disadvantage. It's very easy to mess up a core during it's "growth" which would render the processor useless. But IBM has recently made a graphene processor that is capable of sending basic code 10000x faster (I may be wrong. CBA to look it up) than conventional processors

    • @theinternetstolemysoulbuti2740
      @theinternetstolemysoulbuti2740 8 ปีที่แล้ว

      In a nutshell, graphene is so difficult to produce (in high purity with high quantity) that 90s technology would prove innefective. Silicon is much easier to make circuitry than graphene and it's good that we're discovering it now. (Instead of shelving it as a' 'failed' idea.

    • @Hellcommander245
      @Hellcommander245 8 ปีที่แล้ว +5

      It makes you wonder how large a computer processor would be if it used vacuum tubes instead of transistors.

    • @CzornyLisek
      @CzornyLisek 8 ปีที่แล้ว +3

      Emmm we already are able to make one atom trasistors. First one was created in 2004.
      right now there is numerous of one atom/few atoms transistors. End I don't think eny of them use graphene. They used silicene(form of silicone), alumina(aluminium oxide) end silver.
      About vaccum tubes. they also can be made rly rly tiny using same processes as to make today trasistors . Smallest vacum tube trasistor work on 460GHz. End I'm not shure about specific diameter.
      Keep in mind that "speed" of trasistor =/= "clock speed" of whole CPU/GPU/ect.

  • @QuantumBraced
    @QuantumBraced 8 ปีที่แล้ว +494

    The 1990s were insane. You'd buy a PC and 3 years later it would be almost completely obsolete. You couldn't install Windows 98 on a machine from 93-94. Hardware worth would half every year. It was like hyperinflation. Nowadays, a 6-7 year old PC is totally fine, if not fast. Even for gaming, get it a budget $150 graphics card and it'll be alive and kicking new games. I had a 120GB HDD in 2003, and in 2016 that's the starting capacity of new MacBook Pros.

    • @dycedargselderbrother5353
      @dycedargselderbrother5353 8 ปีที่แล้ว +31

      If we were to collapse upgrades into discrete computers, I had something like 5 PCs in the 1990s. 3 years is generous. While "total obsoletion" is accurate, realistically you wouldn't be able to run next year's software without some sort of upgrade, and in areas no one thinks about anymore like CD-ROM drives, sound cards, and incompatible cases and power supplies.

    • @Zack-dk3pt
      @Zack-dk3pt 8 ปีที่แล้ว +5

      yep got an 09 computer only ne wthing isa cup fan dual core apu 4 gb ram can still play most games on low quality sewttings some o hiugh but for the most part still a fully functioning computer would be perfect for a non gamer

    • @stereorail
      @stereorail 8 ปีที่แล้ว +6

      In the year 2001 I installed Windows95 on a 1992 IBM PS/2 386sx system.
      The most I had to fiddle with is setting HDD access mode to 16-bit since
      SX was a 16-bit bus. It took minutes to boot but then worked OK (though
      slower than 3.1) and even went online. Sold it for 40 bux :-) (in 1994
      it was on clearance sale at Costco for some hundreds)

    • @maxis2k
      @maxis2k 8 ปีที่แล้ว +21

      It definitely was insane. I still remember in 1992 getting my first real computer. Within 3 years, I found that it could barely run Windows 95 and I was playing games like Sim Copter at 5 FPS. I had to struggle on with that all the way until 2002 when I finally got a new computer as a graduation present. And marveled at the 'amazing' 733 MHz processor. Which I found out was already obsolete for that time.
      My last computer, a Core Duo Q6600 on the other hand lasted me 7 years and can still function just fine for anything except games and video rendering. Like the video pointed out, I think processing power has greatly surpassed the software requirements.

    • @Advection357
      @Advection357 8 ปีที่แล้ว +15

      The first system I built in late 1995 cost over 4000$... for a Pentium 100mhz with 16mb EDO ram (800$ for the ram alone) and a Matrox MGA Millenium (the first consumer 3D accelerator ever made... another 500$ for that). Needless to say... it was obsolete in 2 years when the Pentium II came out. I don't even remember what I did with it... I think I sold it... been a while hah

  • @mbunds
    @mbunds 6 ปีที่แล้ว +13

    Here’s a trivial (and quite possibly obvious) observation:
    I’ve noticed a trend over the years where concentration in the microprocessor industry shifts between substrate improvements to get more FLOPS, and efforts to reduce power consumption of the substrate components, individually, and by applying power saving logic to the system etched into the chip as a whole.
    Critical to, but aside from the obvious desire to improve battery life for mobile or terminal devices, these efforts occur simultaneously during the evolution of ever faster circuits by necessity, but a subtle shift in industry focus can be observed over time, as more development becomes required for the reduction of power consumption, before the density of the devices on a substrate may be increased without facing problems with heat dissipation. Once the power budget for the system has been proven to fall within specs, concentration subtly shifts back toward increasing clock speeds and packing more circuitry into the substrate.
    Thanks for an excellent presentation!

  • @DarthScape
    @DarthScape 5 ปีที่แล้ว +318

    "As many as 8"
    And we now have 64 cores in a single socket.

    • @uroborous01
      @uroborous01 5 ปีที่แล้ว +27

      DarthScape 64, 64 bit cores in a single socket. And we still have trouble emulating certain n64 games on them.
      Were going really fast, but where are we going exactly?

    • @uroborous01
      @uroborous01 5 ปีที่แล้ว +12

      bootleg linux well i was trying to make a cheeky emulation nerd joke, but i guess i kinda failed.
      You do make a valid point tho.

    • @CommonApathy
      @CommonApathy 5 ปีที่แล้ว +5

      And you'll only ever use like 3 of them except for very, very specific tasks.

    • @ILikeStyx
      @ILikeStyx 5 ปีที่แล้ว +2

      Well... 2020. 64-Core (128-Thread) AMD Threadripper has yet to be released ;)

    • @SavageDarknessGames
      @SavageDarknessGames 5 ปีที่แล้ว +2

      The issue isn’t how many cores on a single socketed chip, the issue is the programmers taking advantage of multi core, multi threading, programming.
      Unfortunately most games and common used applications have no use for more than one core.
      The only time you’ll find multi core using programs is in things like gene splicing and physics computations.

  • @Agorante
    @Agorante 6 ปีที่แล้ว +161

    All this makes me feel old. You start personal computing at 1975. I started a little before that . I had an Altair but my first real useful computer was my Commodore PET. My boss and I around 1975 both got interested in PCs. He went with a Radio Shack TRS-80. It had 4K of RAM. I wanted more power so I got a PET with 8K. But before he received it it expanded to 16K. I was jealous.
    I don't remember how much RAM the Altair had. I think it had been about 512 bytes. Of course it didn't even have a boot ROM. In order to boot it you had to set the switches on the front panel in a specific order and watch all the little lights. I only did that a couple times. Once you booted it you could load a Microsoft BASIC tape. It took a long time to even get get a prompt indicating that you had BASIC.
    So when I got the PET and I turned it on and it show the OK prompt immediately, I was thrilled. It was all so easy. Of course that meant I had to go to the public library and get a book on BASIC.
    Sounds primitive now but I worked in the government and nobody anywhere except Carl with his TRS-80 had a computer. I was looked on as some kind of wizard. It made my career. I wrote statistic programs. The county paid for the PET. And I had a kind of statistical job. There was almost no pre-written applications in those days. i wrote a word processor which allowed me to publish a monthly agency status report. By modern standards it was a terrible word processor but I was the only person who had a word processor. That's how I got a couple promotions into management. So when the Reagan cuts came I wasn't cut . I did the cutting.
    I wrote a sort of Visi-Calc program too. That's how I got into managing the budget. The relevant metric is not how much power your computer has, its how much you have versus the guy in the next cubicle.

    • @drewbadwolf5182
      @drewbadwolf5182 5 ปีที่แล้ว +12

      means your more knowledgeable then most of us

    • @jimhewes7507
      @jimhewes7507 5 ปีที่แล้ว +7

      I started almost as far back, on the VIC-20 which had 4K of free RAM. (I don't count a little Fortran in college before that). I started with BASIC but very quickly realized there were some programs running faster than what was possible with BASIC and so I discovered machine language. After that it was almost all machine language for me with TinyMon. I discovered how to insert a game cartridge so that I could see the code in the cartridge and so I would disassemble the whole code on paper to see how it worked. When I finally got a C64 a couple years later, 64K was like boundless space.
      But I was never as productive as you to write your own word processor and spreadsheet. Wow!

    • @uroborous01
      @uroborous01 5 ปีที่แล้ว +2

      Damn... those were the days...

    • @TheSodog8
      @TheSodog8 5 ปีที่แล้ว +4

      Ah the early days, I got a promotion to CNC programed because I knew how to type a C/ prompt.

    • @khemikora
      @khemikora 5 ปีที่แล้ว +4

      My first computer was my beloved C64 with a whopping 64 Kilobytes of RAM. After that I moved to the Amiga 1200 with 2 Megabytes of RAM! Twas quite a leap! My next machine was a 300Mhz AMD PC with I think 64MB of RAM. My current computer now has 16 Gigabytes of RAM. Isn't computing progress great!

  • @charleshayesvaughn
    @charleshayesvaughn 5 ปีที่แล้ว +89

    And now you'll be at a million in probably less than a month. Congrats dude.

  • @johnh6524
    @johnh6524 5 ปีที่แล้ว +86

    Back in the day the expression was “Whatever Intel giveth Microsoft taketh away”. I wonder if the rise of open source has had an effect on this?

    • @tiporari
      @tiporari 5 ปีที่แล้ว +8

      Nope. Open source, unix/linux, all very cool. Impact to 99% of consumers? Negligible. Performance demands for a fully featured open source only machine is right up there with Windows. All the bells and whistles cost the same in terms of compute resources whether MSFT developed it or some open source coder. In many cases linux performance is worse overall because of poorly optimized drivers, lack of hardware acceleration, and less maturity for open source alternatives to mainstream tech like Java for example.

    • @herrfriberger5
      @herrfriberger5 5 ปีที่แล้ว +3

      I sure "hate" MS, but few of their products was very inefficient in terms of clockcycles or CPU usage, at least compared to Adobe and many others (although something like that saying was true regarding disk space during the 80s and 90s).
      Windows 3.1 and up started *far* to slowly, for no other reason than bad design and optimization, but this had more to do with lots of (slow) hd-accesses than with CPU usage. Many (not all!) Gnu/Linux or open source projects are equally sloppy designs, unfortunately.

    • @deusexaethera
      @deusexaethera 5 ปีที่แล้ว +2

      It absolutely has had an effect -- a negative effect. Open-source code is generally written like shit and is even more inefficient than Microsoft products, when adjusted for the same level of feature-richness and hardware compatibility. Just look at the horrible clusterfuck that we call "the internet", which is mostly based on open-source code. The best code is written by a single developer who has a long time to refine it -- basically what I do for a living.

    • @martijngeel9224
      @martijngeel9224 5 ปีที่แล้ว +7

      @@deusexaethera Ah, you are a programmer that get payed by line and the more lines of nonsence you produce the more you get payed. Linux is faster than windows. But i guess you never heared of Linux. Linux is rising, but certainly because microsoft stops with windows. I mean windows 7. windows 10 is crap, click more in more time, look busy and do less.
      The longer it takes for your code to execute, the less you should get payed. And you thought that it was a work of art, shame on you. Look at the demo makers on the C64 that is a work of art, that is well programmed by smart people. No i am not a demo maker, but i sometimes program in C64 assembly and look at their code.
      Have a nice day.

    • @deusexaethera
      @deusexaethera 5 ปีที่แล้ว +9

      @@martijngeel9224: Nobody pays programmers per line of code anymore, dumbass. I get paid a salary. And of course I've heard of Linux. I've used it occasionally since the late 1990s, and my opinion remains the same: Linux is the best OS to use when I want an OS that requires me to fix every single feature because none of them work correctly.

  • @lobaxx
    @lobaxx 7 ปีที่แล้ว +583

    They are all wrong.
    Clockspeeds have stagnated not because of changing consumer habits, but because of the heat wall. Simply put, heat increases exponentially with linearly increasing clock speeds, which makes faster processing cores impossible. This forced processor makers to go for increased parallelism (more cores at the same clock speed instead of one core that is faster).
    Now, in theory, increased parallelism increases processing power, but it isn't as simple in practice.
    In the olden days, a program would get faster with newer hardware because clock speeds increased - programmers didn't have to change anything in a program to see performance improvements. However, since 2005-ish the change was not in clock speeds but in the number of processing units. Since the clock speed has not changed, a program will run just as slow on new hardware as it did on older hardware. In order to get this theoretical increase in performance, programs need to do various things at the same time.
    So here is the first hurdle - programmers now need to re-write their code to gain performance where before the performance just happened by itself. Programmers are lazy and management is cheap, so this rarely happens. Also, programming parallel code is a hard. Like, really really hard with bugs appearing everywhere that are almost impossible to reproduce.
    But assuming deep pockets and motivated staff, you still cannot expect performance increase to match the theoretical hardware output by parallelizing the code. Very, very few problems are completely concurrent (mostly, those lie in the domain of graphics, which is why graphics cards with hundreds or even thousands of processing units exist), and thus Amdahl's law kicks in.
    Lets look at an example program: it does some task A, and uses the result to do B, and both take the same amount of time to do on a single core processor. No matter how many thousands of cores we have, we must calculate A before we can calculate B, and although B is a type of problem that can be parallelized, A isn't . This means that no matter how many cores we throw at the program, B will always run at the same speed. So even if we have a fictional supercomputer with infinite amount of cores, it will only run the program twice as fast compared to on an old, single core processor. Infinite processors, but only twice as fast! So it's no wonder that new programs run just fine on old hardware.
    And there is as a condensed introduction to modern computer science and distributed programming as I could muster.

    • @steve1978ger
      @steve1978ger 7 ปีที่แล้ว +30

      Yup, this is only in part demand driven (if at all). As well explained in the video, the computer industry has never had a problem creating demand for faster machines. IC miniaturization has come to physical limits, in heat dissipation, in manufacturing processes, in getting quantum effects (not the good ones) while approaching the atomic level; and at the same time there are tasks that can not be parallelized to multi core CPUs very well.

    • @theelectronwrangler6416
      @theelectronwrangler6416 6 ปีที่แล้ว +16

      I was going to add to this but then saw there was more talking about parallel computing and its own hurdles. This about completely covers the why, though there are some other technological advances that may still help improve clock speed, or at least the speed/bandwidth of inter-core communication. I'm fond of carbon fiber nanotubes, but mostly just because it's fun to say :)

    • @marlona5205
      @marlona5205 6 ปีที่แล้ว +15

      i am with you on this one. we have a thermal limitation, until we find better affordable materials, CPU speed won't increase. every device we currently own has a life span of less then 6 years depending on usage. heavy usage devices such a cellphone only have a life span of less then 2 years. and its all due to the materials being used . where solder joining IC's/CPU either brittles or melts creating shorts and connection issues

    • @kraazed
      @kraazed 6 ปีที่แล้ว +25

      I agree with what you say, in that the heat walls, thermal transfer and package complexity are the reasons for clock speeds not increasing beyond 4ghz.
      But I also agree with the people in this video, it's because consumers drive the vendors and everyone wants lightweight portable devices with great power efficiency. This is why die sizes have been shrinking, this is why the only real gains in the last 5 years has been in Watt per FLOP.
      If the die sizes were bigger, power caps were not limited to 140/200W for the CPU sockets, we would easily be seeing stock computers running at 5ghz or greater today, which would certain yeild higher performance.
      I have a water-cooled 4930k @4.6ghz, this chip is now 5 years old. It pulls about 230W through the CPU socket at full tilt, in a socket designed for about 130W. My biggest limitation is heat and feeding the processor electricity. A more modern 6 core at 4.6ghz would use about 2/3rds of my 6 core and deliver 10‰ more performance.
      With all that being said, processors and motherboards these days are not designed for higher clocks. Giving the right engineering and architecture, processors could easily see 6ghz, but the heat outputs and power consumption would be significant.

    • @woodwind314
      @woodwind314 6 ปีที่แล้ว +17

      Good reply. The heat wall, though, is more a symptom than a cause. Of the early 2000s mainstream processors the Intel Prescott probably was the worst in terms of heat production, but it it could still be ramped up over 4GHz clock speed - if you put in the right cooling. Now while this was feasible in big desktop machines, it was (and is) not feasible in smaller devices. Hence a need for a drastic lowering of power consumption (== heat output) of the processors. Of course considerations of battery usage in mobile devices also played a big role in that paradigm shift in CPU design.
      Another reason why we don't see any speed increase in CPUs (especially if you look at the individual core) is that memory is lagging orders of magnitude behind CPUs in speed, but at the same time memory demands continue to increase. So to actually get faster computers, the road to go is to get RAM faster, and to write software that is optimized to decrease memory usage and, more importantly, memory access. And that is actually easier done that parallelization.

  • @MEGATestberichte
    @MEGATestberichte 7 ปีที่แล้ว +14

    This channel is just awesome. I am an aged technerd myself too. Love it.

  • @RonJohn63
    @RonJohn63 5 ปีที่แล้ว +9

    2:04 The company I work for just sent me a 4GB DIMM to add to their laptop, bringing it up to 8GB before the upgrade to Win 10. Yes, for the past two years, I've been running W7Pro with 4GB.
    OTOH, my own desktop machine is a "PC of Theseus". The mobo is about 5 years old, with AMD FX-6100, but 240GB SSD and 32GB RAM. A 10TB spindle is for data. I have zero need to upgrade the CPU or mobo any time soon, and have finally reached "32GB ought to be enough for anyone".

  • @dragunovbushcraft152
    @dragunovbushcraft152 6 ปีที่แล้ว +2

    I've been repairing, and working on computers for 43 years. I've watched them "grow up". I have had WELL over 1000 different computers (maybe twice that!), I still use a Lenovo T500 with Radeon graphics for lots of things. I also have several Skylake, and Kaybe Lake systems. Other than gaming, and a select few pieces of graphics software, I find my T500 to be more than up to the task for modern computing. You are partially right about the software issue, however, there is one more important factor:
    Upgradability.
    My T500, came with Vista, 2gb/Ram, a 160gb spin drive, a 1650x1080 display, and ATI Radeon 3650m Graphics, in 2009. It was a very, expensive machine when new. It now has Win7Pro, 8gb/RAM, a 256gb SanDisk SSD, a T9600 Core2Duo processor, and a 500gb, WD "Black" HDD in the optical drive bay. I can do some gaming on it, I can run CAD software, I can edit video on it, I can run Office 2013 on it. It runs most of these as well as my Kaybe Lake systems. It boots nearly as fast as my Kaybe Lake systems running Win10. If I installed Win 8.1 with Classic Shell on it, it would be even faster. The display is more than adequate for most modern applications. This computer was upgraded from all used parts I got mostly for FREE. This computer will still be usable for at least another 7-10years. Maybe longer, depending what I use it for. I use it as a demo model, as well as daily use. People are AMAZED at my T500, as it keeps up, and even surpasses the speed, and usability of their newer machines.
    I've sold untold numbers of R400/500 and T400/500's that I rebuild, and I give a very HEFTY warranty with them. All have the Radeon graphics in them. Other than a blown ram chip or two, and a crashed mechanical drive or two, the ONLY thing brought back on warranty is for general upkeep. One customer has had one of my T400 laptops so long, that I had to re-paste his CPU (covered under limited, lifetime warranty), general HDD cleaning (part of warranty). I have NEVER had a "failed" motherboard, or display come back. Machines built from 2009, to 2016, or so, are robust, and very upgradable.
    No "obsolete" here!

  • @VideoNOLA
    @VideoNOLA 6 ปีที่แล้ว +106

    If you ran the same old MS-DOS programs we had in the late 1980s through mid-1990s on today's PC (if you could magically do so in the first place) it would run so blindingly fast as to be useless. Back then, programming was streamlined and efficient out of necessity as CPU cycles and memory came at an extreme premium (remember running HIMEM and MEMM386 just to shoe-horn a few more TSR's into RAM during bootup?).
    Sadly, today's bloatware is the opposite. Huge, sloppy, un-optimized, resource-hungry and so high level and fault tolerant that it's as if IBM were back paying developers by the line of code. I'm constantly amazed that ANYTHING manages to slow down a 4-core, 8GB, 64-bit desktop computer when asked to do JUST ONE THING (i.e. open Chrome browser). It's so sad.

    • @Lambda_Ovine
      @Lambda_Ovine 5 ปีที่แล้ว +18

      Let's not forget that, back then, developers fine tuned their software to run in specific architectures and platforms with considerable life spans. Today, with all the diversity and configurations of hardware out there, you simply don't have that luxury anymore.

    • @frikkiethirion8053
      @frikkiethirion8053 5 ปีที่แล้ว +8

      Use Dosbox to run your vintage/abandonware on new machines.

    • @arthur_p_dent
      @arthur_p_dent 5 ปีที่แล้ว +24

      Back in those days, software developers had 2 precious resources to look after: they would optimize a program either for minimal memory use (both RAM and hard disk space), or for maximal speed.
      Nowadays, both memory and speed are available in abundance, so a third resource comes into play: human effort. From a developer's viewpoint, it simply doesn't make typically sense to spend an extra thousands of hours of time developing and coding in order to minimize RAM usage or maximize speed. Hence programming languages like JAVA, which are 1000 times slower than GWBASIC while needing a 1000 times more RAM, but allow for applications to be developed in record time (compared to machine language anyway).

    • @SecretSauce2738
      @SecretSauce2738 5 ปีที่แล้ว +6

      @@Lambda_Ovine That's what kernels and drivers are for. I can open Chrome on a Windows machine with a completely customized build, and any inefficiencies in the kernel or drivers is simply not enough to be a bottleneck to the speed of the system. Try running a minimal open source web browser on Linux with that same hardware, and then compare it's speed to the original Chrome on Windows. If they can do the same thing, the difference in speed is certainly not a lack of optimization for a specific hardware set up, it's the bloat in the software.

    • @RetroDawn
      @RetroDawn 5 ปีที่แล้ว +4

      @@arthur_p_dent Actually, Java is *significantly* faster than GWBASIC, or any interpreted BASIC. Java is actually on par with C++ in speed, and is even faster on many occasions. And, Java definitely doesn't need 1000 times more RAM. I'm a professional software developer for over 25 years, and taught myself BASIC and PILOT back in 83 and 84, respectively, when I was 10.

  • @fheyo
    @fheyo 8 ปีที่แล้ว +16

    Fundamentals of pc usage was 1- Web browsing, 2- Viewing quality photos, 3- Document creating , 4- Watching a movie, 5- Listening music ten years ago and that was perfectly doable. There is no room for improvement. Improvement needed only for 3d rendering, video editing, specific work needs, better gaming experience .

    • @Raptor3388
      @Raptor3388 8 ปีที่แล้ว +4

      Yes the most common sites have become so bloated they will put older computer to their knees, even dual core computers. TH-cam, mostly because of Flash, is very demanding, and even Facebook now, I've noticed it's become noticably harder to use on a lower end 2009 laptop (Pentium T4500 and 4Gb DDR3 in my case), which used to do everything flawlessly.

    • @OMA2k
      @OMA2k 5 ปีที่แล้ว

      @@picketf Webpages using PHP is irrelevant to the power your computer needs to display it, because PHP is run in the server and your computer only gets a plain HTML representation generated in the server from the PHP (or any other server language). Scripts and HD video do put strain on your computer, though.

    • @OMA2k
      @OMA2k 5 ปีที่แล้ว

      @@Raptor3388 : Flash has an undeserved bad reputation. You say TH-cam is very demanding "mostly because of Flash" when in reality TH-cam had already stopped using Flash since quite some time before you wrote this comment 3 years ago. In fact, HTML5 animation is more CPU-demanding than Flash. Those pesky animated banners that are everywhere in websites didn't die when everyone stopped using Flash, but now they tax the CPU a lot more than when they were actually Flash animations. I'm sick of hearing my laptop fans blowing at full speed whenever I'm in a website with several animated "HTML5-canvas" banners. That didn't happen with Flash. I hate that my laptop has to consume more battery just to play some stupid unwanted ads. Those kinds of ads should be banned, and only static image banners allowed. I'm not more likely to click on them just because of some silly animation. But I digress... For some reason, people like to blame everything on Flash even when it's not really used, probably because of some second-hand opinion from some "guru".

  • @klaxoncow
    @klaxoncow 5 ปีที่แล้ว +58

    There's another obvious thing that I'm surprised was missed.
    When you showed the 2015 model and the 2005 model, they're both PCs. But when you showed the 1990 model versus the 1980 model, it's a Commodore PET versus a NEC Ultralite.
    Early computing had zero regard for interoperability and compatibility. Every single machine was its own island.
    Even in regards to its own brand, such as with the PET, VIC-20, C64 and Amiga. All from Commodore but, with regard to interoperability with each other, there was essentially none and they might as well have been manufactured by different companies. It made no difference.
    But, with the PC, IBM accidentally created a standard which others could follow. And very much a crucial point in selling your "PC compatible" was that, yes, it was compatible. Manufacturers were aiming to make their machines compatible and interoperable with other PCs and PC software.
    Microsoft is known to go to quite some lengths - under the hood - to preserve backwards compatibility. To the point that various versions of Windows are known to detect certain pieces of the most popular software and actually change settings and recreate bugs from previous versions to keep them alive and well on a new OS. It has a big list of "exceptions" for various software and then changes its behaviour to keep those bits of software alive.
    Then there's the user visible stuff like being able to select compatibility profiles on applications. Please run this app as if this were still Windows XP, please.
    And, over those decades, we've increasingly had more interoperability standards. From well-known file types - like MP3, PNG, etc. - to Internet protocols to, like, industry bodies getting together to define the Vulkan API.
    In 1980, no-one gave a crap about interoperability. It was essentially non-existent. By 1990, it was understood to be a good thing and was actively strived for. By 2000, it was essential - the web simply can't work without it. By the late 2010s, even Microsoft is conceding that it needs interoperability with Linux and is including "Windows Subsystem for Linux" and then starts work on its own Linux distribution. It's also conceded - a few days ago - to using the open-source Chromium for its Edge browser (there's little to gain from the immense effort of developing its own HTML engine, as you can't lock people in like that anymore and, anyway, Chrome is kicking their arse and they'd never make up the difference and they know it).
    And there's been a slow but steady embrace of open source, as it's understood that, actually, interoperability is king. Proprietary closed source implementations cut you off and erect "walled gardens" around you - this really isn't useful. At first it might seem fine. But as you upgrade machines, change vendors, change software, etc. then it has become increasingly clear to many businesses that, whatever closed source offers, it'll sting you badly in the arse in 10 years' time, when you can't escape their "walled garden" that keeps you paying.
    In 1990, there was zero interest in making any of your software from your 1980 Commodore PET function. And if you were attempting "backward compatibility", then your 1990 laptop would have to deal with a hundred systems. Those were the bad old days. Let them go.
    But, in 2020, there is reason to still be concerned with XP compatibility, with 32-bit machines, and generally aiming to be "backwards compatible".
    And there's a long legacy of this now. For example, let me use my psychic powers. Choose any Windows EXE on your system. I predict that the first two bytes are "MZ". Check that in a hex editor. Further along - probably about at the 4KB mark - you'll then find the two bytes "PE".
    How am I making these predictions?
    Because, you see, MS-DOS detects whether a file is an executable by the initial two "MZ" bytes - the initials of one of Microsoft's engineers, Mark Zbikowski, if you're wondering - and, for "backwards compatibility", Windows executables still start with "MZ" and have an MS-DOS "stub" program at the front. It's literally a working MS-DOS executable that is usually programmed to print "This program requires Microsoft Windows" and exit.
    Because, yes, that prompt is NOT coming from the OS. MS-DOS is running the EXE file. Because the "MZ stub" program at the beginning is a legitimate MS-DOS executable.
    And then, within the MS-DOS executable headers, is a pointer to the "New Executable". This was originally an old style Windows executable, identified by "NE" for "New Executable", but later 32-bit versions of Windows moved to the "PE" (or "Portable Executable") format. There's also a 64-bit PE these days too.
    But, anyway, technical details aside, the point is that every Windows program (and DLL) on your system is preceded with an MS-DOS executable, on the off-chance that you attempt to run it in DOS, for it to print "This program requires Microsoft Windows" and quit.
    This hopefully gives you a measure of what I mean by how much effort and expense and legacy is in the modern system to be "backwards compatible" with all that preceded it.
    In future, when you're downloading "Half-life: Alyx" to play the most modern VR on your latest and greatest Windows 10 system, the first two bytes of the executable will be "MZ". Because there's a fully working MS-DOS "stub" program bolted onto the front of them all, to maintain compatibility with MS-DOS.
    Here's another thing to consider as well. There are natural limits to things.
    On the 8-bit machines, you had monochrome (1 bit per pixel) and maybe 4 colours (2 bits per pixel). Perhaps your system was capable of 16 colours (4 bits per pixel). And then VGA brought us 256 colours (8 bits per pixel). These days, we all pretty much universally use "TrueColour" or 24 bits per pixel.
    In certain fields, like scientific imaging, you might push beyond that and HDR TVs get all floating point about it. But there's the idea of a natural limit.
    The human eye just can't see more than 24 bits per pixel. So there's no point, for pure display purposes, in storing more than that for an image. Your eye simply can't see it.
    (In fact, the range of human eyesight is actually closer to 16 bits per pixel. But our colour sensitivity is not uniform across the range - we can see more in the yellow / green range - so 16 bits is not quite good enough to be completely imperceptible to the human eye. But it's right on the edge. 24 bits per pixels - a byte for each of RGB - exceeds that, is more than good enough and is simple to program and deal with.)
    While video qualities rise - 576p to 720p to 1080p to 4K to 8K, and 30fps to 60fps to 144fps - the number of letters in the English alphabet remains 26. Once UNICODE now covers every conceivable alphabet (and much more) then text files don't naturally get bigger.
    (And I'm talking about 4K / 8K and 120fps / 144fps - well, there's diminishing returns on these things. More resolution and higher frame rate does make things slightly better, but only slightly. It's four times more data to do 8K than 4K - which itself is 8 times more data than 1080p - but is 8K really 16 times better than 1080p to look at? When you compare 320 x 200 resolution to 640 x 480 then the increase is stark and obvious. From standard definition's 576p to HD of 720p and 1080p, everyone could see the clear advantage (particularly as it coincided with screens becoming flat panels rather than bulky CRTs) so everyone upgraded. But selling 4K and 8K has not been so easy, because the leaps get less and less significant.)
    There's the "law" that things expand to fit the available space. More RAM, more disk space and programs and data just get bigger to fit. Well, that's largely true - until you start hitting these natural limits. There's no point going beyond 24 bits per pixel. So images stay at that and all the extra RAM and disk space starts meaning "more pictures", not "pictures getting bigger to fit the available RAM / disk space" as they once did.
    And once machines are good enough to, you know, display a 24 bit per pixel image or decode an MP4 stream, then all exponential hardware improvements mean is that the CPU / GPU is less taxed and capable of doing more things simultaneously.
    There are some things that will, no doubt, keep on getting bigger and bigger to match hardware improvements. But there are lots of things that have now reached the natural limits. Older hardware can reach those natural limits, so improvements are not about being capable of doing things anymore, in quite the same way, but about being able to do it easily, with CPU to spare for another 8 tasks at the same time.

    • @pointlesspublishing5351
      @pointlesspublishing5351 4 ปีที่แล้ว +1

      Excellent answer. Very indepth. Coincidendly, it fits with my gaming- and job-experience with computer. My second screen is an old 19" from 2005...and my main screen offers Full HD on 24", so that a book page is a "real page" on a screen (enough for me as an professional author). WHY should i get more? It does not make SENSE. To see my text written in UHD? And in gaming...i have the feeling the console-thing (basically all consoles have the same games available, excluded some exclusives) also stopped the Gearing Up for Games process. Since Xbox360, i remember being able to buy the same game for PS3, 360 AND PC...and while of course i COULD get better performance on a potent computer, i got the impression in the 2000s that i just NEED a computer which can MATCH console performance for ACCEPTABLE gaming results. Everything else seemed to be a waste of money, unless you're really into it. Crysis...ahem.

    • @stevejobs6693
      @stevejobs6693 4 ปีที่แล้ว +1

      While the human interface device, screen or speaker certainly will hit the limit based on what our senses can perceive, the state of the art (imagine 64 bit color space or 16K resolution) will continue to grow for two reasons: 1. The increased software capabilities (things you can do with the additional data) think of how your smartphone uses multiple cameras to input spatial data for security or utility (ie. portrait mode). Or for example an infrared sensor augmentation of data through software. And 2. Machine learning / deep learning (AI) will eventually surpass humans in computational throughput. Neural networks are already using/creating non-human interpretable data and intermediates to solve problems, so we'll have to keep growing to make these systems more robust/efficient.

    • @Porygonal64
      @Porygonal64 3 ปีที่แล้ว +1

      tldr

    • @Xyspade
      @Xyspade 2 ปีที่แล้ว

      I know it's been 2 years but wow, you put an immense amount of effort into that article, and it did not go unnoticed. I read the whole thing. And I too am surprised that this was missed in the video because I think this is the most accurate answer. Accept my one upthumb because you deserve a lot more.

  • @DaMaster1983
    @DaMaster1983 5 ปีที่แล้ว +2

    I know you will find it weird.. but what makes your channel so successful is your voice.. and how pleasant you are that its kinda relaxing to watch.. plus off course im from the 80's so it brings good ole memories.. and we also learn to fix stuff.. thanks to your wonderful well edited videos..

  • @10p6
    @10p6 9 ปีที่แล้ว +14

    I think the main point comes down to these 4 issues.
    1. Manufacturers have reached the maximum clock speed without expensive cooling systems. Therefor to get more speed more CPU cores are being added. Once again though these are limited by clock speed of the bus.
    2. With regards to number 1. Ram clock speeds have stagnated too, making it harder to effectively push more bandwidth of multiple cores through the bus. Front side bus speeds have seen limited speed growth.
    3. A majority of people have moved over to laptops. This creates two main issues: 1 Limited space for CPU cooling, but also battery power has not increased at the same rate as CPU power did. So on a laptop manufacturers do not put the fastest CPU's in them as otherwise when running on battery the computer would only last a few minutes.
    4. In the older days the main CPU handled the bulk of the systems graphics. These days virtually everything has been offloaded to the GPU meaning smaller CPU's are required for general operation.
    Right now I am writing this on a 6 year old Toshiba M780 Tablet (real tablet PC) With 8GB ram, Dual Core i7, Raid, touch screen and so on, so why upgrade as it does 99% of what I need to do, and even now, the bulk of laptops are no faster except in 3D unless you are going to large, expensive and bulky laptops? Next to this tablet / laptop though is my HP Workstation with 36 cores and 256GB Ram and quad SLI Quadro's. On the Laptop side speed has not really increased, on the desktop / workstation side, they have drastically increased, however so has the cost of them.

    • @volkswagenginetta
      @volkswagenginetta 9 ปีที่แล้ว

      +10p6 you mean 18 cores with hyperthreading. the xeon e5-2699

    • @10p6
      @10p6 9 ปีที่แล้ว +3

      +volkswagenginetta no I mean twin 18 core xeons, 72 hyper threads

    • @volkswagenginetta
      @volkswagenginetta 9 ปีที่แล้ว

      10p6 oh right double socket. my bad

    • @clyax113
      @clyax113 8 ปีที่แล้ว

      +10p6 Is there a similar model that has a little higher performance that could last me about 8-10 years than the one you mentioned originally?

    • @ToriRocksAmos
      @ToriRocksAmos 8 ปีที่แล้ว

      +10p6 just wanna add one comment to your point 1.)
      Just adding more cpu cores instead of increasing clock speed (or performance per clock) doesn't scale nearly as well as increasing clock speed.
      For example: If you increase the clock speed of your 3 Ghz Quadcore by 10%, it'll perform almost 10% better in most tasks, resulting in a close to 100% scaling.
      If you add 2 more cores @ 3Ghz to your quadcore you'll prbly won't see much of an improvement - unleass you are doing a lot of multi-tasking, or if you are using prosumer multicore optimized software.
      (I know that I reduced the complexity of the matter in my statement; hopefully it still makes sense.)

  • @Lexyvil
    @Lexyvil 9 ปีที่แล้ว +31

    Also that last song sounds like it's coming from the classic Lemmings game~

    • @The8BitGuy
      @The8BitGuy  9 ปีที่แล้ว +13

      +Lexyvil It is... I made a whole episode about the lemmings music.

    • @bryndal36
      @bryndal36 9 ปีที่แล้ว +3

      +The 8-Bit Guy I was going to say the same thing about the lemmings music. It brings back great memories of playing lemmings on a friends 386 so many years ago. :)

    • @Siwena
      @Siwena 9 ปีที่แล้ว +1

      +Lexyvil Hey old timers :). Glad to see that comment has been covered :)

    • @user2C47
      @user2C47 9 ปีที่แล้ว +1

      From a YM3812

  • @Vriess123
    @Vriess123 8 ปีที่แล้ว +16

    CPU's do seem to be reaching diminishing returns. A Sandy Bridge processor from 5-6 years ago is actually very close to the brand new Skylake's and honestly there isn't much reason to even upgrade if you are overclocking it.

    • @Vriess123
      @Vriess123 8 ปีที่แล้ว

      Ivy was only a little faster than Sandy. Sandy is really the start of the cpu's that still are very close to the newest ones.

    • @Patchuchan
      @Patchuchan 8 ปีที่แล้ว

      True a Sandy bridge is still a good CPU today.
      I can see where he's coming at as compared to the past the rate of advancement has slowed.
      Compare an Apple II+ that came out in 1979 to an Amiga 1000 that came out in 1985.
      Both were state of the art machines for their day but the latter is vastly more capable.

    • @howlingwolven
      @howlingwolven 8 ปีที่แล้ว +1

      The big thing today seems to be with graphics. GPUs nowadays use a LOT more power and do a LOT more than they used to. Ever more pixels means ever more cores to drive them at an acceptable framerate, which has also increased greatly.

    • @Vriess123
      @Vriess123 8 ปีที่แล้ว

      Yeah, while cpu speed has gotten pretty stale graphics cards are still getting faster and faster. I wonder when they will start hitting a wall with that as well though

    • @bit2shift
      @bit2shift 8 ปีที่แล้ว +2

      Power consumption seems to be going down with each process node.
      The trend going forward is to add more cores and larger SIMD registers.
      GPUs still have to go above the 2GHz barrier, and with even more stream processors.

  • @PolyClubGaming
    @PolyClubGaming 5 ปีที่แล้ว +26

    2:00 2019 - My computer has 24GB RAM, it supports much more but I don't really need more than that lol

    • @billybobjoe198
      @billybobjoe198 5 ปีที่แล้ว +5

      I built my computer in 2011, 5 years before this video came out and have had 24gb of sweet sweet triple channel ram taking up all 6 of my slots since then.

    • @ElijahCiali
      @ElijahCiali 4 ปีที่แล้ว

      I've got 32GB and am upgrading to 256 soon.

    • @Tobi_DarkKnight
      @Tobi_DarkKnight 3 ปีที่แล้ว

      I got 32gb of ram and it's still not enough.

  • @alcidiow
    @alcidiow 9 ปีที่แล้ว +66

    since amd can't barely compete with intel for now intel barely has to try to improve in their cpus, which means we see less improvement overall in the maket.
    basically why should we try to improve performance by a long shot when we don't need to.
    i hope zen makes an impact when it comes out

    • @manictiger
      @manictiger 9 ปีที่แล้ว +13

      +alcidiow
      Intel will be effectively out of new products once they reach 7nm chips.
      Once that's out, IBM will have their 4nm chips out and Q-bit hybrid chips and other exotic solutions will start appearing on the market.
      Intel will effectively be finished if they can't come up with a rival to all that.
      So they need to buy time.
      They are deliberately delaying the release of 10nm. They could have made a 10nm plant in 2014 or 2015, but that would be like speeding up the construction of their own gallows.

    • @Lefsler
      @Lefsler 9 ปีที่แล้ว +2

      +alcidiow They create new techniques to improve the performance, new DMA, Branch prediction, cache hit and others.

    • @nameless-user
      @nameless-user 9 ปีที่แล้ว

      +manictiger Wait, 4nm is a thing? No wonder computer performance is levelling out. I don't know what the smallest possible silicon process size is, but we must be getting close.

    • @manictiger
      @manictiger 9 ปีที่แล้ว +1

      *****
      IBM announced that they made a working 7nm carbon transistor. For some reason my brain thought it was 4nm.
      The theoretical limit for transistors is about .1nm, which is based on the concept of a phosphorus atom transistor.
      I guess Moore's Law isn't quite finished, yet.

    • @nameless-user
      @nameless-user 9 ปีที่แล้ว

      manictiger How did I not hear about this? XD

  • @Durakken
    @Durakken 9 ปีที่แล้ว +248

    I don't know why you would use clock speed which isn't a good measure of comparison at all due to some of the things you mentioned. You should instead use FLOPS which can be found roughly by looking at CPU's FLOPS/Cycle which in older CPUs was 2, now get up to 8, but most CPUs are the standard 4, if I remember right. You take that number and multiply it by the clock speed multiplied by the number of cores.
    Any CPU before the 2000s is going to look like 2*C*1 where as the average CPU today is going to be 4*C*2 with the average top end being 8*C*8+ and still higher end having 10 to 12 cores and thats if you're not crazy and doing a Multi Server CPU build where you can have I think like 3 CPUs and those types of CPU have around 48 cores at max...
    But the answer to your question is fairly simple really... All that improvement in raw power is there but there is a second benefit here that is being over looked which leads us the right answer.
    Let's suppose I want to write a program and don't want to deal with the headache of Multi-threading and core selection and I want to make sure that there are no problem whatever the configuration settings are on the computer... Well, that means I'm going to take into account mostly how fast the program can run on 1 core.... So what is the difference in terms of performance between the average computer in 2005 and the average performance in 2015? Between none and double. Double sounds like a lot, but double is a 1 year difference in most situations that have to do with PCs. So Mostly applications you're going to run are going to have the same level of impact in terms of processing power as an application run in 2006 run on a 2005 machine. It'll be slower and take more percentage of the resources but it should run more or less the same because applications aren't designed to take up all the resources available...especially OSs which try to keep from taking many resources if possible, because all OSs other than Apple OSs want to be able to run on as many systems as possible and, you know, actually be used to run other programs.
    So you're idea is close, but it is missing the component of threading into account which means that the reources to run most applications is, more or less, static since multi-core processors became the norm. The idea of power users and the failure of Windows, etc that were presented are good ideas, but ultimately they don't matter when you take what I said into account, because with this fact alone you get the results we have every time.
    Considering that your test was done on MacBook, the suggestions presented might play a bigger role due to them being much more first party development, but the fact is that most application developers don't want to deal with multi-threading stuff unless they have to and most application developers probably couldn't even if they wanted to. Apple obviously wanted the app store to be somewhat friendly to a greater number of developers since it is a major selling feature since around 2005 which would result in the same thing happening with all their apps and thus creating the same situation, maybe even more so when you take into account the portability of acounts between devices. Making sure everything works on the poorest quality device ensures across the board quality.

    • @The8BitGuy
      @The8BitGuy  9 ปีที่แล้ว +43

      +Durakken One problem I ran into was trying to compare very old CPUs from the 1980's to modern CPUs. I found it difficult to find comparison data for the old CPUs that would have data in the same format for modern ones.

    • @Durakken
      @Durakken 9 ปีที่แล้ว +4

      The 8-Bit Guy I didn't mean to come off as an expert or something if I did. I very much suspect you know a lot more about the subject than I do, but the answer to me seems to me to be what I said based on what I know and the reasoning I said.
      I'm by no means a hardware guy so I don't know much about that info other than what I've randomly come across and don't really follow any of that stuff so I don't don't know what format you're really looking for but I would imagine that the very old CPUs are hard to find info about for various reasons, but you might be able to get info from people who post on the computerphile channel, but I didn't have any problem finding comparisons for MIPs which are a good comparison.

    • @asirf.3634
      @asirf.3634 9 ปีที่แล้ว +2

      +Durakken its weird that my 1.1ghz macbook is way faster than my old 2.5ghz mac, but this explains it.

    • @Durakken
      @Durakken 9 ปีที่แล้ว +7

      Asir F. Well, yeah... consider the following...
      You run the OS and a Game. Let's say the OS takes 2 Billion instructions per cycle to run and Game takes 2 billion instructions per cycle to run.
      Your old 2.5ghz mac is likely running 5 billion instructions per cycles, 4 billion of which is taken up running the game... more over the OS's instructions are ran first, then the Game's so there is 200 to 300 milliseconds between each time the system is running the Game instructions or OS instructions depending on how they use that 1 billion unused instructions per cycle. generally they just go on to the next instruction set, but it is possible they dont
      The 1.1ghz mac likely running at least a dual core and can handle more instructions per cycle so 1 core is running 4.4 billion instructions per second and the other is running 4.4 billion per cycle. So you're like "oh more instructions per cycle, that makes sense" but it's more than that... The programs are also running simulaneously so you're OS is always running on the 1 core and the Game on the other core which means there is no wait between when the system is handling those instructions... Even if you say the cores each only hand 2 instructions per cycle, that's 2.2 billion instructions per cycle per core which is just enough to run either... and even though the total instructions are 4.4 billion, 600 million less instructions than your old system, because the programs are essentially running on different cores and at the same time, the performance is better.
      Obviously there are some short comings to this, like if your game uses 2.3 billion instructions per cycle... in which case 100 million of those instructions would then be processed on the core that the OS is running on and requires slightly more instructions to govern which instructions are processed where and whether or not the game has to wait for those instructions or not. In which case it may be better to have your old mac.
      This is a really simplified explanation mostly this is done automatically if done... a lot of times people and developers don't bother setting things up this way so what you end up getting is everything is stacked on a single core until itsfilled up and then the next etc... but it depends on the developers and the person who configures the computer.
      Also, this doesn't take into consideration caches and other such things that can and often do bottleneck the system.

    • @asirf.3634
      @asirf.3634 9 ปีที่แล้ว +2

      Durakken my goodness, are you taking a masters degree in computing? but thank you for explaining that to me, had to read it a few times to fully understand. So like the 1.1ghz is faster because 1 core is for the os and the other core is for the game or any other program , but if its 2.5ghz both would be run on the same processor and thus it will be slower.

  • @JoeZasada
    @JoeZasada 8 ปีที่แล้ว +10

    nice bookshelf of classic star trek games. 25th anniversary. Judgement rites. those were awesome!

  • @alihelmy
    @alihelmy 6 ปีที่แล้ว +1

    Mate, your videos are some of the most awesome videos to watch. I absolutely love hearing about how these awesome machines from my childhood worked!

  • @rtv190
    @rtv190 8 ปีที่แล้ว +112

    6 months later he is at almost 300K subs now

    • @pablorojo7989
      @pablorojo7989 8 ปีที่แล้ว +2

      1 week later and he's at 302k :D

    • @danicic87
      @danicic87 8 ปีที่แล้ว +1

      2 hours later 303.5 k subs :P

    • @metaldrums1015
      @metaldrums1015 8 ปีที่แล้ว +1

      7 hours later, 305.3k

    • @bobalobalie
      @bobalobalie 8 ปีที่แล้ว

      Subscribers mean little to nothing. There are plenty of people with millions of subscribers yet. They only get 200k views per video. Views is what determines how much TH-cam pays for monetization of ADs.

    • @metaldrums1015
      @metaldrums1015 8 ปีที่แล้ว +2

      Yeah but they could have less views because they didn't have reoccurring viewers which would be the subscribers. Also minutes watch count towards the pay as well.

  • @Itsmekimmyjo
    @Itsmekimmyjo 6 ปีที่แล้ว +5

    I think one of my favorite things about your channel, is that you always seek opinions of others instead of trying to appear like you know it all. Genuinely honest and intriguing videos ⭐️ ⭐️⭐️⭐️⭐️
    Oh.. and also your fabulous shirts!

  • @matlilly8795
    @matlilly8795 6 ปีที่แล้ว +3

    That collaboration was incredible. It's so nice to see the nerd commnity working together.

  • @BubblegumCrash332
    @BubblegumCrash332 5 ปีที่แล้ว +3

    The 90s had the best jump. From 91 to 99 it was like a different world.

  • @someoneorother3638
    @someoneorother3638 8 ปีที่แล้ว +195

    If you're a casual computer user, there is absolutely NOTHING you're doing on your computer that you couldn't have done on a computer 10 years ago. Checking email, using social media, using a word processor... the computers of 10 years ago could do all that stuff and they could do it well.
    There's no reason that 10 year old technology SHOULDN'T be completely adequate for casual computer users.

    • @phreak1118
      @phreak1118 8 ปีที่แล้ว +41

      Unless you casually play any game made in the last 5 years.

    • @phreak1118
      @phreak1118 8 ปีที่แล้ว +40

      Also, I have a laptop from 10 years ago and it cannot play video at 1080p.

    • @someoneorother3638
      @someoneorother3638 8 ปีที่แล้ว +11

      It depends on the game. FPS's and such that require high graphics, sure. But most games don't require that level of graphics and can be played perfectly fine on a 10 year old computer. My desktop is about 10 years old and I can play most games on it fine still. Good thing I'm not into FPS's.

    • @CzornyLisek
      @CzornyLisek 8 ปีที่แล้ว +2

      Some newest end fastest video, music, wi fi ect. standards use hardware. So today even weak pc/laptops on spec can do 4k, super hi speed network stuff, safety ecryptions, they have specialised parts to do the work so that main cores(CPUs) don't do it. While old Pc/laptop will strugle do to shit.
      Becouse of that specialised hardware we can put slower CPU, but build overall more efficent laptop in "normal" tasks.
      Also remember that even expensive laptop, unless it's super hi level gaming laptop that cost fortune, is already like 3-5years behind mid lev Pc, when relased
      Using browser(or something like Photoshop end other graphic/vid software) well... There is no end to RAM usage, even 128Gb could be to little.
      For Me max. was like 12GB used + 12GB reserved by browser(Chrome) alone. While CPU usage was almost none.

    • @MaaveMaave
      @MaaveMaave 8 ปีที่แล้ว +18

      It's possible, although software bloat makes it difficult. There's so much CPU-intensive CSS, JS, flash, etc on modern webpages that it chugs on old hardware.

  • @Stigsnake5
    @Stigsnake5 9 ปีที่แล้ว +14

    07:05 I wouldn't say windows 8 was a failure in performance, just the GUI was terrible for PCs and non-touch screen laptops and maybe some other subjective design choices but otherwise was a performance increase over previous versions.

    • @R4dm1n
      @R4dm1n 9 ปีที่แล้ว +4

      And that's why I use Windows 7.

    • @R4dm1n
      @R4dm1n 9 ปีที่แล้ว

      ***** I'm upgrading to Windows 10 anyways.

    • @neeneko
      @neeneko 9 ปีที่แล้ว

      +Blaze I think the idea as not that windows 8 was a failure in performance, but that its marketing failure resulted in a larger than expected number of machines remaining on windows XP and 7

    • @videotape2959
      @videotape2959 9 ปีที่แล้ว

      +SilverGenerations Studios Be careful with that. In my opinion Windows 10 is even worse than 8.

    • @R4dm1n
      @R4dm1n 9 ปีที่แล้ว

      VideoTape Not really.
      Privacy issues is a big problem in Windows 10, but the other features are very good additions, well, except that there's only two options in Windows Update.

  • @ChrisMusty
    @ChrisMusty 5 ปีที่แล้ว +88

    I am from the future, in 2019 AMD has released a 64 core processor!

    • @mikem9536
      @mikem9536 5 ปีที่แล้ว +5

      Yeah, Intel just slowed down.

    • @a64738
      @a64738 5 ปีที่แล้ว +13

      Intel has had a 64core CPU for many years, only problem is that it cost about 15.000 $...

    • @ChrisMusty
      @ChrisMusty 5 ปีที่แล้ว +8

      @@a64738 are you sure about that? 64 cores or 64 bits?

    • @Graeme_Lastname
      @Graeme_Lastname 5 ปีที่แล้ว +3

      @@ChrisMusty All my bits are made of cores, good as quantum. ;)

    • @ieast007
      @ieast007 5 ปีที่แล้ว +3

      I'm also from the future and AMD is out of business since they sold out and transferred their Ryzen CPU technology to the Chinese state owned company Tianjin.

  • @biggshasty
    @biggshasty 5 ปีที่แล้ว +2

    I know this is an old video, but it's still kinda relevant. I get asked this question all the time. Thanks for uploading.

  • @Storm_.
    @Storm_. 8 ปีที่แล้ว +4

    One thing none of the experts mentioned is GPU power. In the last 10 years GPU's have outstripped CPU's in terms of exponential computing power and these days you have things like CUDA that offload processing that the CPU would normally do on to the graphics chip. So really what we have these days is a CPU and then a multi-media beast processor GPU supporting everything. Even on Macs you'll notice their latest kernels after snow leopard started utilizing the GPU to speed up OS tasks. So in effect - gaming has been the main driving force for development in computer hardware.

    • @SerBallister
      @SerBallister 8 ปีที่แล้ว

      +Storm Gaming Media Yup, cannot be over stated how much of a shift GPUs have brought in terms of processing power.

  • @Jere616
    @Jere616 8 ปีที่แล้ว +24

    At 5:50, who else was more interested in reading all those titles Clint Basinger had instead of listening to what he was saying?

    • @kneekoo
      @kneekoo 8 ปีที่แล้ว

      I just paused the video so I can read them without the background noise. :))

    • @ddnava96
      @ddnava96 8 ปีที่แล้ว +2

      I was interested in playing Zoo Tycoon again :(

    • @jamiemarchant
      @jamiemarchant 8 ปีที่แล้ว +1

      I subscribe to his channel and there is often something neat in the background. This time I saw the cheesy FMV game Star Trek Borg.

    • @frankstrawnation
      @frankstrawnation 7 ปีที่แล้ว

      Caesar II for the win!

    • @sunilkumar-id5nm
      @sunilkumar-id5nm 7 ปีที่แล้ว

      Jere616 lol I actually did that,,,

  • @humanperson8363
    @humanperson8363 7 ปีที่แล้ว +23

    Can you do a follow up video on this topic?

  • @TK199999
    @TK199999 6 ปีที่แล้ว +2

    When I was lad, in the long ago late 1990's, I was in Explores. Now the one I was in was run by engineers, who taught us things like how solder and read circuit diagrams, but one of was much older and had been retired for many years. He started working as a computer engineer back in late 60's, so it was like 1999, we the students asked about Moore's Law and was it stopping. He explained to us, from his prospective that no it wasn't, but that what's happen in computers over the last 40 years and what Moore was predicting were not necessarily the same thing. He said in terms of computing power the difference that he said was being made was in efficiency. He explained that since the late 1960's computing was from programing to microprocessors was very brute force, more watts, more transistors, more microprocessors more lines of code. Which made things faster but not necessarily better or more useful.
    Basically added more stuff to get the same results at little fast, but at massive cost increase. The example he gave was CPU heat, which anyone who remembers those heady days when CPU's like the Pentium 4 Prescott was breathing fire and terrorizing small villages knows was a very hot and very power hunger chip. He argued we know there is a better way and now we are starting to see we can't brute force ourway to better performance. That its efficiency in hardware and software design that the real jumps in computers was heading. Which I think has happened over the last 20 years, with more efficient multicore chips, running at less power, less heat and more efficient software doing the same amount of worker at about the same speed as things like a single core 10 Ghz chips using software made in old way, that require the power of the atom to run, with just as much heat.
    So the reason older computers can still do a lot of the newer can is because of the software side of greater efficiency (I believe its called IPC today). Not to say those older chips were not capable, once we started down this road of efficiency over brute force, the change was as massive as from going non-microprocessor systems to x86 CPUs. Which can use every last once of power of those older systems and don't need 5 Ghz to do a lot of the work. Don't get me wrong 5Ghz makes thing faster, and more capable but in terms of efficiency well that number keeps shrinking. So the future I see is more cores, less heat, less power, better software and clock speed increases will be an afterthought, but will probably steadily climb upward if not rocket in days past, because it doesn't need to.

  • @The_Metroid
    @The_Metroid 5 ปีที่แล้ว +26

    I'm using a "craptop" from 2003. In 2019. Windows 10 has never been so slow.

    • @umbertohaterhd722
      @umbertohaterhd722 5 ปีที่แล้ว +1

      Yay Pentium M bro.

    • @tl1882
      @tl1882 5 ปีที่แล้ว +1

      I'm using an i7 and i can't run windows 10

    • @tl1882
      @tl1882 5 ปีที่แล้ว

      @Michael Francis where i live windows 10 is ~200$

    • @earlpottinger671
      @earlpottinger671 4 ปีที่แล้ว

      Get a SSD. First, I have Windows 7 on my desktop, and I find it way better than Windows 10. However, I have a laptop with Windows 10, and it was driving crazy how slow it responded to me even with the fast hardware it had. Worse, I installed Haiku on a USB flash-drive, and it still ran faster than Window 10 on the hard-drive.
      I replaced the hard-drive with a SSD and the difference is amazing. I think Windows 10 tries to read too many small files and has a poor cache system so it is very slow on a hard-drive, on a SSD the computer works the way I expect it to.
      Next step, Haiku on a SSD for this computer.

    • @The_Metroid
      @The_Metroid 4 ปีที่แล้ว +1

      @erik masterchef I'm saving up. I don't have enough at the moment.

  • @immortalsofar5314
    @immortalsofar5314 5 ปีที่แล้ว +7

    Coding has also changed. I remember changing JSR, RTS to JMP on the C64 to save a byte. What I remember most about the '90s was the frustration of trying to use new software that was written for the *next* generation of computers and developers not caring about efficiency because it's running under windows anyway.
    The one thing they've struggled to increase exponentially is productivity. They tried 3GL, OOP and numerous other strategies but despite some gains, hardware has outstripped it so the *cost* of using all of the system's resources has increased. It's no longer a case of "can we fit this bell or whistle in there, it's down to whether or not it's worth it (and, frankly, does anyone what it?)

    • @CristalianaIvor
      @CristalianaIvor 5 ปีที่แล้ว +2

      I think thats pretty sad. If you think about the many tricks devs did to fit something or make it run faster - it feels like thats all forgotten today. everything is about making stuff look fancy in 4k graphics but nobody cares that it runs like shit because nobody ever cared to optimize anything -.-
      the shit some game devs serve us could be programmed better by an entusiastic five year old. hell even a monkey hitting keys randomly.

    • @silentbloodyslayer98
      @silentbloodyslayer98 5 ปีที่แล้ว +1

      @@CristalianaIvor CPU's arent getting faster than used to be so optimization will have a big comeback real soon

    • @CristalianaIvor
      @CristalianaIvor 5 ปีที่แล้ว +1

      @@silentbloodyslayer98 hopefully, yes

    • @annagulaev
      @annagulaev 3 ปีที่แล้ว

      Until the 1990's, coding things as simply as possible was a sign of competence and cleverness. Since then programmers have switched focus to impressing their peers with complexity. So much of modern programming practices is counter productive, geared toward demonstrating competence through compliance with ego-focused rather than productivity-focused methodology. In the 90's, OOP switched from being a TOOL to being a RULE. Use it, always, for every problem, and be able to recite the precise definitions of its terms, or you're a moron. And from there, complexity as a virtue has only gotten worse. The expected tool set now includes what are going to turn out to be fly-by-night technologies that will hamper maintenance coders until the code is replaced by the next set of ego tools. Coding has become something that only code farms can do, because learning the tools has to be amortized over multiple clients. It's becoming a thing that small companies can't do anymore, because programmers have made it way more complex than it needs to be. Out of ego, resume building and salary protection.

    • @immortalsofar5314
      @immortalsofar5314 3 ปีที่แล้ว

      @@annagulaev In my experience, the new tech tended to be pushed by management to simplify (sic) what they don't understand. Complexity isn't the sign of good code but elegance is. I once wrote what I thought was an idiot-proof system, someone else made a change and broke it which puzzled me so I asked a colleague to see if he could fix the problem without having my knowledge. He went in, found the right module and could see at a glance not only what had been changed but what should have been changed. Idiot- but not moron-proof. Complex solutions need rewriting. Simple solutions to solve complex problems - they're what make you say "Wow!"

  • @KneelB4Bacon
    @KneelB4Bacon 8 ปีที่แล้ว +22

    I think the thing that amazes me most right now is how much external storage has improved. In just the last few years, the cost of flash drives has dropped dramatically while the increase in capacity has been just as impressive..

    • @japzone
      @japzone 8 ปีที่แล้ว +6

      I can fit 512GB of storage on my fingernail now. It just boggles my mind.

    • @WildBikerBill
      @WildBikerBill 6 ปีที่แล้ว

      This exponential increase in capacity/decline in cost has been seen before in both hard drive capacity and memory (DRAM).

    • @m8x425
      @m8x425 6 ปีที่แล้ว

      The makers of the Flash Memory IC's shot themselves in the foot by limiting production of the chips, which drove up the cost. Now the Chinese are getting in on the gravy which is causing a flood of memory IC's.
      The same thing is happening with RAM. Samsung is trying to limit this by making limited quantities of high speed RAM, but that probably won't help them.

  • @KirbyZhang
    @KirbyZhang ปีที่แล้ว +2

    you can still single task most programs with 1GB of memory. The extra memory just isn't needed by get gobbled up by multitasking, which you can kind of give up on and still feel the computer is "usable". Also single core performance barely doubled.
    in 1990's there was the operating system revolution into multitasking and realtime multimedia, the applications were really pushing the CPU whereas today there's no killer app that pushes the CPU

  • @sdphotography4733
    @sdphotography4733 8 ปีที่แล้ว +31

    I am oft reminded of a 'scientist' (circa 1970) that claimed that the computers on the Star Ship Enterprise were impossible because it was impossible to fit enough vacuum tubes on such a ship to power the computer. To the point, we can't imagine what is down the road nor how fast computers can really get.

    • @jhoughjr1
      @jhoughjr1 6 ปีที่แล้ว +2

      SD Photography funny since by then vacuum tubes had been replaced by transistors and ICs by then.
      He obviously didn’t know his trek because their computers didn’t use vacuum tubes.

    • @oldtwinsna8347
      @oldtwinsna8347 6 ปีที่แล้ว +1

      @@jhoughjr1 i thought everyone knew they used duotronic circuitry.

    • @jhoughjr1
      @jhoughjr1 6 ปีที่แล้ว

      @@oldtwinsna8347 lol there is a term I aint heard in ages. My instructor's instructor told him BJTs were just a fad and not really useful.

  • @nplowman1
    @nplowman1 6 ปีที่แล้ว +6

    I think the perception that things have slowed down is also partly due to the fact that the off-the-shelf configurations we see on consumer laptops has been pretty stagnant for a while. For example, I bought an entry level Dell laptop for $500 in 2009 that had 4gb of ram and a 2ghz dual core processor. If you check a Best buy ad today, that's basically the same configuration you'll see on the cheaper entry level laptops. They've probably gotten cheaper, thinner and improved battery life since 2009, but otherwise not a lot has changed at the entry level.

    • @Chriserino
      @Chriserino 6 ปีที่แล้ว

      I think you just described all current chrome books.

    • @EmeraldEyesEsoteric
      @EmeraldEyesEsoteric 6 ปีที่แล้ว +1

      Actually new Laptops completely suck, because they usually don't have any actual disk space. They are trying to promote this "cloud" crap which means you can't access your stuff without an internet connection, and for all you know, the government can. They are trying to phase out desktops, but you shouldn't buy a desktop from a store anyway. Design it from the ground up instead, otherwise it'll be more expensive because it'll come with Windows 10 instead of 7 and a bunch of other stuff you don't really want. I would never buy that. How dare they sell us Laptops with less then 100 gigs disk space!

    • @theravenmonarch9441
      @theravenmonarch9441 6 ปีที่แล้ว

      @@EmeraldEyesEsoteric there are gaming laptops that ONLY come with a 256 or 512GB SSD... GAMING and half a TB and LESS. So it's not because of cloud, it's stupidity.

    • @dragunovbushcraft152
      @dragunovbushcraft152 6 ปีที่แล้ว

      @@EmeraldEyesEsoteric How dare you BUY them. If you went to a good, Lenovo T440p, with nVidia graphics, you have a VERY powerful machine, that is a halfway decent gamer, and will cost you between $160-$400, refurbished. Stop buying their crap, and they'll stop making it.

  • @Geardos1
    @Geardos1 9 ปีที่แล้ว +19

    If you look at certain industries the difference is even more profound.
    Compare what you needed to edit video in 1995, 2005, and 2015.
    1995: specialized expensive workstations and hardware for SD video
    2005: Reasonably powerful computer for SD video
    2015: nearly any new computer can edit HD video

    • @R4dm1n
      @R4dm1n 9 ปีที่แล้ว

      Good.
      Sony Vegas is still going after 2015 on my Sony VAIO.

    • @Mandragara
      @Mandragara 9 ปีที่แล้ว +5

      +Geardos And now people demand 4K video for their 1080p displays

    • @Geardos1
      @Geardos1 9 ปีที่แล้ว

      1080 is fine for viewing if you have a display smaller than 60 inches

    • @mathieunouquet1928
      @mathieunouquet1928 9 ปีที่แล้ว

      +Geardos only because the videos is handled in hardware, and not in software. Note that the NLE part is one thing, but encoding is a whole different one.

    • @Geardos1
      @Geardos1 9 ปีที่แล้ว

      Encoding got way faster too.. Encoding SD video used to be painful.. Now I can encode HD video relatively quickly.

  • @medleysa
    @medleysa ปีที่แล้ว +2

    It’s not so much that computers have slowed down in progress; it’s more that user needs/demands have largely halted since the mid-2000’s.

  • @PearComputingDevices
    @PearComputingDevices 5 ปีที่แล้ว +8

    For years back in the late 90's and after I was a developer for a company called Be. inc for their OS called BeOS. One of the problems with BeOS at the time and is still largely true about this OS was the fact there was no major developers. Today this isnt as big of a deal as it seemed back then and to make matters worse Be, inc had a tough time attracting developers. At some point in early 1998 during a major meeting with management and shareholders I gave my impression about the future and in that was the fact that we wouldn't need software like Office *gasp* I proposed that set top boxes could be the future and BeOS could be the underlying software that ran on them. A bit like WebTV but much less limited, Since WebTV didn't offer much outside of a browser. This prompted Be Inc to research the idea of avery compact version of BeOS to fit in roms. IT had never gone far I admit. But I think we were indeed too far ahead. Ironically a decade later.....

    • @jonwest776
      @jonwest776 ปีที่แล้ว

      I had that OS. But no software.

    • @PearComputingDevices
      @PearComputingDevices ปีที่แล้ว

      @jon west Well there was tons of software. Just few commercial packages. Something I didn't mind. I had the Chromebook vision long before Google though. Because if you could do it all online, what does it matter? Nobody buys a Chromebook today expecting additional software, let alone commercial. But the great news all along: No bloat. BeOS had very little bloat. Unlike Windows then and now.

    • @jonwest776
      @jonwest776 ปีที่แล้ว +1

      @@PearComputingDevices The no bloat is why I bought it. I loved the whole concept of it. Unfortunately I used commercial software, and nothing I used migrated to it. I guess it shows what a monopoly does to any competition.

  • @faded.0913
    @faded.0913 9 ปีที่แล้ว +37

    In 2015 we don't have 8gb ram we have 128 and we have over 4,000 MHz. I found those charts to be Irrelevant

    • @The8BitGuy
      @The8BitGuy  9 ปีที่แล้ว +107

      +The Computer Tech Guy The charts were showing "average" hardware, not top-of-the-line.

    • @brandonjensen63
      @brandonjensen63 9 ปีที่แล้ว +9

      +The 8-Bit Guy I still think that the average would be somewhere around 3,000 MHz

    • @beat461
      @beat461 9 ปีที่แล้ว +18

      +WarlordLeo 1 more like 2.5Ghz. Don't forget a lot of casual computer users, which are the majority, use i3/i5 dual cores clocked at 1.6-2.5 ghz

    • @WR3ND
      @WR3ND 9 ปีที่แล้ว +2

      What's a "computer?" The average is pretty low, but then I still occasionally use my pre-Titanium TI-89 calculator which is... approaching being 20 years old and uses a processes that was used in some of the very first personal desktop computers ever made.
      Alternatively, my main computer is a 64bit 6 core/12 thread i7-3930K @ 4.2GHz, with 64GB quad channel DDR3 RAM @ 1600MHz, 2 Titan Black cards in SLI, and a TB SSD as the main system drive... and is a few years old now, not including the video cards and even those are getting a little long in the tooth. Still, there really is no reason for me to bother upgrading yet, because there isn't anything to upgrade to that is significantly more powerful. Maybe in a year or two the video cards might be worth upgrading.
      Maybe the average has been increasing a bit, but the high-end is stagnating probably for a few reasons. Most people don't have significantly more demanding applications and the popularity of mobile devices which are still trying to outdo each other every year, but can't really compete with current desktop options in terms of raw computational power and data storage.
      In short, computers overall haven't really improved all that much recently, but the proliferation of the low-end and power efficient mobile devices and their improvements have brought up the average.
      In terms of computing power, I primarily use my main computer to crunch for scientific and humanitarian research and it can work on as much as I want to throw at it, so it's an outlier.

    • @brandonjensen63
      @brandonjensen63 9 ปีที่แล้ว +4

      +WR3ND +Johnnie Bassdrum +The 8-Bit Guy but you guys don't get it, moors law is based on the "new" processors and his video was in 2015 which means that it would be. about all of the Skylake and new A10 processors which are mostly all above the 3000 line...

  • @JetScreamer_YT
    @JetScreamer_YT 8 ปีที่แล้ว +5

    Average people don't need an abundance of power for everyday tasks. In fact, most of my friends computing needs dont go past a phone, or a tablet.
    Many people are going to laptops. It seems the manufacturer's goal is to get performance to be better, while needing less power to cool. lower voltages makes it easier to cool the machines. No more laps being burnt. This is good for the batteries too.
    I think it was a guy who made DIGITAL said we would all work through terminals, with cental servers. It came true.
    Then the market quickly changed. We needed to power each machine with its own sofware.
    However, the Cloud changes everything as mentioned. We are now on terminals again. Chrome Books are proof. Steam Boxes, and our own PC's stream high end games on average hardware. It appears that Digital corp. was right after all.
    I'm only 46, but I have been hands on with computers for 40 of those years. My Dad was one of many pioneers who got us to where we are today. I'm grateful I got to sit in the sidecar, and witness all this a little bit closer, because it was his field. I was born at the perfect time to remember our simple existance, and how we got to this place. At no other time have we progressed and evolved so much, in just 50 years time.

  • @ChristianHegele
    @ChristianHegele 4 ปีที่แล้ว +2

    I think another aspect at play is how the entire industry has really settled down in the last 20 years around the x86 architecture (and the x86-64 extension which remains fully compatible with it). In past ages there were so many competing processor architectures from many competing firms, and advances in CPU power often came at the expense of backwards or cross-platform compatibility. Commercial software ecosystems just disappeared as newer hardware appeared, which was fundamentally incompatible with the code written for earlier machines.

  • @moptim
    @moptim 5 ปีที่แล้ว +3

    Seeing the plaque and a subscriber count of 931k... figured I'd hit that button too, to see how the 1M plaque would look like!

  • @UXXV
    @UXXV 9 ปีที่แล้ว +88

    I have to disagree. Moores law stated that power would double every 18 months (or would be half the cost). This was true up until a few years ago BUT your charts seem way off. The PC rig I bought for $2,000 in 2012 (thats now 4 years old) has a 4TB drive in it plus 16GB of RAM. The CPU is a 3770K i7 clocked at 3.5GHz on 4 cores. Skip forward to now and going by Moors law a new PC for the same money should have 64GB of RAM or more and hard drive storage sizes should have increased to 16TB but they have flat lined at 4 and only now are 8's coming out with weird helium filled units or with cold storage options from Seagate which are slow to read and write from. CPU power too has also stalled with the upgrades to the i7 line for instance only offering a 5-10% increase at best year on year even with die shrinks and optimisations. Software hasnt really caught up with mutithreading and multicores either with most current software maybe using 2 cores at best.

    • @metalj
      @metalj 9 ปีที่แล้ว +13

      +UXXV I could have sworn this was the case too. Glad I am not the only one.

    • @neeneko
      @neeneko 9 ปีที่แล้ว

      +UXXV I think demand for larger hard drive has also be flagging. Unless you are doing something with video it is getting harder and harder to actually fill up those big drives, which means less market pressure for developing larger ones.

    • @UXXV
      @UXXV 9 ปีที่แล้ว +3

      Good comments guys. I edit video and moving from HD to 4K means 4TB drives wont be cutting it for much longer :(

    • @metalj
      @metalj 9 ปีที่แล้ว +1

      UXXV Just out of curiosity, how long does 1TB last approximately in 4k video?

    • @rich1051414
      @rich1051414 9 ปีที่แล้ว

      +metalj 4k at 25fps would take roughly 2 hours to fill a 1TB hard drive. 128 minutes to be exact. That is not a lot of raw video.

  • @sciencoking
    @sciencoking 8 ปีที่แล้ว +6

    Why didn't you use logarithmic charts.. Linear charts are terrible at visualizing the exponential growth you keep pointing out.

  • @BamaChad-W4CHD
    @BamaChad-W4CHD 5 ปีที่แล้ว

    Now over 1 million subs! Well deserved I must say. Love your content. Thanks for doing what you do!

  • @woooweee
    @woooweee 9 ปีที่แล้ว +18

    The base line for running a word processor is just stagnant at this point.
    Browsers still can consume all your ram if you go tab crazy though.
    an ssd can make most old pc's as fast as a decent netbook these days.

    • @srwapo
      @srwapo 9 ปีที่แล้ว +3

      That's what I'm thinking, too. There's not a lot more performance that will be demanded by a word processor in the future. I just don't see how you can innovate that application.
      Gaming, video editing, CAD, and other applications will still see performance benefits from more powerful systems, but more and more people will be content with what they have.
      I wonder if the impending shift to 4k video will drive sales in the near term.

    • @Knight_Kin
      @Knight_Kin 9 ปีที่แล้ว

      +wooo weee The relative need of the older software hit it's development wall while the processing hardware continues to march on. As you say, there's no 'so much' you can do with word processing, 10 years from now maybe a new version of the software might need 1% more power than today, but it would still be dramatically 'less' tasking as a percentage of overall usage.

    • @WalnutSpice
      @WalnutSpice 9 ปีที่แล้ว +3

      +wooo weee I can agree with the SSD thing. I put one in my early 2009 MacBook and it felt like an entirely new computer.

    • @demuzn3649
      @demuzn3649 9 ปีที่แล้ว +1

      Then there are some websites that max out RAM and or CPU usage due to sub-par programming, ;-;

    • @timramich
      @timramich 9 ปีที่แล้ว +1

      +wooo weee Chrome sucks up all of one's RAM...because it's a flawed application with a memory leak... Now I've gone back to FireFox, again have disabled my page file, and have yet to receive any errors about running out of memory.

  • @danieldougan269
    @danieldougan269 7 ปีที่แล้ว +25

    The reduction in clock speeds also has to do with an interest in maximizing battery life and minimizing power consumption.

    • @monday6740
      @monday6740 5 ปีที่แล้ว +8

      I don't think so, clock speeds are important too for marketing purposes. Expected battery life is never met anyway, so they like to not mention that too much ... Batteries are 19th century technology, and hasn't really advanced since then.

    • @arwahsapi
      @arwahsapi 5 ปีที่แล้ว +1

      To stay in market Intel's strategy is to sell less powerful CPU like the Celeron lineups nowadays

    • @johncooper9448
      @johncooper9448 5 ปีที่แล้ว +3

      ​@@monday6740 that's bullshit, batteries have changed a ton since the 1800s, lithium batteries weren't commercialized bascially until Sony did it around 1991 and there's been a ton of advancement in lithium batteries since then, both in cost, technology, battery chemistry, and energy storage. modern batteries last years longer than 1990 batteries and hold way more charge in a way smaller space. just compare early 90s laptop batteries to modern cellphone batteries.

    • @iQQator
      @iQQator 5 ปีที่แล้ว

      Reduction of clock speed improves CPU thermal picture.

  • @TKnightcrawler
    @TKnightcrawler 8 ปีที่แล้ว +17

    LOL at the Lemmings music. ;-)

    • @TheFoodnipple
      @TheFoodnipple 8 ปีที่แล้ว +2

      I knew that sounded familiar!

    • @firstcoastdude
      @firstcoastdude 8 ปีที่แล้ว +2

      Loved that game

    • @thesnare100
      @thesnare100 8 ปีที่แล้ว +3

      I remember it was one of the "tricky" (titled) levels that I couldn't finish on the SNES version, how far did you get?

    • @TKnightcrawler
      @TKnightcrawler 8 ปีที่แล้ว +2

      thesnare100 I was on the PC, so I had a mouse to use. I got about half-way through Mayhem, I think.

  • @JavascriptJack
    @JavascriptJack 6 ปีที่แล้ว

    I really loved this video because you have shared commentary from others, and... it made me smile to see you all working together.

  • @deusexaethera
    @deusexaethera 5 ปีที่แล้ว +4

    Yeah, measuring processing power via clock speed ignores a ton of ancillary information, because even though clock speeds have leveled-off, single-core performance continues to increase thanks to ever-more complex circuits in each processor core that handle common computing tasks orders of magnitude faster than software possibly could.

  • @thegeekonskates1768
    @thegeekonskates1768 5 ปีที่แล้ว +3

    Okay I know this is a very old post, but I wanted to comment on the software side of this. I don't know enough about the hardware side of things like clock speeds, cores and all that - I understand the concepts, but as a coder I see it from a totally different perspective: most programmers are not all that concerned with memory management anymore. That's what the garbage collector or interpreter are for. I build web apps at work and memory use is not even a thought until an app needs to scale. But at home, I've been learning to code for retro systems like the Vic-20, C64 and others, using a C compiler. And when you're building for systems where every byte counts, you find you do things a lot differently. You can store 8 boolean variables in a single byte, so instead of checking if some int == 1 I'm checking if some unsigned char & SOME_CONSTANT (lol). But you can't do stuff like that in high-level languages, so as the one guy said we now have web browsers consuming gigs of RAM; the JavaScript interpreter works overtime, casting variables to different formats, converting arbitrary data objects into a format it can use, and the list goes on. I don't think that's the ONLY reason, but I do think it's a reason. If we paid a little more attention to how we use our resources, slow computers would not be a thing. But that's just one geek's opinion I guess. :D
    Anyway, thanks for the great video. Even though it was from ages ago I think this discussion will continue to be relevant for years to come.

    • @CristalianaIvor
      @CristalianaIvor 5 ปีที่แล้ว

      just think of something like wordpress thats just so sluggish. its sad that programmers today dont give a shit about optimizing stuff.

  • @Ofecks
    @Ofecks 6 ปีที่แล้ว +9

    7:43 I instantly recognized everything rob has in this shot, including his t-shirt. Maybe it's time to get off the internet and go outside.

  • @lornova79
    @lornova79 5 ปีที่แล้ว +2

    And now David reached 1 million subscribers! He certainly deserves it!

  • @MaximumEfficiency
    @MaximumEfficiency 8 ปีที่แล้ว +56

    Microshit and Adobe keep bloating software, for example Adobe reader needs 380MB (!!) of space while Sumatra needs 7MB!

    • @gideonkloosterman
      @gideonkloosterman 6 ปีที่แล้ว +6

      Wow I'm using sumatra from now on!

    • @DJ_Force
      @DJ_Force 6 ปีที่แล้ว +9

      It's not Microsoft. Most open source libraries are also HUGE, and you typically need several to create anything valuable.
      Basically, software developers today don't write code so much as integrate existing code, with a small amount of custom logic. This process is much faster than coding was 15 years ago, but results in MUCH larger executable.

    • @787brx8
      @787brx8 6 ปีที่แล้ว +5

      It's for all the NSA apps...they take up A lot of memory. Lol

    • @jonross377
      @jonross377 6 ปีที่แล้ว +4

      @@Saucy-ws6jc So If I were to pay you $7 dollars an hour to do the EXACT same job that I pay the guy next to you $380 dollars an hour, you would be cool with that right? I mean that's nothing right?

    • @Aeroxima
      @Aeroxima 6 ปีที่แล้ว +12

      @@Saucy-ws6jc If 380 is small, what is 7? How is it even possible for a pdf reading program to be that drastically different in size? Should notepad be hundreds of mb, just because "that's nothing, so it's fine"?

  • @bonkmaykr
    @bonkmaykr 5 ปีที่แล้ว +20

    Heheheheehehehhh... if only he knew.....
    *_minecraft ray tracing_*

  • @ridinggambit5017
    @ridinggambit5017 5 ปีที่แล้ว +3

    I have to question where you got your figures from, in 2005 we had already hit 3.8Ghz with the Pentium 4 processors.... and current Intel processors have an official base speed of 4GHz and boost speed of 5GHz. Overclockers of course are finding that these CPU's actually run just fine with a base speed of 5GHz (or close to).

  • @Phryj
    @Phryj 4 ปีที่แล้ว +1

    This would be a good topic to revisit. One thing to consider: it's getting much more difficult to further shrink down components. At the same time, not only are we seeing more and more cores, but better use of the die space in each core, allowing for faster and more efficient operation, more advanced extended instruction circuitry, and multiple execution pathways in each core. Modern CPUs can get a lot more done per clock cycle, so although we're not seeing the raw speed increases we once were, CPUs are still getting better.

    • @morganrussman
      @morganrussman 2 ปีที่แล้ว +1

      I definitely do agree that this would be a good topic to revisit, considering that almost 7 years later we seem to be at that point where computer specifications seems to be hitting that curve up point again for the half bell up hill.

  • @leeverink32
    @leeverink32 8 ปีที่แล้ว +12

    our CNC machines at work still use windows xp even in 2016!

    • @snbeast9545
      @snbeast9545 6 ปีที่แล้ว +4

      Dedicated industrial/business machines (ATMs, CNC machines, cash registers, etc.) don't need OS upgrades, unless they're connected to the Internet and the OS they're running is no longer receiving security updates. In that case, upgrade ASAP!

    • @jonross377
      @jonross377 6 ปีที่แล้ว

      The reason is because they would have to buy a new version of their software, whichever it is that they use. I am a machinist also and we just upgraded our software and computers it was in the hundreds of thousands for the software upgrades... Most companies are reluctant to do something like that.

    • @ragnarb8331
      @ragnarb8331 6 ปีที่แล้ว

      Bro, our big machine lathe’s are using windows 2000 😂

    • @common_c3nts
      @common_c3nts 6 ปีที่แล้ว

      Our CNC machines still run dos and use floppy disks. LOL

    • @magnusevald
      @magnusevald 5 ปีที่แล้ว

      I work at a lab where we still use a Macintosh from 1984 to do a breath-analyze because the software does not exist on anything else :D

  • @kovanovsky2233
    @kovanovsky2233 6 ปีที่แล้ว +6

    When you say that you ask other computer expert, I thought u're gonna ask computer science professors computerphile style lmao

    • @Just_Gaming4Fun89
      @Just_Gaming4Fun89 4 ปีที่แล้ว

      Rob C he probably should have said “computer enthusiast”rather than computer expert.

  • @Forlorn79
    @Forlorn79 8 ปีที่แล้ว +5

    Older computers used to have a CPU that handled everything in a single thread. Now PCs have multi-threads and GPUs to handle graphics, so the main difference is time. A PC with better components will be able to do more in less time, while a computer with older components will be able to do less tasks and will take longer. Gamers often upgrade the GPU first, because that is the new limitation for high end graphics, while other users might require more CPU power for handling many threads and data crunching. For instance, an older PC can watch TH-cam, but maybe not at the highest resolution, and maybe not while doing other things.

  • @quenguin2178
    @quenguin2178 5 ปีที่แล้ว +1

    TBH we started getting affordable Solid State drives around 2010 ish too so they speed up application loading and boot speed. Another thing we covered in college was that around 2004-2006 multicore CPU's started coming in, so you had multiple CPU cores to spread the load of the application (in theory). we also started to see an increase in IPC (Instructions per clock) so CPU's can do more work per mhz clock cycle than they could before

  • @PixelOutlaw
    @PixelOutlaw 8 ปีที่แล้ว +15

    What is really killing computers these days is the weight of the web browser. They are more a virtualized JavaScript and plugin circus than HTML browser. It is shameful how resource hungry software has become. I think a lot of this comes from layer upon layer of abstraction in software libraries. Each library needing 2 or 3 more libraries to function. The browser is like the .pdf of the program world. It is expected to do everything and support everything to a fault. You hit the nail on the head it is software getting sloppier and sloppier. That said, I quite like writing things in Lisp and Python rather than C++ and ASM.

    • @coopergates9680
      @coopergates9680 7 ปีที่แล้ว +1

      Lol what do you think of the OOP craze in the last few years? I write in Java but I don't
      miss OOP much when hopping back to C. I have found Java does run very fast for a non-native language.

    • @fgvcosmic6752
      @fgvcosmic6752 7 ปีที่แล้ว

      Whats lisp and ASM?

  • @obsoletegeek
    @obsoletegeek 9 ปีที่แล้ว +25

    Computer hardware is far less interesting these days. I miss the "Megahertz Mania" years.

    • @-taz-
      @-taz- 9 ปีที่แล้ว +1

      +Computer Whiz 1999 Once we got to GHz, progress in clock speed got very slow. Instead, we kept making transistors smaller and adding more CPU cores until now we've run out of space in 2 dimensions! If we make transistors any smaller, they break because the electrons start hitting quantum effects and jumping around all over the place. We can make start adding redundant logic for error correction, but then the processors would get even larger, so it makes more sense to just stop shrinking at this point. Next processors will start getting stacked up in 3 dimensions.

    • @-taz-
      @-taz- 9 ปีที่แล้ว +1

      ***** We already know what they will be. Powerful computers and the electricity to drive them will be removed from the people. Computers in the cloud (aka siren servers) will be owned entirely by rich oligarchs so they can keep tabs on us. Computers will lose hard drives and be implemented as only banks of CPUs and RAM. One of the companies very close to central intelligence is HP. You can examine their "Machine" technology for Things to Come.

    • @-taz-
      @-taz- 9 ปีที่แล้ว +1

      ***** Yeah that's what solar power is all about, too. It restricts energy a person can use to the amount of 2-D real estate and finite resources (silver for solar panels) they can afford. Like everything, it makes us *feel* more free. The big server farms will have huge solar collectors or just keep using coal and oil in reality. That drives Facebook, Google, Apple, Amazon, the NSA. We can't compete against them because we have to pay them for everything. It's "kick away the ladder" economics.

    • @-taz-
      @-taz- 9 ปีที่แล้ว +1

      ***** That's true. With everything centralized, like with Google, it can scan all our emails and TH-cam comments to learn from us. There is NO security for the individual because they own the data. They can even nudge our behavior by showing us just the right material. With distributed systems (Cisco routers, Ubuntu, Windows), there are plenty of back doors so data can be collected, but it's not by default. With Android, you can be watched in every app, and they know every button press.

    • @-taz-
      @-taz- 9 ปีที่แล้ว +1

      ***** I know... I wish. It's going to be like the Borg though from Star Trek, which they also write by the way. The same groups that developed the modern PC also overlapped with the same elites, futurists, theosophists including Gene Roddenberry. I've been thinking a few years and can't even image how to escape.

  • @jovetj
    @jovetj 5 ปีที่แล้ว +22

    This video is outdated. Time for a 5 year update! Seems to me that Moore's Law has, indeed, ended.

    • @TheFallingFlamingo
      @TheFallingFlamingo 5 ปีที่แล้ว +3

      People have been saying that since the early 2000s, computer engineers continue to ignore them and make progress. The only people who think Moore's Law is dead are those who profit off a slower stream of updates. Unfortunately for them, there's always going to be potential profit in making things smaller in tech, if they don't make the move eventually someone else will. *Cough* Intel and Nvidia *Cough*
      Realistically there's a lot of untapped potential for our chip design, the trouble is a lack of competition driving innovation. Intel is working on stack-able architecture, which in practice would eliminate the limited space we run into working strictly on a vertical plane, but who knows what decade they'll deem worthy to release it in. Someone could discover another material that better retains strong currents on a nanometer scale, allowing for even smaller transistors. We could even find a better alternative to transistors or find advancements for an alternative method that already exist, like the spin transistor.

    • @1pcfred
      @1pcfred 5 ปีที่แล้ว +4

      It only looks like Moore's Law has ended when you only consider the consumer market. Moore's Law was never about the consumer market though. The Law is about the highest end devices. There it is still going strong.

    • @thetrashmann8140
      @thetrashmann8140 4 ปีที่แล้ว

      AMD might have kick started Moore's law backup again but only time will tell. AMD managed to get a 7 nm process node working and Intel are still on 14 and 10 nm process nodes (14 being for low end and 10 for high end), and because of that AMD is able to fit a lot more cores in the dye than Intel with less power consumption and heat as well, and in the graphics card space AMD is catching up to Nvidia but Nvidia are still in the lead even if by barely in some cases.

    • @Koffiato
      @Koffiato 4 ปีที่แล้ว

      Except computing power kept getting multiplying

  • @CkVega
    @CkVega 2 ปีที่แล้ว +1

    It's fascinating to look back nearly 7 years and see the conclusions holding true with a minor shift of the scale.

    • @Uranium-jj7le
      @Uranium-jj7le 2 ปีที่แล้ว +1

      This!
      Honestly unless you’re a Gamer constantly chasing the latest games and want the highest graphics or a Video editor, 8 GB maybe 16 GB of ram in a computer is all you need with like an 7th gen+ I3.
      Hell I’d argue if you just want a window into the web, you could probably get like a 50-100 dollar android phone and that’ll do ya nicely.

  • @ironcito1101
    @ironcito1101 8 ปีที่แล้ว +30

    You gotta use log charts to plot this kind of thing.

    • @0ThrowawayAccount0
      @0ThrowawayAccount0 6 ปีที่แล้ว +2

      Diego C. Finally... Someone else gets it

    • @nplowman1
      @nplowman1 6 ปีที่แล้ว +3

      Yes. I was thinking the same thing. :D

    • @General12th
      @General12th 6 ปีที่แล้ว +9

      You don't HAVE to. And really, linear charts are better at showing the unbelievable growth.

  • @picklerick98
    @picklerick98 6 ปีที่แล้ว +6

    can you do this from 2015 - 2019?

  • @Klblaz
    @Klblaz 8 ปีที่แล้ว +124

    I think that computer prices are getting slowly higher because of Intel/Nvidia monopoly.

    • @jackwest3282
      @jackwest3282 8 ปีที่แล้ว +18

      +John Gray actually the prices go up as the cost to make silica waffers in the amounts demanded now (ssd drives, memmory, mainboards and addon cards, game consoles use it.) which is a whole lot higher demand than can be supplied by manufacturers...so according to the economic laws the prices will go up since there is little supply but high demand..ect. simple economics bro

    • @soylentgreenb
      @soylentgreenb 8 ปีที่แล้ว +13

      +Jack West That's entirely backwards. Silicon isn't a limited resource and there is no silicon wafer mafia or monopoly. Absent any supply shocks, more demand doesn't drive up the price, it just means there will be more, larger and more efficient factories and more competition (economies of scale).
      The fabs and masks cost much more for smaller process nodes. Immersion litography, multipatterning etc., it adds up. Moore's law will not end because 3 nm devices cannot be constructed; but because the transistors cost more, perform worse and generate more heat per area. That may be long before we reach 3 nm. ITRS no longer goes smaller than 10 nm on their roadmap.

    • @jackwest3282
      @jackwest3282 8 ปีที่แล้ว +3

      not talking about moore's law, the other guy was talking about a monopoly by nvidia and intel. which is ridiculous because, many companies produce silica wafers and what I was saying is that there are many parts of production that cause them to be unable to produce enough product to meet the demands for other areas. Also there is the cost to retool factories for smaller nanometer wafers used in more advanced video cards, mainboards, ssd hard drives among others. By the way silica is not an unlimited resource it doesn't grow naturally. Yeah there is a shit ton of sand and other sources of minerals that contain it, however the refining process costs a lot of money, time, effort which the companies need to compensate themselves for by charging higher prices. If they could grow silica or other materials in a lab they fuckin would do so as it would be a shit ton cheaper than getting it from nature. That is why they are looking into industrial grade diamonds which can be grown, cut with lasers into small wafers that are more conductive than silica wafers currently are...but basically yes there are controlling factors to all of this but it all goes back to basic economics...costs, production and final turn out of the finished product in some of the market are down quite a bit. Think about all the smartphones out there constantly being made...they need memory, cpu, mainboards, graphics chips..etc...all made with silica in them...so while its not a very limited resource the demand is so high for it and gets higher every year that its hard for industrial supplies for it to be met efficiently and that is something most of the companies do like. Samsung is one of the that actually has the most efficient production company that produces everything itself for all its products so its both cheaper in price, higher in quality. Look at the price comparison between phones...iPhones are insanely expensive because Apple has to piece meal out their production for the phone...hence the cost is extremely high even though they are produced in a country that is among one of the major experts in the field of Silica wafer production efficiency. So you have a iPhone that costs 700 to 1000 dollars compared to the same specs Samsung phone that is only 400 to 700 dollars. Same thing for LCD TV's and PC Screens...Samsung produces them all themselves from the memory chips to the screens. They charge about a midrange to high range price cause of the quality and availability of their product. Sony produces their screens partially in their own factories in japan and receive the majority of their parts from other sources..hence the inferior quality product at a high market price. Shall I go on or do you get the point? its still supply and demand when you break it all down.

    • @jackwest3282
      @jackwest3282 8 ปีที่แล้ว +2

      Oh also yes more demand and little supply DO drive up the price genius. Go back and read your economics 101 book you might learn something.

    • @jackwest3282
      @jackwest3282 8 ปีที่แล้ว +2

      Also I did not say anything until now about Moore's law and I only said something about it in reply to your comment. Cheers

  • @KokkiePiet
    @KokkiePiet 5 ปีที่แล้ว +1

    Over a million subscribers now, congratulations!

  • @fuzzyfoyz
    @fuzzyfoyz 5 ปีที่แล้ว +3

    Would love to see an update of this, given the shift to cloud at present and possible future shift to blockchain.
    I would say that the shift from desktop apps to SaaS is where hardware speed requirements are becoming a thing of the past. At least as far as hardware for joe public is concerned anyway.

  • @Max_Jacoby
    @Max_Jacoby 7 ปีที่แล้ว +18

    I have 10 years old computer. It has Core 2 Duo E8400 which was considered high end back then. It's still doing ok. Photoshop, Visual Studio, Full HD playback, video editing up to DVD quality work great. Unbearable expirience arise with 4K playback and Full HD+ video editing. Unless you're a gamer or professional video editor I don't see reasons to upgrade younger than 10 years computer.

    • @micx2056
      @micx2056 6 ปีที่แล้ว +2

      Me too . The only update was an SSD disk

    • @chiclone-tests71
      @chiclone-tests71 6 ปีที่แล้ว +4

      Yes, an ssd can speed up an computer significantly more, than any cpu or OS can do

    • @Gazzoosethe1
      @Gazzoosethe1 6 ปีที่แล้ว

      I think my pc can handle everything for 20 years or so, a gtx 970 hall of fame edition, 16 gb ddr3 ram and i5 5th gen 3.4 ghz quadcore

    • @jonross377
      @jonross377 6 ปีที่แล้ว +1

      well according to one of the above posts your computer shit itself at least 4 years ago... He says they last 6 years at most. Like ALL of them...

    • @CristalianaIvor
      @CristalianaIvor 5 ปีที่แล้ว

      @@micx2056 yeah thats the only thing I did to my 10 year old laptop. it sped it up massively

  • @MsHUGSaLOT
    @MsHUGSaLOT 7 ปีที่แล้ว +14

    5:12 As the years go on and we have more RAM, more storage and faster processors, software has become less efficient, and effective and is more sloppily written. Now a days, it takers a HUGE TEAM of people to work on one bit of software, and it's still buggy A.F even after years of development.

    • @TanteEmmaaa
      @TanteEmmaaa 5 ปีที่แล้ว +5

      You clearly are not a developer.

    • @gorillaau
      @gorillaau 5 ปีที่แล้ว

      No, software has become richer and more detailed. A web browser is capable of more than static html. Mosaic Browser is interesting with modern web sites.

  • @bikeaddict
    @bikeaddict 6 ปีที่แล้ว

    Bahaha. As I am watching this you are taking bout 200k subs and you're at 812k. My 9 year old love you and is becoming passionate about the tech. Thanks dude.

  • @snowwhite7677
    @snowwhite7677 7 ปีที่แล้ว +30

    If you look at the average Smart Phone, the Apps on it and what it can run, are what the average User uses on average... Bill Gates was right when he said everyone would have their own computer in the New Millennium, People just don't realize it is the Smart Phone that everyone has one version or another. This is especially true in the 3rd World. People may not have running water or electricity in their homes, but they have that Smart Phone!

    • @EmeraldEyesEsoteric
      @EmeraldEyesEsoteric 6 ปีที่แล้ว +1

      I have a desktop PC and a free stupid phone... It does text, calls, and that's it. No internet access of any kind. It's tiny, mobile, and doesn't produce anywhere near as much wireless radiation as the brain tumor generation has with their smart phones. Notice that Bill Gates did not allow his children to play with smart phones and tablets, because he knows about the radiation. Fun fact: In every phone somewhere there is an FCC warning to be found, which states that putting the phone next to your head is way beyond the FCC safety limit, so you are supposed to use ear bugs or speakerphone. If you put a smart phone or laptop in your lap, studies have proven, you will temporarily kill off most of your sperm.

    • @newgameld2512
      @newgameld2512 6 ปีที่แล้ว +11

      Sir, please remove that tinfoil hat.

    • @punker4Real
      @punker4Real 6 ปีที่แล้ว

      it's not a windows smart phone

    • @dragunovbushcraft152
      @dragunovbushcraft152 6 ปีที่แล้ว +4

      Phones SUCK as computers. Poor, tiny displays, no local storage, easy to break, some are not even repairable. I have a good, quality "SmartPhone" due to my business. It isn't even in the same GALAXY, as ANY of my laptops.

    • @BaronVonQuiply
      @BaronVonQuiply 6 ปีที่แล้ว +6

      @@EmeraldEyesEsoteric Hi there.
      en.wikipedia.org/wiki/Ionizing_radiation
      en.wikipedia.org/wiki/Non-ionizing_radiation
      en.wikipedia.org/wiki/Radio_wave

  • @flaviomprado
    @flaviomprado 7 ปีที่แล้ว +15

    Well, my two bits here:
    1 - Since the 90's, there is a HUGE improvement in the quality of compilers, both for programming languages specifications and more important, on compiler optimizations. So, newer software can potentially be more efficient on older hardware.
    2 - Lots of high demanding tasks are being redirected today to GPU's, even on the OS itself. It's not uncommon upgrading only the graphics card and get even high demanding games to run OK.
    3 - The amount of RAM you can put on consumer grade computers today is ridiculous high. It's very common that new software to demand more ram and you can put a HUGE amount on your computer, if you already don't have it. also, little to no swap. :)

  • @seanryanmetalhead
    @seanryanmetalhead 7 ปีที่แล้ว +17

    One point that I noticed was missed is the lack of competition in the CPU market. It's been about a decade since AMD had something that was good enough to stand toe to toe with any of Intel's offerings. It use to be that AMD would offer similar or sometimes better performance at a lower price point. But with the Jaguar and FX lines, they fell short. Now with Ryzen we should see some big improvements with both AMD and Intel in the areas of pricing and performance. Keep in mind that the PC gaming market is huge right now and growing. If you look at the hardware sales figures, high end graphics cards, CPU's and motherboards have been selling well. Low end PC's have been falling off the sales charts and fading away over the past few years due to smartphones. Non PC gamers, or non power users have moved on to tablets and smartphones.

    • @theravenmonarch9441
      @theravenmonarch9441 6 ปีที่แล้ว +1

      Ryzen did make intel wake up and do something! 8th gen intel cpus have more cores than their previous ones. i3 8th gen -> 4 cores, i5 8th gen -> 6 cores and i7 8th gen is 6 core 12 threads.

    • @dragunovbushcraft152
      @dragunovbushcraft152 6 ปีที่แล้ว +3

      I beg to differ. I make a living off of refurbishing/re-selling older laptops, and business is very good.

    • @b469b
      @b469b 5 ปีที่แล้ว

      Amd killing it now tho eh lol

    • @CristalianaIvor
      @CristalianaIvor 5 ปีที่แล้ว

      so true. the only thing that buggs me about the new ryzen is that they built in a bug that prevents me from booting linux on my new laptop

  • @boraxmacconachie7082
    @boraxmacconachie7082 2 ปีที่แล้ว +1

    I'd be really interested to see this topic revisited now in 2022 to see if things have slowed down a lot more in the last five years.
    I recently had to buy a new machine (a PC) for graphics, and I was surprised to find that the specs were almost identical to a similar computer (a mac) that I bought in 2015, also for graphics. While shopping, I also noticed that new consumer-grade computers are still using more-or-less the same amount of RAM and CPU as an old iMac I've had since 2013, which was nothing special at the time

  • @Slytzel
    @Slytzel 8 ปีที่แล้ว +48

    Could you please make a video on the topic why new game consoles used to be more powerful than PCs but are not anymore now?

    • @gnulen
      @gnulen 8 ปีที่แล้ว +10

      its pretty obvious actually. You have always been able to buy more powerful PC's than game consoles, the cost of PC´s were just higher since the components costed more - its the same now. A high end PC still costs 5-10 times more than a console - but now you can get super cheap PC´s which are better at the same price as consoles, since a console generation has a 10 year life span.

    • @Slytzel
      @Slytzel 8 ปีที่แล้ว +5

      ZeGypsy Hm that is not exactly how it was, at the time of the Nintendo 64 no PC that a normal person could buy could hold up with it. When the PS3 camo out it was still as good if not better than high end PCs. Now the newest generation is out and was already obsolete when it launched.

    • @gnulen
      @gnulen 8 ปีที่แล้ว +9

      +Slytzel the ps3 wasnt as good as high end pc's when it came out.. components are components. they don't get magically cheaper because they are in a console

    • @Slytzel
      @Slytzel 8 ปีที่แล้ว +2

      ZeGypsy They do, because consoles are sold under the manifacturing price oftentimes and the company make the money off the games.Also the parts are modified in older consoles and in Nintendo consoles.

    • @gnulen
      @gnulen 8 ปีที่แล้ว +3

      +Slytzel if you look at power vs price i think you will find that it has been pretty consistent with what i said.

  • @codebeat4192
    @codebeat4192 8 ปีที่แล้ว +6

    The feel of speed also dependent on the quality of it's software. Today the use of libraries, frameworks and modeling stuff makes it easier (to read/use) for humans but is mostly not very efficient way to tell the computer what to do. That's also a reason software grow in size or is slower comparing to earlier versions of the same functionality because of lazyness/ineffeciency.
    In the early days they need to program software with more accuracy and effeciency because of limitations, for example RAM or programming for speed to keep it fast as possible. The availability of resources today can result in bad performance software because programmers doesn't feel the need to maximize optimalizations/effeciency like in the early days. Many programmers rely on the optimalizations of the used compiler but the compiler is not able to solve every ineffeciency.
    So the question is, what is effect of Ghz when the binairy quality of the software is not efficient enough/optimal. When you need more RAM and Ghz to do the same thing, there is effectively no win situation. What is the effect of a speedy computer versus quality of software.
    When you run for example Windows XP or earlier OS on a computer of today (when possible), the speed will be amazing. You will experience the benefits of buying a faster computer. When I upgraded my computer with new software (Windows 7) in the past, the computers overall "feel of speed" slows down.
    If you need to upgrade the hardware when upgrading the software to keep the same "feel of speed" doesn't make sense, there is no real benefit of increased processor speed. In fact the quality of the software decreases the benefits to upgrade.
    So the question is, what is the real amount of speed increase comparing to different software configurations (that does actually the same), when the software requires more speed, there is actually waste of speed. What is the rate of return?
    Intresting thought, right?

    • @elfmanml
      @elfmanml 8 ปีที่แล้ว +4

      My thoughts exactly.
      I'm old school user and (sort of) programmer. I was "taught" (or forced, maybe) to optimize 4 MB systems to run at their limits, even today I'm trying to squeeze my aplications to less than 8 MB of memory usage.
      Of course, when playing new games, there is for sure difference between 10 years old computer and the newest. Editing video, reformating or drawing images with 256-512 MB memory usage also benefits from big memory.
      And nowadays internet, with lot of flash and other things, would probably "kill" and old computer :)
      But still, maybe I just have grown more stupid and am not able to properly configure Windows, but why I have feel my working desktop at job with Windows 7 (Six Core, 3.5 GHz, 8 GB RAM; Eight Core, 16 GB RAM) is effectively slower (lags, high memory usage, slow start-up even with hibernation) than my home computer?
      If I would like to have internet at home, I'll probably buy some smart TV "module" (HDMI computer). Otherwise, I'm doing slightly oposite things than surf web on old computer.
      Yeah, 10 year old computer would be fine. But I'm using Windows 98 on 350 MHz Pentium II model with 256 MB RAM :) Daily. Lucky DVD watcher, games player, painter and programmer. When I start to compress video into XVid, I simply go outside for a walk :)
      High performance desktops and laptops are, of course, useful. I can't imagine for example to use our slow CNC design sofware or manage gigabytes of e-mails on my grandpa PC, but sort of lazyness in programing is here for sure (remember Quake II on 3Dfx graphic card? 10 years laters, there were some games (among many others), which looked exactly the same, but they "wanted" not 16 MB of RAM, but 1 GB!).
      But maybe I'm just old and see it in a bad way.. (assembler and BASIC + DOS must have leaved some "scars" in a soul).

    • @elfmanml
      @elfmanml 8 ปีที่แล้ว

      Thank you for a reply.
      Yes, defrag, clean registry is basically all I can do (with minimum allowed services, and of course, no animations or themes). Modern windows are too robust for me to fully understand them. On older systems you change something in INI file and that was it.
      Most of the time it works fine, but it lags exactly when you need to do something very fast, as usuall :) (I'm sorry it I was confusing anybody, by lags I meant ie. I start Open Office, and instead of 1 second it appear after 4-8 seconds). And web browser is the most noticable memory consumer so it sometimes just needs restart.
      I hate a "reinstall windows" term :) Not doing it since 2005 (I know, then what I'm swearing about).
      I'm using Linux (Ubuntu) at home sometimes. Of course, it is slower than W98 on that configuration.
      We've tried to use Linux about 5 years ago at work, but we found there are many important apps which we need that doesn't work properly under it even with Wine etc. (the biggest problem is I'm the most competent here, and I'm pretty dumb relatively).
      Agreed about what is slow is a software. But the idea is, computers are like 1000x or more faster than before 20 years, but overall performace for BFU eyes is not.
      I know these games, but I've never played them. I was all into action games back then (Doom or Need for Speed), and now I play games almost no at all. My faves Wacraft I/II, Settlers 2, Simcity 1 and Transport Tycoon are probably the closest to your selection.
      My first graphic accelelator was Ati All in Wonder TV 8 MB which replaced old Trident 512 kB VGA in my Pentium I system (which was basically just 386/DX 4 MB 121 MB HDD with replaced motherboard), but my friend did have a 3dfx banshee 16 MB which was a lot of better.
      And THIS was speed up you can not only feel but also see!
      Modern games and their engines are unknown stuff for me. I'm just simple person who could understand at the most thread programing under Free Pascal and optimizations in asssembler, but that is, which is pretty much useless today (I'm learning C++ as a hobby now, but still on single apps level).
      And when I see two "exact" things, which one is more "expensive", I get the other one.
      Yes, I know, I like 8bitGuy videos about old computers, i.e. about old graphics/music and so on. After all, as a child I started with didactic-like computers and NES video games, and I'm interested how these simple things back then worked so good.
      Windows 98 is de facto my daily basic, so I don't look after some intersting videos about it so much (except when someboy is trying to run 98/XP on 386 and lower CPUs - some of the other "retro" videos on YT are too "new" for me (it's like you come to visit a car museum and oldest piece here is from 1990 while you just arrived here by Ford-T :D )).
      My programming "careerism" was 8bit BASIC, then QBASIC, then Pascal, then batch (but not that much good), then assembler 8086, then Free Pascal, then assembler 8051 and now C.. not much I guess.
      I soldered PCB once. It is working but looks horribly :)
      Being born in 2001 is not that bad, if you intented i.e. work as IT specialist. For sure it is better than old structure which crawls the world since 1983 and is unable (or does not want) to learn modern world :)¨
      Well, I'm probably not have any holidays till retirement except for Christmas, but I do too make an 3D game (slowly), so I hope it will be finished than I would be dead (I'm very lazy, you know). For me, it is way to pleasure myself with low level programming, even if the result won't be any good.
      Your comment is very fine. I like people who wrote tl;dr stuff as I do :) For me, English is also not my first language, so I hope I made some sence too..

    • @elfmanml
      @elfmanml 8 ปีที่แล้ว

      *****
      (offtopic) Yes, I'm still 33.5 years "young", but even if our republic forces us to work till 68, after I'll manage all my obligations (mortgage, etc.), I'm not forced to work anymore (estimated 6-7 years from now). Who needs wealth?
      The second statement is true (I'm technically old: I don't want to use a Facebook, don't even have tablet or smartphone..).
      I think two of our work computers have primary disk as SSD. They boot nearly as fast as my W98 :) (if it doesn' get bothered by an external USB hub, which sometimes is). But then wow efect is over and you get to boring "slow" speed (I know, if I run app from W10 PC on my home PC, if it even run, it would be slow as hell squared, but this just confirms 8-big Guy idea, which is: modern software + modern PC drastical speed up as expected).
      I cannot tell I don't know anything about newest technologies (but for sure I wouldn't be able for example to pass an exam about it on school), I just haven't found reason to buy, understand and use them (for years). Above that all, I transfered my interest from IT/HP/gaming to other areas (gardening, wood/brick construction, drawing, learning Japanese, etc.) just to keep me to have interest in something I don't already know. I have used tablet (configuring it for company), but even if it looks interesting, and quite fast (=usable), there is no need to buy it personaly (maybe I'm just an odd fish, dinosaur :) )
      720p is enough for me (even though I have HD TV at home, but still without Blue Ray). I have plan to buy VR (ie. Oculus Rift), but only when it get very cheap.
      If I finish that game, I will for sure let everyone know :) (it would be that BIG miracle). But as far as technology goes, Unreal engine 3 would use that game for toiler paper at max. It is software rendered (Quake 1 is closest example).
      This kind of bring me "on topic" again, since I could use some modern free engine to get it work fast on today's computers and it would probably look great, but that would be no fun for me to optimize code (I don't want to buy new computer so I can play it :) ).
      I considered learning Java when I was finding new job, but then found one in a CNC area so I didn't learn it.
      I still think I'm dumb, but not that much as it sounds :) It is always better if you are in the middle. The people, who are "not so good as you", gives you quite good start for future senility (you can afford to lose some brain cells since you are using many of them, because you are not THAT dumb), and the people who are smarter than you always give you opportunity to improve yourself.
      You can be (if you are not Da Vinchi) best at one "job" or interest, or average to good on many together. It seems I chose the second option. Besides, claiming yourself to be a dummy make a lot of people happy, so why not :)
      I think, we are getting more offtopic with this.

    • @codebeat4192
      @codebeat4192 8 ปีที่แล้ว

      Or installing a SSD. Windows sucks at disk access. Wrote a virtual diskdriver and it is insane to see how many times Windows access drives, files or folders.

    • @zwz.zdenek
      @zwz.zdenek 6 ปีที่แล้ว

      Not entirely true. Newer versions of Windows have optimizations. They prefetch things cleverly, fewer UI tasks are modal, so more parallelism can occur etc. Not to mention upgrades to memory management which even older programs can benefit from. So even though the system contains more services and is, in general, more demanding, you just can't say that an older system is always faster.

  • @mikepelletier1399
    @mikepelletier1399 5 ปีที่แล้ว +2

    Woa, way time to do an update to this!!

  • @EkanVitki
    @EkanVitki 5 ปีที่แล้ว

    One of the important factors is that we're bottoming out on the physical possibilities due to the scale of the components vs the molecules they're made of... when it gets so thin (now 7nm, moving towards 5nm in 2020) higher frequencies bleed interference to neighbouring parts of the circuit, so you need to keep the clock speed down to limit this, and that's not to mention the extra heat caused by so much going on in a tiny area with so little material now to carry it away. At the moment the cheapest/easiest way around the problem is to increase parallelisation (as we are) by multiplying the numbers of cores. The next big jump will most likely come after we have mastered being able to produce CPUs out of other materials with characteristics that allow us to miniaturize even more (maybe graphene, maybe something else)

  • @jmarkashe660
    @jmarkashe660 5 ปีที่แล้ว +3

    I personally greatly dislike the term ‘cloud’ because I have been in the I.T. business long enough to recognize that it is just another computer in a datacenter somewhere. I started out programming mainframes and getting people connected to them over vast distances. People don’t realize that they are working in that same environment with premise-based servers. They do their work and save things to the server that may be in a different room, building or even city than they are. So, this notion of cloud-based computing is not new. How it is used, is a bit new based on software development.
    Here in 2019 I have a Windows XP laptop that I have found is still being updated with patches from Microsoft. However, most of the applications I run on it have their foot in the Internet realm of computer software and operational construct. On my Windows 10 desktop I can use the file manager and type www.microsoft.com and it will recognize that it needs to open a browser and go to that URL. Conversely I can open File Explorer and open \\\\ and it will open the file manager.
    Now, I do have a SharePoint server on my network and that makes it very easy for other OS’s to get to files on my network.
    If it’s not web-enabled, they will build applications to recognize and use web protocols, thus reducing the load on memory and CPU.

  • @Octillerysnacker
    @Octillerysnacker 9 ปีที่แล้ว +217

    If it's a computer made by Apple, no, it's not getting faster.

    • @mhill88ify
      @mhill88ify 9 ปีที่แล้ว +25

      +Octillerysnacker Thank you! Spot on! Apple only makes them as fast as they need to be to get retards to buy their shit....and then they claim they are "major innovators". What a joke...I wish my GF wasn't an Apple user, it really makes her look bad.

    • @mhill88ify
      @mhill88ify 9 ปีที่แล้ว

      +THE VERY BEST OF TH-cam It is, technically, but with their planned obsolescence and lying to consumers, it's not as fast as it could be.

    • @GreenAppelPie
      @GreenAppelPie 9 ปีที่แล้ว +16

      +mhill88ify Yeah my wife bought a top of the line Apple desktop 3 years ago and thinks she needs a new one, I am like why, their latest it just the slight bit more powerful and she's willing to spend 2k+ for it. Ridiculous, just ridiculous.

    • @mhill88ify
      @mhill88ify 9 ปีที่แล้ว +6

      GreenAppelPie
      I agree, I doubt she does anything now that requires an upgrade anyway....and then when you look at Apple's profit margins it all makes sense....they rip you off and make you feel cool while earning BILLIONS each quarter.

    • @Octillerysnacker
      @Octillerysnacker 9 ปีที่แล้ว +2

      I know right? It just so sad that Apple is so rich because they only market while putting zero innovation into their products. Even with many competitors, they still make HUGE amounts of money from not doing anything.mhill88ify

  • @olivialambert4124
    @olivialambert4124 6 ปีที่แล้ว +3

    The reason is software. The interesting part is why? Well if you think about what home computers are being used for, its to do a task and display it in an intuitive way allowing an average person to manage. Banking, word processing and the like. Well around XP era we got the processing power to allow intuitive display of most tasks. A little beyond, around Vista and that was all we really needed. Any more would make it look nice, but it wouldn't make it dramatically easier. That unlocked different types of computers to be able to complete the same task, netbooks, tablets, then phones. A combination of little gain with more computing power and a need to display it for low computing power users left the software lagging way behind hardware. That said I would argue software has still been getting by. A 10 year old computer is still borderline unusable. And as with the enterprise servers and more specialist home uses (gaming, photoshop) we still require up to date computers. But ultimately its quite obviously a matter of what capabilities the software needs to complete its task and for 95% of users you gain very little making the software more intensive.

    • @virtualtools_3021
      @virtualtools_3021 3 ปีที่แล้ว

      now in 2021 a 10 year old computer can be running a 6 core 12 thread Westmere xeon with gtx 580 3gb, borderline unusable nope not any more

    • @olivialambert4124
      @olivialambert4124 3 ปีที่แล้ว

      @@virtualtools_3021 I've got a number of computers from 10 years old. They are borderline unusable. Older gaming machines with 8 core processors, 8GB RAM, etc. struggle to use the internet in a reasonable way and fail at multitasking. Certainly can't keep up with anything more than that. Thats a monster computer trying the most basic tasks. Our standard non-gaming machines from a decade ago are far worse when attempting the tasks for which they were designed. Gets far more problematic if you've updated Windows and browsers to the modern era to properly achieve that task. Either they can't watch youtube because software is outdated, or they can't watch youtube because new software runs too slowly.
      Most of us have decade old computers stored away, we can all test the claim. Nobody has a Xeon with a 580, though. You don't use a Xeon with no benefits outside of stability for server use with a GTX 580 gaming card. And either way I wouldn't really call a supercomputer "usable" if it managed to run youtube but could no longer manage its actual intended tasks.

  • @bend1119
    @bend1119 5 ปีที่แล้ว

    Congrats on just passing 1 million subs. Awesome content!