Meaningless Tech Specs
ฝัง
- เผยแพร่เมื่อ 7 พ.ย. 2024
- Try FreshBooks free, for 30 days, no credit card required at www.freshbooks...
Here are some tech specs that used to matter, but you can safely ignore them now.
Leave a reply with your requests for future episodes, or tweet them here: / jmart604
► GET MERCH: lttstore.com
► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/tqsponsors
► PODCAST GEAR: lmg.gg/podcast...
► SUPPORT US ON FLOATPLANE: www.floatplane...
FOLLOW US ELSEWHERE
---------------------------------------------------
Twitter: / linustech
Facebook: / linustech
Instagram: / linustech
TikTok: / linustech
Twitch: / linustech
I remember back in the day with low quality integrated sound (and some cheap add on boards) you could tell when you were about to get a cell phone call. It would buzz a certain way for several seconds before the phone even rang. Any sort of interference was picked up, sometimes you could even get random walkie talkie, other phones, or radio stations coming through your PC speakers. It would freak you out occasionally when you would be in silence then suddenly hear some voices.
Damn
Dähhdüdüdüdüdüd remember it from my fathers PC just moments bevor he got a call
I remember that being very very noticeable, but I still get this if I put my phone near my headphone cable.
I wonder if another reason for that not existing is lower latencies? I.e. even if you had hardware that would buzz in a certain way, would the length of time it buzzed be as long today? I'd expect no, but don't actually know how long it takes between when a modern phone receives the first signal indicating a phone call, and when it actually rings/is ready for the user to pick up. I'd hope for it to be something on the order of sub-1-second, seems easily possible with modern computers and network connections, but don't really know how cell calls work today.
Even my modern HiFi setup makes the buzz sounds If I put my phone really close to it.
"Megapixels" ..there is such a thing as high quality megapixels and low quality megapixels ..its a number that does NOT mean image quality
Jep fully agree, a 108MP phone camera doesn't outperform a 32MP "traditional" camera. You can even see this just between phones as well, despite something like iPhones typically having only a 12MP camera, they still are able to take one of the best photos from all phones while many all other more modern phones have something like a 16, 24, 32, 48, 64, 108, etc. MP camera.
You mean sensor quality...
@@Das_Unterstrich I definitely need more than 12mp for distance stuff
@@Das_Unterstrich yea because that 48mp, 64mp, 108mp, etc is just a gimmick. It basically divided 1 pixel into 4 pixel. This technique is called pixel binning. So lets say your 48MP (8000x6000) camera phone is actually only can take up to 12MP (6000x4000). So the real hardware capable of your phone camera is only 12MP
Its applied to another too, like the 64, 108MP phone camera. Just divided it by 4 and its your real hardware megapixel
Yea it looks sharper but its still not capture the real detail, because its depend on camera sensor not image resolution
This is actually the same with dpi in mouses.. native sensor resolutions these days are around 36x36 pixels which is 1296pixels, so ideally optimal dpi should be around that. But these days they get branded as 25k dpi resolution, which i think is kinda misleading..
I still have an old Sound Blaster Audigy laying around. Something to note is that back in the day you could get better FPS from a sound card with hardware-accelerated sound, because PCs were much slower and therefore sound processing took up a much larger chunk of your CPUs processing power. That Sound Blaster Audigy got me enough extra FPS that I could bump up from 1024x768 to 1280x1024 in Unreal Tournament 2003.
I still like using a sound card. Admittedly not nearly as much of a difference as it used to be.
I remember when listening to a single 128kb/s MP3 through Winamp would use 45% of my system resources
@@rickyyoung Winamp...
Thanks for making me feel old :D
@@1AKAvg it really whips the lama's ass!
@@Shibathedog I love sound cards. :)
This was me jumping 10+ years ahead in tech when I did my latest upgrade a couple of years ago, and having to re-educate myself on hardware. "Wha', CPUs just boost themselves now? BIOS has colors and mouse support? What is this newfangled wizardry!"
Bios having a gui was the biggest mindfuck the first time I had to boot into it on my newer pc lol
BIOS mouse still bugs me.
@@autumnrain7626 Indeed, that was a change for me. All my PCs before my current one had a text-mode BIOS.
And I was building my current PC myself, so no operating system out of the box. The UEFI was the only thing running on it at first.
I found out the other day that my bios has wifi drivers to pull updates via the internet. This is not even dangerous anymore since you can just push a button to flash the bios worst case, insane.
No more prying out a bios chip and replacing it.
⚠️ *AGP 8X*
⚠️ *SDRAM*
⚠️ *Shared Memory*
⚠️ *HT Technology*
⚠️ *L1 cache*
⚠️ *GPU Pipelines*
⚠️ *Pixel shaders*
⚠️ *Vertex shaders*
⚠️ *Compute Shaders version*
⚠️ *OpenGL version*
⚠️ *Hardware T&L support*
⚠️ *Max Resolution*
⚠️ *16bits Audio*
⚠️ *48Hz Sample Rate Audio*
⚠️ *EnergyStar Conformity*
The whole i3/i5/i7 is one of the most commonly misunderstood (and exploited) specs. It's useless without at least knowing the generation, as a modern i3 is miles ahead of a very old i7.
And a mobile low power i7 has lower clocks and less cores than a high power mobile i5, etc
well yeh but if your i3 is just a few generations old and is a dual core. Chances are it won't run wimdows properly. Like it will get extremely slow.
@@Gurj101 my i5 is several generations old and it's a dual core, runs win10 like a charm.
Any Core processor since 2012 at least runs windows no problem, if you have issues it's most likely due to the fact you don't have an ssd or have bloatware.
Btw, i3 1220P ha the exact same number of cores as several mobile i7s.
@@Gurj101 My i3-2130 and its integrated gpu are still handling Windows 10 and most web browsing absolutely flawless, given an SSD and 8 gigs of ram. I see only minor stutters in some heavy web-apps.
Don't underestimate Windows, I have installed Win10 on a Core 2 Duo E6400 with HDD and 3gb of DDR2 400mhz ram, yes it's slower than we are used to nowadays and since there are no compatible drivers for 2009-ish integrated graphics, I can't enable some graphical features, but it works! I guess if I tried Win7 it could have worked even better, but I need this PC just for the Internet access and sadly Windows 7 is just not for that today.
@@Anankin12 That really caught me when I bought a laptop some years ago, I didn't have a lot of experience with mobile processors, so I was disappointed when I found out that my i5-7200U had only two cores with HT. Like, at this point it's just an i3, what's the purpose of calling it i5 with an U
Would love to hear about some More dead specs!
This was really interesting! I recall, way back when, before building my first pc. 2005 - reading and listening in on some conversations, all about the FSB... Teacher also tried to explain to me all about front side bus, but clearly, this was not something I ever needed to think about, as I never knew what it did or heard of it again after computer studies...
⚠️ *AGP 8X*
⚠️ *SDRAM*
⚠️ *Shared Memory*
⚠️ *HT Technology*
⚠️ *L1 cache*
⚠️ *GPU Pipelines*
⚠️ *Pixel shaders*
⚠️ *Vertex shaders*
⚠️ *Compute Shaders version*
⚠️ *OpenGL version*
⚠️ *Hardware T&L support*
⚠️ *Max Resolution*
⚠️ *16bits Audio*
⚠️ *48Hz Sample Rate Audio*
⚠️ *EnergyStar Conformity*
CRT monitor refresh rates used to be important, and it wasn't because of gaming. CRT monitors that ran below 72Hz would have a flicker effect because the phosphors in CRTs dim very quickly. That flicker would lead to headaches and eyestrain for users. Each time you increased a CRT monitor's resolution setting the maximum possible refresh rate would go down. Many monitors could run as high as 1600x1200, but few could run at 72Hz or higher, effectively making that resolution unusable for long periods. When people started using LCD monitors refresh rates weren't that important because it takes far longer for them to start dimming, so long it makes them ghost, even today all these years later.
@@OussaMeb Floating-point co-processor ;)
@@Mr.Morden Thank you for your informative comment.
I personally was sacificing the HRR in favor of 1280x1024 @60hz, inside games.
But I rather use just 1024x768 85hz outside, because of the high flickering perception in desktop apps (mostly due to the white/light bgs).
@@eTiMaGo CPU Achitecture 😁
And here i thought we would be seeing modern specs which are actually useless
Same
@@AbiRizky here's a useless one: response time on monitors. It's actually extremely important, but the manufacturer specification is a complete lie and impossible to compare with other monitors
One of my first jobs was tech support for Dell. I learned so much during our training, including FSB and, at the time, the newly release Pentium 4.
meanwhile the specs that actually matter to me are never on the site, I usually have to hope that I can find out in the user manual, but even that is not always guaranteed (even though I have been lucky in the past and it still had what I was looking for)
Devices with Bluetooth are notorious for this. There are loads of different things Bluetooth does and there's no way to know the sub version level of support for each of those features.
Exactly - the one that gets me most is that nowhere advertises if memory is dual rank, or similarly what the topology of motherboard memory slots is. Like I don't want to spend hours digging through every manual, not just spec sheet, of every motherboard I'm considering, only for when I'm deciding on memory modules to discover the only ones that make sense switched between single and dual rank in the middle of production, without changing the spec sheet or technical info, forcing either buying massive or tiny modules to make either rank more likely.
At least secondary timings are beginning to make their way into product titles
My pet peeve here is GNSS in mobile phones or even dedicated (consumer) navigation hardware. Just tell me which chip is being used.
The one metric I would like to see in more spec sheets for monitors is pixel density. The higher the density the sharper the image. Instead of just implying the smaller the screen at the same resolution will have a higher pixel density.
"For those of you with long memories" lol, thank you very much 😁
God that kind of talk makes me feel old lol
The days my PC games were on multiple 5.25" floppy discs which included a book for a manual
@@WahlVids or the days I had to program my games from magazines on spectrum 48k.
Well, it's not like CPU clockspeeds mean much anymore (except when you compare within one generation of the same CPU line).
It would be fun comparing a Pentium 4 3.8 GHz against a something modern of the same speed, probably a core from a Xeon or Epyc to get the clock down that low.
Yes but this channel still mentions clock speed as if it means anything whenever a new processor comes out so don't expect to see it in this video.
I remember going crazy nuts when I got my first upgrade from CGA (4 colors) to EGA (16 colors) graphics...it looked SO COOL, as I dreamed of VGA and FORGET SUPER VGA that was way out of price range.
I recall my first proper PC I purchased, I couldn't afford both an EGA adapter card and an EGA capable monitor, so bought a CGA adapter and EGA monitor, so I could eventually update just the adapter down the track. At the time I also wasn't quite sure I would ever fill up the 20MB hard drive I put in it. 😮
LOL, You RICH kids with your CGA! My first PC had MDA! (No addressable pixels at ALL! 😲) Just text. Helped develop "ASCII art" skills! (Itself derived from "typewriter art"
ah the old 256 colors :D
I remember when i accidentally replaced the driver for the Voodoo 3dfx to something different and suffered 16colors :D
How we take for granted current pc with just a 1080p monitor when SVGA was all I need to play Red Alert
The frequency response headphone manufacturers claim. Like "10hz - 50khz". It means absolutely nothing.
It would kinda mean something if it wasn't usually a bold faced lie
0:51 Oh man, that brings back memories.
The fact I used to buy sound cards shows how old I am.
You used to need all those ISA/PCI slots. Graphics card, sound card, network card, TV card. Full ATX was a must.
If your needs differ even slightly from the "norm", you'll still need to populate all the slots - it's just your everyday average gaming PC-s that only need a slot for GPU. Want 10Gbit Ethernet to take advantage of your NAS RAID array? You'll need a PCIe network card - and expen$ive one, too. Hell, even a single HDD can saturate 1 Gbit Ethernet with their 150 MB/s or more reads, never mind if you're going to set up SSD arrays or cache. Want multichannel audio, with possibly ADAT inputs/outputs? PCIe sound cards are still a thing, though USB interfaces are more common these days, especially in the less than 1000€ price range. Hell, I had to drop in a cheap PCIe sound card just to get coax S/PDIF output for front LR channels with analog outs for the rest of it. Video capture cards are also still a thing for streaming setups. As are SATA/SAS HBA-s for NAS boxes (normal mobos have like, 6 SATA ports, 8 at best, which is far from enough for datahoarders/self-hosters), and PCIe to NVMe adapters for that sweet SSD cache.
I really thought CPU Clock Speed was going to be one of them. It definitely should have been prioritised over Graphics Card colours which true is not applicable in (almost) the last 20 years or so.
Clock speed is still relevant when comparing cpus within the same lineup though
Today the only important spec seems to be *hashrate* 😭🤣
FSB sounds like a certain russian government agency
I remember that number of colors dropdown! And how exciting it was--and how you knew you were on a fairly new computer--when at the bottom of the list was selectable "millions and millions."
I'm old enough to remember being jealous of my neighbor's 16 color graphics (EGA) because my PC could only do 4 colors (CGA)
⚠️ *AGP 8X*
⚠️ *SDRAM*
⚠️ *Shared Memory*
⚠️ *HT Technology*
⚠️ *L1 cache*
⚠️ *GPU Pipelines*
⚠️ *Pixel shaders*
⚠️ *Vertex shaders*
⚠️ *Compute Shaders version*
⚠️ *OpenGL version*
⚠️ *Hardware T&L support*
⚠️ *Max Resolution*
⚠️ *16bits Audio*
⚠️ *48Hz Sample Rate Audio*
⚠️ *EnergyStar Conformity*
Energy star is no joke for Dell
eh we still use SDRAM, it is still in the name, DDR5 SDRAM, also HT on modern CPUs
@@vadnegru your comment is a pure joke 🤣🤣🤣
@bruh I was one of the 1st ppl to land on the comment section when I started wrighring it. The time i finished I found thousends of comments above, so it's time for dirty trick, ctrl+vvvvvv 🤣🤣🤣
@@travelthetropics6190 yes of course, but since SDRAM and HyperThreding are the norm nowadays, nobody talks about them anymore, just like CPU Architecture.
Fun fact : 8-bits colour used 3 bits for red, 3 for green and 2 for blue.
Otherwise, a "palette" was used :D
Good old times.
When people say they have a 4.3GHz CPU and no actual useful information.
Or they say it is an i5, then leave you to guess how many years old it is.
The 2nd one is my favorite. I see people asking questions: "Will my i5 work with this motherboard?". And then when you ask them if they can tell you what i5 they want to use, they have no idea what you mean. Fun stuff. Reminds of me of the fun school IT support I have to deal with. "This isn't working" and they give absolutely no indication of what "this" is.
In my day, we had 16 colors and we liked it! And we could only use two colors in a color cell! And we made plenty fancy graphics with it at that! Where my VIC II chip homies at?
thanks for changing the background. this is much forgiving regarding youtube compression artifacts
Meanwhile AMD:
No, guys! It is not FSB, it is called as Infinity Fabric
Love all the cutaways and stuff to James being in 8 bit color for a sample of what the difference is
Most overrated spec is the laptop thiccness these days.
Alienware be pumping out longer and wider lappies and be bragging bout how thin they are. Yeah, what about the other two dimensions, mate?
I have an Asus GL702VM and while it's slim finding bags I want that it fits in is a pain in the ass...So true dude
you get bigger screen
I wouldn't say that's useless in the same way that FSB is useless
My old chonkyboi is like 5 cm tall, but will still fit in most bags because the other 2 dimensions are standard.
Meanwhile, a friend of mine can't fit his slim laptop anywhere because what OP says.
I'd keep that laptop 'till 25 if it didn't have abysmal unusable battery life.
@@bigrunts9768 Screen size on laptops is separate from thickness. Length snd Width determine how big the screen can be
Man, this video made me feel old...
So back then the motherboard could have been a serious bottleneck because the FSB couldn't keep up with the CPU? Damn
This gen the only thing motherboards are bottlenecking is our wallet.
@@shawndiaz7528 well the A520 don't allow automatic overclocking on Ryzen CPUs, although they're both factory made and factory supported to do so.
So yeah, some bottleneck still exists.
@@Anankin12 automatic overclocking is a feature and not including it does not equate to a bottleneck. I was also thinking of Alder Lake and AM5 when I said that
Don't forget the nice Slot-1 Celerons and before that the Slot-1 P-II, running at 300 MHz. That's 4.5 times 66 MHz FSB.
Imagine what would happen if your cheap 66 MHz FSB CPU could run at 100 MHz FSB on a nice BX mainboard with 100 MHz SDRAM? Then you had saved yourself a lot (!!) of money compared to the 450 MHz Pentium-II which were extremely expensive at the time.
I remember at the computer store I worked, we once had a batch of "450 MHz P-II's" which if you pushed away the plastic part, you could see some green wires in them. This was discovered after we had a customer complaining about his (expensive!) system being unstable.
I never got a full answer from the boss about what was the official reply from the reseller other than "it is under investigation", but all 100 MHz bus P-IIs had to be returned to the main office immediately and PCs that were already built had to be opened and the customers being called their system delivery was delayed for a few days or they could use a slower CPU until the ones they ordered were available again.
Luckily these CPUs except for 1, never ended up at customers.
Those CPUs went for about 1000 Dutch guilders (about 450 euro, without inflation) or roughly 4x the price of the P-II 300.
Asus was also known for clocking the FSB on their P2B series boards at 103 MHz by default, so their boards would show higher scores in benchmarks.
On some PCI cards, this caused some issues and especially in the early years (before SPD chips on SRAM modules) it could become critical with el-cheapo RAM modules.
775 quads vibe
another useless GPU spec: Flops. On consumer/gaming cards, the number of Gigaflops or Teraflops a card has is basically useless, but even as recent as the 2000 series nvidia cards they were talking about it. on Scientific/Compute focused cards, they have way better stats for measuring computational power
Wrong. It gives a general idea when comparing in the same GPU architecture. You can loosely compare in different GPU architectures, bit you have to keep in mind that newer GPU architectures are generally more efficient per FLOP
Very wrong, FLOPS or FLOating point Operations Per Second gives you info about how much calculation a GPU is able to do, both in scientific operations and in games.
The only issue with Flops is that because of how GPU work, a lot of those operations are wasted.
The reason is, that a GPU as opposed to a CPU does the exact same calculation on a large set of data, for example if a GPU core have width of 48, then it will calculate 48 pixels at the same time, but if there is an "if" statement in the pipeline, then it will count both true and false side for each pixel, and dump results from the side that became invalid.
FLOPS are relevant, but they exclude memory bandwidth, which is a huge part of performance in games.
There isn't and never has been just one spec that can predict performance in games, even back in the day when it was triangles and fill rates. Might as well post 3Dmark scores, bud.
Would love a video going into detail of ram specs :)
They made one 6 years ago. th-cam.com/video/h-TWQ0rS-SI/w-d-xo.html
They actually have several videos on various RAM-related topics
I think there already is one long back or something. atleast something on what DDR means and what error correcting memory is.
All timings explained? Would be nice
@@vadnegru They have a video on CAS latency that paints a broad enough picture to cover most things regarding timings
@@shawndiaz7528 I saw that, and for OC it was good vid, but still, there are more to tell.
I still want a video about that audiophile grade network switch. I mean 2 GB/s half duplex instead of 1 GB/s full duplex
*imagine the shutup and take my money meme here please*
For many graphics cards/cables/tv's/monitors HDR/High Resolution/High framerate is still a 'pick 2 out of 3' at best, so color depth is a spec that has actually come back to being relevant since HDR.
I wish TH-cam videos would be 256 colors. This way I could save so much bandwidth.
Maybe I could even get 4k on DSL.
I don't think I've worried about FSB since like core2duo days.... So back when I was still in highschool.... Damn I'm old....
Everyone could tell by your odd use of ellipses.
Video idea: Threads and handles explained
Am i the only one who's not quite clear with these terms?
"When compared to the latest laptop chip, the Apple M-thing gives X% the performance at only a X-th the power consumption."
Which laptop chip, apple?
Uhhhh
@@igameidoresearchtoo6511 to be fair, they've gotten somewhat better by including the laptop model and CPU model number in the footnotes on the slides. The M2 presentation compared against a 14-inch MSI Evo laptop with a Core i5-1260P, for example.
@@tdwdank5750 Yes, but comparing your most expensive new product to something so low end is insane but most apple customers aren't aware of what specs mean or what CPU numbers mean anyways so they don't care.
@@igameidoresearchtoo6511 The M2 isn't high end really, just like the M1 was below the M1 Pro, Max and Ultra. (to be fair: it's a massive chip which would be high-end if Intel or AMD made it, but that's the luxury Apple has in making everything themselves and putting a price on it).
Similarly, I'd say that i5 isn't low end, either. In the 15-28 W performance segment there's only the i7's above it, with some lower end i5, the i3's and Pentium/Celeron budget chips below it. So I'd call it midrange.
@@tdwdank5750 That is true, but apple claims the M1 and M2 are high end, despite them being low end (yes the M1 can be considered low end due to how slow it is), so this is why I said they should be versing high end CPUs if apple wants to safely claim they are high end.
the i5 is midrange but in a real non biased benchmark the M1 would me miles behind, not slower than the M1.
16 colors ? LOOGZHOORY !
Back in my day we only had black and white, or green screen/yellow screen if you put a mask over your monitor. Four color graphics was a godsend.
Tell kids that today, they won't believe you... !
"And that might be fine, if you're using your computer to create modern art!". This one got me laughing so hard! 😂
Signal to noise ratio still matters for those of us who like to use digital EQ. If you're doing it right to avoid clipping, you make sure that your entire EQ curve is at or below 0 dB, and you boost the volume using analog gain later in the signal chain. But all those frequency cuts eat into your SnR, and if the noise floor was close to audible to begin with, you might hear it when you compensate with analog gain.
It's a niche requirement, but still, SnR does matter to some folks.
Rule of thumb - set host to 100%, cut it down on as late in chain as possible.
This makes me wonder why OSes don't handle audio in 32 bit floating-point like DAWs do. That way any additional processing of the audio is essentially "clipping-proof" provided the last element in the chain (before DAC) has the audio below 0dBFS.
@@MichaelCoombes776 yes, i newer saw Realtek with option to go over 24 bits
@@MichaelCoombes776 The increased bit depth doesn't necessarily give you more headroom for EQ - you can have 32-bit audio processing with a poorly-implemented DAC that still gives you a relatively low SnR.
What the increased bit depth does help with is making sure that any small rounding errors that happen during audio processing are going to be inaudible. 16-bit audio gives you a dynamic range of about 96 dB - enough to cover the range that humans can hear, but the least significant bit is pretty close to being audible. With repeated processing steps, some unwanted artifacts could become noticeable. With 32 bits, you've got a dynamic range slightly over 192 dB. That gives you way more room to manipulate the audio without worrying about small rounding errors accumulating into the audible range.
specs are just used for clout or bragging rights nowadays. People will be fighting over AMD and Intel latest CPU, when the most cpu intensive thing they'll ever do is playing minecraft...
FSB has been replaced by Infinity Fabric speed and Intels equivalent which still matter for ram choice if you want to optimize.
I work for a Packaging / Shipping Optimization Software company, and we still use bitmap / 256 color logos... hell we even use old 2005 DLLs and other stuff that microsoft just doesn't include in windows anymore. Sometimes lower bandwidth stuff like low color count is helpful when numbers, and data is more important for a companies' needs.
Meanwhile I'm on a dedicated audio card and still have issues with noise leaking in from my 3090 at high frame rates...
Really? How
Check for any stray screws. I was having a similar issue and I discovered that at some point I accidentally dropped a motherboard screw into the IO plate shield thing
Consider an external dac/amp
@@Ava-wu4qp and preferably with optical signal flow (toslink).
@@victortitov1740 toslink on a normal motherboard is actually quite bad. Motherboards decode and then recode the audio to digital, adding more noise. It really only makes sense if you need to run an audio signal more than 20m or so.
Assuming the OP isn't operating a large theater, they'd be better off just using a USB DAC and cheap shielded unbalanced audio cables. If they want to get really fancy, use a USB cable with a magnetic ring on it or balanced audio equipment but it's unlikely they'll notice a difference.
This was a nice trip down memory lane so thanks for that. I remember the FSB and North Bridge. I also remember the first time I went over 256 color and man was I blown away.
I'm so glad somebody else mentioned the color limitations in modern art lol.
This should be made into a series that talks not only about useless specs for pc components but for consumer electronics in general
I'd love a video about RAM channels and RAM configurations that will still give multi-channel performance.
I have to give to all LTT channels. Amazing how AMD not ATI was mentioned, but plenty of Intel.
ATI before AMD had video cards with thousands of colors when other were stuck at 16 and 256.
CPU memory controller was first on AMD, but of course, not mentioned.
This video is very impartial, not everything has to mention the CPU/GPU wars of yore and who was first at what, especially nowadays as those statistics are wholly irrelevant in common discourse.
An external sound card with virtual 7.1 surround sound is pretty great.
The most meaningless spec these days is response time on monitors, or at least the numbers manufacturers put on the spec sheets. Of course real response times do matter, but the specs never tell the real numbers.
In general is just a quick way to differ VA from others.
True , respose time that manufacturers are almost all of them is bull💩 unless it is oled maybe
will always love my sound blaster audigy 2 zs platinum with the front 5.25 bay controller.. that sound card could do anything .. I still have it in my retro xin98/xp/w7 machine along with my ati all in wonder 9800 pro.. both cards have remotes , crazy number of in and outputs..
oh my word, I love that somehow you picked the same speakers that I have (seen in 2:43). BTW it sounds good for how old they are.
Some of those numbers are no longer relevant because they have changed form. For example, on the AMD CPUs, the FSB equivalent is the infinity fabric clock speed, which is directly related to the memory speed. Similarly, there's the ring-bus speed on Intel CPUs which again relates to the memory speed. Although, I guess the BIOS takes care of the multipliers there for you when configuring the memory speed. So it's not as much no longer relevant, but hidden away from the user.
Bit depth is also important, but no longer is it 8-bits, 16-bits, or 24-bits, but has transformed into HDR support. 10-bits per channel, or 11-bits per channel, etc.
I can't speak for sound, but I really wish those numbers would be more prominent, since I have encountered several motherboards with low-level noise issues (and not bargain bin, unless motherboard prices have gone so high that $100+ counts as "bargain").
Also, when talking about integrating the memory controller into the CPU, why use a 10 year old CPU as your example....
These days if you are interested in sound quality a USB DAC or audio interface (if you want to record stuff) is much better than an internal sound card anyway. Plus better drivers (ASIO) but again that's probably only relevant when using digital audio workstations and such.
@@MichaelCoombes776 I'm not really interested in sound quality, but I dislike my speakers clicking every time I scroll a PDF document. I hadn't considered a USB DAC though, that may be a good solution. Still, it shouldn't have been an issue in the first place considering the cost of the motherboard and that it's not a lower end chipset. Thanks for the suggestion!
I use remote desktop practically religiously now as so many websites are designed for desktops even these days... And yes desktop mode is a thing, but some websites base it on screen size or even check if your on a desktop or mobile device.
I have never needed more than 256 colours to read the site.
my first PC was...monochrome. so "back in the old days" is a bit different. Also. sound used to be all the PC speaker, the built in one.
Ironic that most of systems nowdays comes with no speaker built in, with header on motherboard to install one.
Another thing about audio: External sound cards, dac, and amps can be found at really good prices nowadays. Fosi Q4 or VE Megatron should be great for most consumers headphones
Just hearing the term Northbridge makes me miss the days of Nvidia mainboard chipsets.
I had that on lab in uni. Amd Athlon cpu on Nvidia chipset blown my mind
After WWDC yesterday, I read the title as Meaningless Tech Speeches. Would be cool if ya'd cover that too.
Did you send out to a time machine to dig up FSB? Has it been relevant in the past decade?
Never heard of it tbh.
Russian Security Service?
I hate the noise through the audio. I have a mid-range motherboard from Gigabyte and I could hear hissing when I moved my mouse quickly. Thankfully, my Razer headphones came with a USB dongle (USB soundcard) and the audio quality was even better than the motherboard one. First, I thought the noise was because of my poor power supply. I never thought the noise issue was that common, I just thought my motherboard was bad.
didn't even know these where things too look for 20 years ago. literately have never looked at these specs
Another meaningless spec is the frequency range in headphones. "10Hz - 40kHz" does not equal good audioquality. The only spec that is worth taking a look at is an uncompensated frequency response graph.
That audio noise is not too old. I could still hear it on my Thinkpad R61 and W530 which was annoying as f.
The fact that Tflops didn't make it into this video is a massive oversight.
0:23 LMAO THE FORTNITE BUS SFX IM CRYING
I remember when I had to use a PC with only 256 colors because the drivers for the graphics in Windows 95 weren't installed by the previous owners of my computer at the time and I didn't have access to the internet to download the driver.
I lived on the top floor of a highrise dorm my freshman year of college and my onboard sound card picked up the local NPR station.
Dynamic contrast ratio for a TV or monitor, it just means it can dim the backlight for dark scenes. Static contrast ratio is infinitely more important
ok it took 2 decades but i finally know why the sound sucked on our home pc...
Instead of FSB now there is infinity Fabric on AMD
similar but not same
nice throwback vid
appreciate the 256 bit color they put in on james
meaningless spec: the accidental cut in your beard
The joke about modern art is just brilliant
Especially considering that it's not really a joke
I got an addition, RGB specification, 16 million colours
SNR actually need to be looked at. Anything above 100db is good. Below will be just ok for most people but if you want that extra punch, buy above 100db SNR
That last joke made me laugh. Thank you xD
Worth mentioning, if you're going for a new sound system for your pc look towards an optical solution. I wish I did, if not merely because of the loud popping sound or static background noise overtime.. & very jarring distortion sound that happens if you move the 3.5mm cable even a little lol.
Not everyone has many cables or wires running to their desktop/desk so it's not as relevant for that many but is something to be thoughtful of.
All I saw was the 300MHz CPU with a 66MHz FSB--and immediately thought back to when I overclocked my Celeron 300A--an extra 150MHz for moving a jumper.
External sound cards aren't relevant anymore if you don't mind the popping and crackling ;)
Congrats if you have good integrated audio.
"The looser cruiser" i laugh so hard
How about smart circuit breakers? Circuit breaker technology has come a long way. TH-cam channel ElectricianU would be a good resource to ask about it!
Old school overclocking required surgery on your motherboard. Now you click a setting.
I still have a sound card! My Fury X causes my inbuilt audio to be utterly unacceptable 🤭
Hey, modern video cards should be able to display more than 24bit color. Most should support at least 30 bit (3x10bit) in order to support UltraHD.
my first gui computer was a 150mhz cyrix. I would download pictures on the internet and think, wow these look terrible! Then I found the color option and was blown away.
oh, man 😂 the Nazare's math meme at the beginning! brazilian awesomeness 🙌
5:00 When you realize you dont need much color to make "Modern Art" feels like something wrong here. LUL
HDR is more colors. So we're still looking for that!
Y’all should do an episode on why virtual machines were made
1. TFT
Used to be nothing but a filler terminology in LCD screens' tech specs at a time when non-TFT LCD screens had long since gone way of the dodo.
I've had few people ask me what TFT meant relating to LCD TVs or monitors. I'd say it meant nothing, just a marketing buzzword, because at the time, that was mostly the case. Luckily it's mostly omitted today, but it has kind of been replaced by...
2. LED
Relating to LCD displays, unless accompanied by another word or letter (Edge, Direct, Quantum dot/Q), it doesn't really say that much, and I've known people who were mislead by it to think the display in question was not actually LCD when it was specified as "LED".
All it says today is that the LCD screen is LED backlit, which is like, as opposed to what, CCFLs? When was the last time anyone still produced these? The early stone age?
It's quite funny how I used to obsess about the number of cores on a smartphone SoC, now I just base it on the SoC brand and model.
Arm cores are kinda funny, 4 strong cores can easily outperform 8 weak ones, even on the same architecture
@@richardwelsh7901 Intel has used that kind of architecture already.
@@NiffirgkcaJ plot twist: its 2030, we are all using intel RISC-V phones
Topic Request: please cover how12th generation intel processors do not disclose their base clock speed. And offer suggestions on how to find the base clock speed since it’s no longer in ARK.
Your 16 color photo translation doesn't use the right 16 colors -- they were always the same 16 colors, black, red, green, yelllow, blue, magenta, cyan, gray and bright versions of the same. "dark yellow" was often called "brown" and "bright black" would have been called "dark gray"
You say we use 24bit today but even back 20 years ago 32bit was available. I have a Windows 98 retro PC with GeForce 3 and it has 16,256, 16 bit or 32bit as options. I do remember some cards being 24 bit back in the day but I thought 32 took over ages ago. One spec that no one needed to care about for some time was VRAM. It became important again but I remember when cards with 1GB of VRAM and only 256MB of VRAM performed the same VRAM wasn’t even a thing you worried about for many years.
Plz do a Video about usb's and usb c's and the diffrent versions of them
Oh old tech .. was a good days this tech brings back old memories specially sound noise 😄