Fun fact: if you want Ethernet on your gpu for some odd reason have a 20 series nvidia card or a 6000 series amd card with a virtuallink port, you can use an adapter because the virtuallink port works as a normal USB C port
Would that still work with DMA? Since the Ethernet stack exists in memory and the ethernet controller can access memory directly over the PCIe bus, wouldn't that bypass the CPU altogether? Oh, you said GPU. But still, isn't the USB C port on it's own PCIe lanes? Surely it doesn't need processing on the GPU? Or did I misinfer your meaning?
I had the Musical one that had a crazy amount of sounds for the alarm, it played Happy Birthday every hour on your Birthday, Jingle Bells every hour Christmas day & so on
Hi! I know the response why Killer chipset was there. It's because at the end of 2000's (early 2010's) many routers had problem with priority management and these killer cards were somewhat capable to bypass it. By example, if on a same network, one person was doing torrents, well the one with the Killer card will actually got the priority so he would feel less slow down. Yes, P2P was a thing back then!
Nah, AMD themselves officially stated it was for PCoIP - a way of setting up multiple monitor workstations by splitting/sharing GPUs between workstations and the server. It basically let them assign your workstation to the hardware pool to be shared among clients, which would allow super-fast multi-monitor responses on a fast network connection, which was necessary for graphics to keep up. This was particularly useful for stock market day-traders.
As an authorised Killer NIC integrator, I've dealt a LOT with it in the past. The benefits of the Killer network card are that it performs BETTER than a built-in card or add-on card. When paired with the right Video card (something a gamer concerned with cheap integrated NIC problems) would prefer as an upgrade to their connection and thus play experience.
back when this was released Killer network cards would have performed exactly the same over the intent which was likely as best 20Mb/s as any other Nic. It is just some marketing bollocks that people with too much spare money and too little talent would fall for. I had a built in Killer network card on my last Mobo (similar age) and the software and srivers were a pain. I now have a 2.5GB nic which is pointless as well but atleast there is no stupid software with it it.
Going on sale in 2009 means a fair number of buyers were upgrading systems based on motherboards made from maybe 2005 to 2009. The older end of that range was still 100mb networking so the 1gb lan was definitely a way to differentiate this card.
1 gigabit ethernet was not as common as you make it out to be in 2009 especially for mid range consumers. And 2 most people had computers with only 1 pcie slot and the rest of their peripherals were pci in the midrange space.
GBE adaptors were pretty common around 2007, I remember buying my socket AM2 motherbord around that time and I did not aim at high-end mobos. The point of this Killer adaptor is that it is a gaming adaptor.
I'm not sure about that, my old Socket 754 AGP motherboard had gigabit ethernet. It was from literally five or six years before this card came out. In that time, it would have been increasingly hard to buy a motherboard with just "Fast" Ethernet (i.e. 100mbps) and although a number of legacy systems certainly floated around (as they do), I doubt that many of their owners contemplated putting a brand new GPU in one.
Yeah, in 2009 it was hard not to buy brand new board without gigabit ethernet, I don't think there was many. In 2009 Intel is about to release those new Core series processors (s1156), and I remember that even very first s775 mobos (~2005, GM915, or so) usually came with gigabit already.
From what I recall, the marketing with Killer NICs was that it did everything in hardware, to free up your CPU. It was supposed to be ultra low latency. I also recall features like being able to prioritize game traffic and download things in the background, as in the card itself could handle torrent downloads, or something.
Would that explain why so many complain about stability? Perhaps the drivers are hooking deep into the network stack to accelerate various network layers?
Yeah, that's about the explanation. When you design an Ethernet card, you can choose how much you want to do in hardware and how much you want to do in software. At that time, most Ethernet cards only did the bare minimum in hardware (as hardware is expensive, software is cheap) and pretty much everything else in the driver, which meant if your system was under heavy CPU load, your network latency was higher than normal and it was fluctuating a lot, which both isn't good for gaming at all. So the idea was to bundle a Ethernet adapter that would do as much in hardware as possible and on top of that, even be able to buffer network traffic and then prioritize certain kind of traffic over other traffic (which up to today normal Ethernet chips don't do) and again, all in hardware.
@@soundspark Every Ethernet driver allows you to change the MAC address to a value of your choice (the Ethernet standard even requires this capability, and half of the entire MAC address space is reserved for manual MAC address assignment), and every card also supports a promiscuous mode, where you can see all packets that physically arrive at your Ethernet port, whether addressed to your MAC address or not, but to receive data, the data stream must still be filtered to your active MAC address (all packets arrive at the driver level, but only those with your MAC address go into the IP stack of the OS). These features were often implemented in software only. The moment you changed the MAC address or enabled promiscuous mode, the hardware MAC filtering on the card was simply disabled and everything happened in the driver. But high quality Ethernet chips also supported changing the hardware MAC filtering logic, because they had programmable logic on the chip instead of fixed logic. However, a very cheap card in early 2000 may have had no hardware filtering at all and always worked as if the card was in promiscuous mode, which is not forbidden by any standard or driver rule and makes the chip even cheaper to produce. And then there's the CRC32 checksum. This needs to be calculated for outgoing packets and verified for incoming packets. Again, better cards did this entirely in hardware in the chip, cheap chips just did it in the driver, so they expected to get outgoing packets with a pre-calculated checksum for sending, and they forwarded incoming packets as they were, and the driver had to check the checksum and drop the packet if it was bad (but this put a load on the bus, as even bad packets were sent over the bus first, only to be dropped by the driver, whereas a card with hardware support would have dropped them directly on the card). Of course, today's Ethernet chips can do everything in hardware and always have programmable logic. This is basically standard today, even for an on-board chip.
Yep Killer NICs were really popular before multi core processors were a big thing. Having your CPU giving time to your nic instead of the game "slows things down" so have a chip that handles the nic. Most server NICs operate the same way as a server cpu can't handle multiple 10+GB nics. The Killer NICs are just server grade NICs for your desktop with a gamming skin.
The Killer NIC moves TCP/IP processing off the CPU and onto itself, has it's own TCP/IP stack, firewall, etc. They helped to take load off the CPU while gaming online.
Yeah I remember back when my friends and I were getting these certain NICs despite having Ethernet built into our motherboards just for that dedicated TCP/IP processing and how much it improved our pings.
At the time, the HD5770 was an absolute value monster, coming at something like 95% performance of the HD4870 at 2/3 of the price (MSRP), bringing respectable 1080p performance to well under the $200 mark.
My 5770 died a year ago, was a sad day even though I had passed it onto my little brother a few years back. Could play plenty of modern games like Skyrim or League of Legends with ease, especially @ 720p. Have never found a better valued card since.
My theory on this card: Some guy in visiontek: we need a new graphics card with a special future! Drunk technical: Let's mix a graphics card with Ethernet card. Other guy: That's brilliant!
Personally i prefer membrane keyboards its much easier to press the keys I hate friend mechanical keyboard when im visiting him my fingers hertz from it
I am actually giving a PC to someone I know. His Garbage Acer Prebuilt with an FM1 A6 and mechanical drive crashed so... Giving him an Old upgraded Dell I have, Core2Q6700, 6GB DDR3 ram, 120GB Kingston SSD + 750Gig Mechanical drive... and a good old Radeon 5770. Those old radeon's are really quite decent.
@@DawidDoesTechStuff extremely random. Never knew those existed and i have been a tech since the 486 days. Congrats on your TH-cam success! And thanks for making such fun videos.
This card has the Killer ethernet chipset which was supposed to bring lower ping and optimized throughput for games. Lots of enthusiasts bought the separate Killer NIC even though they already had Gig ethernet on board. This card is obviously a package deal, helpful for those with only a few PCIe slots.
I first saw these long ago and remember it being kind of a gimmick. From what little I remember, the GPU itself has to devote compute cycles to the network functions, making the card actually perform worse graphically than it would be without the network chip. What it was good for was hosting multiplayer servers without the need for the CPU to have to expend cycles on TCP requests. A good option back in the day for LAN parties if your main server had one of these.
Nope its just another PCI-e device on the bus attached via a PLX bridge chip. The GPU has no clue of the existence of the network controller. But the network controller tends to get worse latency than your mainboard integrated one. And during heavy GPU PCI-e transfers or network transfers the traffic across the PLX chip can sort of get in the way of each other.
I have recently played a lot on Geforce Now and I had no issue with frames or quality or anything. The quality does periodically drop when the internet is chugging and its noticeably softer for a few minutes but the frames never drop under 30/60 or whatever I set it to. I never had those horrible drops to 10-15 fps like you did. In fact, I played through the entirety of Just Cause 4 and did not have a single significant freeze or fps drop! So I have absolutely no clue what could have happened in your testing, apart from the "network overload" that you mentioned. (Or just your bad internet Dawid lol)
I would actually consider this quite useful. Especially if it came out today with a 10G nic but in say Tesla or Quadro cards. Throwing a bunch if them in a server would be super useful for VM stuff. PCIE pass through with a nic has its use as well
Simple as when this card was released onboard LAN wasn’t good. What better to market with your GPU to gamers than a good network chip for their LAN parties. Makes sense to me. Saves space in the build so you don’t take up to PCI slots.... still an interesting product and decent video!
@@nated4wgy You missed the part where I said it was fine, which implies 'decent'. How does this card coming out later help your point? Unless you're saying that built in NIC used to be good, but then became worse around 2011? I also built systems up and through that time, all had serviceable motherboard LAN which I used.
@@IMelkor42 Fine doesn't imply decent at all. Fine implies the bare minimum. "serviceable" Again. If you think that implies GOOD, you need to buy a dictionary. What? You are literally putting words in my mouth now. I'm done you toxic arse.
The marketing for this card was that if you have the internet connected with the graphics card your ping and fps would be much better in game since all the processing was done in the same board pretty smart marketing for people who don't know about pc because they would be like good connection + graphics card= good fps in online games which that equation does work problem is it doesn't make a real difference also allot of people didn't have gigabit ethernet was very expensive
@@betadan yeah and it also did a lot of network processing and thereby offloading the load from the cpu. Usually not that important, but when running torrents in the background it can make a massive difference back then
I had the misfortune of having a motherboard at one point that had that Killer chip on it. I actually ended up having to completely disable that network interface once I realized that the driver had a memory leak. Driver memory leaks are especially bad because it takes a reboot to get that memory back and they're tougher to track down because that memory isn't attributed to a process.
I believe this was intended for gamers using older PC's that only had 100Mb/s ethernet on board, or flakey and frankly awful Realtek network adapters from the time period. Not every gamer had a brand spanking new Core i7 960 and X58 motherboard at the time, many were still using early Core2 Duos or a crappy prebuilt that needed a GPU upgrade, and for those users this would have been enticing.
I agree, I believe it is this + general marketing/testing the waters type stuff. Another use-case would be if you were running crossfire, wanted another NIC, and had exhausted your slots.
Such gamers would have been better served by sinking $280 (Yes, the standalone Killer NIC cost that much) into a better graphics card, RAM upgrade, or CPU.
The Killer NPU (Network Processing Unit) bypassed the Windows Network Stack. It ran Linux (hence the RAM modules) in order to process network packets. This mean't that you could actually control the way traffic passed through the NIC. You could prioritize your games while allowing torrents to download while you gamed. This would mitigate any sort of latency incurred from downloading torrents while gaming. Think of it as more of a "Router" than a NIC. That's why the Killer NIC exists.
I can see adding an Ethernet could make sense on some small form factor motherboards, or budget boards lacking Ethernet, or the on board solution fails. There may be some video over Ethernet system.
Nope. It was used on remote workstations where security was important, hospital scan images for example. That way you showed the image without the reciver being able to download vital/personal info. It was used alot with CAD for example. You even had cards that ONLY had a ethernet conex like the AMD FirePro R5000.
I had 2 5770s in crossfire back in the day. The crossfire scaling for those cards was stupidly good for some reason, so adding a second card netted 75% higher performance.
This was from a period when one or two expansion slots on motherboards were legacy PCI slots. If you had a Micro-ATX board back then, you didn't have much room for expansion. Usually below the main x16 slot was an x1 slot that would be blocked by a graphics card and the remaining two slots would be a PCI slot (effectively useless) and a secondary PCIe slot. So if you wanted to replace either the garbage onboard sound or network adapter with a discrete solution from a better brand then you would be hard pressed to come up with an ideal solution. Also let's not forget that SLI and Crossfire were a thing back then.
The reason this thing exists: LAN parties. Instead of needing to connect a bunch of network cables to your router, you can use the additional network port to share your connection with the bro next to you!
I bought one and still have it in a box of old video cards. I specifically bought it to upgrade an old DELL Inspiron that had an on board 10/100 Ethernet port and proprietary old video card that pulled memory off system memory. This card made sense as it was an upgrade on Both my video and ethernet and saved an expansion slot as the card upgrade still took 2 slots and made 1 of the pci slots unusable, and the 1 ISA slot was of no use for new boards at the time. With this card I had an updated 10/100/1000 Ethernet with a faster video card with dedicated memory as the dell card shared memory off the motherboard. This card was for situations like this as a viable upgrade to old systems. It worked great until the Motherboard died and I scavenged it for parts. As far as drivers, I upgraded from Win 98 to Win XP pro. Drivers worked well at the time. I am one who squeezes the last drop of functionality out of every machine I have. Most of my old machines that are totally obsolete have been converted to NAS machines. I have been working with computers and electronics sense I was 7 years old and being a retired DATA Systems and IS/IT tech I have seen a lot that most will never see. :)
Used and old keyboard without win key until I spilled beer on it, after that the win key is the first thing I remove on a keyboard. Although my steelseries keyboard have no win key on the left side, there is a steelseries function key instead.
I had one of their K55 keyboards and for a membrane it was actually really good. I used it for years until I finally got a mechanical keyboard on sale, ended up giving the K55 to a friend.
There's also the ability to setup etherchannel bonding, if your switch supports that, so you can have a 2GbE connection. Something that will still cost a fair amount (£25) today. Not to mention that you can have one network for gaming, with an outside connection to the Internet, and an internal connection for your intranet/private servers. Finally, Killer have a lot of marketing behind them, and this was probably more of collaboration that was decided in the board room, rather than at the design level, to trial future compatibility between 2 companies. Most of the time when you see something "weird" on a "consumer" product, it's either because it wasn't _meant_ to be a consumer product, and just ended up that way upon release, or there was a potential corporate takeover/merger on the cards, and they wanted to see if their engineers could work together, or if the merger/acquisition would make sense (being folded in to one product line), and provide a noticeable boost to their bottom lines. Sometimes when you develop something, you don't really know how it will be used in the market. EVE is a classic example of a video game that had this very issue. There's a keynote speech from ~2014 that talks about this. When EVE released, they didn't anticipate that players would work together in quite the way they did, and as a result, the pacing of the in-game economy was completely out of whack compared to where they thought it would be a couple of months after launch. It's a classic case of, here's some mechano. Make something. Then staring in wonder as someone creates a full sized theme park out of it.
This is why I like your channel so much. When you say you're gonna show a pointless GPU, you show a pointless GPU. Not some amazing RTX 4090 Ti CureYourDepression Edtion that's "pointless" because it has 16GB of GDDR6X instead of 32GB.
Killer ethernet was marketed for online gaming. It gave gaming trafic priority over regular ethernet trafic or something. That is probably why it was included.
I could see a NIC being useful on a GPU as an output, provided that it had an output hooked to it. Way back in the day, getting display signal over longer distances were a pain, so what most people used was to encode the HDMI signal and run it over Ethernet as it was cheaper and worked for longer distances than HDMI cables ever could. Having a built in port for it on the graphics card could make you shed one box in the setup... However I doubt that was its purpose though.
In the UK, that membrane keyboard is on Amazon for £70-75. That's getting into decent mechanical keyboard territory (Logitech G413, Razer Black Widow Lite or almost any Red Dragon of Havit), and you can get a Corsair K60 for around the same price.
I had that card! I replaced it with a 7770 Ghost which was not really that much better (mostly a rebadge). I really wish AMD were better about long term driver support. It doesn't matter for most people of course who upgrade constantly, but for making old hardware usable it really is nice to see. Cards like the X1950XT, HD2900, HD4890 etc have abominable support even though their raw performance would be more than adequate for lots of indie and eSports titles. I can use old GeForce 8800GT, 9600GT etc from similar or even earlier with far better results with newer OS than contemporary AMD/ATI. As far as the network card was concerned, it was a good idea but terrible implementation and support. Your typical cheap onboard NIC of the era was a 'slacker' IC that put the brunt of the network processing load onto the CPU and OS though its driver. At the time even using old 3Com 10/100 cards offered better results as the NIC had all of its own processing hardware on the card rather than being a performance parasite (that can also introduce lag due if the system starts getting pretty loaded and the CPU cycles are not free enough to maintain consistent network performance). The best results even to this day are offered by strong non-slacker NICs like the Intel Pro/1000 cards with onboard cache and fast local processors (they even have heatsinks!). If you want a good one, look on the usual sites for the dual port gigabit cards by Intel, you should find them for $20-$30 if you hunt, and they are ultra smooth non slacker stuff. They even support duplexing of the ports for 2Gbit if you have a compatible router and/or switch for the other devices in your system (my NAS has this enabled and supported, giving me more than double the peak performance of typical onboard gigabit connections). Of course 2.5Gbe etc is even better if you have the supporting infrastructure, but it's dirt cheap to build out dual Gbit in comparison.
These Graphics cards were mostly going to be used for Public Displays ( you know, like the ones that show adverts or announcements for the public in public places like trains, parks, malls etc) and Digital Signage. The GPU part is mostly responsible for display output while the network part will enable it to get a stream of data that needs to be displayed without involving any human operator on site. I remember Matrox used to sell a box that was essentially a Savage S3 GPU with a VIA network card.
Was it made to have a HDMI to Ethernet adapter? I remember back when HDMI was new you needed a PCI Ethernet card to actually do that because you would need full speed gigabit Ethernet to actually send the signal. That was more of a novelty though, HDMI over Ethernet was made for like hospital waiting rooms and airports. Those places didn't actually use Ethernet to run the one DVD player they could afford from a locked room to the TV in the waiting rooms, DVD players got cheap enough that they could buy one DVD player for every TV that was hung on the wall.
I had a similar idea just a few days ago. Imagine having a graphics card with a half-height expansion slot integrated, that works with a pci-e bifurcation. A bit like the pci to pci-e adapters look but the pcb extends to house a full gpu. i dunno...
Hey! About the geforce now thing: I've been forced with an old HD4650 or something like that for the past few months because my r9 280 broke, and... Yeah, that does suck. It's not because of geforce now, but because the video card was so weak that it couldn't handle geforce now (you still need acceptable video decoding capabilities, especially with higher framerates and resolutions). Now I managed to get my hands on an r9 270x, which handles that just fine. Nice video!
I remember getting one of those for my old workshop pc back in the day when grinder-dust killed the network and monitor ports on it, and I currently were unable to convince my then wife that the 2 year old system we used upstairs were outdated...
i love this content man. also, recently just upgraded from a K55 RGB (had it for about four months) to a K55 PRO (it unfortunately broke in a rage mode) to a Logitech G513. i gotta say that the move from the larger k55 to the smaller g513 was difficult, but, the g513 smokes the k55. no comp. this thing sounds great, feels great and i got it for 70 bucks from work because it had a damaged exterior box. everything was intact though, no damage. i love it, plus the palm rest thats padded feels amazing.
"a particularly rowdy absinthe binge" this is the kind of thing i tell my friends so they will check out this channel, Dawid, you sound like good peeps haha
I remember when those graphics cards came out, it actually made sense to have the LAN card on there, it was around the same time people were really getting into high-speed broadband that was fast enough to out-speed a regular full duplex 10/100baseT ethernet card, and motherboards only had them on them, but gigabit cards started making more sense. it was not unlike how soundcards also used to be what you plugged in your joystick to, or hard drive cards that could also do floppy drives and serial.
First, have this card. Second, works with or without the shroud, no temp diff. Third, Killer placed the NIC on the card to jump on the "gaming" bandwagon. I believe the idea was if it was integrated into the GPU it would use less resources and not take up the extra slot that it usually would take up. I also believe marketing thought it was a snazzy idea considering it was attached to the GPU itself. Minimal extra cost, two birds with one stone. Please understand Killer was trying to place itself as THE NIC for gaming and wanted to make sure you thought nothing else could compare. Fourth, GFX works on bandwidth. If for whatever reason their servers are loaded and or the points in between are congested, your bandwidth is throttled and the game stutters. It has been a day one issue with ALL streaming services. And why cloud gaming is still, well, stuck in the clouds.
maybe it came at an era in which 100Mbit NICs were still a thing and people didn't really want to change their motherboards just for the sake of a new network card
But I love hardware with unusual features like that: Graphics cards with LAN ports, Gateway 2000 AnyKey keyboards from 1992-1993 with diagonal arrow keys, Modern wireless keyboards with phone holder. four serial ports on a desktop, Blu Ray Rewriteable with HD DVD, DVD, CD, LightScribe, LabelFlash (also known as Super Duper Optical Drive).
Have that same keyboard and love it honestly. Plus when you break it in the keys give a little rewarding clack/squeak I cant help but adore. Also if your a streamer those macro and media keys are a god send. Allow you to make changes without putting windows over SLOBS or having to buy a 3rd screen.
I am actually not surprised about the results, a 5770 was a formidable GPU back in the day. Got a GTX 580 2011, a top of the line GPU of the same generation but from rivals Nvidia and it lasted longer than expected. And if I don't completely miss remember, the 5770 wasn't that far behind in performance. The GPU power was actually quite good, it even came close to the upcoming consoles in 2013, but when they launched they had 8GB shared memory so 1,5 GB just wasn't enough. So games ran fine but you had to nuke the texture settings so everything became a blur. I replaced it with a GTX 1070 in 2016 with some help from a friend.
In my experience, cloud gaming as a whole is redundant as it is in its current state. It either fails to perform well, and in cases where it does, it is priced high enough that the customer would be smarter to buy their own rig. Plus, what's the point of cloud gaming if you can't play most of the games you own anyway?
So about GeForce Now... I've been using it daily, since my current PC is rocking an FX-6200 with an HD 7570. At $5/month, I'll have spent the equivalent of a 3060 (at MSRP) in five and a half years. One thing I've noticed about GeForce Now is that by default it auto-chooses your server location, based on network performance. But sometimes you can get a better connection (or at least a more consistent one) by manually picking a server. There's even a tool on the settings page to test bandwidth, packet loss, and latency with different servers. Most of the time, my best server is US East 2. But some days it suffers from congestion, and I get better results from their US Northeast Server.
It's a gaming NIC. It has some hardware or drivers that "improve online performance". I remember them as being PCI or PCIe cards. Never knew there was one paired with a graphics card. Maybe it's for motherboards that only have one PCIe slot.
In 2009/2010, you had to buy a mid to high-end motherboard to get 1Gb Ethernet. Many lower end motherboards still had 100Mb onboard. Rewind a few years, and even mid-range motherboards were fairly sparse that had 1Gb onboard Ethernet. Now, factor in that most onboard Ethernet was some Realtek or Broadcom crap that didn't perform nearly as well as Killer NICs did at the time. Couple that with mATX being a very popular form factor, and you can see why there was at least a niche market for something like this back then. A HD 5770 wasn't even coming close to using all 16 lanes of PCIe 2.0, and 1Gb Ethernet wouldn't use much of it at all (1 or 2 lanes at most) it really did make sense for a few people. I almost bought that card for a new build in 2010, but it was cheaper to get a better motherboard and a regular 5770. However, if I already had the motherboard, maybe a year or two old, and it had a crappy onboard NIC, it would have made sense at the time.
I bought an HD 5770 when it had come out back in early 2010. It wasn't anything special but it was good for Medium-Light Gaming at the time. It could easily do 40fps for titles like Jedi Academy, Far Cry 2 & 3 and Startcraft II or Diablo etc.
I do commend you, Dawid, for trying to use such a monstrosity of a card. I cannot imagine WHY someone thought this kind of mad hybrid would ever make sense!
I used to use an AMD HD 6770 card, which I have heard is basically just a HD 5770 that they rebranded. Worked great until the plastic fan shroud connectors broke, which messed with the cooling flow and it started to overheat under load. No ethernet port on my card, though.
Not pointless! Some people wanted to start gaming on a systems that didn't have integrated nics. Just like in the 90s soundblasters had game controller ports. Buy 1 card instead of 2!
It was useful for ITX systems back in the day. Many mainboards had rather bad onboard NICs and in rare cases, they died without any warning. I think it's a funny and possibly useful feature, nowadays such a combo would be interesting with 2,5 or 10Gbit LAN, there is plenty of unused PCIe 4.0 bandwidth.
So if I recall the reason for the integration of the 2 cards was because it "saved space". Motherboards that did not have a built in Network option and didn't have enough PCI slots to have a network card and graphics card (or they were too close together to be used at the same time).
Fun fact: if you want Ethernet on your gpu for some odd reason have a 20 series nvidia card or a 6000 series amd card with a virtuallink port, you can use an adapter because the virtuallink port works as a normal USB C port
Oh it’s dapz hey man
hi
Ayo dapz. I didn't expect you here
More graphics cards need to have usb-c ports
Would that still work with DMA? Since the Ethernet stack exists in memory and the ethernet controller can access memory directly over the PCIe bus, wouldn't that bypass the CPU altogether?
Oh, you said GPU. But still, isn't the USB C port on it's own PCIe lanes? Surely it doesn't need processing on the GPU? Or did I misinfer your meaning?
"I nearly played so much Battlefield 4 that Anna almost left me"
"My Anna left me"
*sigh*
I wanted more of that story.
Same here
@@ApofKol he's such a tease with that one isn't he?
same :´(
Did your Anna leave you? I'm sorry :(
Casio watches were pretty slick (especially if they had a calculator).
I bought one when they first came out,they were awesome
yeah, i had one back in the 80s and i was so sad when i cracked the glass on it in the subway in paris
I had the Musical one that had a crazy amount of sounds for the alarm, it played Happy Birthday every hour on your Birthday, Jingle Bells every hour Christmas day & so on
@@shaneeslick thats the one i had too, it played 12 melodies i think and had a calculator. those were the times:}
Found my old Casio watch from over 20 years ago... still had perfect time. It's the telephone book one. XD
"...if you accidentally press the Windows key in the middle of a gun fight...in a game..."
Thanks for specifying this for your Chicago audience.
Bahahaha
@@AnnaDoes Guessing you wrote that?
If you know, you know. 😂
*_Oof_*
I was wondering at which point he said this in the video, then I realized... the advertisement :D
Hi! I know the response why Killer chipset was there. It's because at the end of 2000's (early 2010's) many routers had problem with priority management and these killer cards were somewhat capable to bypass it. By example, if on a same network, one person was doing torrents, well the one with the Killer card will actually got the priority so he would feel less slow down. Yes, P2P was a thing back then!
Dear, you are removed from 3rd world' experience
(Connect tor to reach torrage)
best p2p to pvp card ever
P2P is still alive and well if you know where to look. Just a lot less accessible to the average normie than it was back in the mid to late 2000s.
Nah, AMD themselves officially stated it was for PCoIP - a way of setting up multiple monitor workstations by splitting/sharing GPUs between workstations and the server. It basically let them assign your workstation to the hardware pool to be shared among clients, which would allow super-fast multi-monitor responses on a fast network connection, which was necessary for graphics to keep up. This was particularly useful for stock market day-traders.
@@joshuavoss4354 please do share
As an authorised Killer NIC integrator, I've dealt a LOT with it in the past. The benefits of the Killer network card are that it performs BETTER than a built-in card or add-on card. When paired with the right Video card (something a gamer concerned with cheap integrated NIC problems) would prefer as an upgrade to their connection and thus play experience.
back when this was released Killer network cards would have performed exactly the same over the intent which was likely as best 20Mb/s as any other Nic. It is just some marketing bollocks that people with too much spare money and too little talent would fall for.
I had a built in Killer network card on my last Mobo (similar age) and the software and srivers were a pain.
I now have a 2.5GB nic which is pointless as well but atleast there is no stupid software with it it.
Going on sale in 2009 means a fair number of buyers were upgrading systems based on motherboards made from maybe 2005 to 2009. The older end of that range was still 100mb networking so the 1gb lan was definitely a way to differentiate this card.
1 gigabit ethernet was not as common as you make it out to be in 2009 especially for mid range consumers. And 2 most people had computers with only 1 pcie slot and the rest of their peripherals were pci in the midrange space.
or even AGP or how it was called that slot :D
cus look i have here AM3 socket MSI MBO that have PCIE slots and down on mbo is PCI slot :D
GBE adaptors were pretty common around 2007, I remember buying my socket AM2 motherbord around that time and I did not aim at high-end mobos. The point of this Killer adaptor is that it is a gaming adaptor.
I'm not sure about that, my old Socket 754 AGP motherboard had gigabit ethernet. It was from literally five or six years before this card came out. In that time, it would have been increasingly hard to buy a motherboard with just "Fast" Ethernet (i.e. 100mbps) and although a number of legacy systems certainly floated around (as they do), I doubt that many of their owners contemplated putting a brand new GPU in one.
Yeah, in 2009 it was hard not to buy brand new board without gigabit ethernet, I don't think there was many. In 2009 Intel is about to release those new Core series processors (s1156), and I remember that even very first s775 mobos (~2005, GM915, or so) usually came with gigabit already.
From what I recall, the marketing with Killer NICs was that it did everything in hardware, to free up your CPU. It was supposed to be ultra low latency. I also recall features like being able to prioritize game traffic and download things in the background, as in the card itself could handle torrent downloads, or something.
Would that explain why so many complain about stability? Perhaps the drivers are hooking deep into the network stack to accelerate various network layers?
Yeah, that's about the explanation. When you design an Ethernet card, you can choose how much you want to do in hardware and how much you want to do in software. At that time, most Ethernet cards only did the bare minimum in hardware (as hardware is expensive, software is cheap) and pretty much everything else in the driver, which meant if your system was under heavy CPU load, your network latency was higher than normal and it was fluctuating a lot, which both isn't good for gaming at all. So the idea was to bundle a Ethernet adapter that would do as much in hardware as possible and on top of that, even be able to buffer network traffic and then prioritize certain kind of traffic over other traffic (which up to today normal Ethernet chips don't do) and again, all in hardware.
@@xcoder1122 The Ethernet card should at a minimum handle the MAC layer and packets addressed to and from the card?
@@soundspark Every Ethernet driver allows you to change the MAC address to a value of your choice (the Ethernet standard even requires this capability, and half of the entire MAC address space is reserved for manual MAC address assignment), and every card also supports a promiscuous mode, where you can see all packets that physically arrive at your Ethernet port, whether addressed to your MAC address or not, but to receive data, the data stream must still be filtered to your active MAC address (all packets arrive at the driver level, but only those with your MAC address go into the IP stack of the OS).
These features were often implemented in software only. The moment you changed the MAC address or enabled promiscuous mode, the hardware MAC filtering on the card was simply disabled and everything happened in the driver. But high quality Ethernet chips also supported changing the hardware MAC filtering logic, because they had programmable logic on the chip instead of fixed logic. However, a very cheap card in early 2000 may have had no hardware filtering at all and always worked as if the card was in promiscuous mode, which is not forbidden by any standard or driver rule and makes the chip even cheaper to produce.
And then there's the CRC32 checksum. This needs to be calculated for outgoing packets and verified for incoming packets. Again, better cards did this entirely in hardware in the chip, cheap chips just did it in the driver, so they expected to get outgoing packets with a pre-calculated checksum for sending, and they forwarded incoming packets as they were, and the driver had to check the checksum and drop the packet if it was bad (but this put a load on the bus, as even bad packets were sent over the bus first, only to be dropped by the driver, whereas a card with hardware support would have dropped them directly on the card).
Of course, today's Ethernet chips can do everything in hardware and always have programmable logic. This is basically standard today, even for an on-board chip.
Yep Killer NICs were really popular before multi core processors were a big thing. Having your CPU giving time to your nic instead of the game "slows things down" so have a chip that handles the nic.
Most server NICs operate the same way as a server cpu can't handle multiple 10+GB nics. The Killer NICs are just server grade NICs for your desktop with a gamming skin.
I'm actually so surprised with the graphics cards performance with games I thought it would end up being poo poo garbage.
Ditto! I was quite impressed
I know! Much better than I thought.
@@DawidDoesTechStuff Dawid your moist pc video was awesome dude.lmfao
@@nick-cd5tg Thanks! I'm glad you enjoyed it. 😁
80 fps in csgo at low settings is garbage
GPUs back then look like the knockoffs of today. Looks like a children's version
The knockoffs are still trying to catch up to current year designs. Ten years in the future they'll make convincing RTX knockoffs.
I wonder how many knockoffs are using surplus shrouds from old GPU.
as opposed to now where they are all plain black blocks
You should see the first generation of GPU's that had fan shrouds. They are so tiny and adorable, like babby's first squirrel cage.
The Killer NIC moves TCP/IP processing off the CPU and onto itself, has it's own TCP/IP stack, firewall, etc. They helped to take load off the CPU while gaming online.
Yeah I remember back when my friends and I were getting these certain NICs despite having Ethernet built into our motherboards just for that dedicated TCP/IP processing and how much it improved our pings.
At the time, the HD5770 was an absolute value monster, coming at something like 95% performance of the HD4870 at 2/3 of the price (MSRP), bringing respectable 1080p performance to well under the $200 mark.
looks like equivalent of GTX 550 Ti
My 5770 died a year ago, was a sad day even though I had passed it onto my little brother a few years back. Could play plenty of modern games like Skyrim or League of Legends with ease, especially @ 720p. Have never found a better valued card since.
It was rivalling the gtx 460.
Having a NIC makes PERFECT SENSE if there were software which could output RTMP streams directly encoded via the GPU.
Having a Graphics card with built in Ethernet makes sense, if you have motherboard with limited number of expansion slots.
Or if it could output HDBaseT or perform RDMA transfers or something.
The video card was probably marketed as a drop in upgrade for older systems, which probably didn’t have gig Ethernet.
Ok so help me understand this then…who had gig speed internet 10 years ago?
@@Derek2k I have gig speed for like 5 years. And I don't live in very populated city. Not every country is internet poop as USA
@@Derek2k On the local network?
Tons of people. I've got a Core 2 Duo playing media server with onboard, dual gigabit LAN
@@Derek2k More over local lan... though that was more a 2000s thing. :P
@@Derek2k gigabit LAN is a nicety for transferring files to and fro, especially if you have network storage.
Thumbs up for producing a sponsor spot that was a mini review segment. Feature list and audio samples -- excellent work!
My theory on this card:
Some guy in visiontek: we need a new graphics card with a special future!
Drunk technical: Let's mix a graphics card with Ethernet card.
Other guy: That's brilliant!
Haha!! Yeah, that sounds about right.
i love the sponsor is a membrane keyboard
he lowkey made me wanna buy it lol
Been using the standard k55 for years now, massively prefer it over a mechanical personally. Feels really great and its relatively quiet
Some membrane keyboards can feel good!
Personally i prefer membrane keyboards its much easier to press the keys
I hate friend mechanical keyboard when im visiting him my fingers hertz from it
@@dan2800 Does your fingers megahertz or gigahertz?
"A keyboard that sounds like this"
If only people were allowed to work this passive aggression into Raycon ads...
Circa 2015 Dawid really did play A LOT of battlefield 4...
impossible, not him!
Maybe he new challenge, like plays battlefield game on casio pocket watch too 😅
Well were happy your still with him :D
@@petaaa5419 me too. Turns out he was worth keeping 🤣
I am actually giving a PC to someone I know.
His Garbage Acer Prebuilt with an FM1 A6 and mechanical drive crashed so...
Giving him an Old upgraded Dell I have, Core2Q6700, 6GB DDR3 ram, 120GB Kingston SSD + 750Gig Mechanical drive... and a good old Radeon 5770.
Those old radeon's are really quite decent.
Oh for sure. The Graphics card bit makes perfect sense, I just think the ethernet card is random. 👍
@@DawidDoesTechStuff extremely random. Never knew those existed and i have been a tech since the 486 days. Congrats on your TH-cam success! And thanks for making such fun videos.
The moment you realize that decal on the Graphics card is from Warhammer 40K Dawn of War II Chaos rising... Burn in Holy fire
This card has the Killer ethernet chipset which was supposed to bring lower ping and optimized throughput for games. Lots of enthusiasts bought the separate Killer NIC even though they already had Gig ethernet on board. This card is obviously a package deal, helpful for those with only a few PCIe slots.
i never got the killer nic but yes that was what it was for. i just never wanted to spend the money on it
I first saw these long ago and remember it being kind of a gimmick. From what little I remember, the GPU itself has to devote compute cycles to the network functions, making the card actually perform worse graphically than it would be without the network chip. What it was good for was hosting multiplayer servers without the need for the CPU to have to expend cycles on TCP requests. A good option back in the day for LAN parties if your main server had one of these.
Nope its just another PCI-e device on the bus attached via a PLX bridge chip. The GPU has no clue of the existence of the network controller. But the network controller tends to get worse latency than your mainboard integrated one. And during heavy GPU PCI-e transfers or network transfers the traffic across the PLX chip can sort of get in the way of each other.
@@cybercat1531 Welp, sounds like it was even more useless than I thought it was. The more you know XD
I have recently played a lot on Geforce Now and I had no issue with frames or quality or anything. The quality does periodically drop when the internet is chugging and its noticeably softer for a few minutes but the frames never drop under 30/60 or whatever I set it to. I never had those horrible drops to 10-15 fps like you did. In fact, I played through the entirety of Just Cause 4 and did not have a single significant freeze or fps drop! So I have absolutely no clue what could have happened in your testing, apart from the "network overload" that you mentioned. (Or just your bad internet Dawid lol)
Yes I’ve also played a lot GeForce now recently and I’ve not experienced so much lag as you have
Maybe it was a video decoding bottleneck, was it even hardware decoding with that card?
I would actually consider this quite useful. Especially if it came out today with a 10G nic but in say Tesla or Quadro cards. Throwing a bunch if them in a server would be super useful for VM stuff. PCIE pass through with a nic has its use as well
You didn't think the end..
Used $200-400 , servers at this point have quad 10g fiber or copper stuff at this point.
Simple as when this card was released onboard LAN wasn’t good.
What better to market with your GPU to gamers than a good network chip for their LAN parties.
Makes sense to me. Saves space in the build so you don’t take up to PCI slots....
still an interesting product and decent video!
Nah. The budget system I built in high school back in 2002 had onboard LAN and it was fine.
@@IMelkor42 You missed the "decent" part. Also this card was released way later. It's a killer network chip
@@nated4wgy You missed the part where I said it was fine, which implies 'decent'.
How does this card coming out later help your point? Unless you're saying that built in NIC used to be good, but then became worse around 2011? I also built systems up and through that time, all had serviceable motherboard LAN which I used.
@@IMelkor42 Fine doesn't imply decent at all. Fine implies the bare minimum.
"serviceable" Again. If you think that implies GOOD, you need to buy a dictionary.
What? You are literally putting words in my mouth now. I'm done you toxic arse.
at the time you couldn't really stream 1080p across wifi to a monitor in another building, this is an 'internet of things' graphics card
The marketing for this card was that if you have the internet connected with the graphics card your ping and fps would be much better in game since all the processing was done in the same board pretty smart marketing for people who don't know about pc because they would be like good connection + graphics card= good fps in online games which that equation does work problem is it doesn't make a real difference also allot of people didn't have gigabit ethernet was very expensive
The Killer NIC software also allows for QOS. This would allow you to set a higher network priority on your games.
@@betadan yeah and it also did a lot of network processing and thereby offloading the load from the cpu. Usually not that important, but when running torrents in the background it can make a massive difference back then
Video starts at 1:34
"A keyboard that sounds like THIS"
starts typing and it sounds like mushy garbage
I had the misfortune of having a motherboard at one point that had that Killer chip on it. I actually ended up having to completely disable that network interface once I realized that the driver had a memory leak. Driver memory leaks are especially bad because it takes a reboot to get that memory back and they're tougher to track down because that memory isn't attributed to a process.
I believe this was intended for gamers using older PC's that only had 100Mb/s ethernet on board, or flakey and frankly awful Realtek network adapters from the time period.
Not every gamer had a brand spanking new Core i7 960 and X58 motherboard at the time, many were still using early Core2 Duos or a crappy prebuilt that needed a GPU upgrade, and for those users this would have been enticing.
I agree, I believe it is this + general marketing/testing the waters type stuff.
Another use-case would be if you were running crossfire, wanted another NIC, and had exhausted your slots.
Such gamers would have been better served by sinking $280 (Yes, the standalone Killer NIC cost that much) into a better graphics card, RAM upgrade, or CPU.
My only question is why there appears to be a Chaos Space Marine on the card. Maybe the card is a heretic?
I looked it up because i swore those were space marines and yup dawn of war 2 promotion on the box
@@wsketchy oh that makes sense!
“Now, if your GPU crashes, your internet can go with it!!”
The Killer NPU (Network Processing Unit) bypassed the Windows Network Stack. It ran Linux (hence the RAM modules) in order to process network packets. This mean't that you could actually control the way traffic passed through the NIC. You could prioritize your games while allowing torrents to download while you gamed. This would mitigate any sort of latency incurred from downloading torrents while gaming. Think of it as more of a "Router" than a NIC. That's why the Killer NIC exists.
I wonder what he's typing when he's sponsoring those keyboards. Hmm...
Im almost curious enough to frame x frame that to spell it out. Almost.
I'm not sure.. but I'm getting a vibe that Dawid was really surprised...
favorite comment
I can see adding an Ethernet could make sense on some small form factor motherboards, or budget boards lacking Ethernet, or the on board solution fails. There may be some video over Ethernet system.
My best guess is that it was originally part of a pre-built system and they had stock leftover and repackaged it as a stand alone GPU.
Nope. It was used on remote workstations where security was important, hospital scan images for example. That way you showed the image without the reciver being able to download vital/personal info. It was used alot with CAD for example.
You even had cards that ONLY had a ethernet conex like the AMD FirePro R5000.
That corsair keyboard looks and sounds like a glorified office keyboard with a ton of dirt underneath the keys
"It runs surprisingly well!" Dawid channelling CDPR executives.
Bro tried to impress us with the sound of a membrane keyboard 😅
Dawid doing greatness always
I swear nobody talks about how nice Dawid’s camera is! Lovely, shows how much he cares.
any gpu with sticker art is interesting lol
I had 2 5770s in crossfire back in the day. The crossfire scaling for those cards was stupidly good for some reason, so adding a second card netted 75% higher performance.
No Dawid, I won’t buy a membrane keyboard 😂😐
yeah imagine paying that kind of price for a e-waste cyberpower pc-like keyboard with a corsair logo
They are good if have a lot of pet hair floating around, it only takes one hair to get key chatter on mechanical keys.
Corsair: sponsors Dawid
Dawid: dedicated media keys with rocker buttons, so *I don’t really like that*
When a GPU from 2009 runs better than your current GPU
you running on a gt 710? :)
@@notcheems2783 Nah, I'm running on a Quadro K1000M right now on my laptop. I'm getting the parts to build a pc with a GTX 1060 6GB.
Okay, stop showing off about having a GPU.
@@JustSomeVideos0 It's not showing off, it's called "describing"
@@theitbit8458 it's also called a joke about the current lack of gpu units for sale :)
This was from a period when one or two expansion slots on motherboards were legacy PCI slots.
If you had a Micro-ATX board back then, you didn't have much room for expansion. Usually below the main x16 slot was an x1 slot that would be blocked by a graphics card and the remaining two slots would be a PCI slot (effectively useless) and a secondary PCIe slot.
So if you wanted to replace either the garbage onboard sound or network adapter with a discrete solution from a better brand then you would be hard pressed to come up with an ideal solution. Also let's not forget that SLI and Crossfire were a thing back then.
The reason this thing exists: LAN parties. Instead of needing to connect a bunch of network cables to your router, you can use the additional network port to share your connection with the bro next to you!
You are much better off getting a gige card, they are pretty cheap, and only need a pciex1 slot.
@@arahman56 But it was the same 11 years ago?
Sad to say that's only a single generation older than my two graphics cards in my machine
"Dota runs on casio watch" me playing it on 710m😂
I bought one and still have it in a box of old video cards. I specifically bought it to upgrade an old DELL Inspiron that had an on board 10/100 Ethernet port and proprietary old video card that pulled memory off system memory. This card made sense as it was an upgrade on Both my video and ethernet and saved an expansion slot as the card upgrade still took 2 slots and made 1 of the pci slots unusable, and the 1 ISA slot was of no use for new boards at the time. With this card I had an updated 10/100/1000 Ethernet with a faster video card with dedicated memory as the dell card shared memory off the motherboard. This card was for situations like this as a viable upgrade to old systems. It worked great until the Motherboard died and I scavenged it for parts. As far as drivers, I upgraded from Win 98 to Win XP pro. Drivers worked well at the time. I am one who squeezes the last drop of functionality out of every machine I have. Most of my old machines that are totally obsolete have been converted to NAS machines. I have been working with computers and electronics sense I was 7 years old and being a retired DATA Systems and IS/IT tech I have seen a lot that most will never see. :)
Wow I still have a desktop with a "Killer" ethernet chipset in it. The port glows red ominously
All the games I've lost to the Windows Button on the keyboard,so,so many
On my keyboard you can disable the windows key with a button, I literally have that exact keyboard.
Used and old keyboard without win key until I spilled beer on it, after that the win key is the first thing I remove on a keyboard. Although my steelseries keyboard have no win key on the left side, there is a steelseries function key instead.
"It is a membrane keyboard..."
Aaaand I'm out.
I had one of their K55 keyboards and for a membrane it was actually really good. I used it for years until I finally got a mechanical keyboard on sale, ended up giving the K55 to a friend.
There's also the ability to setup etherchannel bonding, if your switch supports that, so you can have a 2GbE connection. Something that will still cost a fair amount (£25) today.
Not to mention that you can have one network for gaming, with an outside connection to the Internet, and an internal connection for your intranet/private servers.
Finally, Killer have a lot of marketing behind them, and this was probably more of collaboration that was decided in the board room, rather than at the design level, to trial future compatibility between 2 companies.
Most of the time when you see something "weird" on a "consumer" product, it's either because it wasn't _meant_ to be a consumer product, and just ended up that way upon release, or there was a potential corporate takeover/merger on the cards, and they wanted to see if their engineers could work together, or if the merger/acquisition would make sense (being folded in to one product line), and provide a noticeable boost to their bottom lines.
Sometimes when you develop something, you don't really know how it will be used in the market. EVE is a classic example of a video game that had this very issue. There's a keynote speech from ~2014 that talks about this. When EVE released, they didn't anticipate that players would work together in quite the way they did, and as a result, the pacing of the in-game economy was completely out of whack compared to where they thought it would be a couple of months after launch.
It's a classic case of, here's some mechano. Make something. Then staring in wonder as someone creates a full sized theme park out of it.
That card gets literally 4x more fps in CSGO than my laptop.....
This is why I like your channel so much. When you say you're gonna show a pointless GPU, you show a pointless GPU. Not some amazing RTX 4090 Ti CureYourDepression Edtion that's "pointless" because it has 16GB of GDDR6X instead of 32GB.
Killer ethernet was marketed for online gaming. It gave gaming trafic priority over regular ethernet trafic or something.
That is probably why it was included.
I could see a NIC being useful on a GPU as an output, provided that it had an output hooked to it.
Way back in the day, getting display signal over longer distances were a pain, so what most people used was to encode the HDMI signal and run it over Ethernet as it was cheaper and worked for longer distances than HDMI cables ever could.
Having a built in port for it on the graphics card could make you shed one box in the setup... However I doubt that was its purpose though.
In the UK, that membrane keyboard is on Amazon for £70-75.
That's getting into decent mechanical keyboard territory (Logitech G413, Razer Black Widow Lite or almost any Red Dragon of Havit), and you can get a Corsair K60 for around the same price.
No mechanical feel? I love my aluminium mechanical simulated keyboard
I had that card! I replaced it with a 7770 Ghost which was not really that much better (mostly a rebadge).
I really wish AMD were better about long term driver support. It doesn't matter for most people of course who upgrade constantly, but for making old hardware usable it really is nice to see. Cards like the X1950XT, HD2900, HD4890 etc have abominable support even though their raw performance would be more than adequate for lots of indie and eSports titles. I can use old GeForce 8800GT, 9600GT etc from similar or even earlier with far better results with newer OS than contemporary AMD/ATI.
As far as the network card was concerned, it was a good idea but terrible implementation and support. Your typical cheap onboard NIC of the era was a 'slacker' IC that put the brunt of the network processing load onto the CPU and OS though its driver. At the time even using old 3Com 10/100 cards offered better results as the NIC had all of its own processing hardware on the card rather than being a performance parasite (that can also introduce lag due if the system starts getting pretty loaded and the CPU cycles are not free enough to maintain consistent network performance).
The best results even to this day are offered by strong non-slacker NICs like the Intel Pro/1000 cards with onboard cache and fast local processors (they even have heatsinks!). If you want a good one, look on the usual sites for the dual port gigabit cards by Intel, you should find them for $20-$30 if you hunt, and they are ultra smooth non slacker stuff. They even support duplexing of the ports for 2Gbit if you have a compatible router and/or switch for the other devices in your system (my NAS has this enabled and supported, giving me more than double the peak performance of typical onboard gigabit connections). Of course 2.5Gbe etc is even better if you have the supporting infrastructure, but it's dirt cheap to build out dual Gbit in comparison.
These Graphics cards were mostly going to be used for Public Displays ( you know, like the ones that show adverts or announcements for the public in public places like trains, parks, malls etc) and Digital Signage. The GPU part is mostly responsible for display output while the network part will enable it to get a stream of data that needs to be displayed without involving any human operator on site.
I remember Matrox used to sell a box that was essentially a Savage S3 GPU with a VIA network card.
Was it made to have a HDMI to Ethernet adapter? I remember back when HDMI was new you needed a PCI Ethernet card to actually do that because you would need full speed gigabit Ethernet to actually send the signal. That was more of a novelty though, HDMI over Ethernet was made for like hospital waiting rooms and airports. Those places didn't actually use Ethernet to run the one DVD player they could afford from a locked room to the TV in the waiting rooms, DVD players got cheap enough that they could buy one DVD player for every TV that was hung on the wall.
I had a similar idea just a few days ago. Imagine having a graphics card with a half-height expansion slot integrated, that works with a pci-e bifurcation. A bit like the pci to pci-e adapters look but the pcb extends to house a full gpu. i dunno...
Hey! About the geforce now thing: I've been forced with an old HD4650 or something like that for the past few months because my r9 280 broke, and... Yeah, that does suck. It's not because of geforce now, but because the video card was so weak that it couldn't handle geforce now (you still need acceptable video decoding capabilities, especially with higher framerates and resolutions). Now I managed to get my hands on an r9 270x, which handles that just fine. Nice video!
"Dota runs surprisingly well"
After seconds later:
"It's not really surprising"
Was that ad spot a joke? "A keyboard that sounds like THIS: *sounds of spaghetti being stirred*"
I have a 5770 on a secondary PC that works for YT and Netflix on my TV :D
Perfect Bang for Buck!
This card was aimed at SCADA systems where the PLCs were supposed to be controlled remotely via a network, and which required 8k VR interfaces...
I remember getting one of those for my old workshop pc back in the day when grinder-dust killed the network and monitor ports on it, and I currently were unable to convince my then wife that the 2 year old system we used upstairs were outdated...
The thumbnail is either young ryan Reynolds or a discount ryan Reynolds
Vanwilder
i love this content man. also, recently just upgraded from a K55 RGB (had it for about four months) to a K55 PRO (it unfortunately broke in a rage mode) to a Logitech G513. i gotta say that the move from the larger k55 to the smaller g513 was difficult, but, the g513 smokes the k55. no comp. this thing sounds great, feels great and i got it for 70 bucks from work because it had a damaged exterior box. everything was intact though, no damage. i love it, plus the palm rest thats padded feels amazing.
"a particularly rowdy absinthe binge" this is the kind of thing i tell my friends so they will check out this channel, Dawid, you sound like good peeps haha
I like how Ryan Reynolds was on the thumbnail reminds me of watching that movie years ago. 😂 I love the way dawid says abomination.
I remember when those graphics cards came out, it actually made sense to have the LAN card on there, it was around the same time people were really getting into high-speed broadband that was fast enough to out-speed a regular full duplex 10/100baseT ethernet card, and motherboards only had them on them, but gigabit cards started making more sense.
it was not unlike how soundcards also used to be what you plugged in your joystick to, or hard drive cards that could also do floppy drives and serial.
First, have this card. Second, works with or without the shroud, no temp diff. Third, Killer placed the NIC on the card to jump on the "gaming" bandwagon. I believe the idea was if it was integrated into the GPU it would use less resources and not take up the extra slot that it usually would take up. I also believe marketing thought it was a snazzy idea considering it was attached to the GPU itself. Minimal extra cost, two birds with one stone. Please understand Killer was trying to place itself as THE NIC for gaming and wanted to make sure you thought nothing else could compare. Fourth, GFX works on bandwidth. If for whatever reason their servers are loaded and or the points in between are congested, your bandwidth is throttled and the game stutters. It has been a day one issue with ALL streaming services. And why cloud gaming is still, well, stuck in the clouds.
maybe it came at an era in which 100Mbit NICs were still a thing and people didn't really want to change their motherboards just for the sake of a new network card
"Product that was brainstormed during a particularly rowdy absinthe binge" Your comparisons and descriptions are always god-tier, Dawid.
Dawid's ad: *you have become the very thing you swore to destroy*
But I love hardware with unusual features like that: Graphics cards with LAN ports, Gateway 2000 AnyKey keyboards from 1992-1993 with diagonal arrow keys, Modern wireless keyboards with phone holder. four serial ports on a desktop, Blu Ray Rewriteable with HD DVD, DVD, CD, LightScribe, LabelFlash (also known as Super Duper Optical Drive).
Have that same keyboard and love it honestly. Plus when you break it in the keys give a little rewarding clack/squeak I cant help but adore. Also if your a streamer those macro and media keys are a god send. Allow you to make changes without putting windows over SLOBS or having to buy a 3rd screen.
Your videos are the highlight of my day 🙏🏼 Always super funny and informative, much love from Minnesota 👋🏼
I am actually not surprised about the results, a 5770 was a formidable GPU back in the day. Got a GTX 580 2011, a top of the line GPU of the same generation but from rivals Nvidia and it lasted longer than expected. And if I don't completely miss remember, the 5770 wasn't that far behind in performance.
The GPU power was actually quite good, it even came close to the upcoming consoles in 2013, but when they launched they had 8GB shared memory so 1,5 GB just wasn't enough.
So games ran fine but you had to nuke the texture settings so everything became a blur. I replaced it with a GTX 1070 in 2016 with some help from a friend.
In my experience, cloud gaming as a whole is redundant as it is in its current state. It either fails to perform well, and in cases where it does, it is priced high enough that the customer would be smarter to buy their own rig. Plus, what's the point of cloud gaming if you can't play most of the games you own anyway?
My question is: Why does it have a Chaos Space Marine on it and how were they never sued by Games Workshop? That's the real mystery
I mean, it wasn't uncommon for GPUs to have depictions of video game characters on them back then...
Might have been connected to DoW2:CR
Thanks for advertising that E-Waste keyboard!
So about GeForce Now...
I've been using it daily, since my current PC is rocking an FX-6200 with an HD 7570. At $5/month, I'll have spent the equivalent of a 3060 (at MSRP) in five and a half years.
One thing I've noticed about GeForce Now is that by default it auto-chooses your server location, based on network performance. But sometimes you can get a better connection (or at least a more consistent one) by manually picking a server. There's even a tool on the settings page to test bandwidth, packet loss, and latency with different servers.
Most of the time, my best server is US East 2. But some days it suffers from congestion, and I get better results from their US Northeast Server.
“Check out the link in the description below! For a keyboard that sounds like this!” *unexceptional membrane keyboard noises*
It's a gaming NIC. It has some hardware or drivers that "improve online performance". I remember them as being PCI or PCIe cards. Never knew there was one paired with a graphics card. Maybe it's for motherboards that only have one PCIe slot.
In 2009/2010, you had to buy a mid to high-end motherboard to get 1Gb Ethernet. Many lower end motherboards still had 100Mb onboard. Rewind a few years, and even mid-range motherboards were fairly sparse that had 1Gb onboard Ethernet.
Now, factor in that most onboard Ethernet was some Realtek or Broadcom crap that didn't perform nearly as well as Killer NICs did at the time. Couple that with mATX being a very popular form factor, and you can see why there was at least a niche market for something like this back then.
A HD 5770 wasn't even coming close to using all 16 lanes of PCIe 2.0, and 1Gb Ethernet wouldn't use much of it at all (1 or 2 lanes at most) it really did make sense for a few people.
I almost bought that card for a new build in 2010, but it was cheaper to get a better motherboard and a regular 5770. However, if I already had the motherboard, maybe a year or two old, and it had a crappy onboard NIC, it would have made sense at the time.
I bought an HD 5770 when it had come out back in early 2010. It wasn't anything special but it was good for Medium-Light Gaming at the time. It could easily do 40fps for titles like Jedi Academy, Far Cry 2 & 3 and Startcraft II or Diablo etc.
I do commend you, Dawid, for trying to use such a monstrosity of a card. I cannot imagine WHY someone thought this kind of mad hybrid would ever make sense!
I used to use an AMD HD 6770 card, which I have heard is basically just a HD 5770 that they rebranded. Worked great until the plastic fan shroud connectors broke, which messed with the cooling flow and it started to overheat under load. No ethernet port on my card, though.
Not pointless! Some people wanted to start gaming on a systems that didn't have integrated nics. Just like in the 90s soundblasters had game controller ports. Buy 1 card instead of 2!
It was useful for ITX systems back in the day. Many mainboards had rather bad onboard NICs and in rare cases, they died without any warning. I think it's a funny and possibly useful feature, nowadays such a combo would be interesting with 2,5 or 10Gbit LAN, there is plenty of unused PCIe 4.0 bandwidth.
So if I recall the reason for the integration of the 2 cards was because it "saved space". Motherboards that did not have a built in Network option and didn't have enough PCI slots to have a network card and graphics card (or they were too close together to be used at the same time).