WE GOT INTEL'S PROTOTYPE GRAPHICS CARD!!
ฝัง
- เผยแพร่เมื่อ 7 ก.พ. 2025
- Check out the Massdrop x Sennheiser PC37X Gaming Headset for $119.99 USD on Massdrop at dro.ps/linus-pc37x
Buy CORSAIR's Dark Core SE Wireless Mouse on Amazon at geni.us/f4ViD
We got our hands on a PROTOTYPE video card from Intel. But does it actually do anything?...
Buy Intel CPUs on Amazon: geni.us/809gd
Discuss on the forum: linustechtips....
Our Affiliates, Referral Programs, and Sponsors: linustechtips....
Linus Tech Tips merchandise at www.designbyhum...
Linus Tech Tips posters at crowdmade.com/l...
Our Test Benches on Amazon: www.amazon.com...
Our production gear: geni.us/cvOS
Get LTX 2018 tickets at www.ltxexpo.com/
Twitter - / linustech
Facebook - / linustech
Instagram - / linustech
Twitch - / linustech
Intro Screen Music Credit:
Title: Laszlo - Supernova
Video Link: • [Electro] - Laszlo - S...
iTunes Download Link: itunes.apple.c...
Artist Link: / laszlomusic
Outro Screen Music Credit: Approaching Nirvana - Sugar High / approachingnirvana
Sound effects provided by www.freesfx.co....
Finds a super ultra rare one of a kind GPU unlike any other, swings it around and shakes it violently in excitement
Because he's not afraid 🙂
Lmao rite?
It worked when shipped.
for real, my esd lesson in school was scary and idk htf anything he touches works
HappyAgony i honestly was thinking same thing and also it scared me so much
R (red) : amd
G (green) : nvidia
B (blue) : Intel
wtf mind explode
❤️♻️🚾
The f heart recycle toilet
.... Ragnbi? The fuck does that mean?
Switch Xbox PlayStation
I actually felt bad for him, He looked like a kid hoping to find an xbox down christmas tree and find a sweater instead XD
Ikr
More like a kid who opened the box for an Xbox under the Christmas tree and it was a sweater and some books that somehow weighed as much as an Xbox when packed in the box.
@Wat what Jimmy Coe said was perfectly on topic
what a horrible thing right, when you have no real problems -- no need to feel bad for people with real problems either
@Wat its a comment section, his story was fine
you seem like the kinda guy to have jokes go across your head you salty wanker
This really aged well
Yes
ikr
i would like but its at 69
Lol
They still prototypes tho
This card needs a special bios to boot this thing because this thing is encrypted(Forwarded from an Intel employee)
Aw man I was so excited
@Xander McKay no, usually the bios is on the motherboard
@@sealsoftware4772 well now it's on a daughter board 😎 look who's grown up
@@internetdumbass
are you ok?
@@pedroxyo no lol, but an independent motherboard plugged into another motherboard can be called a daughterboard. esp since this graphics is basically a x84 computer all on its own en.m.wikipedia.org/wiki/Expansion_card
I worked for intel. It was miserable. Every 6 months you had to do a presentation on yourself, basically giving your self a review. You were pitted against your coworkers and graded on a bell curve. Even if eveyone in your department was awesome, one of them was going to score last and get disciplined
Rank-and-Yank sucks
Thats why u work for amd instead lol
....
......
..yeah sorry i just couldnt help it
Probably the reason why intel still manages to keep its head up.
That's why they are so blue.
@ELDERPALADIN98 But you do yearly at least at costco, walmart, and HEB...
Therapist: Intel graphics card isn't real, it can't hurt you
Intel graphics card:
*2020 has entered the chat*
@tshrpl lol
@tshrpl lol
I laughed at this too hard
Intel HD : Am i a joke to you?
I have the drivers if you still want them. We recieved this card for test purposes in our Crytek headquarter in germany.
So you are one of those guys who made the world's worst game engine, that requires a 9999$ PC, and after that you make all believe that producing a shit-code is actually cool, yeah?
Alex Kart worst game engine? Not really. The game engine which changed the whole game industry? Yes. At the time, Crysis was groundbraking and still is groundbreaking in terms of lightning and global illumination, the materials and characters and so on. Please don‘t talk about us like that. Games like Battlefield or The Witcher 3 wouldn‘t look like that, if Crysis 1 didn‘t show them how to do it.
@@izmirdikili8069
Nothing wrong with the engine.
Performance problem, Wasnt it tesselation being applied on mostly non-visible elements, water layer etc, maybe that was crysis 3 only?
That game btw still looks amazing nearly 7y later.
@@izmirdikili8069 I think, you guys actually did the one, very bad, thing: you showed to all programmers in the whole world that bad and slow code can be released, if only you can convince consumers to buy powerful PCs.
Alex Kart Crysis 1 was meant to be a benchmark in the first place. It was never intended to make people buy new, more expensive hardware. Crysis 1 was only showcasing, what is possible and what can be made. If you look back at crysis 2 and 3, those game were much better optimized. Because no one would have known that crysis 1 would blow up and get so big. I hope you can understand what i‘m trying to tell you. My english might not be the best.
I can't wait for Nvidia to issue out CPUs next
About that...
👍
they already have its called the tegra
I'm being ironic btw
Justin Y. I Love you
"This may be the ONLY engineering sample of a graphics card from Intel!" [flails card violently multiple times]
What a beautiful looking card.
I'd put the card in my system if it did absolutely nothing. It looks amazing
I screamed internally at him flailing it around like that, what with his reputation for... dropping things XD
yeah, I hate the weird edgy look of the 10 series GPUS. this looks really clean. I like it
@@fvkz That's what the kids want. Something shiny and new that they've never seen before, like socialism.
@@lazzer408 that isn't what the kids want but good generalization based off the vocal minority lol
Me listening to Linus's explanations: *"I like your funny words, magic man."*
oi like ya funny words magic men
Unoriginal
Clone High reference
Tech illiterate
lmao
Me: very carefully carries a 570
Linus: *casually swinging a rare gpu in the air without care
RX570 users unite
@@BungSpoot more like disband lmao i hate mine and its drivers
*cries in Intel HD Graphics 4000*
*cries in gt 710*
@@z_3phy_r haha rtx 3050 go brrrr
I do have 2 seperate driver packages. my father is going nuts right now. he is in his office looking for them.
you gotta tag linus when you find em
BUMP THIS SHIT EVERYBODY LIKE
Lmao wtf
You find them?
email Linus!
I wonder if that GPU has an integrated CPU.
it IS a cpu, its hundreds of cpu cores to compute anything it wants ,its basically intels version of IBMs cell processor
/s
This is so wholesome
UnknownTimelord every gpu has some (1-4 even sometimes 8) cpu cores, for temp-fanspeed management, dynamic scheduling and optimization,translating opengl request to executable code and many other things,
Nvidia used to use power and arm now will use risc v, also emulates Agia's some kind of "core" when running older physx games,
powervr gpu used to have 4 arm m4 cores (now maybe m7) a custom (maybe in-house design) kind of stripped down DSP core (not complet core, it has shared resources distributed all across other "cores", and its resources are also shared) and some FPGA like configurable logical elements. (programmed with their own version of varilog)
all this powervr shit i know because had a chat with powervr engineer in a licensing terms meeting.
and Nvidia shit from my college roommate.
Someone didnt watch the video...
Lol this is funny I used to work in the embedded graphics debug lab next to the performance group testing these from 2008 to 2009.. By the way the reason you can't boot it is there are changes that need to be configured in the video bios before you can boot the system with that card.. Then there are specific drivers you need to make it work... I think I have intel's Binary modification program to edit the vbios and merge it with the system bios. Have to dig through my Google drive
In case Linus didn't see your post please do send him anything you can find. We all want to see this card's performance ( I was Bleak when it didn't work).
Hope u get in touch with linus
Your our last hope.
DO IT
Go go
Nvidia, amd, and intel:
“There ain’t room enough in this pc for the three of us...”
**The Good, The Bad, and The Ugly theme starts playing**
But there is enough room in bigger motherboards.
Mexican standoff
AMD CPU
Nvidia GPU
Intel M.2 SSD
Boom
@@Bradshawmen 2030 Intel GPU Nvidia cpu and ssd
I was surprised that Linus didn't drop the card waving it around that much.
Who knows? Maybe he did, that could I explain why it doesn't work.
I was impatiantly waiting for one of the only models of this part to get dropped and destroyed.
It comes from ebay, so has already been dropped 1000 times before he got it
that waving was scary
ELiT3CH I guarantee you if he dropped it on camera the clip would be in here. He enjoys his reputation for mishandling important technology.
Should've at least connected a normal GPU and this blue something into a second slot to check what is that thing in device manager.
Barteg why go through all the trouble to make a hype video and get paid for it?
Not gonna lie, intel's card looks like walmart made a graphics card.
Lol
It is not as bad ,it have a really basic design and I liked it
Not really different from Nvidia's Founder's Samples. Do you mean colour? Isn't Kevroliet C8 same colour?
Its from intel what u expect xD
and you think having overbearing rgb vomit looks better?
gamers dont exactly have good style either
4 Years later and we got intel ARC.
I'm running an a770
@@tallrocko7010 Is it good?
@@kanosur69 It's.... cheap.
@@kanosur69a bit less ironed driver than AMD at that time this was written . But it's cheap, and doesn't give you middle finger on productivity workload.
@@MariusThePaladin and good.
This needs to be looked at again. Theres something here, I hate to see good ideas go to waste
DaComputerNerd Lmao no, Intel is fine. AMD is pulling ahead in the consumer market but when it comes to enterprise and server markets intel has a much larger foothold and will for many years to come. And after all, most of the money is in the server and enterprise markets
@@panduino2156 lmao ever heard of ryzen threadripper? Amd is crushing Intel in every department and with the new 4000 series of Ryzen, Intel will become pretty obsolete
@@goroz4322 threadripper is a consumer product and have a lots of limitation vs EPYC :/
Linus did a video specifically on that.
for now, intel is #1 in the enterprise.
The new good idea is APUs, integration is the future. Ryzen Vega and Intel HD went from struggling in 720p video playback to being able to run GTA V at 50 FPS on low, and both companies (as well as ARM) are investing heavily in integration
They did, they lent this card to YuuKi-AnS for the AMD Vega 16 engineering card in the new video, and somehow YuuKi made the card works. You can found some notes here (in Chinese): www.bilibili.com/read/cv6152485
I dare you to Call Intel and try to RMA it
prob stolen
Jaggsta you're missing the point
300th like
Protip: You need an Intel manufactured motherboard to boot the card
...And probably Intel's beta BIOS to allow a device like that to be interfaced with..
I think the "above 4g encoding" option (though also somewhat rare) should be enough.
TheTruthNJ09 bro tip: use magnets for faster acquiring.
Could also default to some debugging mode if it doesn't start properly so maybe switch to on board graphics see if the computer sees it and what it sees it as. If it doesn't say it's a graphics card it will go to on board graphics.
Should probably test different pci-e settings in bios and a different fsb block n shit. Compatibility is going to be limited on a product like this.
This is the saddest video of LTT, you can see as Linus slowly realizes that what he is excited for will not come to pass.
first of April already?
Nope, this is all real.
Ben V Nope it's a real project
can notl believe it now.
But I think it could take of the monopoly from Nvidia and thats good for the prices.
i hope he will get the drivers
el piloto He was joking...
You need to keep this super safe for the future. it might be worth a fortune.
instantly drops it.
More like getting sued by Intel for technically stolen goods. Going through trash still counts as stealing.
its actually, even in the trash its still companies property until its properly destroyed. My fathers friend was working at DELL and over a years he took home a lot of hardware from a trash containers ( monitors, keyboards, mouses, fans other hardware no hhd ofc) and was giving that stuff for free to his friends. After many years he finally got caught and sadly was fired for taking the stuff from the trash.
Mr. Fox Yes the company had the right to fire him but they couldn't accuse him of stealing nor bring any criminal charges in such an instance.
companies are stingy, worked at a candy shop and people would read the bulk candy signs wrong constantly and fill bags with candy and decide not to pay for it. long story short if it was too mixed up in the bag they would throw the entire bag away, I asked, me: "Are we allowed to take it home?", boss: " no that's stealing." me: "What about if you put it in the trash can and I pulled it out?", boss: "No that's still stealing". Complete dumb ass laws we have, ended up wasting an unbelievable amount of candy because IT'S MINE YOU CAN'T HAVE IT
Soon you will be able to put into one system a GPU from AMD, intel and Nvidia.
That day, you will have the first RGB with no lights.
dosduros that killed me 🤣
At the time this was supposed to be released so was lucid hydra so we could scale all 3...but nope that and bulldozer went tits up and the entire dream was over.
Nice
um..
друг Because they aren't iGnorant iDiots and reactionary foes of progress. Mental dwarves like yourself though, never get to realize competition is vital to even have technological evolution (or we would get the Intel Core scenario, where CPU's barely improved at all in 5 years, and iDiots still kept rewarding Intel for parasiting on iGnorant customers).
My high school computer science teacher worked at intel for several decades and wrote some of the drivers for this card, and I asked him about potentially sending them over to you guys because he has one of the last prototypes made by the Larrabee team, but he said that in the event of him doing that, it'd get him into an extremely tricky situation with Intel over agreements and previous contracts, so as much as I'd love to see it work here, I don't think we'll see drivers for this hit the open net :(
Adding on:
Because most of this card's fundamental advantage is used in the Xeon coprocessor cards that they use in their enterprise environment, the drivers aren't that dissimilar with the exception of one being configured to allow output and the other not.
Put this on second PCIE slot along with any working series Nvidia/AMD GPU, see if it gets detected by windows.
Blue Marble yhea
Yeah, it was pretty sloppy to end the video like that
Jepp!! DO IT @linus!!
ye he's so lazy
lol
This is one of the most interesting and informative videos you've done in a while. I would definitely appreciate some follow up content on this if you're able to get it running as well as some decent drivers. Plus any kind of similar content you can create.
sickbailey21 i heard linus was gay?
no u
thought the same, really enjoyed how he just chatted while setting up the pc too, brilliant.
Billy Gilliam
why would that even be relevant? It's also pretty much a non-starter he's got a wife and 2 or 3 kids lol.
A third competitor in the gpu market would be amazing, too bad it's probably not going to happen anytime soon
There's rumors Intel is working on a gaming GPU.
allegedly intel has been working on it for a while and the card maybe coming sooner than expected.
Bring back Matrox...
Intel got that AMD GPU guy recruited. So we'll see a Intel GPU anytime soon.
Piotr Krolikowski yeah but remember this "new intel gaming gpu" has been ping ponging in and out of existence so much so don't get your hopes up too high for this current one
should make a follow up video since the drivers are available :o
Nah
@@jdmnissan yah
@@a5hamad fine..
What drivers are you talking about
What drivers wear
Try booting it with a second "boot" gpu. Maybe it's similar to using PC cards in a Mac Pro where the system wont output anything without the right driver (which makes sense based on the description of the way this card works), or maybe the onboard ROM isn't gonna work with anything modern.
what are the reasons the rom wouldnt work with modern hardware?
Given it is an Intel product, my guess its the cards BIOS/OPROM would check the host system for the GenuineIntel flag as well as check its CPU. If its CPU is not on a "white-list" (aka do not boot if one of these known CPUs is NOT detected) then well I just typed it in the braces.
You can guarantee a Microsoft Basic Display Adapter driver also will not work.
Linus needs to grab a period-correct system to this GPU, and by period-correct I mean older.
Asciiwarrior Compatibility.
+Asciiwarrior It could be a UEFI bug, most computers still ran BIOS back then.
Djhg2000 That could be very true too, I commonly have UEFI/BIOS compatibility problems between 2 of my devices running separate installs
0:16 *Mission Failed... We'll get em next time*
Nice 😃
Yeah i also saw that xDDD
He missed
huh?
LOL LMAO
Hmm... I would try to replicate the exact config from the time. Being 2006-2008, a system running on DDR2, an Intel core 2 duo and windows XP Enterprise would probably be what you want
Brenden Pragasam maybe try to get tf2 on it
noot .1944 maybe try TO MAKE IT WORK IN THE FIRST PLACE
I saw one of these things at a garage sale for $20 like four years ago, I wish I had known what it was at the time.
i wonder how much it would actually sell for once you account for the rarity
Back in 2009 Intel was actually offering a good amount of money to port our game over to Larabee. I didn't work on the port long, but loved the whole GPGPU drive and was a little heartbroken when they canned the project because the performance wasn't good enough.
How was the performance compared to discreet gpus from nvidia/ati at the time?
On paper a 32 core version could be as fast as a high end 2008 graphics card, but the biggest problem was the architecture, as a lot of specialised graphics techniques were adapted to something called shaders which are programs for the processors used in graphics cards to run. With GPGPU you could pretty much treat it as a slow CPU - general processor, so most of the advanced techniques that were optimised for GPUs didn't work so well on CPUs - I'm talking about running at 1-5% of the speed here. The biggest promise with GPGPU however is for something called Raytracing which is a more photo realistic way of drawing the scene and would take graphics to the next level, however it was challenging as the higher core chips were barely powerful enough to get raytracing to half the quality/resolution of then current gen rasterized games and you'd have to consider supporting the lower core version of the chips too.
Vtudio woah. thanks for sharing. What game was it?
All that being said, do you think there was promise in this thing potentially usurping traditional cards for gaming purposes? I mean, being able to use the whole card on certain aspects when parts of the rendering aren’t in play (like Linus’ shaders and antialiasing examples) or only applying graphical effects sparingly to parts of a scene that need it the most... that has to count for something in terms of the efficiency of the card, right? From the sounds of it, it also seems like much of the potential performance increases in this card are through software optimization more than the hardware.
What if this product were developed to completion with modern day technology? Do you think now that it’s been nearly 10 years, this type of card can handle photorealistic rendering better than traditional cards?
Also, just curious, what games did you develop, always looking for new ones to check out.
This is so disappointing
Don't be sad, at least the fan works.
katasiapa and the LED
katasiapa yeah
Frame it and keep it on the wall 😂😂
Get out... NOW!
gile lu ndro
red vs green vs blue...
Now thats what I call RGB.
The Avengers
-_- sigh
Rick Frost that makes no sense buddy.
UnknownPlayer well the order is not much important to me because RGB includes blue, red and green. Blue for a Intel, red for AMD, green for NVidia. And fixed the comment for you.
Pokemon Red, Blue and Green
8:55 Linus just described DLSS months before its announcement
Daniel Benites wow, u r right
Ummm.. no?
those 2 have absolutely nothing to do with each other, neither in the underlying theory nor the actual use.
@@izmirdikili8069 Are you Crytek Ceo
no? dlss is upscaling the whole image in real time using AI to make it look good
Step 1 : Buy vega 64
Step 2 : Paint it blue/black
Step 3 : Flash custom bios
Step 4 : Sell on ebay for 10000$
Step 5 : ???
Step 6 : Profit
Cekpi7 buy Nvidia fx5800 ultra paint profit :-pp
it's a vega FE
That now I payed for College
Cekpi7 hmm....
why waste that money on a vega 64 if you gonna scam anyway?
Still a fascinating story. That 'competing for budget' story is ridiculous, and reminds of how badly corporate cultures can fall victim to idiotic ideologies. Also reminds me I once had an executive producer at a creative firm who kept a copy of GM's management guide in her office.
I work in a company of a couple 10 thousands employees that is sub-divided in BU. Believe me, it is far more stupid than you think sometimes. I've lost the count of how many internal competitor my own team has. They all want their share to look good and increase personal growth and in the end, it just cost more and is less efficient than sharing knowledge.
"executive producer at a creative firm who kept a copy of GM's management guide in her office"
Classic thanks.
IBM had a policy that divisions couldn't compete. So the laptop/portable division could not make a product that would compete with desktop, desktop could not make a product that competed with server, etc. It stifled innovative products that allowed Compaq to take up sufficient market share to become huge. Sometimes competition is good.
Similar story to Nokia inventing the iphone before Apple did. A very similar product at least but internally Nokia would not allow the release of this wonder phone as they were already making money on their bog standard product line. Also reminds me of the Robocop story line :D
Not really that ridiculous, two teams are chosen to create graphics technologies, the company only has the budget for one. Both teams are given the resources to explore their concept and report back. Obviously competition is going to arise but that's how you formulate good ideas when resources are limited, incentivize people to make the best product.
"so what you gpu bro?"
"intel?"
"poor"
@Opecuted you mean 16k 360hz :)
@@happy-xv8gn nonononono, you mean 64k 4096hz
@@GrootsChannel are you sure i thought iits 100k 8000 hz???
@@happy-xv8gn nah mate last time I checked it was easily 256.000k 512.000hz
@@GrootsChannel oh sorry. i thought its 712.000k 1024.000hz
"Intel's first and only dedicated GPU"
tfw Linus doesn't remember the i740 - the board that launched the AGP standard.
He's a child. He wasn't sentient then.
Have you tried using iGPU/onboard graphics, and then just showing us device manager output?
-Clippy
Seth Estrada What you said.
At some point my pc wouldn't boot unless it was connected to internal graphics then the driver install seemed to have fixed it
Seth Estrada or use the hotswap mobo if it wont boot with it
Seth Estrada my displayport output wouldn't work untill I installed drivers for my gpu using hdmi
lspci -vt
Hes on an X299 bench. There's no iGPU. But I get what you mean. He should have kept the Titan Xp in to see it Windows even detects the damned thing.
Nvidia and Amd are watching this video, and they going to raid his house tonight
iMuhammad Siam Nvidia bought Linus's company already, so if anyone is raiding his house, it's Nvidia
Lol raid 0;)
@@aimanamrih8169 what are you talking about? Everybody knows Linus is Nvidias and Intels official promoter.
@@ZenMuff1n Considering he's been bad talking the RTX's
@@Icybubba s/
2:12 that "to date" came in clutch in the present day, now that Intel's GPU's are about to finally drop to the market.
Email intel for the drivers lol
Nick b if he do this they will seize him for using unreleased software
It's hardware and they can't, dumb ass.
I suspect they can, i heard somewhere about engineering sample CPU's that if you got one and you show it on video or in any social media, you could be contacted and asked to return it to manufacturer, these are super rare cases so yeah
Um, I'm pretty sure they wouldn't have released this video if there was any chance of them getting sued.
indeed, my comment was just a bit of thinking and typing
He should turn on "above 4g encoding" in the bios! (which only few motherboards support), or it will not work. (Read up on knights corner Xeon phi to learn this)
Andreas Andersen So, if he did that, would it post and show anything?
Danish army
TheLT it's definitely more probable at least.
Andreas Andersen now fhat you mention it, I remember it helped me boot an old pci 32bit S3 trio card in a modern system...
Wow nice job
Someone get this man his drivers!
and tell him who to throw money at to get the rights to this card
Watching this after intel is dropping their own GPU for real is surreal
intel are fucking cunts when it comes to hardware.
You kept me here 13 minutes waiting for an example of it working, you seemed super excited and I was enthralled... but then it didn't come. I'm almost as sad as you ):
Cody Small sharing is caring
That might be something she never said.
I got some blue balls here.
who else was only here for the benchmarks, and cried at the end
A triopoly of red, green, and blue graphics cards would've been interesting. Whether that would lead to insane amounts of competition and innovation, who knows?
*and CPU
TBH I don't really like what we have in this era where we only have 2 CPU and 2 GPU manufacture competing in general PC market, as a consumer I don't think this is healthy
If only team green and blue would start to compete, that would be amazing lol
though from a businessman's perspective this sounds pretty dumb
This would take RGB to an whole new level!
Yeah, you would think Intel would've got into GPU's to cash in in on the whole mining thing. I bet they were kicking themselves for missing out on that.
is this some kind of maymay?
I would rather have 3 more RAM manufacturers at this point, preferably not in Korea.
A GPU with an integrated CPU
*Nvidia starts making CPUs*
*Western Digital starts making RAM*
*Logitech starts making motherboards*
OneDimensional World flipped on its head
and then a giant merger and activates skynet
I'd be interested in an Nvidia CPU.... xD A PC CPU, just to be clear. It sounds like a fascinating possibility to try out, particularly considering how much I tend to like there stuff. A Logitech motherboard sounds weird, but I'm certainly curious how they'd do, haha.
Don’t forget those Acer power supplies.
WD have made a CPU in the past IIRC.
Try an older Intel desktop board, as I recall some of these old engineering samples won't boot on third party boards, best bet would be a 945 or an 965 was a pretty standard test bench in 2008-2009
945G is shit
like my life, but i'm booting up everyday anyways
This would be the best bet. And don't use it as the primary display if it still doesn't work; get an old Radeon or something and use that to at least see if it's being recognized at all. Regardless it's probably not gonna matter that much unless someone could find drivers.. which is next to impossible..
Main reason I suggest 945G is because the chipset is extremely well documented and hardware support is really wide, even if the performance is absolute shit
Jay Pattyson best card would be an antiquated PCI video card because then you can have the Intel card on the first slot and you can set all the settings manually, plus all the bios codes on the 945 are well documented so you can get a clear answer as far as the Intel card is concerned
So does this mean Linus didn't have the *intel* to boot this on?
Get out.
Nice
It’s probably easier to buy one of these things than it is to buy a 3080
Boot it on integrated with the card and see if it's recognized in Device Manager. Or even better, boot it in Linux with integrated graphics and see if lspci does anything.
Siddharth Singh I honestly didn’t know why they didn’t try that.
Liam: exactly, it was my first thought when I realised it didn''t actually work. I mean, most PCIe devices provide atleast some basic info about themselves. And with enough work, a driver could totally be written if it's just x86 architecture.
Mika: i guess that is okay too but only if they tell us before hand if it worked or not. I hate clickbait.
that was the smoothest sponsor transition I have ever seen at the beginning!
I thought it was a Corsair GPU
Why all these dislikes and people saying nothing happens?
Actually I found this video interesting because I always like listening to hardware story, also it’s evident that Linus did put passion to make this video (as in any other video where he struggle with hardware) and ok, maybe he also could fix whatever problem the GPU has before posting it but it is already 15min only for the story; did you really wanted a 30min video?
I constantly watch 1 hour long videos, I wouldnt mind 30min video.
It's the clickbait title.
The longer the better. But I agree, I enjoyed watching/listen to this video.
matteo barsanti this is bating at its finest. Nothing happened. Just a lot of wasted time listening to big mouth Linus. Stfu Linus
MrNosugarcoating why are you subscribed....
Coming back to this video since now the Intel Arc GPUs are on their way :)
Protogen go beep boop
Just plug in a second card, run it as the primary, and then tinker with it once you've gotten into your OS.
Bios won't hand it off anyway.
Linus, why not just put a working gpu(and connect screen on it) on parallel that, and boot windows or linux and see if that get detected any way
riku2015 I said very similar thing in a previous comment
the best idea that i've ever heard....
true
if at least something shows up
Next video
i read that they send video to the igpus frame buffer through some pcie lanes
The history lesson was quite interesting...makes you wonder what else some of these huge corporations have left on the cutting room floor simply because it didn't fit in with their plans at the time.
This reminds me of the story of Martin L Gore, of Depeche Mode fame, who had a serious case of writers block and for the first time in the groups history actually had folks join him in his quest to write music. What they found out was that Martin was such a genius that most all of what he thought was useless garbage was in fact some of the best music that these folks had ever seen. He was literately throwing away stuff other bands would kill for.....Morel of the story? Sometimes it's best to just to get outside opinions just to see if what you thought was useless actually could help you succeed.
Imagine where Intel would be now if they actually started doing this back then...even the mighty Nvidia would be toppled. Hey ya never know.....
Michael Livote one man's trash is another man's treasure
Michael Livote, Nah. Intel would’ve pissed off customers somehow and lost sales to the point where it’d get shelved even post release.
Because that's what happened with their CPUs, right? And that's what happens with Nvidia's GPUs, right? If the best sales were determined solely by how happy customers were, AMD would have crushed Intel and Nvidia ages ago.
Michael Livote right
i mean ryzen was left in rnd for like 10 years in the we will look into this later pile.
Listening to Linus talk about prototype tech like this is like me being a 15th century peasant listening to a time traveler try and explain gunpowder
dude. even though the card didn't boot. I've learned SO much about intel.
honestly dude. same
Thanks for ruining the video
that's why you watch before you read the comments
Riches Media, Clearly the comment wasn't directed to you. but how would my comment be a bigger disappointment then watching the disappointing moment on video? no one likes to see a pc not being able to pass post.
Why don't you use integrated graphics to boot into windows and then show us that gpu in device manager?
post fails
Amiga500 maybe he doesn’t have drivers for it.
Guru Meditation Error
Amiga500 because the graphics card is dead and this is the only way he could have made a video and money about that
hot swap mb
maybe boot it up in Intel board. Some server board.. You never know if they have some extra microcode to activate this on certain chipsets. And thinking of the era, Socket 771? If you can't find any, just take a 775 and google Socket 775 to 771 mod if you aren't familiar with it.
Codeplayer makes sense. Could just require the older chipsets.
DoubleBubble28 you realise it was a prototype and barely the true definition we have of a GPU, right?
@DoubleBubble28 you're missing the point. This isn't really a graphics card. Graphics rendering just happened to be a good test load and was kinda a secondary goal (hence why it was never publicly released as a graphics card). This is also an engineering sample so, it's very likely it won't boot without a special board.
@DoubleBubble28 is an engineering sample mate. not your average consumer gpu. even some Intel ES CPU wouldnt work on any consumer mobo.
DoubleBubble28 this is not going to be released
This really aged well.
I would love to see this card work. I know you got the sauce to make I work Linus waiting to see you plunk it on
I would be so glad to see intel in the dedicated gpu market!!
It will sucks. Believe me.
@Linus Tech Tips - Linus, did you try booting up the system with another graphics card for video output while the Intel one is plugged in another PCI-e slot, and check in the Device Manager / BIOS / GPU-Z / AIDA64 or whatever other software you know or use to see if the Intel card even gets detected at that point. If it doesn't, then you might consider that the motherboard might not be compatible with that (BIOS and all that good stuff). Looking forward on this project, this is actually very interesting.
Cranky Pants I said a very similar thing
WiKiT WoNKa - Oh, there are people that think the same? Weird.
Cranky Pants He wasn't like "you stole my comment!", he just had the hype/happiness of sharing the same idea with another commenter.
Excuse my sarcasm. I blame my zodiac for that. I didn't intend to sound mean or aggresive.
Cranky Pants lol
R - AMD, G - Nvidia, B - Intel.
Radeon, GeForce, B..
but AMD starts with A, Nvidia with N and Intel with I
so it's ANI
MPcrazyscience R stands for Red, and G and B stands for Green and Blue. The theme colors of the three different manufacturers.
MPcrazyscience AMD's colour is red, Nvidia is green, Intel is blue. Those are their theme colours.
Naig Carillo So it was really meant to be.
R-adeon, G-eForce, B-adluck ?
To think this has lead to the arc b580 the best midrange graphics card
Post code D6 usually means no console output detected. If this card is software drive, it may not even have an onboard bios. you could try to use it as a secondary card to see if the system even sees it as a video card.
I wonder how well guarded the dumpsters are for places like Intel, man I would love to dumpster dive there
NSMV I'll join ya
They will be guarded now.
same :(
he said they smash them to bits first basically
my cousin told me he met some guy that worked on something like that, i don't know exactly what it was but it was related to dumping intel chips, and yeah they were told to destroy them but he saved one a day obviously hidding and all that, and that way i remember having the best i7 second gen back in the day by just 5 bucks.
Dude you are so good at stretching the timeline of the video
Over stretching
I think he really lied about "first time starting it up" because if he didnt know it didnt start up he would've just innocently tried to run it since the start of the video, but since he already knew it doesnt work, he rambled for about 12 mins then pretended to be surprised it doesnt work by the end of the vid
yeah thats why i avoid his shit and consider it a waste of time... of course he does cover a lot of unique stuff so i occasionally browse some
He doesn't put 6-7 ads on a 10 minute video though :P
watching this now when intel are actually releasing a gpu is quite cool!
watching it after there's been rumors that they're cancelling their GPUs is even more cool
Aged like milk
7:06 i like how he can just pull a random rgb keyboard & mouse out of nowhere
"This isn't working, lemme go grab a dvi monitor."
Capitalism always punishes the poor.
@@marmitaa8619
It's always worked in my favor.
@@marmitaa8619 and socialism makes everyone poor except the extremely rich.
3:48 Linus stop swinging that GPU about! 😥 I already got scared when I saw you holding it at the start of the video haha
Yeah, Linus being known as a clumsy fuck, and he's just swinging a rare hardware sample around. I was screaming at the TV tbh.
It's bad enough that he NEVER EVER ONCE uses a grounded wrist strap in ANY of his videos. The amount of ESD needed to damage electronics is WAY below the minimum amount your body can feel. It also isn't like you feel a spark and then the card/whatever is dead. You do damage to it and do more and more damage to it over time. These tend to make it work less reliably over time.
This is why he drops things, really.
He handles everything as if its a meaningless value-less item.
Every part he holds, he treats it as if it were a pen or a screwdriver.
Imagine if this was first shown in a theatre. The collective audience reactions would be something to behold.
th-cam.com/video/gsL6wNP_oJo/w-d-xo.html
To save you the time.
It doesn't work.
mcopyright this comment should be pinned!
After that years, Intel really made discrete graphics cards. First the Iris XE Max for laptops in 2020 (?), now the Alchemist Arc
lol intel arc a770 this is 2022
update this is 2023
So I did a little research and I can tell you that this GPU works only on specific motherboards. Also CPU could be a problem if there is no compatibility (this GPU is not a regular one). In an old article it's stated "video card will not include all the features of a PC-compatible motherboard, so PC operating systems and applications will not run without modifications" and also if you could manage everything it is stated "even if compatibility is achieved, to run efficiently software must be rewritten to use Larrabee's vector units, and not all software can put them to good use".
If you would like to try to run this GPU than I suggest you to find a motherboard from a public demonstration of the Larrabee architecture that took place at the Intel Developer Forum in San Francisco on September 22, 2009 or any other public demonstration with Larrabee units, that is if someone solds it.
Cheers! :-)
Linus: it won't post.
Also linus: did you try dropping it?
The Intel PCI card is a computer of its own as you have explained. As such, you can't assume *that* computer has booted (since there is no program loaded). You need to have an actual graphics adapter on the real mainboard (possibly onboard) first, launch an OS, install the PCI card driver, then make the PCI card run a program (much like a virtual machine), at which point you can start thinking of using the extra DVI port that comes on the Larrabee device.
I was thinking this too - this thing is a COMPUTER, it just looks like a graphics card. Linus needs to load software onto it to make it do something (i.e. behave like a graphics card) jus like you do with a raspberry pi or an arduino. The only thing that makes this similar to a graphics card is its form factor.
Yeah
Did you get the drivers? Make a part.2 pls continue pls
edit: Like so Linus contiues this!!!
Josué Hernandez yes please!
i also want to see part 2
Would love to see this!
Josué Hernandez is it going to happen?
@@JonasReichert1992 I m sure if we keep on this they will!
Quick note, Intel had a dedicated graphics card way back in the late 90's, just around the time that the 3DFX cards were around. I believe it was the I740 and it was competing with some of the other vendors like Diamond, Matrox and 3DFX to an extent as it used Intel's new AGP technology. I think there was a follow up, like an I920 or something like that but they weren't around long.
I was going to mention that to. I had one of those cards. Performance was pretty good but it was no 3dfx and was left in the dust. I think I used it for 6 months. Back in those days a green parts every 3 to 9 months was more of the norm.
As of today. My main keyboard from 1995. My computer is about 5-2 years old and still does everything I need.
I also had an i740 that came with my first PC back in y2k
*INTEL WANTS TO KNOW UR LOCATION*
Their own office
Then they say "please rate our product 1-5 stars"
Canada
lol
Vancouver
I ... I think we just witnessed Linus full on nerding out xD
JamesCraft bruhhh.... Anyone can nerd out reading off a teleprompter.
Jesus Ortiz he wrote it himself
JamesCraft hell yeah Id be nerding out aswell
Am I the only one just really happy to see him so excited?
And now I watched the ending. I am very sad now.
2022 here: They finally made a reall gaming GPU and it is all right, but needs improving.
IT HAS IMPROVED
They increased the compatibility for dx9
I got an “Engineering Sample” GeForce 6600 from Circuit City. Had artifacts and when I checked all the pipelines were unlocked (defective one was supposed to be locked). I also have an uncut wafer of Celeron CPU dies from an Intel educational kit.
Wow that is really cool! And oh man! Circuit City! Last time I was there, 512MB's on a Memory Stick was $150!!
Too much symbols, too little images.
WertzOne
Not sure how to share pics in TH-cam comments if I even still have them. Heck, the comments usually don’t even show up if you have a link in them (they get flagged as spam).
This was back when there was some crazy good deal on them after rebates... like, $70 after $150 of rebates. I recall using the eVGA Step Up program to get a 7900GS out of one of them, which should narrow the time frame a bit. I had to take the HSF off and clean the thermal compound to see “Engineering Sample” on the core but it was there. IIRC, they were all properly sealed by my friend did get a fraudulently resealed GeForce2 MX inside a 6600GT from a different Circuit City location a couple years earlier when he wanted a card to play Final Fantasy XI, so I’ve definitely experienced that before.
The Intel Education Kit with the disc of CPU dies was a Goodwill find. :)
I have a Quadro 3000G engineering sample, using a hand soldered FPGA based daughterboard. Was supposedly meant for an IBM lab in Germany, but got stolen in transit. At least that's what a Nvidia rep told me. Was really funny, I bought the card off ebay, didn't know it wasn't even released yet (let alone stolen), couldn't find a driver on Nvidia's website, so I sent a mail to Nvidia support - not even an hour later, some very nice guy from Nvidia Germany called and immediately asked: "Who are you and how the did you get that card!?" Still surprised they let me keep it...
wsippel Very cool. I recently found $22,500 worth of heroin inside two NES game cartridges that I believe were recently stolen in transit (like, smuggled by mail for a dark web crypto currency transaction). Turned it over to the cops right away. If anyone thinks I’m joking, I have plenty of proof (WertzOne would be proud!). Heck, it’s been in the news all over. :)
this is a thing, the card supposedly will not function as a gpu unless the drivers are loaded. windows will only detect it as a co-processor until the drivers initialize it and seems like drivers are now only stored in an intel archive that regular employees/support do not have access to.
Yea, before even clicking this video I was just thinking "Without drivers it's practically worthless as a video card, and I doubt they have drivers..."
Exactly! Because it wasn't built to be used as a dedicated graphics card in the first place. Of course it looks like one, but it only acted as a co-processor in an outdated computer. That's all there is to it, really. You'd need to engineer a Windows compatible software environment to make it work in household computers. Intel certainly didn't throw a plug and play device in the bin and forgot about it.
Still I've hit the Like Button to make Linus feel better. ;)
"Intel's FIRST GPU!!!"
Stop pretending it wasn't the Intel 740 that ushered in the age of AGP graphic cards.
I was about to comment this, so I searched for AGP in this comments section. Yes indeed, there have been Intel graphics cards made to promote the AGP standard and I used to have that one. The Intel 740 card.
Me too... I think @Linus might be too young to remember... that was back in '98/'99 !!
The i740 had interesting speed improvements in the XWindows environments (ie. mostly 2D)
i had it too in 2000 ))
My first Videocard was an Nvidia Riva TNT2, it came out just one year after the Intel 740! in 1999!
I played Quake 3 Arena and Ever Quest with it :D
Don't blame him, he wasn't born at that time.
Who’s watching this after watching the video on Arc?
oh yeah
uh which video?
me after buying intel arc b580
Will there be a follow up when you get this working?
I can't imagine Linus wouldn't make a follow up if he gets it working. He's in this to make money and the more videos he gets from something like this the more money he's able to make. Ultimately being able to produce more content for this would only give us more content to watch. It's a win win for everyone.
He doesn't reply to comments.
Roger J I doubt he does this FOR the money. They guy wears docks and sandals.
But...
Can it run Crysis?
Special EDy Don’t forget Minecraft and Euro Truck Simulator 2
Can't even run
Lol hero brine lol . I bet you don’t even play crysis
it's time to STOP!
(^equally outdated meme^)
It can't even boot.
Filming and editing in this video were pretty great. Can tell you're reading the teleprompter though haha. I can appreciate the professional vibe though.
its linus what do you expect
tf is a teleprompter
Intel HD graphics: *Am I a joke to you?*
yes... yes u are.
HD Graphics trashhhh
How much do these suck? *yes*
@@allhopeislost9238 cod 4: am i a joke to you
Intel HD Graphics and Iris are Integrated GPUs so they used RAM
This one is a Dedicated GPU which has its own RAM
Any update on this card?!