@@Maccaroney Top was bowed out because Linus took it apart for that close-up shot of the internals shown earlier in the video. He's pretty careless when it comes to handling PC components so he probably just forced it closed and called it a day. It's just a review sample after all.
I could see MacBook pros using these. Plus traditional gaming laptops, like the dell inspiron gaming, or ITX prebuilds like the Alienware alpha. This chip with revolutionize the market.
They've been for years. X86_64 is designed by AMD (you'll find a lot of system files with "AMD64" in the name, that's why) so AMD's been in Intel products since that.
TunnelBear, the VPN you want to use if you got more money than sense and want to buy the worse VPN possible, Look in the video description if this is you
I actually have a core 2 duo laptop running Mint. I know how bad it is. Thankfully I have my main rig to fall back on when the frame rate drops below 60 fps in Chrome
Linus said in the video that the gpu out performs a GTX 1060. I have a Ryzen 3 1200 and GTX 1050Ti. This Intel CPU is quad core, hyper-threaded and over-clockable to 5Ghz. If that is all true, it would out perform my system, which isn't the best, but isn't bad by any means.
I don't know if it has been said yet but I'm pretty sure that the thing next to the power is actually an IR blaster/receiver so you can use remote controls and make a HTPC setup
took me a second to realize the robot skull had red eyes for AMD's 'team red' meme and blue circuit outline for Intel's 'team blue' meme... Also, I am pretty glad this came out. I was really curious about the lack of designs coming out for NUCs, but the vulnerability killed my idea for Skull Canyon... since I was thinking about Qubes OS on it, as a security + mobility in one package.
its a PCI-E 3.0 x8 bus leaving it the same Bandwith like a 2.0 x16 bus. Thx to HBCC it might actually do better once teh 1060 reaches its memory limit. But other than that .. most ikely no.
probably they are not but nvidia becoming a serious threat to intel in professional market. where does the money that nvidia do for R&D coming from? majority of them coming from their consumer gaming GPU. to halt nvidia advancement they need to disrupt nvidia cash cow first.
Dr. Zentron AMD's CPUs are currently the safest. Not affected by Meltdown and Specter only works under Linux and a non default BIOS setting. So implying that AMD's CPUs are as affected as Intel's CPUs is incorrect.
Its like growing oranges on an apple tree... Its like running android on an apple phone... Its like eating an apple with a fork... ITS NOT POSSIBLE... idk why I referenced apples so much...
Also, grafting branches from different species of trees on to each other isn't an new innovation, its been done for many years. Look for 'fruit tree grafting'
I was just about to get a Skull Canyon and then I heard about the Hades Canyon NUC and decided to wait... but I would had been in the same boat as you if I had decided to get the Skull Canyon.
if you cant beat them because they break their agreements with you tying you up with legal fees for years and stealing your designs then paying companies up to one billion yearly to only use intel, then join em.
AMD beat with Ryzen, but can compete with Nvidia on graphics... Intel can't compete with Nvidia on mobile graphics nor Ryzen then they join to destroy a common enemy and Nvidia monopoly, nothing else. They'r technologies are still separated.
intel and AMD working together is like watching your cousins make out. before, you thought it would never happen, it was stupid to think it would. but now that its happening, you realise its better than you ever thought.
WarmSoftKitty they dont exist to make sure that intel isnt a monopoly, they exist to make cpu’s and gpu’s. as for whether or not theyre a joke is personal opinion, but their cpu’s aren’t bad by any means. the practically invented multi-core processors (i think, honestly idk.) it still warms my heart tho edit: okay it was ibm but amd was still ahead of intel in making one of the first dual-core processors
@@RohanAdvaniRaju It's been pretty solid. When I have a minute to play some games the Crash Bandicoot and Spyro remakes look fantastic. RE2 hits 60 fps no problem as well. It's also doubled as my Plex and Pi-Hole server using the extra NIC onboard. I know it's kind of a niche thing but if you know how to get the most out of it I know it'll last me a long time, even when games start surpassing the tech with streaming/server capabilities alone.
@@SZF123456 Thank you for your prompt reply, I however would buy something portable as I am going to shuttle between work and home. This is an Uber cool fast performance 💻👌 but I'll wait.
They have been "making" smart phones since Jan 2010. The first Nexus is 8years old now. They are doing the exact same thing with the Pixel except they dont let the manufacturer keep their name/logo on it. The Pixel XL 2 is made by LG and the Pixel 2 is made by HTC.
This thing would be amazing for a VR backpack! Gotta do a 2.0 of a DIY one when you can get your hands on one guys! All you need is a battery and you'd be golden!
May the Science be with You honestly i don't think nvidia demanded more money but more like intel need to disrupt nvidia cash flow to stop nvidia advancement. the thing about nvidia is their gaming segment is very successful it was able to fund keep funding their professional solution effort. if nvidia solely rely on the revenue from tesla and quadro to further develop chips like volta nvidia might not as successful as they were now. nvidia does not need to invest too much on the gaming front but they got massive revenue from their gaming business. to disrupt nvidia you need to disrupt their gaming business first.
Renzc ode // I'm thinking nVidia also got a lucrative market in the mining department. Say what you will about AMD cards being better at it in a lot of cases, if it can mine, miners will buy them.
Team work team work everybody do your share... Intel: will it get me out of the trouble for the CPU glitch? Amd: *SIGH* do i have to work with intel? :p Nvidia: f*** this s*** i'm out
never heard of NUC before but upgrade time is in 2018. for 999.99 this might be an option and by adding RAM, SSD, etc you still get that rewarding feeling of "building your own rig". sorta :D
The Sandybridge Lenovo X220 my mate has can only run Hearthstone at 1080p on minimum settings without significant framerate issues. Those were great CPUs, not so much with the integrated graphics...
AMD dominates the console gpu market, basically a monopoly if not for the switch, and it's a glorified mobile phone. That came because of ATi, who were known for making gpus for consoles before AMD bought them. The problem is, there is very little profit made per console, since consoles primarily make their money on game licensing, not console sales. It is why nvidia always loses the fight, they care about profits first and foremost, which is incompatible with console producers, who are trying to cram as much power as possible into a box for as cheap as possible. The reason nintendo even went with nvidia is because the switch was on Arm. Nvidia was the only valid solution on arm.
Would be cool to see a PC GPU that uses the tile rendering tech that their console GPUs use. AMDs console GPUs share RAM with the CPU and have a chunk of ultra fast on-die RAM which they use to render the scene out in tiles and collate the rendered tiles out to the slower system RAM for display to the screen. Imagine if you didn't have to worry about how much memory your video card because it uses the same memory as your CPU and instead of a card you just installed the GPU in a second CPU-like socket (which would be cheaper because it's just a chip, or alternately a more powerful GPU for the same price as a full card) and instead you just put 32 gigabytes of memory in your motherboard (you'd probably need to use quad channel memory for a shared memory system like this though) Multi GPU setups would probbaly also work better because with the GPU rendering the scene in tiles by design it would be easier to split up the load between multiple GPUs by allocating alternate tiles to each GPU. EDIT: I just remembered that Rendition had this idea in the early 2000's, they called it Socket-X but unfortunately nothing ever came of it.
Yes. Although Linux and Windows teams have released patches to work around them, the CPUs themselves are built with the vulnerability inside, and it's going to be a couple of years before manufacturers release chips with the vulnerability hardware corrected.
This reminds me of when Sega started releasing games on Nintendo consoles. You'd never imagine something like that happening back when they were at each other's throats during the 90's.
Looks like a decent mini-PC that can be set up to push a bunch of monitors for vestibules and company lobby's. Especially if you are pushing something 3D and not just slides.
from what we know now? Intel is lagging behind. and Nvidia is being stupid. AMD are making Ryzen 7 3700X 12core-24thread CPUs that can turbo to 5.0Ghz for consumers. (base 4.2Ghz). wile the Server and corporate market gets the Ryzen 9 3800X 16core-32thread 3.9base and 4.7Ghz Turbo... well idk why i say overclock. AMD CPUs in the X series (3800X, 3700X, ect) are unlocked chips. and AMD tech has the cores run free rather then needing to be turbo. as the X-series is the second version of their locked units. So. the R7 3700 runs at 4.6Ghz wile over clocked, not Turbo. Turbo is designed to have free-float CPU power. and 3.8Ghz standard. the 3700X has their max Turbo speed set to 5.0Ghz because you are not overclocking the chip to that level. the CPU will hit that level when that power is needed. rather then damage your CPU at it being 5.0Ghz all the time. So the max safe zone is 4.6Ghz on the 3700 standard. only 105w for the 3700X as well Fun facts. if you want to use that term? them sure. looks like an AMAZING consumer, and low priced VR CPU. TL:DR? AMD CPUs overclock themselves when CPU power is needed. they will float between their max clock speeds and back down after the said process is finished.
That price is just insane! You could get a laptop with with similar performance and ram, storage, screen, keyboard and a battery included for that price!
Type c thunderbolt storage docks. Also the ethernet connections. I don't know why anyone using one of these and not a more suitable tower, would go for one of those things though.
Good thing to know; yellow usb ports are power ports. They will charge your devices and such if the pc is turned off as long as the power cord is connected
Alexandre Loens The Xbox One X's Scorpio Engine SoC is completely AMD, both the CPU and GPU. The CPU is not quite Ryzen, but the GPU is basically a semi-Vega.
Can't believe I will say this, but I'm thinking of getting one when it launches, it just hit the right buttons there with all the I/O support and integrated hw.
Welcome to Team Purple. We're glad to have you.
CardinalJake lol
It's a chill team for chill peeps.
CardinalJake Purple for the win bab
Been part of team purple for the last 8 years, with Intel CPU and AMD graphics.
Is this a counter to team Green?
Intel and AMD together??? I must be living in the real future now.
Kevin sal this universe is collapsing in itself.
Kapil Deshmukh it’s actually expanding
AMD owns Radeon Technologies Group, so your argument is BS. Sometimes people are just right, stop arguing for the sake of arguing.
Kevin sal if this were nvidia they’d include a 1080 then raise the msrp to $2K, knowing the gtx 1100 series is coming in a few months
Bonghætte it is still amd so why are you arguing?
How nice of Intel to pre-drop and scratch it for Linus
Don't bother guys, he doesn't drop it
armando1is1great What a shame
Thanks, now I don't have to watch the video.
still cracked
HE PLAYED US LIKE A DAMN FIDDLE!
Awww :(
The skull logo is red and blue. That's some nice detailing.
Best pokemon catching device ever
Funny how they just stuck the sticker on wonky like “there! It’s done! Happy now?”
Also the top was bowed out above the where the displays were plugged in. lol
Intel quality... if it's not made out of toothpaste, it's gotta be something else
@@Maccaroney Top was bowed out because Linus took it apart for that close-up shot of the internals shown earlier in the video. He's pretty careless when it comes to handling PC components so he probably just forced it closed and called it a day. It's just a review sample after all.
Intel & AMD bring out Hades Canyon ? So it's official then, hell froze over !
Maybe that's why they named it "Hades".
Super excited to see how laptops would do with these!
PausePlay MacBook Pro 2018 with this would be possible.
This, plus optane and duel spindle hard drives... Future laptops are looking amazing, both low and high budget models
I could see MacBook pros using these. Plus traditional gaming laptops, like the dell inspiron gaming, or ITX prebuilds like the Alienware alpha. This chip with revolutionize the market.
shut your face! Definifely excited to see ITX builds (maybe even smaller?) with these
That's amazing to see how we made powerful machines that small
AMD is every where, even in Intel products.
AMD.. all... around... you...
owait..
in my ass too
They've been for years. X86_64 is designed by AMD (you'll find a lot of system files with "AMD64" in the name, that's why) so AMD's been in Intel products since that.
It is: Radeon-ating Pentiums!!
AMD Inside
and the ces subbox spam begins
ikr
U right
All the fucking TunnelBear ads...
And I love it! CES!!!
TunnelBear, the VPN you want to use if you got more money than sense and want to buy the worse VPN possible, Look in the video description if this is you
The future is already here and I'm still stuck with a core 2 duo
George Beard I feel you bro
Laughs in Ryzen 7 and GTX 1070 SLI.
Me too
I actually have a core 2 duo laptop running Mint. I know how bad it is. Thankfully I have my main rig to fall back on when the frame rate drops below 60 fps in Chrome
UnspecificMonotremeGaming same here! usin a core 2 duo e7500 with 4 gigs of ram and amd hd 5770
Intel: "let's take an Xbox and squeeze it."
rofl you win
and add some more ports
Tim Cook: NO MORE "Other" PORTS!!!
Lmfao
but make it 2.5x more expensive :-)
He didn't drop it?
Ankit Sachdeva oh no that’s is not the real Linus, he always drop stuff
he droped it off camera
Edited that part out probably
Isn’t it obvious this video is fake?
Ofc he dropped it, just look at the cracked Vega die!
How long will that voice last this time! ;)
Unfortunately longer than it should. OH, the pain.
Why is he shouting...
have you been in any convention? its kinda loud
サツッピミク it usually quiets down, once linus leaves the show floor
サツッピミク have you listened to the video? You can hear his voice echoing...
Loving the ad placements. Natural transitions. :)
oh what an amazing time to be alive, when a cpu could possibly beat your cpu+ Dedicated graphics its time to upgrade.
Tristan Arellano Yeah let's buy that shiny new processor with a hardware bug that's over 10 years old. Lmao
Man you must have a really really low end system if that is beating your cpu+dedicated gpu.
Never thought i would be excited for onboard graphics
still mobile stuff, don't expect it to last or perform as advertised
Linus said in the video that the gpu out performs a GTX 1060. I have a Ryzen 3 1200 and GTX 1050Ti. This Intel CPU is quad core, hyper-threaded and over-clockable to 5Ghz. If that is all true, it would out perform my system, which isn't the best, but isn't bad by any means.
Better than my desktop fml.
like its pretty powerful
this is a terrible idea. keep in mid you have to pay 200+ afterwards if you want to even get the piece of shit running windows.
its only a 4 core and a rx 460 or rx 550 and not to mention it doesnt come with ram, os nor storage. For 1k what a fail intel LUL
I mean you don't HAVE to pay 200$+....
hens u clueless
I don't know if it has been said yet but I'm pretty sure that the thing next to the power is actually an IR blaster/receiver so you can use remote controls and make a HTPC setup
A PC with a Vega for the cost of a Vega, yes please!
Pretty awesome indeed
GeoDelGonzo oh really?
took me a second to realize the robot skull had red eyes for AMD's 'team red' meme and blue circuit outline for Intel's 'team blue' meme...
Also, I am pretty glad this came out. I was really curious about the lack of designs coming out for NUCs, but the vulnerability killed my idea for Skull Canyon... since I was thinking about Qubes OS on it, as a security + mobility in one package.
GeoDelGonzo mobile gpus are not as good as full sized desktop
Oswald Feurst red, green and blue were company colors before the memes m8
Drop it.
OH COME ON!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
*Evil Linus* Do it, they're expecting it anyways.
They have given him a non working model😂
when I'm Linus.
i love the energy you delivere showcasing this product! You realy can see, how much this thing blowed your mind!
gj
i love how he transitions into the sponsers every video
It never stops
Better love it cause it will never end
5 million subscribers! Congratulations Linus
If this can happen we probably can achieve world peace
Microsoft should make a surface with this
Titan That's the wisest comment i have seen today, i tip my hat to you sir. 😎
It's already on its way
Don't askme how I know
m'comment
Adamania dude i hated the surface so far but if what you say is true i may buy one
Good CPU Performance and Good Integrated Graphics combined in one!
Not integrated but discrete, and is ~GTX 1060 MaxQ is faster than what most people have lol. This is the ultimate SFF build
meltdown and spectre included with no charge :D
"good price, too" said no one.
its a PCI-E 3.0 x8 bus leaving it the same Bandwith like a 2.0 x16 bus. Thx to HBCC it might actually do better once teh 1060 reaches its memory limit. But other than that .. most ikely no.
AMD CPU with Nvidia GPU would be even better
Apparently lots of people failed to distinguish between the GTX 1060 and GTX 1060 Max Q.
Lots of people? I haven’t seen a comment on it yet.
This just shows how hard it is to work with Nvidia.
not the case at all. but this show how much threat is nvidia to intel they willing to use AMD GPU for their product.
Renzc ode nvidia and Intel don’t compete in the consumer market.
probably they are not but nvidia becoming a serious threat to intel in professional market. where does the money that nvidia do for R&D coming from? majority of them coming from their consumer gaming GPU. to halt nvidia advancement they need to disrupt nvidia cash cow first.
Intel had to pay 1.1 billion NVIDIA for license.
Nvidia ceo says that soon gpus will replace cpus.
The CES spam had begun.
SQUARESPACE, TUNNEL BEAR. Remember Phantom Glass?
i only watch ltt for these events and Jay if he comes out with any vids, the others are meh
Love u linustech. Urs voice really show so much enthusiasm than any other tech reviewer.
For God's sake, fix the sticker on the front of the unit!
Intel is cutting all costs including new sticker logo designs.
Glad I'm not the only who's irritated on the tilted sticker that's sticking out 😂
The last one had a removable skull thing, assuming this will too
EDIT: wait that definitely looks unremovable
i thought it looked cool
George S
I don't think he's talking about the design, rather that it's sticked on crooked.
They all come with Spectre and Meltdown, No extra charge!
I will be buying this to upgrade to Spectre and Meltdown, I heard they were fantastic features!
You don't really have a choice since intel, amd and even arm CPUs were affected lol
Dr. Zentron
AMD's CPUs are currently the safest. Not affected by Meltdown and Specter only works under Linux and a non default BIOS setting.
So implying that AMD's CPUs are as affected as Intel's CPUs is incorrect.
In theory Intel could fix that for this. But dunno, maybe a bit late for extensive hardware revision...
@Dr.Zentron Of course you have a choice. Just don't buy any hardware until fixed CPUs are available.
that transition to the advertisment was amazing
Its like growing oranges on an apple tree... Its like running android on an apple phone... Its like eating an apple with a fork... ITS NOT POSSIBLE... idk why I referenced apples so much...
You can eat an apple with a fork, you will get some weird looks but it's possible.
You can run android on an apple phone
also android on apple
Also, grafting branches from different species of trees on to each other isn't an new innovation, its been done for many years.
Look for 'fruit tree grafting'
maybe you are secretly Tybalt...
If this thing can actually perform as well has a 1060 it will be the ultimate portable vr machine
jaygon12 it's superior than a 1050 but not better than a 1060, the 1060 max-q it's a thinner but worse 1060
I think my Skull Canyon can sense my sudden desire to see other people.
We didnt even get skull stickers with skull canyon ;-;
I was just about to get a Skull Canyon and then I heard about the Hades Canyon NUC and decided to wait... but I would had been in the same boat as you if I had decided to get the Skull Canyon.
I mean I guess if you can't beat em, join em.
Dylan M I don't think that really goes for both sides
if you cant beat them because they break their agreements with you tying you up with legal fees for years and stealing your designs then paying companies up to one billion yearly to only use intel, then join em.
Iirc, what AMD did was split their CPU and GPU divisions or something, that's why this is possible.
Don't quote me on that, though.
AMD beat with Ryzen, but can compete with Nvidia on graphics...
Intel can't compete with Nvidia on mobile graphics nor Ryzen then they join to destroy a common enemy and Nvidia monopoly, nothing else.
They'r technologies are still separated.
better the devil you know (AMD), than the devil you don't (nVidia)...
"This thing is loaded for bear. Speaking of bear ..." ?
there's only one og bear, my hero...pedo bear
*T U N N E L B E A R O F F E R S...*
Lazy-ass transition.
Bear in london slang means alot so maybe a clever reference?
Nice Linus! Great to see we're back to the same old awesome content
intel and AMD working together is like watching your cousins make out. before, you thought it would never happen, it was stupid to think it would. but now that its happening, you realise its better than you ever thought.
it's nice to see amd working with intel. I hope they won't disappear tho
Wait
Roll tide?
Advanced Intel Devices
mmm. AIDs... lol
And the CES videos begin
I’m glad your enthusiasm matched mine!
Congrats on 5 million subscribers
Well deserve #LinustechTips
Me &MyNightcores jiren the gay
That's really cool. I guess now we won't have to choose. The color be a weird mix of both red and blue
Naman Goyal
I think that's called purple
In America we call that color purple
aidan c. It's a new color, i will call it redlue
No, Bled.
This thing is perfect for a vr backpack!
Antoine Richermoz this is not powerful enough for decent VR experience.
ThePotatoJuicer , it has 2 thunderbolt
it has the power of a GTX 1060
VR backpacks... I never understood them.
"oh woah so cool look at me I'm playing VR"
*trips on something
"guys do you think i broke the pc"
sanchy panchy the point of a vr backpack is no cables so you don't trip.
With how exciting it was I'm surprised Linus didn't drop it.
I'ts tiny enough for linus with his hands to firmly hold
it warms my heart to see two rival companies partnering on something
I believe it was a design buyout for that specific GPU.
isn't it always nice to see big companies come together to become a bigger monopoly
WarmSoftKitty you mean like this little company named microsoft bought a shit ton of the amd epyc server cpus?
WarmSoftKitty you said amd is a joke while it clearly isnt. Ryzen and epyc are even the same architecure you are just a intel fanboy
WarmSoftKitty they dont exist to make sure that intel isnt a monopoly, they exist to make cpu’s and gpu’s. as for whether or not theyre a joke is personal opinion, but their cpu’s aren’t bad by any means. the practically invented multi-core processors (i think, honestly idk.) it still warms my heart tho
edit: okay it was ibm but amd was still ahead of intel in making one of the first dual-core processors
799$ or 999$ without storage and ram... Dam it Intel, you where that " " close to a perfect product !
There is a deal on Newegg where you can get it with a 120 gb SSD for $699.99
Just ordered mine for $860 on Newegg, 2x 8 GB ram and 250 GB ssd altogether. Just gotta look for discounts
@@SZF123456 How is the performance?
@@RohanAdvaniRaju It's been pretty solid. When I have a minute to play some games the Crash Bandicoot and Spyro remakes look fantastic. RE2 hits 60 fps no problem as well. It's also doubled as my Plex and Pi-Hole server using the extra NIC onboard. I know it's kind of a niche thing but if you know how to get the most out of it I know it'll last me a long time, even when games start surpassing the tech with streaming/server capabilities alone.
@@SZF123456 Thank you for your prompt reply, I however would buy something portable as I am going to shuttle between work and home. This is an Uber cool fast performance 💻👌 but I'll wait.
The power of a 1060???
Holy balls
Anyone remember when phantom glass used to be the sponsor many events ago?
I love how after all these years, the money, popularity, LMG, etc...Linus still geeks out so hard, just like back in the NCIX days. Go Linus!!!!
r/AyyMD needs to have a word with shintel
Last I checked, someone there said that it was worth it, because the Vega would be preventing the Shintel from going nuclear.
Finally, CES videos
GG for 5 million subs!
It's like apple and Google make smart phone together...
That would be awesome....
Google only started making smartphones in 2016 with the pixel.
Before that they only made android software.
They have been "making" smart phones since Jan 2010. The first Nexus is 8years old now. They are doing the exact same thing with the Pixel except they dont let the manufacturer keep their name/logo on it. The Pixel XL 2 is made by LG and the Pixel 2 is made by HTC.
Designed by Google made by Apple or vice versa
Or having a Google Pixel built by Samsung, which is pretty much a wet dream.
IOS with android capabilities.. all i want in ios is that i can put my songs and video xD
Cool I been wanting for this for like 3 months
This thing would be amazing for a VR backpack! Gotta do a 2.0 of a DIY one when you can get your hands on one guys! All you need is a battery and you'd be golden!
I think Intel must be butmad at Nvidia for beating Xeon Phi into the ground, and must team up with Radeon to get back at them.
Nvidia probably demanded alot more money to team up, so they went with AMD
I would guess they demanded an X86 cross license lol
May the Science be with You honestly i don't think nvidia demanded more money but more like intel need to disrupt nvidia cash flow to stop nvidia advancement. the thing about nvidia is their gaming segment is very successful it was able to fund keep funding their professional solution effort. if nvidia solely rely on the revenue from tesla and quadro to further develop chips like volta nvidia might not as successful as they were now. nvidia does not need to invest too much on the gaming front but they got massive revenue from their gaming business. to disrupt nvidia you need to disrupt their gaming business first.
Renzc ode // I'm thinking nVidia also got a lucrative market in the mining department. Say what you will about AMD cards being better at it in a lot of cases, if it can mine, miners will buy them.
you don't earn money with miners. They sell their cards quickly again, flodding the market
so it begins
Thanks for the Synergy link! I was just thinking of buying a copy :)
but can it run crysis?
TRon why not?
it's a meme
yes with cinematic experience
Oh God, stop with the outdated jokes! It was funny 4 years ago.
TRon but can it run 1'000'000 exploding tnt in minecraft?
Can you make a video in the future using it for VR in a cordless setup linus?
Thanks, could be beneficial if it is that powerful and small. Could be strapped to the belt.
Interesting video. It all sounds really promising. I'll love to see a review as soon as Linus can get his hands on one.
Team work team work everybody do your share...
Intel: will it get me out of the trouble for the CPU glitch?
Amd: *SIGH* do i have to work with intel? :p
Nvidia: f*** this s*** i'm out
AMD: will it give me back the respect I deserve after being hammered by Nvidia in 2017?
Nvidia: well time to make new gpu... 1 week later ... hey we got a new gpu a gtx 1090it
im just here for that rgb
NUC...More like *NUT*
*Smashes button*
*NUCC*
Linus should never have started doing cocaine :S
lol thank fuck! ive been scrolling through the comments for aggges looking for cocaine comments and I've only seen yours.
Lol why?, Explain it to me!
misk one why?
why what? lol he's hyped up as fuck, looks like he's on some mad coke...whats not to understand?
What....best collaboration! Razer jump on this!!
never heard of NUC before but upgrade time is in 2018. for 999.99 this might be an option and by adding RAM, SSD, etc you still get that rewarding feeling of "building your own rig". sorta :D
Meanwhile the Sandybridge era Dell Inspiron I’m watching this on gets 25 frames on Minecraft...
MTNG noob
well. my Dell Precision m6600 has i7 Sandybridge too, but it runs CSGO in 230+ fps lol.
DUDE it's a dell!
The Sandybridge Lenovo X220 my mate has can only run Hearthstone at 1080p on minimum settings without significant framerate issues. Those were great CPUs, not so much with the integrated graphics...
MTNG my sandybridge hp envy has a gt 630 it can run battlefield 4 at 50 fps low settings
How much coffee has Linus had? I'm sure it's loud there, but I can hear him echoing for half the video lol
James Fuston Probably loaded up on G-fuel before going!
He probably got a COFFEELAKES WORTH HAHAHAHA im dont
James Fuston A lake of coffee.
I finally got to meet you in person at CES! I was working at the Inada massage chair booth! Dream come true haha
Glad to see AMD branching out in the gaming space. Their processor and graphics tech has so much potential beyond just desktop usage.
The GameCube, Wii and Wii U all used ATI/AMD graphics chipsets.
The ps4 and xbox one as well...
AMD dominates the console gpu market, basically a monopoly if not for the switch, and it's a glorified mobile phone. That came because of ATi, who were known for making gpus for consoles before AMD bought them. The problem is, there is very little profit made per console, since consoles primarily make their money on game licensing, not console sales. It is why nvidia always loses the fight, they care about profits first and foremost, which is incompatible with console producers, who are trying to cram as much power as possible into a box for as cheap as possible.
The reason nintendo even went with nvidia is because the switch was on Arm. Nvidia was the only valid solution on arm.
Would be cool to see a PC GPU that uses the tile rendering tech that their console GPUs use. AMDs console GPUs share RAM with the CPU and have a chunk of ultra fast on-die RAM which they use to render the scene out in tiles and collate the rendered tiles out to the slower system RAM for display to the screen. Imagine if you didn't have to worry about how much memory your video card because it uses the same memory as your CPU and instead of a card you just installed the GPU in a second CPU-like socket (which would be cheaper because it's just a chip, or alternately a more powerful GPU for the same price as a full card) and instead you just put 32 gigabytes of memory in your motherboard (you'd probably need to use quad channel memory for a shared memory system like this though) Multi GPU setups would probbaly also work better because with the GPU rendering the scene in tiles by design it would be easier to split up the load between multiple GPUs by allocating alternate tiles to each GPU.
EDIT: I just remembered that Rendition had this idea in the early 2000's, they called it Socket-X but unfortunately nothing ever came of it.
OMG - if people only knew what they were talking about in posting chit... "branching out in the gaming space"? lmao
But is it vulnerable to Meltdown and Spectre?
Mystic Bardock LSSGSS depends on the build version of the window
sure it is
Yes. Although Linux and Windows teams have released patches to work around them, the CPUs themselves are built with the vulnerability inside, and it's going to be a couple of years before manufacturers release chips with the vulnerability hardware corrected.
yep
That's a added feature.
This reminds me of when Sega started releasing games on Nintendo consoles. You'd never imagine something like that happening back when they were at each other's throats during the 90's.
This is madness!!! What is going on!?! Its like that time when Hitler and Stalin decided to split Poland.
PrincepsComitatus Lel
We all know how the story continues tho. Good analogy.
looks as tasty as a tide pod
Jake Doll Kill this meme please
Congrats on 5M subs!!!!
wtf optane? it is not even needed when you got m.2 ssds
Nicholas Lau its intel stuff, its up to intel what he want to do, dude.
standalone optane is faster.
m.2 doesn't always mean NVME speeds. One of those ports could just be regular SATA3
Your profile pic fits the comment
So will intel be putting sata m.2 ports on their flagship nucs?
One Step Closer To The 2 Dawgs Being A Couple.
Looks like a decent mini-PC that can be set up to push a bunch of monitors for vestibules and company lobby's. Especially if you are pushing something 3D and not just slides.
You know how much shit they could accomplish if they work together.
I shit 3 - 5 times a day sometimes 6 or 7. If they hired me we would get a ton of shit done.
FatherBootyHands I take laxatives I shit liquid
shitting liquid is not shit, it's bum piss
Now pehaps intel and AMD fanboys will not fight or....will they?
I'm from the future, they are
from what we know now?
Intel is lagging behind. and Nvidia is being stupid.
AMD are making Ryzen 7 3700X 12core-24thread CPUs that can turbo to 5.0Ghz for consumers. (base 4.2Ghz). wile the Server and corporate market gets the Ryzen 9 3800X 16core-32thread 3.9base and 4.7Ghz Turbo...
well idk why i say overclock. AMD CPUs in the X series (3800X, 3700X, ect) are unlocked chips. and AMD tech has the cores run free rather then needing to be turbo. as the X-series is the second version of their locked units.
So. the R7 3700 runs at 4.6Ghz wile over clocked, not Turbo. Turbo is designed to have free-float CPU power. and 3.8Ghz standard. the 3700X has their max Turbo speed set to 5.0Ghz because you are not overclocking the chip to that level. the CPU will hit that level when that power is needed. rather then damage your CPU at it being 5.0Ghz all the time. So the max safe zone is 4.6Ghz on the 3700 standard.
only 105w for the 3700X as well
Fun facts.
if you want to use that term? them sure.
looks like an AMAZING consumer, and low priced VR CPU.
TL:DR? AMD CPUs overclock themselves when CPU power is needed. they will float between their max clock speeds and back down after the said process is finished.
That price is just insane! You could get a laptop with with similar performance and ram, storage, screen, keyboard and a battery included for that price!
Link said laptop.
*10 proofs that your life is a lie*
BezNicku synergy is life
"Proof" isn't a countable noun dude
Unless your talking mathematical proofs, then you can.
Why would you put optane in this thing if there's no space for HDDs anyway
so you could sell it to idiots - EA style
Type c thunderbolt storage docks. Also the ethernet connections.
I don't know why anyone using one of these and not a more suitable tower, would go for one of those things though.
haven't seen Linus so excited in a while XD!
„Its compareable to a GTX 1060 maxQ“ Fuuuuck i just bought a Laptop with a 1060 in it...
Max MaxQ is actually like 30% weaker so don't worry too much...
Accelerator But the Laptop is thick af and heavy
Accelerator
15-20%
There is a big difference.
Don't worry, 1060 will serve you just right. First gen product always got some issues
Laptop is still more portable anyway. This thing is small but you'd have to carry around peripherals and a display.
I just came watching. Ty linus
yassi anam show
Good thing to know; yellow usb ports are power ports. They will charge your devices and such if the pc is turned off as long as the power cord is connected
yeah the nuc is cool and all but what about Synergy *2*
It happened because the AMD company's CPU and GPU portion kind of split, they talked about it in the WAN show, it let's this happen.
No, it's the semi-custom program. Intel could still have done that with the full cooperation of AMD before the split of AMD and Radeon.
If i remember correctly , it's exactly how Xbox works since the 360.
Intel Cpu with amd iGPU.
Alexandre Loens The Xbox One X's Scorpio Engine SoC is completely AMD, both the CPU and GPU. The CPU is not quite Ryzen, but the GPU is basically a semi-Vega.
And the 360 had an IBM CPU and an ATI GPU (released a year before ATI's acquisition by AMD).
The XBox One X is (altered) Polaris, not Vega.
The CPU is Jaguar, though also a bit altered.
Can't believe I will say this, but I'm thinking of getting one when it launches, it just hit the right buttons there with all the I/O support and integrated hw.
Linus just had a portgasm
congrats on 5 mil
Thanks for showing the cooling solution
0:42
christ chan my mind after reading intel and amd working together
Linus just came all over it..
Tryden D'Souza it did look like Linus was going to orgasm over it...
Bought Synergy2 - its awesome - so much better than the first iteration. Very nice config now. Well thought out.
intel team up with amd gpu and nvidia team up with amd cpu
Andrew Perry I think no one want the Nvidia chip except end user.
That would be hilarious 😂
TehFalcon101 plot twist: nvidia tegra
TehFalcon101 and 64bit owo
TehFalcon101 k