"You can't noclip into videos because it's not possible" *Repeatedly shows footage of actually doing it* There may have been some deliberately misleading phrasing there, but the whole thing barely feels like a lie.
The program he's using, "Gaussian Splatting", allows the user to inspect a single frame of a video file by "rendering" it as if it were a three-dimensional environment. However, it isn't magic. Data not fully visible in the frame being inspected will either be misrepresented, missing, or in the case of larger amounts of invisible areas, blurred out violently. It's why he can look around the corner of the room almost flawlessly, but turning around or moving a lot to look at what isn't in the video anymore results in an abstract painting.
@@Scoped211. Noclipping into mp4s and not games 2. This video has near no explanation of whats happening, though the creator of the video normally makes educational vids
Basically an algorithm tries to create a 3D scene from the still frames of the video. Like a crude 3D scan. This data is then explorable in a 3D environment.
@@borstenpinsel it's actually a pretty advanced 3D scan, I was reading about it for school last year. if you take your own photos from multiple angles or find a high-quality video to work with, you can get almost perfectly photorealistic renders in real-time from most angles of the subject. it's cool stuff
@@meILM yeah that sounds right. i was given a recommendation to drop this video at noon today (i was late lol) so i suspect some people received a wall of youtube-sponsored videos in their subscription feed today
The weird thing is that many of us can model these places in our heads, even from an extremely low quality recording. In fact, the quality may not even matter up to a certain point. This means that this concept is actually possible, just not very possible *yet*.
I think our brain is really good at making us feel like it knows the structure, but it doesn't because it's not needed, try rotating a more complex 3d model around, for example of human head, you can't predict the position of features super precisely until you see them (for example draw a dot where it should end up). But it still feels normal and good
@@madghostek3026Counterpoint: the consistency of shape and perspective in dreams, which is not spoiled by lucid dreaming. Things morph, but you can move around no problem.
Theoretically, pulling data from a low resolution video should be possible because of the motion and extracting the data from variation in the pixel brightness
Inside a JPEG artefact world someone is taking a photo of someone else “make sure to get my good side”, the good side being the very specific angle they aren’t a disjointed blur spread across a large distance
absolutely... AWESOME!! i find gaussian splatting and other new technologies awesome, and i really enjoyed how this concept of using it on real videos was executed! props to you, GST. you just earned yourself a sub :D
I'm sorry I was expecting you to go in-depth on like what mp4 codec is and see what the files store inside behind the scenes and visualize it. I was not expecting you to GO IN FIRST FUCKING PERSON and just BEGIN FLYING AROUND. HOLY SHIT LOL
This is what I call trickster tech. It was never designed to truly generate interactivity in video clips, it was only designed to emulate it from the outset, literally emphasizing and digitizing the "fake it until you make it" phrase. A gimmick; nothing more.
So I might be off with this thought, but is this substituting the idea that a video is images over time and changing it to images over area. Stitching together the frames like how a panoramic shot works. So moving objects/people would cause blurring and other artifacts along with any issues caused by not having a stable frame of reference.
You are a total artist, you always inspired me to make videos, to upload my music, but in the end I never do more than nothing... Anyways, love your content since Early Sega Genesis Mixes
I'ma be 100% honest I have no clue how I got on this video I set my phone down on Instagram then all of a sudden this was playing but I'm too invested to leave so take my sub
Great to see people appreciating this. I've been doing this with AGIsoft and Reality Capture for years and LumaAI's Gaussian splatting model does a great job on those cases where the data is too compressed to provide consistent data for photogrammetry.
I once had a dream where i was in a park area next to a place i used to work (in real life). I suddenly fell though the floor, and was sudden carted sideways like i had landed on a fast moving conveyor belt, being able to look up and observe the land i used to occupy. I woke up suddenly after a couple of moments, as the effect left me feeling like i couldn't breath. I wish my experience of noclipping was nearly as colourful as this.
So what I'm getting from what I have seen with my eyes, you play a video, and then that video gets turned into a sort of 3D model that you can look around?
@@GSTChannelVEVOmaybe with a high-resolution video the first time round it might give some interesting results? almost sounds to me like 4D but i'm not a Physics Man (yet) so i'm not sure... do you have a link to the program that you could put in the description?
The other day I found a video I shot with a Nokia 6300 on an old HDD. It's 176 x 144px and 60kbps (yep). We've come a LONG way. Do you know RAW (DNG) image format? You can do wonders with a phone that supports it (Galaxy S line of phones, for instance). Just load that file in Lightroom or Adobe Camera Raw and you can literally make invisible things visible. Also, it's free of post-processing that butchers regular JPGs.
I stumbled across that format when looking for a way to store images with a more-than-8-bit color depth per channel, but haven't actually used it in a camera. seems magical
@@GSTChannelVEVO It is. You can take a picture of a musician on stage in harsh blue light, remove the blue tint and make his face look like a face again. You can take a picture of a dark alley at night and make it look almost decent. There's no noise removal or edge contrast enhancement unless you add it yourself. Everything's inherently sharper. Just make sure that the phone or camera shoots in this format because not all of them do.
4:20 This imagery had me thinking. The theory that we're all living in a computer program has existed for a while; the idea that we're either a simulation, or creation by some higher being, played out like a world sized life simulator. Now, I have another thought. What if we ARE in fact simulated, but let's look at it from a less human-centrist perspective. What if we're not the reason for the simulation, but a byproduct? At the timestamp shown, all I see are galaxies, and stardust, and what looks like a whole damn universe making up that image... What if Earth, us, our solar system, and everything else is just some computer placing random blobs around to try and give of the most specifically correct color to a specific pixel? Our solar system could be the difference between one unite of blue luminescence once whatever this supposed algorithm is computing compresses everything into the image it's attempting to display, by layering these 3 dimensional objects it's roughly throwing around at various depths. And worse of all, this could be an early attempt at a generation of this image! We might be thrown out as a useless blob next cycle! Maybe earth threw the color off more than another iteration and we're dropped! I don't BELIEVE this, it's just interesting to think about.
This is incredible! As expected, the github is entirely incomprehensible and there's no installer or anything, because why would you want anyone to actually be able to use the thing? It's like a car company saying here's your car! Just assemble all the pieces without any instructions!
TBH even though I know techniques like gaussian splatting is present, I never imagined that it can technically be used to noclip through videos. Thats pretty impressive. Would be really interested in more videos like these.
we all know this is all just some nice Gaussian Splatting! but the idea of using this technique to generate a 3d perspective from other peoples videos is just so cool and I think I thought of this idea but never really done it before also I think it should be called freecam instead of noclipping but idk
when i was a kid i always wanted to do somekind of noclip on windows 7, like i always wondered what is behind of the windows that our computers display
This completely messed with my head. It’s like that enhance scene in Blade Runner but in real life. I’m not sure if I should be scared or impressed how soon such a thing has become a reality. Edit: I should clarify what I mean by it being scary. It’s more so the fact that archived footage has been used to be turned into a 3D space. It gives it this uncanny feeling that I cannot describe. I am well aware this technology is nothing new. In fact, Lidar has been around for a long ass time. But that still doesn’t stop me from being impressed on what it’s capable of.
This is the opposite of an educational video, where you keep trying to figure out what the fuck the creator means.
I've done so many well-researched and educational videos. I thought it was time for a change. Time to undo all of that. To uneducate the world.
"You can't noclip into videos because it's not possible"
*Repeatedly shows footage of actually doing it*
There may have been some deliberately misleading phrasing there, but the whole thing barely feels like a lie.
A deducational video?
@@GSTChannelVEVOkeeping mystery alive
@@SianaGearz video that makes you forget what you were taught, wow!
For anyone wondering this technique is called Gaussian Splatting
lol, busted.
i've been infatuated with what you can do with gaussian splatting. and even just COLMAP structure-from-motion
Thanks! I've heard of that but had no idea this is what it was in practice.
What tools do you use to do this?
@@therealquade graphdeco-inria/gaussian-splatting! nothing fancy beyond careful framing.
@@GSTChannelVEVO Hoo I have friends working at INRIA 😲
i have no idea what is happening. this must be what it feels like for old people to open a PDF
lmao
The program he's using, "Gaussian Splatting", allows the user to inspect a single frame of a video file by "rendering" it as if it were a three-dimensional environment. However, it isn't magic. Data not fully visible in the frame being inspected will either be misrepresented, missing, or in the case of larger amounts of invisible areas, blurred out violently. It's why he can look around the corner of the room almost flawlessly, but turning around or moving a lot to look at what isn't in the video anymore results in an abstract painting.
@@arthurbarbosa8204 great explanation, tysm!
This video feels like it's from a parallel universe for 2 seperate reasons.
Which are? I'm curious now.
@@Scoped211. Noclipping into mp4s and not games
2. This video has near no explanation of whats happening, though the creator of the video normally makes educational vids
bro said "noclipping into MP4 files" and then actually did it
Brand new sentence
The irony of people WITH youtube premium sitting through a two minutes Ad about youtube premium hearing you say "you will never see ads"😂
And I never do.
🤔
@@JB52520 wowe u passed out from 2:00 to 3:20
sponsorblock?
yeah that occurred to me, lol
I tried to clearly mark where to skip to, and failing that, sponsorblock works wonders ;)
@@GSTChannelVEVO its user submitted and it works rn
This explains surprisingly little despite the fact that almost every spoken line seems to be framed as if it were an explanation, kinda impressive.
Basically an algorithm tries to create a 3D scene from the still frames of the video. Like a crude 3D scan. This data is then explorable in a 3D environment.
@@borstenpinsel it's actually a pretty advanced 3D scan, I was reading about it for school last year. if you take your own photos from multiple angles or find a high-quality video to work with, you can get almost perfectly photorealistic renders in real-time from most angles of the subject. it's cool stuff
This video is Palworld and I'm Nintendo standing at the edge of the line with my legal team.
Jk
ruh roh :D
They're always just there a few thousand units away from everything else, waiting to be teleported in when someone hits the Shigeru trigger
“This video is Palworld and I’m Nintendo edging the line with my legal team”
So you are standing at the edge of a *_boundary?_*
@@GSTChannelVEVO spirits?
Gaussian splatting reminds me of those art displays using random trash and detritus to create an image when viewed from the right angle.
a lot of neon feathers and pipe cleaners!
oh man, calling this noclipping and implying that you could find stuff out of bounds was really getting my goat.
seems this is just a gag though.
hehe, yeah. I might have a strange idea about what constitutes a "joke". sorry about your goat. :P
unfortunately theres only shitty unprotected ram out of bounds, so you couldnt get your goat even if you could noclip.
@@generallyunimportant ranching simulator game but the animals are rogue AIs, and if you die in jpeg you die irl swordartonline style
I love the premise and dead pan delivery, this a comedically genius application for the technology lmao
Huh, never thought TH-cam would sponsor a video.
They seem to be sponsoring a whole bunch of medium sized TH-camrs all of the sudden.
@@meILM yeah that sounds right. i was given a recommendation to drop this video at noon today (i was late lol) so i suspect some people received a wall of youtube-sponsored videos in their subscription feed today
It's a better approach to shifting people to premium than what they've _been_ doing...
I only saw 2 youtubers ever get sponsored by TH-cam: Hunter R. and him.
I got recommended this video instead of sponsored
The math going on to make this work is so far beyond my comprehension it might as well be magic.
This is probably how it would feel like for a 4th dimensional being to visit our 3rd dimension universe
Gaussian Splatting renders are just braindances from Cyberpunk but right now
exactly what i was thinking
Not to mix up with the rapper gaussian that is also a mathematican who does gaussian spitting
I was literally about to make a comment about how this looks a lot like braindances, but it seems like you beat me to it.
I thought this was gonna be a shitpost at first, but it turned out to be a super cool look into gaussian splatting. S-tier Boundary Break parody.
heh, welcome back to boundary bre- how in the hell
The weird thing is that many of us can model these places in our heads, even from an extremely low quality recording. In fact, the quality may not even matter up to a certain point. This means that this concept is actually possible, just not very possible *yet*.
I think our brain is really good at making us feel like it knows the structure, but it doesn't because it's not needed, try rotating a more complex 3d model around, for example of human head, you can't predict the position of features super precisely until you see them (for example draw a dot where it should end up). But it still feels normal and good
But that would mean advancing AI and that could lead to dangerous consequences.
@@madghostek3026Counterpoint: the consistency of shape and perspective in dreams, which is not spoiled by lucid dreaming. Things morph, but you can move around no problem.
@@TheBcoolGuyHow reliable is your source of information? Yeah, not at all.
@@Quitobito we're talking about theoretical things wtf did you expect
title: "noclipping into mp4s"
me: oh like, de-compiling mp4s to figure out how they work
CTS: *starts noclipping through mp4s*
Theoretically, pulling data from a low resolution video should be possible because of the motion and extracting the data from variation in the pixel brightness
I've heard of this but I've never actually worked with it 🤔
The sponsor popup looks so alien, I've never seen a TH-cam video sponsored by TH-cam wild
God its like a glitched out braindance, great stuff
Its like watching those super long fractle videos that just keep diving deeper into something
0:54 *Entire world collapses as the still image turns into a video and my tiny babbie brain can't comprehend this magic*
Inside a JPEG artefact world someone is taking a photo of someone else “make sure to get my good side”, the good side being the very specific angle they aren’t a disjointed blur spread across a large distance
would be cool to do this with movies, like that intro city shot in blade runner
it's technically possible, but the results are quite finnicky. at glance, I'm not sure it'd look very good in that particular case.
absolutely... AWESOME!! i find gaussian splatting and other new technologies awesome, and i really enjoyed how this concept of using it on real videos was executed! props to you, GST. you just earned yourself a sub :D
this genuinely feels like magic wtf
This is like braindances on Cyberpunk
I'm sorry I was expecting you to go in-depth on like what mp4 codec is and see what the files store inside behind the scenes and visualize it. I was not expecting you to GO IN FIRST FUCKING PERSON and just BEGIN FLYING AROUND. HOLY SHIT LOL
It seems obvious in hindsight, but I had no idea you were a wizard.
this was one of my favourite gameplay elements in cyberpunk 2077
There's new research coming out like WildGaussians, SpotlessSplats, etc which are more robust to people in the scenes
ough that's cool. i haven't really been keeping up but i love to see it
TH-cam's first industry plant?!
4:08 breakcore album art
Now I'm thinking how to noclip a mp3
it feels like this video isn’t real, as if to understand what is happening you need context that never existed in this world
This is what I call trickster tech. It was never designed to truly generate interactivity in video clips, it was only designed to emulate it from the outset, literally emphasizing and digitizing the "fake it until you make it" phrase.
A gimmick; nothing more.
What kind of trickery is this
we got cyberpunk braindancing before gta 6
Reminds me of the "enhance" scene in Bladerunner where Deckard is peering around corners in a photograph.
That one SCP article about the anomalous DVD of "The Sopranos"
EXACTLYYYYYYYYY
@@lenbonbsides device theory
Fantastic concept for one of these -most people have no idea this is possible
Cheers ^^
Like a Hermetic treatise, this video has much to say in the depths of its poetry which isn't immediately obvious on the surface of its prose.
it's the first time I've seen somebody actually get sponsored by TH-cam.. interesting
This is like Cyberpunk all over again! 😭
So I might be off with this thought, but is this substituting the idea that a video is images over time and changing it to images over area. Stitching together the frames like how a panoramic shot works. So moving objects/people would cause blurring and other artifacts along with any issues caused by not having a stable frame of reference.
4:04 POV: Posy's close ups
posy fan alert
POSY MENTIONEDDDD 🔥🔥🔥🔥
You are a total artist, you always inspired me to make videos, to upload my music, but in the end I never do more than nothing... Anyways, love your content since Early Sega Genesis Mixes
I'ma be 100% honest I have no clue how I got on this video I set my phone down on Instagram then all of a sudden this was playing but I'm too invested to leave so take my sub
welcome! this is pretty unlike any of my other videos, except in chill vibe. so... beware? :P
Bro noclipped into your phone
@@GSTChannelVEVO I'll be sure to check those other vids out then!
How does one get sponsored by TH-cam ITSELF
This reminds me a lot of cyberpunk 2077's braindance editing environment
Using Gaussian Splats to explore into videos as a “no clip” is brilliant. 🤯
Great to see people appreciating this. I've been doing this with AGIsoft and Reality Capture for years and LumaAI's Gaussian splatting model does a great job on those cases where the data is too compressed to provide consistent data for photogrammetry.
haha for whatever reason i tought you were gonna literally noclip the camera into the video like if it were a 2D plane. Great stuff
VR + this = u are inside of a mp4
cyberpunk 2077 braindances are exactly like this lmfao
I once had a dream where i was in a park area next to a place i used to work (in real life).
I suddenly fell though the floor, and was sudden carted sideways like i had landed on a fast moving conveyor belt, being able to look up and observe the land i used to occupy.
I woke up suddenly after a couple of moments, as the effect left me feeling like i couldn't breath.
I wish my experience of noclipping was nearly as colourful as this.
really took crawling through the screen literally
So what I'm getting from what I have seen with my eyes, you play a video, and then that video gets turned into a sort of 3D model that you can look around?
im not high enough for this
Didn't know what to expect from the title, but I got a beautiful and interesting video.
this is what combing through peoples memories will be like
**edit:** this sponsorship has expired, just in time for the honey revelations yippee! but thanks for check it out anyway
what would happen if you ran this video through the no-clipper??? like especially at 4:13 where it's already unrecognisable from the original image
@@saltedmutton7269 oh god that's a cool idea. i think it'd just dissolve into soup but it'd probably be fun
@@GSTChannelVEVOmaybe with a high-resolution video the first time round it might give some interesting results? almost sounds to me like 4D but i'm not a Physics Man (yet) so i'm not sure... do you have a link to the program that you could put in the description?
@@saltedmutton7269 good idea. I've added a link to the description
I got the popup! Ty
The other day I found a video I shot with a Nokia 6300 on an old HDD. It's 176 x 144px and 60kbps (yep). We've come a LONG way.
Do you know RAW (DNG) image format? You can do wonders with a phone that supports it (Galaxy S line of phones, for instance). Just load that file in Lightroom or Adobe Camera Raw and you can literally make invisible things visible. Also, it's free of post-processing that butchers regular JPGs.
I stumbled across that format when looking for a way to store images with a more-than-8-bit color depth per channel, but haven't actually used it in a camera. seems magical
@@GSTChannelVEVO It is. You can take a picture of a musician on stage in harsh blue light, remove the blue tint and make his face look like a face again.
You can take a picture of a dark alley at night and make it look almost decent. There's no noise removal or edge contrast enhancement unless you add it yourself. Everything's inherently sharper.
Just make sure that the phone or camera shoots in this format because not all of them do.
This is incredible holy
4:04 loooooove how the pixels almost shimmer
This is what it's like to be a BD editor, isn't it?
woah thats absolutely sick!
Now I REALLY want some kind of game where you can scroll through and explore someone else's memories in this sort of style
oh man such a dope idea
This is actually mad impressive. Wow.
I love how this looks, this is such a neat use for gaussean splatting.
Okay, this is pretty cool use for nerfs
4:20 This imagery had me thinking.
The theory that we're all living in a computer program has existed for a while; the idea that we're either a simulation, or creation by some higher being, played out like a world sized life simulator. Now, I have another thought. What if we ARE in fact simulated, but let's look at it from a less human-centrist perspective. What if we're not the reason for the simulation, but a byproduct? At the timestamp shown, all I see are galaxies, and stardust, and what looks like a whole damn universe making up that image... What if Earth, us, our solar system, and everything else is just some computer placing random blobs around to try and give of the most specifically correct color to a specific pixel? Our solar system could be the difference between one unite of blue luminescence once whatever this supposed algorithm is computing compresses everything into the image it's attempting to display, by layering these 3 dimensional objects it's roughly throwing around at various depths. And worse of all, this could be an early attempt at a generation of this image! We might be thrown out as a useless blob next cycle! Maybe earth threw the color off more than another iteration and we're dropped! I don't BELIEVE this, it's just interesting to think about.
Pardon the ramble, but I've been writing* a sci-fi story with a similar premise!!!
_*slacking off on_
@@quantumblauthor7300 That sounds super cool! I hope your writing process goes well!
@@panathentic 💀
put that crack pipe do- actually pick it back up i'm interested
This is incredible!
As expected, the github is entirely incomprehensible and there's no installer or anything, because why would you want anyone to actually be able to use the thing? It's like a car company saying here's your car! Just assemble all the pieces without any instructions!
Linux users when anyone other than an unemployed teenager wants to learn how to use Linux
@@JacobKinsley it was made for a paper.... but yeah skill issue /joke
TBH even though I know techniques like gaussian splatting is present, I never imagined that it can technically be used to noclip through videos. Thats pretty impressive. Would be really interested in more videos like these.
would've preferred you added an explanation of the technique at the end, once you were done with the bit.
Gaussian splatting is so cool, I wish there was more stuff that used it
It's actually crazy that we live in a time where noclipping videos is possible soon it will be reality itself
the internet never fails to amaze me
Some of these shots would make amazing paintings
this is amazing
This feels like something out of cyberpunk
This video is crazy but even the TH-cam sponsor?? What? 😭
its like a beta version of braindance from cyberpunk
gaussian splatting is like the RTX of lidar scanning
how the world feels when you get extremly sick
scrumptious
i had a "2 months free" thing i thought its a new annoying thing
Impressive tech!
I learned about paid promotions unexpectedly
we all know this is all just some nice Gaussian Splatting! but the idea of using this technique to generate a 3d perspective from other peoples videos is just so cool and I think I thought of this idea but never really done it before
also I think it should be called freecam instead of noclipping but idk
Gaussian Splatting is so fun to mess with
this is straight up an scp entry. but it's an episode of the Sopranos instead of literally any mp4
My brain is melting
this video scares me. deeply.
open source Gaussian splatting generator? NIIIIIICEEE
when i was a kid i always wanted to do somekind of noclip on windows 7, like i always wondered what is behind of the windows that our computers display
Cyberpunk brain dance
This completely messed with my head. It’s like that enhance scene in Blade Runner but in real life. I’m not sure if I should be scared or impressed how soon such a thing has become a reality.
Edit: I should clarify what I mean by it being scary. It’s more so the fact that archived footage has been used to be turned into a 3D space. It gives it this uncanny feeling that I cannot describe. I am well aware this technology is nothing new. In fact, Lidar has been around for a long ass time. But that still doesn’t stop me from being impressed on what it’s capable of.
It doesn't show any data that you don't already have, so it's not truly magic or scary, but it's still really cool imo!