even with the random surfer, what stops me from creating 1000s of websites to affect the rank of my website. Maybe it lessens it a bit but I don't think I understand if the random surfer helps in that matter that much. If the "random surfer" was real users going through websites and we somehow kept track of that, then I would understand more
The issue is that nothing links to those random pages you created, so the surfer landing on those in the first place is quite unlikely. In reality, the damping factor is much higher than .85, so you have almost no chance.
Thank you so much for this perfectly explained video - i have been trying to understand this, but now looking at your work, with the animations and simply explanation etc. has helped me so much.
Thanks for this video! Would love to see a visual factoring content prioritization. How prioritizing subsidiaries and paid advertising would change the outcome of the results
Nice video, but how does the random surfer model help solve the issue of creating lots of websites to change the rank of your own website (a problem you yourself brought up)?
I think the idea is: those pages (the ones created to link to the target website) would not have many pages that link to them, so the random surfer wouldn't be sent to them much, meaning there wouldn't be many more visits to the target page.
@@Nathan-cz8uk But how do you then introduce completely new page that has nothing linking to them and get it searchable? Random surfer only has random chance to land on that page that's only 1/X and X being every single page that exists in the space that random surfer can land to. Can you then introduce one thousand pages that cross-link each other so that random surfer in total has >1000/X change landing on any of those? Would that cause an issue that then your most ranked page will be random out of those 1000 you created? Those could of course all link to less than 999 of your pages and every page still linking to your first page that you intend to get shown for in search results, that would make it most ranked out of that set anyway.
Isn't it easier to just describe this as a discrete time markov chain problem? You basically just want to set up a transition matrix where unlinked pages have a transition probability of 0 or something very small according to the damping factor, and all the linked pages have equal probability. Then you just want to find a steady state, so basically just look at the eigenspace corresponding to eigenvalue 1. This also neatly encodes the first problem; if you have no damping, you'll get a big multidimensional eigenspace if there's multiple connected components. But if you introduce some damping, the eigenspace will be 1 dimensional; you'll have a unique stationary state, which is exactly the importance ranked by this monte-carlo algorithm.
That sure sounds easier! Or not, I don't really know. Maybe the dude/dudette that came up with it was worse at math than you? Remember that lots of programmers and 'software engineers' don't really have math degrees.
You could create lots of pages of your own to link to the main page, yes, but you could also pay other more prestigious pages to link to your page instead. If they are on the track of the random surfer, you'll be too
I'd thought about this too. You'd have to hope that these pages are prestigious enough that a) they don't need your money, or b) they wouldn't want to risk hurting their reputation by linking to a disreputable page.
There is a lot more to google search these days than just the linking. All kinds of content analytics and AI is involved. But yea, it all started with the links.
What if my website doesn't have many links and stays that way for many months? And what if my site isn't important (due to no links) but contains relevant keywords that more important sites don't contain? (this is a very in depth comment.)
Very good explanation, congratulations. The animations make the whole topic more understandable and clear. Greetings from Argentina
Vamos carajo!
That iconic voice from Bryan Yu. You're a legend sir!
even with the random surfer, what stops me from creating 1000s of websites to affect the rank of my website. Maybe it lessens it a bit but I don't think I understand if the random surfer helps in that matter that much. If the "random surfer" was real users going through websites and we somehow kept track of that, then I would understand more
The issue is that nothing links to those random pages you created, so the surfer landing on those in the first place is quite unlikely. In reality, the damping factor is much higher than .85, so you have almost no chance.
You can. But not all websites on one server oder IP address, because then this is counter productive and you will be deranked instead
@@weckar but still, he can make a loop
@@gabenugget114 doesn't matter and in fact reduces the visibility of your main page
@@weckar what if you linked to all of the random ones from your main one
Best explaination so far.
Thank you so much for this perfectly explained video - i have been trying to understand this, but now looking at your work, with the animations and simply explanation etc. has helped me so much.
This video is really good. Thanks for making this!
Thank you for explaining it in such an easy and understandable manner.
I find your channel unique and informatic
Great video man. Simple short and clear explanation.
Clear, concise, amazing!
Thank you for your clear explanation and animations
Wtf how is this channel so freakin good. I expected at least 2 million follower. I'm deeply shocked !!!!!!!!
🙄
You are a hero, great explanation
Very well explained!
Very nicely you have explained this.
Thanks for this video!
Would love to see a visual factoring content prioritization. How prioritizing subsidiaries and paid advertising would change the outcome of the results
This makes it a lot simpler to understand than what my teacher explained to us in class. Instead of using pigeons he used Ruphus cucullatus
Absolutely amazing video! Subscribed.
The best explanation ever
Very nice and easy to understand explanation
Simple and clear explanation, :) ! By the way how did you create that animation? Can you share that?
Very clear. Much love!
Thanks for a great explanation
Nice video, but how does the random surfer model help solve the issue of creating lots of websites to change the rank of your own website (a problem you yourself brought up)?
I think the idea is: those pages (the ones created to link to the target website) would not have many pages that link to them, so the random surfer wouldn't be sent to them much, meaning there wouldn't be many more visits to the target page.
@@Nathan-cz8uk Bingo
@@Nathan-cz8uk But how do you then introduce completely new page that has nothing linking to them and get it searchable? Random surfer only has random chance to land on that page that's only 1/X and X being every single page that exists in the space that random surfer can land to. Can you then introduce one thousand pages that cross-link each other so that random surfer in total has >1000/X change landing on any of those? Would that cause an issue that then your most ranked page will be random out of those 1000 you created? Those could of course all link to less than 999 of your pages and every page still linking to your first page that you intend to get shown for in search results, that would make it most ranked out of that set anyway.
No words yo Admire you for such a Mater Piece of video
Great video and animation !! Keep Up !
Great video. You should also teach about algorithms in you web programming course at Harvard
damn, had to study the mathematics of it in Univ to understand it, you did it in a few minutes.
very good!!
Isn't it easier to just describe this as a discrete time markov chain problem? You basically just want to set up a transition matrix where unlinked pages have a transition probability of 0 or something very small according to the damping factor, and all the linked pages have equal probability. Then you just want to find a steady state, so basically just look at the eigenspace corresponding to eigenvalue 1.
This also neatly encodes the first problem; if you have no damping, you'll get a big multidimensional eigenspace if there's multiple connected components. But if you introduce some damping, the eigenspace will be 1 dimensional; you'll have a unique stationary state, which is exactly the importance ranked by this monte-carlo algorithm.
That sure sounds easier!
Or not, I don't really know. Maybe the dude/dudette that came up with it was worse at math than you? Remember that lots of programmers and 'software engineers' don't really have math degrees.
Ah yes cause everything you just said is definitely easier to follow for the average person. 🙄
Simply, Lovely. In a layman's language :)
nice explanation .
It’s more easy calculate the % with the algebric Matrix of the Number of the links
You could create lots of pages of your own to link to the main page, yes, but you could also pay other more prestigious pages to link to your page instead. If they are on the track of the random surfer, you'll be too
I'd thought about this too. You'd have to hope that these pages are prestigious enough that a) they don't need your money, or b) they wouldn't want to risk hurting their reputation by linking to a disreputable page.
Good explanation ⭐⭐⭐⭐⭐
Great vid cheers mate
Forgot on the backend google basing "relivance" based on ESG score.
great vid thanks!
There is a lot more to google search these days than just the linking. All kinds of content analytics and AI is involved. But yea, it all started with the links.
Very interesting
Nice!
Ah, so much better trying to focus in on your voice and the concepts without any audible distractions. 👌🏻
simply awesome......
This video should be shown b4 all data algo courses lmao
Can someone explain how pagerank is different from eigenvector centrality?
You ever noticed that google search isn't good as its used to be?
yup
A page importance is relative to what is searched, it's not absolute. So how does it rank the importance of pages for any given search query?
Mark Zuckerberg explaining Google's algorithm....wow...nice
Lol
. simplicity is an amazing skill 😊 please make videos on AI 😊
that just feel very slow to compute due to the big number of web pages and the iteration it takes to converge to a stable pagerank value
Does the "random surfer" refer to a real user or a googlebot?
O_o I've never been deep in math and just know how to use it in program I didn't know how it works this is interesting
You should have made this video longer. Too short to explain
Is this brian wu from cs50??
✨
Hm looks kinda similar to a Markov model.
The real question is what if I create hundreds of pages that all link to each other, possibly linking to one page more often than any other?
Still the problem is that for a random surfer
The damping factor is much higher
So the chances of the surfer landing on any of the pages is negligible
I'm interested in controlling my TH-cam content more. Does anybody have any helpful advise on how I can do that?
yo.... bando 3sj 3him up
Or just pay Google a click fee to put your result at the top…. :)
ethelum mala
Play Robot? Cute
first
So cute play robot
U sounds like mark Zuckerberg
What if my website doesn't have many links and stays that way for many months? And what if my site isn't important (due to no links) but contains relevant keywords that more important sites don't contain? (this is a very in depth comment.)
Lost terribly
How is a true random number generated here? Is it not true that a computer generated random is truly not a random number?