jokes aside, he's not a bad looking dude at all, getting jacked wouldn't hurt tho, as long as he doesn't covert to a life coach and talk about it non-stop like that other coder youtuber John Sonmez.
@@TechdubberStudios It may pay less than ads, but it's many times better. I support websites that responsibly use cryptomining, and I block ads. Please, don't say that ads are better. They have never been any good to anybody's web browsing experience. Oh, and you can use cryptomining along with Arc, another earning method that does not involve ads. I'm done with Google's creepy trackers. Cryptocurrency mining is the future.
@@ezshroom I am genuinely 100% with you on the crypto movement. I hate ads. Always have hated them. But there are at least 2....3 big corporations that come to mind that were built on the ads business model, but with crypto mining... can't find one. And browser-crypto-mining is not exactly a new technology. I really want it to replace ads. I really do. Hate the pop-ups, spying, tracking, that's going on. And the first corpo that comes to mind would be Netflix, when considering whom should adopt the crypto model. Because the users stay on netflix and binge-watch hours and hours!
@@ezshroom also, do you happen to know any website/forum/subreddit focusing on browser-based mining? I would really like to join and dig in more into this subject.
I avoided serverside rendering a meta tag by registering a sub-domain, doing the serverside-rendering there and making my app only compatible with a set number of user-agents. Brilliant!
Fantastic job explaining this! As always, the hilarious dry humor and "next level" metaphors help drive home points and keep things entertaining. Really helped clear up a bunch of stuff and get me pointed in the right direction. Many thanks!
Our course -- I followed and understood all the way to the end. This is because I'm an unemployed ex-Tech Lead [who has never worked at a FANG company], and a thousandaire.
there is a workaround : just add conditional tag in the small server that builds your page. you can still use client side rendering except for meta tags
Also check out the create-exact-app npm (that's exact not react). Like NextJS but Express-forward design, full control at the server side level of what's going on.
@@jpsimons Just FYI, next.js also gives you full server side control. You can just run next as a library within an express server. In my experience, it's super ergonomic while preserving the state-of-the-art benefits of next (code splitting, automatic static optimization, incremental static generation, etc.). Having said that, I have not yet checked out create-exact-app, and am not sure how it differs from nextjs.
Greate video! I use Laravel on the server side to serve up everything. Static html pages and React apps or a combo of both. It's easy to embed a react app within a .blade template file. Meanwhile Laravel takes care of everything else, like API services, user registration and authentication, etc. Best of both worlds.
A solution to your problem could be to build a single page application, with each end point for the app being pre rendered. It's basically jamstack. Once a user loads one page, the others do not need to be loaded.
Lol just killed the whole argument. Never heard of it before. Just goes to say that tech is exponential. Wonder if it will cause the cosmic crash eventually.
I had to do that once, I used a Lambda function since it was hosted on AWS, and the function intercepts the CloudFront distribution request and updates the HTML if the request comes from a robot, adding the OpenGraph tags.
If you manage the web server, you could use the web server's router to do the same exact hack you described without the need for a different subdomain, just a route that checks the user-agent of the client and returns different HTML based on it.
I had this problem once but my focus was towards crawlers. I ended up using some php to "render" the important bits like title, descriptions and links. Then the javascript would remove those elements and do the single page app business. It was back in carrot farmer code days but I'm sure happy coders can accomplish this just as well.
I think you could easily do this in net core. In the startup class, in the configure for routing, you could filter each route with the correct meta tags. You could make this an extension and bing bang bosh, neat tidy job done
2:05 i do it this way: Server serves response for parsers (meta, og, schema, jsonld and plain html content) and then comes along js that structures it up and takes over routing from this point, so when you navigate you actually don't "refresh"
netlify has a free experimental feature called pre-rendering, for me, it works with Facebook, it parses the right meta tags automatically with pictures also. My content comes from a backend via graphql and apollo. meta is being set with react helmet, the page is handled by react-router, and it's a create react app project. Hope this helps. You can also do prerendering very very easily with react-snap package, but you need to rebuild when data changes. (PS. Thanks for your work, I really like your videos)
This helped me alot! I am working on a project and my backend was almost finished. I was using create-react-app with router but switched over to next.js! Thanks alot
I just used a node.js express server to host compiled create-react-app, this way you can modify the page and add meta tags if needed before serving the page. Sort of a mix of server and client side rendering as he said.
Your solution at the end is valid. Use reverse proxy to detect the request, and forward them appropriately. However, it's best to use SSR from the beginning if that's your intention.
Now you can make static pages for you dynamic frequently updated pages with Nextjs, How it works's is that it looks at the requested page and if it is present from the build time, sends it back and if it is not built during build time, builds on the go(run time) and adds it to the built pages for the next request. pretty amazing and game changing !
Had this issue while using a MeteorJS website running with ReactJs as the client side is that we created a crawler ( i think there is an npm project for this) that would go to every page, render it and save it on our DB. When a non-human (Google) would access the site it would served this rendered HTML, making it SEO friendly. Basically our server would use the User Agent to define what type of content the user would get served. Hope this help
can you please explain what do you mean by 'render it and save it on db' , do you mean like render the dom elements and attach to html and store it in some link and add that link in db , or what exactly are you storing in db, if this is the case wouldn't it be too much for pages which are dynamic like /rob/photos, /sam/photos and likewise to be stored in db or am i missing something
Nice solution. Only downside guess would be that Google wants you to show the same content to their bot as the user. But probably doesn't matter if you don't want to index those pages.
Woah! My self esteem skyrocketed because I managed to keep up with you until the end :D Aside from that, your content is top notch, keep it coming man.
It works! I do exactly that with my react web SPAs. I use firebase and cloud functions to detect user agents and serve SSR version on the fly to robots and CSR version to users. This is also important to SEO indexing, cause some robots won't run any JS and expect html-only responses. Really enjoy your videos.
@@leisiyox I thought about using it, but never tried it. Don't know how well it works. It would also cost more than my current firebase cloud function solution.
@@leisiyox I like that they integrate lots of services in a single solution. When you create a firebase project, you instantly have access to file storage, hosting, database, authentication, and some other stuff that makes it really easy. I also like that Firestore has real-time listeners for your data. Really good for a client side rendered app. Also really like their documentation and the fact that you can easily access other Google Cloud services and API. There are many videos online about it. Check it out.
I think he meant that we should use Next / Nuxt from the beginning. I used to face these problems and since then I use Nuxt for every project and never worry about these problems again
my solution for this is: 1. set a lambda as an index 2. on the index lambda, it should run a headless tool (like a headless chrome) to render the page if the request is a bot, else just serve the raw js
I think I might try react snap. That sounds good. Pre rendering on every build. Because often the layout of a page is the same even of the content changes. What do I mean by that: every reddit post will have the logo, the side bar, the footer and a div in the middle which contains the contents of the post. So you can prerender all that with an empty div, and then Hydrate it. Even with user generated content (as long as it is simple and consistent) you could prerender. Thanks for the video
If the only thing that needs to change is the meta tags (not the rendered bits), you can also modify the html before returning it to the client, inserting the relevant meta tags. It will probably lead to performance problems, but you could also perform an if condition on the referrer of the request to determine if you should perform such modifications.
I just put my meta tags with variables like %PAGE_NAME% %PAGE_IMAGE% and replace these later while serving the page with express. doesn't work while client-side routing but It works for link previews.
This video is 2 years old, I don't know if it existed when you made it but today there is a tool called puppeteer that uses a headless Chrome browser to actually pre-render your HTML that you can feed to bots and still feed the client side rendered to humans
english is not my first language so i understanded only on second time i watched this, thanks for de vid i've not dev a site with link preview yet, very good to know it!
You can use the header trick as you mentioned, and then simply use something like puppeteer to load the page on the server itself and then send the rendered HTML page to the client. If the header says it's from a normal user, then don't go to puppeteer, just return your usual index.html file.
I would use the same client-side bundle BUT adding a little bit of logic on the static assets server to add the meta tags to the HTML shell that embeds the client-side bundle. That way, you won't need to implement HTTP redirects, AND probably is better once you start working with deep-links for a mobile app.
I think your points on pre-rendering are slightly off. Not mentioning tools like Gatsby, Scully, or Gridsome is a miss as they can be used to render those dynamic routes you say cannot be pre-rendered. It's worth mentioning that JAMStack options are becoming really incredible for developers and give you the same benefits as server side rendering out of the box.
Why do you want another domain for that. The server that delivers the app index.html could also deliver the meta response instead, based on user agent. One problem in both situations you could get (separate service or combined), is that the bots make checks whether the content they see is different to a regular user. Whether any of them does it, I don't know, but I would check this possibility.
I find nearly all dev channels realy cringe but this guy is actually entertaining lmao Edit: wow you solution is exactly the same that I came up with when I had this problem xD I thought u'd have a better way :O
Just make a simple API where the client can send urls and it will return a preview. Just make sure to cache the pages and add rate limits so they can’t spam your api.
hi ben i dont know what you are talking about, i am addicted to listen to you, may be it will start making sense someday, i am still learning react and some other frontend libraries.
In our case, we solved this by redirecting the bots from social media to a backend service that would return an html with all the meta tags. So, when a link to mypage/article/23 was copied, the nginx server would redirect it to backend.mypage/article/23/metatags. Not the best solution, but worked pretty fine
Another option would be to have a CDN and when you get content updates just render that single page to the CDN. Then you aren't having to rerender all the time. It would get more complex than that of course as you would need to rerender everything if you make a style change to your website. And it would be public. Unless you pay for something to make it private and have the CDN only return a partial page for bots.
Well a simpler but similar way would be to always server side generate the index.html but only to change the meta data while letting the index.js generate the actual content. This wouldn't hurt human users and is pretty light on the server (only entry point html meta tags need to be generated depending on the link then the rest of the rendering navigation is done client side)
been struggling for two days to understand what are SSR and SSG and CSR and i literally just got every thing on a 10 min video about big mac and ice creem
7:20 would it be possible to use the same url but check the header value on the for example nginx server? like is this user agent a bot (twitter, fb, etc.) => proxy to your slim API for only the meta data response and if its a real user (mac, windows, chrome, firefox user agent etc.) => proxy to your real page / default response / SSR page. maybe im forgetting something. i dont know if this could work.
Going through a similar issue myself. My static site is hosted on S3 / CloudFront, orchestrated by terraform. My plan is to use CloudFront Origin Response triggers to trigger a lambda function to add the correct open graph tags to the response. I think this is the lightest weight option.
The pre-built html approach could work. If someone tries to visit a url while something is still building, you could serve up a special "page is still being built" page... one benefit here could be that, in 20 years when a website is taken down, you have all this pre-built static html, and it's very cheap (probably free) to host that.
Or when there's another sea-change with technology, MAYBE it's easier to migrate to something new, but I'm not sure, would depend on the specific situation.
In your example of the sub domain. I think that’s not needed / don’t need to redirect. If you are using a server side program it can decide to return the react app or the preview on the normal domain using the same user agent logic or an IP address list of known Facebook servers.
Hey Ben, I'm starting a new react web app for something like delivery.com and doordash.com where SEO is a major thing. Like rich structured snippets of stars etc in google results. Is SSR the only option?
In that case Gatsby or NextJS or any other you recommend? I don't know if Gatsby will do but their web.dev score hits 💯 so was thinking that but in a dilemma
@@bawad do you know an easy way to migrate from create react app to next.js? I've found problems with REDUX, svg images and react router active route (NavLink). Great video btw ;)))
The problem I see with the magic link is that when a user just copies the url instead of pressing a share button to get the special share.* link that there will not be meta tags and when you do use the special share.* then there are meta tags. And I think most users will probably just copy the url they are on now instead of looking for a share button
Your metaphors are next level.
here's your big mac.
Aaaah I see what you did there...
🎈 🏠 🎈
"you are not a karen"
Ben's got 99 problems, but a girlfriend ain't one.
Bruh! He says it with a straight face
@@phantomKE I know right? He just leveled up with these jokes!
🤣🤣
jokes aside, he's not a bad looking dude at all, getting jacked wouldn't hurt tho, as long as he doesn't covert to a life coach and talk about it non-stop like that other coder youtuber John Sonmez.
He has a girlfriend in Canada.
I avoid client-side rendering in order to save CPU cycles for cryptocurrency mining.
hahaha
hilarious comment! but crypto mining is an inefficient form of revenue on client's computer, see TPB case experiment.
@@TechdubberStudios It may pay less than ads, but it's many times better. I support websites that responsibly use cryptomining, and I block ads. Please, don't say that ads are better. They have never been any good to anybody's web browsing experience.
Oh, and you can use cryptomining along with Arc, another earning method that does not involve ads.
I'm done with Google's creepy trackers. Cryptocurrency mining is the future.
@@ezshroom I am genuinely 100% with you on the crypto movement. I hate ads. Always have hated them. But there are at least 2....3 big corporations that come to mind that were built on the ads business model, but with crypto mining... can't find one. And browser-crypto-mining is not exactly a new technology. I really want it to replace ads. I really do. Hate the pop-ups, spying, tracking, that's going on. And the first corpo that comes to mind would be Netflix, when considering whom should adopt the crypto model. Because the users stay on netflix and binge-watch hours and hours!
@@ezshroom also, do you happen to know any website/forum/subreddit focusing on browser-based mining? I would really like to join and dig in more into this subject.
I avoided serverside rendering a meta tag by registering a sub-domain, doing the serverside-rendering there and making my app only compatible with a set number of user-agents. Brilliant!
solutions:
0) pre-rendering with parcel or webpack
1) server side rendering
your solutions are not client side rendering. he mentioned it.
Fantastic job explaining this! As always, the hilarious dry humor and "next level" metaphors help drive home points and keep things entertaining. Really helped clear up a bunch of stuff and get me pointed in the right direction. Many thanks!
I love the tint on your glasses, it's serial killer-ish, where can i get a pair like those?
a package arrives at your door after the 3rd kill
@@bawad respect
They are the left-behinds after each kill. That's the way you get it.
@@bawad quick scope no scopes?
Those tints are wiped off blood from killing
Love the joke about girlfriend and client side rendering at the beginning
I love how Ben roasts Angular devs. I thought of that carrot farmer line off and on all day and cracked up every time.
Our course -- I followed and understood all the way to the end. This is because I'm an unemployed ex-Tech Lead [who has never worked at a FANG company], and a thousandaire.
For your sake and ours, I hope you DON'T get a girlfriend too soon.
there is a workaround : just add conditional tag in the small server that builds your page. you can still use client side rendering except for meta tags
This was the best explanation video I've seen on the matter... Kudos to you Mister...
the girlfriend problem might be solved if you stop walking around wearing asexual flag shirts
hahaha
lmao, good catch, respect
He's just playing hard to get. Karen gets it.
But with this if he ever gets one, she will be the right one. Lol
He do check a lot of aesthetic boxes from the virgin meme... Though I probably do too 😆
Solution: NextJS, Angular Universal, Nuxt, etc.
Also check out the create-exact-app npm (that's exact not react). Like NextJS but Express-forward design, full control at the server side level of what's going on.
@@jpsimons Just FYI, next.js also gives you full server side control. You can just run next as a library within an express server. In my experience, it's super ergonomic while preserving the state-of-the-art benefits of next (code splitting, automatic static optimization, incremental static generation, etc.). Having said that, I have not yet checked out create-exact-app, and am not sure how it differs from nextjs.
Why do I not like the sound of Angular Universal?
@@angshu7589 because you are not carrot farmer. Although color of your profile picture kinda resembles the carrot :D
@Adithya R Svelte Sapper is still in early development. I love Svelte, but Sapper is still far away from production-ready
Greate video! I use Laravel on the server side to serve up everything. Static html pages and React apps or a combo of both. It's easy to embed a react app within a .blade template file. Meanwhile Laravel takes care of everything else, like API services, user registration and authentication, etc. Best of both worlds.
A solution to your problem could be to build a single page application, with each end point for the app being pre rendered.
It's basically jamstack. Once a user loads one page, the others do not need to be loaded.
I guess u never heard of prerender.io
been using it for years
Lol just killed the whole argument. Never heard of it before. Just goes to say that tech is exponential. Wonder if it will cause the cosmic crash eventually.
yep yep yep , you just commented before me
@@ayushkhanduri2384 same case, I just searched prerender before I add a comment about it to check if it's already mentioned and here it was.
🤯
I had to do that once, I used a Lambda function since it was hosted on AWS, and the function intercepts the CloudFront distribution request and updates the HTML if the request comes from a robot, adding the OpenGraph tags.
Very useful, thank you for pointing to react-snap. Happy Hacking Ben 🙌🏻
I was just watching one of your videos on react native animation earlier xD
Keep up the good job 🔥
"It's like I spent a bunch of time building a house and now I want that house to fly." LMAO
this is actually the first time I understood the difference between client side and server side rendering
Sapper + svelte gives you the best of both worlds
If you manage the web server, you could use the web server's router to do the same exact hack you described without the need for a different subdomain, just a route that checks the user-agent of the client and returns different HTML based on it.
I had similar issue. good thing you found a better solution.
I had this problem once but my focus was towards crawlers. I ended up using some php to "render" the important bits like title, descriptions and links. Then the javascript would remove those elements and do the single page app business. It was back in carrot farmer code days but I'm sure happy coders can accomplish this just as well.
I think you could easily do this in net core. In the startup class, in the configure for routing, you could filter each route with the correct meta tags. You could make this an extension and bing bang bosh, neat tidy job done
2:05 i do it this way:
Server serves response for parsers (meta, og, schema, jsonld and plain html content) and then comes along js that structures it up and takes over routing from this point, so when you navigate you actually don't "refresh"
netlify has a free experimental feature called pre-rendering, for me, it works with Facebook, it parses the right meta tags automatically with pictures also. My content comes from a backend via graphql and apollo. meta is being set with react helmet, the page is handled by react-router, and it's a create react app project. Hope this helps. You can also do prerendering very very easily with react-snap package, but you need to rebuild when data changes. (PS. Thanks for your work, I really like your videos)
I use EJS and it allows for variables to be passed before sending the HTML to the client, so that can allow you to change the values in the meta tags.
This channel is slowly becoming one of my favorites on TH-cam! 😄
Great video. Trying to wrap my head around server side rendering and this video definitely helped
This helped me alot! I am working on a project and my backend was almost finished. I was using create-react-app with router but switched over to next.js! Thanks alot
I just used a node.js express server to host compiled create-react-app, this way you can modify the page and add meta tags if needed before serving the page. Sort of a mix of server and client side rendering as he said.
Your solution at the end is valid. Use reverse proxy to detect the request, and forward them appropriately.
However, it's best to use SSR from the beginning if that's your intention.
Now you can make static pages for you dynamic frequently updated pages with Nextjs, How it works's is that it looks at the requested page and if it is present from the build time, sends it back and if it is not built during build time, builds on the go(run time) and adds it to the built pages for the next request. pretty amazing and game changing !
Had this issue while using a MeteorJS website running with ReactJs as the client side is that we created a crawler ( i think there is an npm project for this) that would go to every page, render it and save it on our DB. When a non-human (Google) would access the site it would served this rendered HTML, making it SEO friendly. Basically our server would use the User Agent to define what type of content the user would get served. Hope this help
can you please explain what do you mean by 'render it and save it on db' , do you mean like render the dom elements and attach to html and store it in some link and add that link in db , or what exactly are you storing in db, if this is the case wouldn't it be too much for pages which are dynamic like /rob/photos, /sam/photos and likewise to be stored in db or am i missing something
Nice GatsbyJS colorway on that shirt 🤙
Nice solution. Only downside guess would be that Google wants you to show the same content to their bot as the user. But probably doesn't matter if you don't want to index those pages.
Woah! My self esteem skyrocketed because I managed to keep up with you until the end :D Aside from that, your content is top notch, keep it coming man.
Gatsby also solves the React single-page problem, since we can generate all the individual HTML, CSS, and JS pages.
It works! I do exactly that with my react web SPAs. I use firebase and cloud functions to detect user agents and serve SSR version on the fly to robots and CSR version to users. This is also important to SEO indexing, cause some robots won't run any JS and expect html-only responses. Really enjoy your videos.
What about some prerender.io ?
@@leisiyox I thought about using it, but never tried it. Don't know how well it works. It would also cost more than my current firebase cloud function solution.
@@cauebahia what are the conditions that you recommend using firebase?
I thought about using it but I seek guidence
@@leisiyox I like that they integrate lots of services in a single solution. When you create a firebase project, you instantly have access to file storage, hosting, database, authentication, and some other stuff that makes it really easy. I also like that Firestore has real-time listeners for your data. Really good for a client side rendered app. Also really like their documentation and the fact that you can easily access other Google Cloud services and API. There are many videos online about it. Check it out.
I think he meant that we should use Next / Nuxt from the beginning. I used to face these problems and since then I use Nuxt for every project and never worry about these problems again
my solution for this is:
1. set a lambda as an index
2. on the index lambda, it should run a headless tool (like a headless chrome) to render the page if the request is a bot, else just serve the raw js
Hey I saw Wes Bos in one of his videos, he used cloud functions to generate the preview and puppeteer i guess to take a screenshot of the url
I think I might try react snap. That sounds good. Pre rendering on every build. Because often the layout of a page is the same even of the content changes. What do I mean by that: every reddit post will have the logo, the side bar, the footer and a div in the middle which contains the contents of the post. So you can prerender all that with an empty div, and then Hydrate it. Even with user generated content (as long as it is simple and consistent) you could prerender. Thanks for the video
If the only thing that needs to change is the meta tags (not the rendered bits), you can also modify the html before returning it to the client, inserting the relevant meta tags.
It will probably lead to performance problems, but you could also perform an if condition on the referrer of the request to determine if you should perform such modifications.
I just put my meta tags with variables like %PAGE_NAME% %PAGE_IMAGE% and replace these later while serving the page with express. doesn't work while client-side routing but It works for link previews.
This video is 2 years old, I don't know if it existed when you made it but today there is a tool called puppeteer that uses a headless Chrome browser to actually pre-render your HTML that you can feed to bots and still feed the client side rendered to humans
I´ve been watching your videos and yes, the quality of the content is always awesome, new suscriber
Cool stuff about magic links. Even if you hadn't talked about that, you mentioned 🥔. Automatic upvote.
I had the same issue last week and also was thinking about moving to nextjs, but having a separate domain and server makes a lot more sense.
well Next js or any other SSR solutions doesn't mean you're gonna use one server for the backend and the front-end.
english is not my first language so i understanded only on second time i watched this, thanks for de vid i've not dev a site with link preview yet, very good to know it!
Rails is this the best framework. Right i build all my rails views with react, but with rails server rendered meta tags.
Subbed for the consistent Angular claps 💀
You can use the header trick as you mentioned, and then simply use something like puppeteer to load the page on the server itself and then send the rendered HTML page to the client. If the header says it's from a normal user, then don't go to puppeteer, just return your usual index.html file.
I got this idea from here: th-cam.com/video/lhZOFUY1weo/w-d-xo.html
You are hilarious and informative my dude haha, relatable. And damn dude the lenght of your link
my reaction to this video is LOL. as someone already mentioned, you could've done a decent server side routing
The problem with client side rendering is mostly that most of the time it's used for something that doesn't need it. Most websites are mostly static.
You make my day better
I will learn WebD so that I can enjoy these digs by Ben😂
I would use the same client-side bundle BUT adding a little bit of logic on the static assets server to add the meta tags to the HTML shell that embeds the client-side bundle. That way, you won't need to implement HTTP redirects, AND probably is better once you start working with deep-links for a mobile app.
Seriously a balloon helicopter, that was the best analogy you could come up with ?
Facebook's user agent is there for the facebook app browser as well.
I think your points on pre-rendering are slightly off. Not mentioning tools like Gatsby, Scully, or Gridsome is a miss as they can be used to render those dynamic routes you say cannot be pre-rendered. It's worth mentioning that JAMStack options are becoming really incredible for developers and give you the same benefits as server side rendering out of the box.
Your hate for angular is legendary. I love it.
The solution you came up is just the thin end of the wedge, its just the beginning.
Why do you want another domain for that. The server that delivers the app index.html could also deliver the meta response instead, based on user agent. One problem in both situations you could get (separate service or combined), is that the bots make checks whether the content they see is different to a regular user. Whether any of them does it, I don't know, but I would check this possibility.
Best explanation, Ever.
Benawad officially a CHAD?!
I find nearly all dev channels realy cringe but this guy is actually entertaining lmao
Edit: wow you solution is exactly the same that I came up with when I had this problem xD I thought u'd have a better way :O
Wow
I don't usually comment on these types of videos
But bro
This is amazing content please keep it up man
I like your humour😂😂...great lesson as well
Metaphors killing me 😂
I'm about that carrot farmer lifestyle tbh
me too 😦, that's why in my free time will enjoy build something else
@@mohdhaziq9859 I like the carrot life tho, has worked well for me
Just make a simple API where the client can send urls and it will return a preview. Just make sure to cache the pages and add rate limits so they can’t spam your api.
hi ben i dont know what you are talking about, i am addicted to listen to you, may be it will start making sense someday, i am still learning react and some other frontend libraries.
I like how you explain, well done. Thank you for the quality content.
In our case, we solved this by redirecting the bots from social media to a backend service that would return an html with all the meta tags. So, when a link to mypage/article/23 was copied, the nginx server would redirect it to backend.mypage/article/23/metatags. Not the best solution, but worked pretty fine
Another option would be to have a CDN and when you get content updates just render that single page to the CDN. Then you aren't having to rerender all the time. It would get more complex than that of course as you would need to rerender everything if you make a style change to your website. And it would be public. Unless you pay for something to make it private and have the CDN only return a partial page for bots.
Dude I was literally searching the name of this OPG yesterday. Thanks, dude!
Well a simpler but similar way would be to always server side generate the index.html but only to change the meta data while letting the index.js generate the actual content.
This wouldn't hurt human users and is pretty light on the server (only entry point html meta tags need to be generated depending on the link then the rest of the rendering navigation is done client side)
been struggling for two days to understand what are SSR and SSG and CSR and i literally just got every thing on a 10 min video about big mac and ice creem
Thanks Man!
u watched a 9 min video in 3 ?
@@jinxblaze that is what true supporters do: they appreciate the content even before watching it. it's beautiful.
@@jinxblaze TRIPPLE SPEED
@@ChrisStayte xD
7:20 would it be possible to use the same url but check the header value on the for example nginx server?
like is this user agent a bot (twitter, fb, etc.) => proxy to your slim API for only the meta data response
and if its a real user (mac, windows, chrome, firefox user agent etc.) => proxy to your real page / default response / SSR page.
maybe im forgetting something. i dont know if this could work.
Going through a similar issue myself. My static site is hosted on S3 / CloudFront, orchestrated by terraform. My plan is to use CloudFront Origin Response triggers to trigger a lambda function to add the correct open graph tags to the response. I think this is the lightest weight option.
Did you do it? Am exploring some of this ideas :)
Every time I have a problem with my apps I just wait for Ben to have them to so he can solve them for me.
The pre-built html approach could work. If someone tries to visit a url while something is still building, you could serve up a special "page is still being built" page... one benefit here could be that, in 20 years when a website is taken down, you have all this pre-built static html, and it's very cheap (probably free) to host that.
Or when there's another sea-change with technology, MAYBE it's easier to migrate to something new, but I'm not sure, would depend on the specific situation.
In your example of the sub domain. I think that’s not needed / don’t need to redirect. If you are using a server side program it can decide to return the react app or the preview on the normal domain using the same user agent logic or an IP address list of known Facebook servers.
I hit the like button just after the 5 seconds in the video.
Sorry I'm new here, and I've noticed that Ben clearly hates Angular.
Can someone give a quick background please???
08:44
**Theoretically, I think it should work**
that there is a fountain of wizdom
I think one of the main point to understand can be found in the name 'create-react-app' which is NOT 'create-react-website'
This video was hilariously informational, Ben! Thanks! Haha
this was so easy to understand am subscribed
I was experimenting with client side rendering in 2012, it would generate the page based on json
Hey Ben,
I'm starting a new react web app for something like delivery.com and doordash.com where SEO is a major thing.
Like rich structured snippets of stars etc in google results.
Is SSR the only option?
If your content could be pre-generated (for example, blog), one of the options would be to use Gatsby.js
yeah
In that case Gatsby or NextJS or any other you recommend?
I don't know if Gatsby will do but their web.dev score hits 💯 so was thinking that but in a dilemma
Next.js
@@bawad do you know an easy way to migrate from create react app to next.js? I've found problems with REDUX, svg images and react router active route (NavLink). Great video btw ;)))
McLovin teaching javascript
Nice video Ben. Try this out and make a video about the results please.
The problem I see with the magic link is that when a user just copies the url instead of pressing a share button to get the special share.* link that there will not be meta tags and when you do use the special share.* then there are meta tags. And I think most users will probably just copy the url they are on now instead of looking for a share button