Host your Database for Free on Github Pages
ฝัง
- เผยแพร่เมื่อ 21 ก.ย. 2024
- Databases are an essential part of many modern web applications but running them can be really expensive. That’s why in this video I’ll show you how you can run an SQLite Database completely for free, on top of Girhub Pages. GitHub Pages allows you to host files and websites. But using the JavaScript library SQL.js-HTTPVFS it is possible to use an SQLite dataset completely hosted as static files.
===
Check out my project „solidtime - The modern Open-Source Time Tracker„ on www.solidtime.io
===
The final result of the „On this day“ can be found here: onthisday.buff...
The GitHub repository with all the source code: github.com/buf...
===
Check out SQL.js-HTTPVFS at:
github.com/phi...
If you want to find out more about it, the author explained the initial implementation here:
phiresky.githu...
You can find the dataset used for the „On this day“ website here:
www.kaggle.com...
===
Regular databases like MySQL, Postgres or Redis/Valkey run as a service and need server side computing power work. But SQLite is quite a bit different. The Database runs as a single file database and is really useful for various applications that need a performant but simple database. You can use SQL to query the database just like with any other database. The main difference is that we can use the fact that SQLite only needs a filesystem to run to distribute it with static page hosting services like Guthub Pages. It is also possible to use CDNs like Cloudflare or BunnyCDN to distribute the database very cheap. We use the HTTP Protocol to our advantage and set an HTTP range header to request only the parts of the SQLite database we actually need in our application.
Laravel 11 use SQLite by default now. CMIIW
oh that's awesome i completely missed that. which is quite embarrassing because Laravel is my go to backend framework 🙈
Might have to check it out since SQLite is the only DBMS I know and idk much about web app hosting
i got excited cuz i thought you figured out a way to do writes, not just read.
Oh, okay. Well that saves me watching then.
Thx
Definitely not for every purpose. But a great option for some hobby projects for sure
Thanks man
phew, you saved me 10 mins of my life 😂
I’m so glad that I watched until the end, I was like “how the hell does it know what range of bytes to load if the data isn’t clustered”. Nice project and explanation, thanks :)
Nah I’m convinced TH-cam can read my thoughts. I was just thinking about how I could host a database for free yesterday, and lo and behold… this video the next day.
you can try hosting your db on railway for free
If your Database is static and can be pre-cached (such as to your bytes-range feature) then it shouldn't have been a database to begin with, multiple JSONs would've been the better option.
Also, no write, so what's the point?
Clickbait of course
The only real benefit is no server side compute required
@@hartvenus if it's static then you can do the compute one time on your personal laptop
Always love it when you upload!
thank you i appreciate that! hopefully i can get on a more frequent upload schedule soon 😅
Really cool, always super interested in nodb approaches!! Thanks for sharing!
Thanks for sharing! I always wondered if a configuration like this was feasible.
Try splitting your data into per decade databases and query based on that.
You'll have small dbs to work with and maintain.
Site looks awesome, like this design
Great idea, but I would probably just make the code reference a database somewhere else and maybe cache it. That way you wouldn't have the limitation of having to re-publish with changes to the database. I was actually thinking about something similar to this yesterday and then this shows up in my feed. Awesome how stuff works. Thanks!
If you cannot do writes, it is not a database at all.
There are still a lot of use cases, but would be nice to know first.
Why not contribute to activitywatcher?
Why start from scratch?
Activitywatcher has a different use case. Our primary focus is not to track every activity on your computer, but to simplify tracking for project based freelance and agency work. Especially with teams.
We might even use parts of ActivityWatch Later on in native desktop apps, but that’s not the focus right now.
even tho ur a small channel, I love your videos lol
(especially cut, video ideas, explanations and voice over are good)
thank you i appreciate it!
@@bufferhead_ oida mir's grad auf'gfallen das du a Österreicher bist lol haha
07:24 It's nice to see Ukraine in the list of countries ♥And thanks for such a helpful video!
sir please make a full tutorial video if possible. i could host so many of my hobby projects i don't wanna pay for a hosting services. what ever u said seems like a too much of work for me btw is it possible php pages? i mean how do u make php pages to run or do we have to change the entire code with database?
The amount of ppl having serious dialogue about this as if it's not just entertainment is pretty surprising
8:25
Writing to the DB wouldn’t re-upload the entire 1GB. It would only upload the git patch (diff)
I say redeploy there.
What about splitting data and compiling each part into a Markdown or HTML file so that you can get data from a specific part by going to a specific page on your "free db site". You can make a program that puts the keys in the h2 and the values in the h3 or p, and also reads these and converts them back to JSON. That solves both the issue of having a file size limit and of having to load a giant JSON file every time.
Why bother with HTML tags? There's XML
One level further, why not just have a master JSON file that points down to smaller JSONs that have the data you need?
That sounds like htmx
pffft just have a csv file
Yeah, you can technically just call the GitHub API to read the csv file. In converse, you can also just add a new commit in order to "write" new data. The downside though is no multiple concurrent writes, only one writer.
dont you need to expose your api key to do writes? you cant really do that on a website. @@jskksjjskksj
Also you can't do random read on the csv over http without knowing what to read... The index approach is pretty cool imho
Dude gave out my secret
just use turso
Wouldn't Vercel be quite fitting for this case?
Thats what i was thinking. They do have some free PostGreSQL.
Before watching you should know that it's only a read """database""", you can't really write to it so it's basically just an API and not a database like the video and title say
I can't find a tutorial on how to actually do this, i seek suggestions and help
Hallelujah thank YOU Jesus Christ our Holy Lord GOD Almighty ✝💝🙏
what did you use for styling in your weather project? looks good
thank you!!
Thanks for sharing
Thank you so much for explaining the approach, I find it very stimulating to learn for personal projects and look forward to replicating it and mention your video.
Concerning the db, I have little experience of sql.js. Would a tool like Duckdb (SQLite but Olap) without indexes need solve your issues? It has its Wasm and I advise to checkout.
Would be great to hear your opinion.
I feel like at one of those masses sung in Latin.
Say it "READ ONLY DATABASE" loud dear
Österreichhh😂
Super cooles video
Thank you.
Is the db file publicly exposed? I mean, when deployed, if you know the name/path, is it posible to download it?
No it's a considered a security risk if your db is publicly readable.
you are supposed to query db in trusted secure machine only
Yes, since you can only deploy GitHub Pages with public repositories (or private if you have a paid account)
It won't hurt to subscribe
Let's always do alot of good ❤️
Don't abuse those free platforms
this is insane:)
Just use a regular database.
There are many ways to host static pages … for example a raspberry pi.
Firebase free i exist 😂 and if your project uses more reads/writes than the free tier it would be probably making you some $$$
"No problemo"
0:36 , Why are you showing ' Lesbian ' Signature or Flag. 😡
Did you show it to us on purpose?
❤❤
Just becasue you can doesn't mean you should.
Why shouldn’t you?
Is it bad?
Exactlyyyy 👍
@@jdubbeatz1042, because data shouldn’t be stored in a .json file.
@@bassycounter, yeah- .json files are publicly accessible; databases at least have a login/authentication.
It could take much, much less time than 10 minutes...
Wasted 5 minutes of my life
This is very bad idea to follow. Not only u cant do write, but the read speed will be very slow compare to normal approach
Digital ocean, 30gb Postgres database with connection pooling, 19eur per month.
Not much viability yo this idea.
Cool video though!
IDK why, i feel bad, like there are a few companies who are letting people to host things for free, and here are guys like you, who are straight up there exploiting it for your own personal gains, that's one of the reasons why many companies are closing their free trial, what you are doing may give you a personal advantage in short term, but is just bad for the community and the free mindset in long term
correct me if i am wrong
I doubt this could produce any relevant cost, even if many people did that. As mentioned in the video, there are file limits (1GB) and traffic limits (100GB). Those are not extremely high numbers, which is why I also mentioned that you can go to a CDN.
Microsoft as the owner of GitHub, also owns one of the biggest cloud providers Azure. So the cost is even less for them.
I think companies closing free trials is not a result of any misuse. After all the companies are free to adapt their rules, or just straight up shut down specific projects if they think they are bad faith.
I think SQL.js-HTTPVFS has interesting technological concepts, that’s the main reason I made the video.
This is misleading, click bait
I completely disagree.