At 17:18, I'm glad Arjan talked about coupling and separation trade-offs. I always felt awkward with ORM classes being so tightly coupled to DB tables which makes the domain classes hard to use in isolation. e.g., ORM assumes that you need a session when perhaps you have some behavior that doesn't involve the DB. One approach on this is to flip the dependency. Rather than make your domain model depend on ORM -- by using a class that inherits from SQLAlchemy or SQLModel to combine domain, validation, and/or DB logic -- consider making the ORM class extend your domain model.
In general I don’t think it’s a good idea to have specific database classes be the dependency center of your app. An approach I like is the “clean architecture”.
SQLC is the best if you already know your way around SQL. Automatically generate pydantic models from raw SQL queries. You get both flexibility of raw queries (which you already probably already have) and type safety, no need to deal with ORMs.
IME, it is better to always keep a service layer between db transaction logic and api endpoint logic So it should be like this: API -> Service Layer -> DB Transaction For better decoupling and easy for testing
I don't think this is a good idea, there's no good reason to use sqlmodel: - you're coupling your api with database - sqlalchemy models can be declared using type hints, but you've shown essentially a deprecated way of defining them - there are some issues with sqlmodel, e.g. with model inheritance Also I have a lot of questions regarding the code in the video: - why didn't you use sessionmaker as a context manager but did that with session in sqlmodel? - why did you use deprecated query api in the sqlalchemy example? - why are you committing your changes manually whel that could be done in a dependency? - why is autoflush disabled?
Having a Pydantic model and an ORM model is not boilerplate; it reflects a proper separation of concerns. It is uncommon to work with database models directly in the presentation layer. Typically, database models are used exclusively in repositories, where they are mapped to domain models (Pydantic models in this case). These domain models are then utilized in the service layer and beyond, ensuring that business logic is not tightly coupled with infrastructure details.
I don't like this. I can see the value if you're writing a prototype just to get something that works, but for anything serious, you really don't want that level of coupling between the public API and the internal database representation. The docs show how to separate the table models from the API models, but at that point you might as well just use SQLAlchemy and Pydantic separately.
Absolutely, if you bring your database classes to real app as models it will be a nightmare. Simple type casting from one db model type to dataclass or whatever work just great for centuries 😅🙌
One thing is just so better.. documentation. SQLAlchemy documentation is HELL, just like DFR docs. You have so many ways to achieve the same thing. SQLModel documentation is so clean and easy, SQLAlchemy documentation is "no for human consumption". Besides that SQLModel is just additional wrapper with additional coupling and not many benefits. Doing some kind of hybrid - input DTO with DAO is not a great idea. Data access objects with additional validations feels so dirty and violates many good modern best practices.
Just keep sqlalchemy and pydantic separate, adding a layer on top of sqlalchemy is just asking for weird bugs and you only need to refer to one set of documentation for sqlalchemy
Yes, but you don't have any nice verification of you data when creating your data, that you feed into the database. Adding that to SQLalchemy is possible, but also quite some code to get it right. And it also requires a good understanding of the sqlalchemy internals. I guess some verification of top of sqlalchemy is a great idea. Except, sqlmodel just got it wrong.
Sqlmodel is really nice.You have talked about alembic,can you do a tutorial on how to connect FastApi to multiple databases and how will you perform migrations with alembic for each database.
The main a problem I see is what you mentioned at the end of the video, we ahould ask ourselves if we want to couple so tightly our domain model with the database. Just hope the people get to the end of the video to notice this issue
Uhm, I am not convinced, we are removing all the DTO / DAO differences with SQLModel, constraining the DB datastructure with the needs of the clients (what DTO actually provides)...
Working on a project right now with my Dad who is a seasoned SQL guy been, doing it professionally since the early 90s. He scoffs at ORMs, but we're using SQLModel and I finally showed him how ludicrously simple it is. He's now stopped complaining about the concept and is willing to do it this way in the Python itself instead of just writing all this complex SQL I don't understand 😄
I thought I hit enter on a comment here, or it got deleted for some reason. Thanks for the video. I would like to discuss one aspect that everybody seems to ignore - connection pool usage. If you make a select, then a connection is taken from the pool and it's given back to the pool only when you exit get_session - this limits your application concurrency to the amount of SQL connections you allow - and those are quite finite. To explore and maybe benchmark what I'm talking about try making an async endpoint with a select, then await sleep(5) (to mock an API call) and see how many request per second your endpoint can handle, then explicitly "begin" with a async with session: select;
I like the idea behind this lib but actually wouldn't use It in production because 1) coupling API schema with db schema 2) it looks a bit raw, I don't know how it will work in production 3) I'm not sure does it support advance features from sqlalchemy
Using a base class is mandatory if you want to add your own pydantic validators. E.g. if you want to check if a field value is a valid email address. If you use a class A(SQLModel, table=True), you cannot add validators to A. Also, working with database schemas is a pain in the a... like in sqlalchemy.
Hi Arjan nice topic. I have been considering starting a FAST API project and separating the DB by using FAST API together with Django. As Django has built in capabilities that makes database integration and management easy. Then use FastAPI for its Asynchronous abilities. In my example should I have SQLModel sit between FastAPI and Django or am I over thinking this and in my example SQL Alchemy nor SQLModel are not needed? Take care D
The easiest way to see why SQLModel won't work is try to create more or less complicated API which will do something more than just simple querying to DB. And then the realization that's it's not mature enough at this particular moment will come fast. SQLAlchemy is perfect already. It has 1 order more documenation, community, etc.
Nice studio 😇 . I learnt something new so thank you 🙌. A lot of different opinions in the comment section make me think critically 🤭. The red squiggly under Field in import from sqlmodel is triggering my OCD 😭
The Python db driver interface is kind of nice and lightweight already. It yields a list of dicts which can be trivially transferred into a pandas DataFrame because list of dicts is exactly the constructor argument of a DataFrame. There's no need to depend on an ORM at all. Pandas already does the heavy lifting for you.
I'm surprised that lots of comments are overreacting about how bad it is to mix table definition and validation model definition. Did you experiment it before saying "no"? There are good examples on SQLModel documentation which demonstrate how to deal with this. Actually, the whole point of this piece of software is dealing with that mix. So maybe just give it a try first. In the end you could be surprised and notice that it works pretty nicely.
Well, I tired it, and I realized that the FastAPI examples uses the response_model attribute instead of the return type annotation of the function, because otherwise you'll get all sorts of type check errors in mypy. This is because the decorator converts the table class into the wanted api class, but this is all to avoid writing a single line in the api function. If you factor in this converion more explicitly, you then realize that the table class and the non-table ones cover in fact two separate aspects, and you just saved a few lines of repeated class arguments. With all sorts of drawbacks that comes with this high abstractions.
I write my comment here, but I can say you, it's not work very well, especially if you need specific fields like BigInt, arrays of type, jsonb... or if you have complicated setup, like with multiple primary keys.
Beware: several things require referencing sqlalchemy directly (regex validation?) and async support is not officially present yet (messy though possible to diy). I wouldn’t use this until it matures further.
I honestly don't see the benefit of using SQLModel when 1. you can't do validations if they are tables, 2. sqlalchemy orm has the Mapped feature to allow native types.
this makes no sense, Django is a not a frontend framework. If you do not want to use DRF and you should probably try django ninja, it takes some ideas from FastAPI and is more adapted to the framework
can you explain why that makes no sense? yes, I know django is not a frontend framework, I mean to say I want to use django to serve the traditional MVT except those React, etc.
@@jonathanpiaget5195it is exactly what the OP said: django for frontend and FastAPI for backend. and it doesn't make sense. FastAPI and Django are both backend frameworks.
Could you give us an example with one table with 28 columns, 7 foreign keys and the others have pivot tables and FKs and those have service tables with types. Yeah... More complex it gets worse it gets. SQLAlchemy , SQLModel and any other BSql can't handle an aviation crew schedule or maintenance DB nor Financial or anything that has more complexity than 10 tables. Just use SQL and do whatever you need to avoid mistakes, FE, BE, DB validation, sessions handling, Authorization handling. Make it simple. Let's stop adding things that can break, can't have maintenance or even worse, can become obsolete and discarted, like Flutter... Yeah, I said it !
Python ORMs are so broken by design, all of them miss the target of a descent API. Missing ActiveRecord sanity, but since working in Python at work ended up creating a lightweight Pydantic + Psychopg@3 database model interface with just a few hundred lines of code, and it is way more elegant than this.
If you are doing anything serious, don’t couple the API schema to the database. I’m working on a big legacy backend which has done this and it creates all sorts of problems for us.
Imma be "that guy" and say that the benefits of abstraction and ease of switching between backends are far outweighed by the benefits of rawdogging SQL directly and taking advantage of native functionality that may be specific to an engine
Single-responsibility principle? Never heard of? 99% of projects are not "Hello World" CRUDs. An object lifecycle (including validation) can be extremely complicated. Any of such simplistic approaches would not work for that.
I am concerned about how to teach less skilled engineers how to use this "facade" library correctly. As you pointed out, in practice the API and DAL models should never be combined for several reasons - proper app layering polices, security, SRP, etc. I think it is best to keep the API and DAL layer boundaries clear, and not use something like SQLModel which will definitely be the source of bugs and constant, er, teaching opportunities. In practice, API layers should NEVER be doing CRUD directly. I blame MS for teaching devs who are just starting to do this. We are still getting bit by that blunder. Shame on you MS. Sorry, I think this one is a pass for me.
Definitely against this. I really prefer to separate Database schema and serialization, validation/ dto creation. Mixing the two feels really like an anti pattern.
Agreed. And I still haven’t found a good way to dynamically construct SQL queries at runtime. When you want a user to be able to do complex filtering of items for example. For certain filters I need to join to other tables, but the user may have no need for that filter, so how do we prevent unnecessary joining?
Layers of abstraction under layers of abstraction is never a good idea. If you need to be an expert in sql, and then sqlalchemy and now in this good luck with the final queries it generates. It saves the amount of typing? Yeah, cool story. Typing amount is the least concerning element of software engineering. Code maintenance is the major factor software engineers should optimise for and layers over layers of abstractions won’t help with this
Don't listen to author and just use SQLAlchemy. It uses everywhere. If you're looking for a job - learn SQLA Author probably just doesn't have any idea for new videos and creating like this one.
💡 Learn how to design great software in 7 steps: arjan.codes/designguide.
At 17:18, I'm glad Arjan talked about coupling and separation trade-offs. I always felt awkward with ORM classes being so tightly coupled to DB tables which makes the domain classes hard to use in isolation. e.g., ORM assumes that you need a session when perhaps you have some behavior that doesn't involve the DB.
One approach on this is to flip the dependency. Rather than make your domain model depend on ORM -- by using a class that inherits from SQLAlchemy or SQLModel to combine domain, validation, and/or DB logic -- consider making the ORM class extend your domain model.
Code example snippet:
```
from sqlalchemy import Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
# Domain Model
class UserDomain:
def __init__(self, name: str, email: str):
self.name = name
self.email = email
def is_valid_email(self) -> bool:
return "@" in self.email
# ORM Model
class UserORM(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True, index=True)
name = Column(String, nullable=False)
email = Column(String, unique=True, index=True)
def __init__(self, user: UserDomain):
self.domain = user # Composition: ORM contains the domain model
self.name = user.name
self.email = user.email
def to_domain(self) -> UserDomain:
return self.domain
@classmethod
def from_domain(cls, user: UserDomain) -> "UserORM":
return cls(user)
# Example Usage
domain_user = UserDomain(name="John Doe", email="john@example.com")
orm_user = UserORM.from_domain(domain_user)
print(orm_user.name) # Access ORM fields
print(orm_user.to_domain().is_valid_email()) # Delegate to domain logic
```
In general I don’t think it’s a good idea to have specific database classes be the dependency center of your app. An approach I like is the “clean architecture”.
DI
SQLC is the best if you already know your way around SQL. Automatically generate pydantic models from raw SQL queries. You get both flexibility of raw queries (which you already probably already have) and type safety, no need to deal with ORMs.
Also get a small performance boost from no ORM too
IME, it is better to always keep a service layer between db transaction logic and api endpoint logic
So it should be like this:
API -> Service Layer -> DB Transaction
For better decoupling and easy for testing
Bhai, tu single hain kya? 😛
@@abc_cba I am married, kyun pucha?
funny to kuch na hai isme kuch.🤔
@@papunmohanty5968 dhat Teri yaar.
Mera dil tut gaya.
:(
But I am happy that you found your partner.
Bohot cute dikhta hain bro, tu!
I don't think this is a good idea, there's no good reason to use sqlmodel:
- you're coupling your api with database
- sqlalchemy models can be declared using type hints, but you've shown essentially a deprecated way of defining them
- there are some issues with sqlmodel, e.g. with model inheritance
Also I have a lot of questions regarding the code in the video:
- why didn't you use sessionmaker as a context manager but did that with session in sqlmodel?
- why did you use deprecated query api in the sqlalchemy example?
- why are you committing your changes manually whel that could be done in a dependency?
- why is autoflush disabled?
Cringe when I see ppl using the old way
Also there is an issuse with asyncronious:
U r writing code with async framework, so why do u use sync methods. Then starting it with uvicorn...
@@pavelnoryk3823 yes, async support is not unique to sqlmodel, sqlalchemy had it for approx 3 years or more
@@pavelnoryk3823 to debug locally most likely...
@@timbrap4693 I am new to this topic. Could you explain what you mean with "old way"?
Thumbnail 10/10
💯
Looking forward to the uv video.
Having a Pydantic model and an ORM model is not boilerplate; it reflects a proper separation of concerns. It is uncommon to work with database models directly in the presentation layer. Typically, database models are used exclusively in repositories, where they are mapped to domain models (Pydantic models in this case). These domain models are then utilized in the service layer and beyond, ensuring that business logic is not tightly coupled with infrastructure details.
Thats why I like django
Boilerplate can reflect proper separation of concerns. It is still boilerplate. Whether the design calls for it or not is something else.
I'd be interested in a video about writing database tests for an API like this. (Not necessarily SQLModel-specific).
your studio looks awesome!!
Thanks! 🙏
I don't like this. I can see the value if you're writing a prototype just to get something that works, but for anything serious, you really don't want that level of coupling between the public API and the internal database representation. The docs show how to separate the table models from the API models, but at that point you might as well just use SQLAlchemy and Pydantic separately.
💯 agree
It's highly incovenient and less flexible
To be fair enough, Arjan mentions coupling as downside at the end of the video.
@@michael_baron It's not just a downside, it is the reason why you shouldn't even touch it.
Absolutely, if you bring your database classes to real app as models it will be a nightmare. Simple type casting from one db model type to dataclass or whatever work just great for centuries 😅🙌
I was going to say this, completely true.
but I do prefer using sqlmodel over sqlalchemy with separation between it and pydantic.
One thing is just so better.. documentation. SQLAlchemy documentation is HELL, just like DFR docs. You have so many ways to achieve the same thing. SQLModel documentation is so clean and easy, SQLAlchemy documentation is "no for human consumption". Besides that SQLModel is just additional wrapper with additional coupling and not many benefits. Doing some kind of hybrid - input DTO with DAO is not a great idea. Data access objects with additional validations feels so dirty and violates many good modern best practices.
You are aware, that SQLModel does zero validation, if you set table=True? And the documentation keeps nearly silent, about this very subtle "issue".
Just keep sqlalchemy and pydantic separate, adding a layer on top of sqlalchemy is just asking for weird bugs and you only need to refer to one set of documentation for sqlalchemy
Yes, but you don't have any nice verification of you data when creating your data, that you feed into the database. Adding that to SQLalchemy is possible, but also quite some code to get it right. And it also requires a good understanding of the sqlalchemy internals. I guess some verification of top of sqlalchemy is a great idea. Except, sqlmodel just got it wrong.
I just realized how much I love django orm.
One of the main reasons I still use Django. The ORM is just so much powerful
@sofianeabdellaoui3682 exactly. I am learning spring boot now and I miss django orm so much. It's just awesome.
Django is an overcrowded mess. Am glad I stopped using it tbh, each to their own tho.
Django ORM 🤌🏽
Sqlmodel is really nice.You have talked about alembic,can you do a tutorial on how to connect FastApi to multiple databases and how will you perform migrations with alembic for each database.
The main a problem I see is what you mentioned at the end of the video, we ahould ask ourselves if we want to couple so tightly our domain model with the database.
Just hope the people get to the end of the video to notice this issue
Uhm, I am not convinced, we are removing all the DTO / DAO differences with SQLModel, constraining the DB datastructure with the needs of the clients (what DTO actually provides)...
Love SQLModel, it's pretty useful, although a bit limiting for the more "in the weeds" SQL stuff.
what about eager loading?
Interesting topic, thanks for the video.
+1 for making a video about uv. Looking forward to it.
Coming soon!
Working on a project right now with my Dad who is a seasoned SQL guy been, doing it professionally since the early 90s. He scoffs at ORMs, but we're using SQLModel and I finally showed him how ludicrously simple it is. He's now stopped complaining about the concept and is willing to do it this way in the Python itself instead of just writing all this complex SQL I don't understand 😄
I thought I hit enter on a comment here, or it got deleted for some reason. Thanks for the video. I would like to discuss one aspect that everybody seems to ignore - connection pool usage. If you make a select, then a connection is taken from the pool and it's given back to the pool only when you exit get_session - this limits your application concurrency to the amount of SQL connections you allow - and those are quite finite.
To explore and maybe benchmark what I'm talking about try making an async endpoint with a select, then await sleep(5) (to mock an API call) and see how many request per second your endpoint can handle, then explicitly "begin" with a async with session: select;
We hit the ceiling in production with get_session approach too. Using session maker as a context manager solves the issue.
I like the idea behind this lib but actually wouldn't use It in production because
1) coupling API schema with db schema
2) it looks a bit raw, I don't know how it will work in production
3) I'm not sure does it support advance features from sqlalchemy
Yep, all backenders with experience were bitten by 1) at some point. I don't get this obsession to endlessly simplify CRUD.
isnt the Optional class deprecated in pydantic models?
A question i think sqlite is prone to database errors like dirty reads and has no locking mechanism. What is your understanding.
Using a base class is mandatory if you want to add your own pydantic validators. E.g. if you want to check if a field value is a valid email address. If you use a class A(SQLModel, table=True), you cannot add validators to A.
Also, working with database schemas is a pain in the a... like in sqlalchemy.
yes, I ran into the same problem. the pending github issue has 50+ hearts. For me, sqlmodel just got the wrong design for this task.
Hi Arjan nice topic. I have been considering starting a FAST API project and separating the DB by using FAST API together with Django. As Django has built in capabilities that makes database integration and management easy. Then use FastAPI for its Asynchronous abilities. In my example should I have SQLModel sit between FastAPI and Django or am I over thinking this and in my example SQL Alchemy nor SQLModel are not needed? Take care D
i am starting using sqlmodel, i think to start with a small project and send you this with salmodel will i send u my all written in sqlmodel
Is anybody addressing that SQLmodel is still in its 0.xx.xx version? I feel like it may not be ready for production projects…
So is FastAPI
The easiest way to see why SQLModel won't work is try to create more or less complicated API which will do something more than just simple querying to DB. And then the realization that's it's not mature enough at this particular moment will come fast. SQLAlchemy is perfect already. It has 1 order more documenation, community, etc.
Nice studio 😇 . I learnt something new so thank you 🙌. A lot of different opinions in the comment section make me think critically 🤭.
The red squiggly under Field in import from sqlmodel is triggering my OCD 😭
Thank you for nice contents
+1 for using uv.
The Python db driver interface is kind of nice and lightweight already. It yields a list of dicts which can be trivially transferred into a pandas DataFrame because list of dicts is exactly the constructor argument of a DataFrame. There's no need to depend on an ORM at all. Pandas already does the heavy lifting for you.
0:07 application programming interface interface
I'm surprised that lots of comments are overreacting about how bad it is to mix table definition and validation model definition. Did you experiment it before saying "no"? There are good examples on SQLModel documentation which demonstrate how to deal with this. Actually, the whole point of this piece of software is dealing with that mix. So maybe just give it a try first. In the end you could be surprised and notice that it works pretty nicely.
Well, I tired it, and I realized that the FastAPI examples uses the response_model attribute instead of the return type annotation of the function, because otherwise you'll get all sorts of type check errors in mypy. This is because the decorator converts the table class into the wanted api class, but this is all to avoid writing a single line in the api function. If you factor in this converion more explicitly, you then realize that the table class and the non-table ones cover in fact two separate aspects, and you just saved a few lines of repeated class arguments. With all sorts of drawbacks that comes with this high abstractions.
And what about Alembic? How does it integrate here?
It works, but needs a few adjustments to the config and template files. You can find a few articles online on how to do this
I write my comment here, but I can say you, it's not work very well, especially if you need specific fields like BigInt, arrays of type, jsonb... or if you have complicated setup, like with multiple primary keys.
What about eager loading ?
would love to see how you integrate mongodb and fastapi, including pydantic etc.
Beware: several things require referencing sqlalchemy directly (regex validation?) and async support is not officially present yet (messy though possible to diy). I wouldn’t use this until it matures further.
uv is great! im waiting for video about it
Coming soon!
i'd also like a video about alembic tbh
So django models?
I honestly don't see the benefit of using SQLModel when 1. you can't do validations if they are tables, 2. sqlalchemy orm has the Mapped feature to allow native types.
I wonder how well SQLModel would integrate with kedro-pydantic.
what are the best fastapi starter ?
official docs and fun project
Maybe it is not a good fit for ORMs but it can fit the bill for data exchange.
Honestly it burns my eyes seeing 1:12 two classes with same attributes, it\s probably the reason i'll champion flask.
can you make a video on Django app for frontend and FastAPI using django models for API's? I don't want to use DRF.
this makes no sense, Django is a not a frontend framework. If you do not want to use DRF and you should probably try django ninja, it takes some ideas from FastAPI and is more adapted to the framework
can you explain why that makes no sense?
yes, I know django is not a frontend framework, I mean to say I want to use django to serve the traditional MVT except those React, etc.
@@aafan.kuware oh I interpreted your "for frontend" as if you wanted to use it only for the front 😁, this is good for server side rendering indeed
wtf 😂
just use FastAPI and any frontend framework or just raw jinja templates
@@jonathanpiaget5195it is exactly what the OP said: django for frontend and FastAPI for backend. and it doesn't make sense. FastAPI and Django are both backend frameworks.
Thanks for uv🎉🎉🎉🎉🎉🎉
Coming before the end of the year.
i don't know why did you use Sqlalchemy 1.4 instead of 2.0
and also just because you don't like it doesn't it mean everyone should to
Could you give us an example with one table with 28 columns, 7 foreign keys and the others have pivot tables and FKs and those have service tables with types.
Yeah... More complex it gets worse it gets. SQLAlchemy , SQLModel and any other BSql can't handle an aviation crew schedule or maintenance DB nor Financial or anything that has more complexity than 10 tables.
Just use SQL and do whatever you need to avoid mistakes, FE, BE, DB validation, sessions handling,
Authorization handling. Make it simple. Let's stop adding things that can break, can't have maintenance or even worse, can become obsolete and discarted, like Flutter... Yeah, I said it !
Wow I don't hate DRF so much anymore
Python ORMs are so broken by design, all of them miss the target of a descent API. Missing ActiveRecord sanity, but since working in Python at work ended up creating a lightweight Pydantic + Psychopg@3 database model interface with just a few hundred lines of code, and it is way more elegant than this.
what is wrong with SQLAlchemy? it is pretty flexible
Use neo4j, or any graph database ! Much more flexible and expressive than SQL !
If you are doing anything serious, don’t couple the API schema to the database. I’m working on a big legacy backend which has done this and it creates all sorts of problems for us.
Imma be "that guy" and say that the benefits of abstraction and ease of switching between backends are far outweighed by the benefits of rawdogging SQL directly and taking advantage of native functionality that may be specific to an engine
Single-responsibility principle? Never heard of?
99% of projects are not "Hello World" CRUDs. An object lifecycle (including validation) can be extremely complicated. Any of such simplistic approaches would not work for that.
I am concerned about how to teach less skilled engineers how to use this "facade" library correctly.
As you pointed out, in practice the API and DAL models should never be combined for several reasons - proper app layering polices, security, SRP, etc.
I think it is best to keep the API and DAL layer boundaries clear, and not use something like SQLModel which will definitely be the source of bugs and constant, er, teaching opportunities.
In practice, API layers should NEVER be doing CRUD directly. I blame MS for teaching devs who are just starting to do this. We are still getting bit by that blunder. Shame on you MS.
Sorry, I think this one is a pass for me.
SQLMODEL IS 💯 backward compatible with sqlalchemy.
ORMs always get back to bite you
Definitely against this. I really prefer to separate Database schema and serialization, validation/ dto creation. Mixing the two feels really like an anti pattern.
Peewee is still my favorite
I don't like SQLAlchemy in general, it's overcomplicated and slow. Piccolo ORM is faster and a lot simpler.
Poor Wilhelm
Don't do this. You're just asking for bugs to happen
My experience is to stick to the standard: SQL.
Agreed. And I still haven’t found a good way to dynamically construct SQL queries at runtime. When you want a user to be able to do complex filtering of items for example. For certain filters I need to join to other tables, but the user may have no need for that filter, so how do we prevent unnecessary joining?
No, you should not
I wouldn't recommend it.
heavily disagree
IMHO: The usefulness of using ORM is highly questionable.
Maybe if limited in sql syntax
Layers of abstraction under layers of abstraction is never a good idea. If you need to be an expert in sql, and then sqlalchemy and now in this good luck with the final queries it generates.
It saves the amount of typing? Yeah, cool story. Typing amount is the least concerning element of software engineering. Code maintenance is the major factor software engineers should optimise for and layers over layers of abstractions won’t help with this
Thanks I hate it
Don't listen to author and just use SQLAlchemy. It uses everywhere. If you're looking for a job - learn SQLA
Author probably just doesn't have any idea for new videos and creating like this one.
🫸 / 👉