Conda is nice while you can find the packages in the repos. But sometimes you'll have to mix conda and pip packages because the package it's only available on pip and that's when it gets messy. You can do that and even create a environment file that keeps track of those pip dependencies, but it won't check the pip packages for compatibility with conda packages. So you loose the main feature of conda. Poetry sounds cool. But if it doesn't download binaries that makes it less appealing.
Poetry is an abstraction layer on top of pip, so it can download pre-compiled binaries. It just won't install a FORTRAN compiler on your machine like conda can
I like when people remind you to give the thumbs up after giving some quality information instead of asking about it at the start You can use pip in conda environments, so I usually just install all the packages with pip and not with conda. Switching between environments is convenient
Just be careful with this one - Conda does some funky stuff with installs, so this approach won't always work, especially if you need the Fortran compilers etc. that Conda can install
I've been using Poetry for a while and really like it, but I've always wondered if I should be using Pipenv or Conda instead. This video did a great job explaining the differences. If I had one complaint about Poetry though (aside from having faster package resolution, though I remember reading that it had to do build packages to check their version info because it wasn't properly noted on PyPI which seems outside their control), is to be able to use junctions to a common cache folder like Pnpm. I don't need a dozen copies of Numpy or Black for each of my projects taking up space on my computer. I would much rather have a common installation of those packages that are linked where needed.
Yes! I'm not sure how well Python deals with symlinks to packages, but I'm sure it would be possible to achieve something like you've mentioned. Poetry is open source though, so there's nothing stopping you giving it a go 😉
You are using pip for the wrong purpose. Pip is to isolate your local environments for different projects. It is not for migrating your app to production. You should use docker for that. I use Docker (for lift and shift of applications) and pip (for python library installation) together and this combination has served me well. If your application cannot be containerised for lift and shift, if you need to manually rebuild your production environment, you should thoroughly rethink your whole architecture and tech stacks.
I'm going to have to disagree here. Typically, pip is just used for dependency installation, whereas venv is used for environment creation. Also, pip, Poetry etc. are not a replacement for Docker, and would typically be used alongside it, like you mentioned, but I'm also not certain what you mean by "lift and shift". Do you mean installing the dependencies on your local machine and then copying them into a Docker image? If not, you're always going to need to run pip/Poetry/Conda during the image build process to install dependencies anyway.
I encounter similar situation alott of times, i use conda tho: conda create -n env1 python=3.9 conda create -n test1 python=3.8 Now that i have 3 env in my conda, base, env1,test1, then i activate the env i need and use "pip install" libraries within that env, but still overtime things just go wrong ( especially i use alot of Machine learning/GPU /numpy lib, they just not not compatible with each other, or even crash the other env somehow, what have i done wrong ?
If you're still using pip, you're missing out on Conda's functionality. I'd recommend using conda install to get your libraries. Most things you'll need should be available via the Anaconda repo or conda forge
@@IsaacHarrisHolt 🤔I thought I was already inside a separate environment where I could simply use pip install. I am now going to try using Poetry. It reminds me a bit of Rust haha. Anyway thanks for the great video !
Technically you are, but conda environments aren't JUST Python environments, so they don't behave in exactly the same way. If you're working with ML and GPU libraries a lot, I probably wouldn't recommend Poetry, as you'll lose the benefits conda provides. Then again, you won't have been getting those benefits when using pip inside conda, so you might not be missing out on much. And I'm glad to hear I was able to help!
I've had some really nightmarish issues with pip's dependency resolver when working with large teams in the past. Pip will warn you that things might not be compatible, sure, but since it will still install the package, people just ignore it and continue, which then leads to a mess when you need to update one of the packages or otherwise make changes. Also these other tools are typically more user-friendly and have more robust features.
Thank you! I'm curious why you'd be interested in a video like this - I would have thought that most people would want to know how to go about using the new way of doing things.
Poetry is the future but unfortunately not all packages support it properly with indexes. I’ve seen like an hour or longer resolving dependencies task on my machine since it has to download and index big libraries. I’m sure this will be improved one day. I use poetry when I can and switch to pip when it gives me trouble.
This is a great way of doing things. Poetry's dependency resolving can take ages, but hopefully it gets improved as packages move towards defining dependencies in a `pyproject.yaml` format, which makes them easier to resolve.
You can run poetry in a verbose mode where you can exactly see which resolution takes so long. Then take this dependency (mostly dependency of your direct dependency) and pin it on a specific version. Depending on the size of your project it can take some time, but it will improve the dependeny resolution from hours to seconds. (which i did for a huge project)
@@IsaacHarrisHolt It It's a great video and helped me a lot, thank you! Do you know the best approach if you have a package version that i is only supported by Python 3.7+ and you need to install another package that only is compatible to python versions up to 3.6? Is it still possible to use them somehow together?
It doesn't resolve dependencies as well as some other tools available, meaning you're more likely to end up with mismatched versions that can break things
@@IsaacHarrisHolt OK, so in theory they could just swap in a better algorithm for that and it would be fine? But I guess I like this setup the best, so if I ever use Python again, I'll still try pipenv first until I run into trouble. I'm not planning to do any huge projects anyway, so maybe for a small number of dependencies it's sufficient.
Oh, my goodness. Speaking of Fortran compilers. I attempted installing the Intel Fortran compiler, which is considered the gold standard against any other, like Gnu Fortran or LFortran, but I could not stomach putting 26 GiB of nonsense on my laptop 500 GiB NVMe SSD just to compare ifort with gfortran and lfortran. Nope.
@@IsaacHarrisHolt yeah. I started using pipenv,mamba. uv is still in development so I am very excited about all in one tool for python from development to release.
I think this is a problem that a lot of the tools ultimately face. It's a tricky problem to solve that a lot of package managers struggle with. The "quick" option would be to do what NPM etc. do for the JS ecosystem and just download every required version of a package. It's not great for storage space though :/
The conda executable is unbearable if you use the conda-forge channel. However, there is a binary implementation of the executable called "mamba", which makes resolving faster by at least a magnitude and a half.
It depends what you're writing! Dockerising is much easier if you're not using Conda - environment.yaml files are fine, but they're not strictly reproducible and can lead to some of the dependency resolving issues I mentioned in the video.
Pip tools is certainly great, and it's wonderful for deploying stuff. Poetry works really well for publishing packages, which is what I mostly use Python for these days, so I tend to use that more often
For an in-depth look, take a read of this wonderful article: pradyunsg.me/blog/2023/01/21/thoughts-on-python-packaging/ TL;DR: There are too many tools in the ecosystem that are actively competing, and not working together. They all have different ways of approaching the problem (fine) and have some different features (also fine). The problem is that these tools are pretty much all endorsed by PyPA (the Python packaging authority) and while they're all slightly different, they're all fundamentally solving the same problem, causing unnecessary competition and segregating the community. You just have to look at some of the comments on this video, which is a mostly objective overview of available tools, to see how divided people can get over this decision. Other modern languages have a single recommended tool, often shipped as part of the language itself. While Python has legacy concerns that these don't, Python doesn't even have a single package repository! Conda uses condaforge or other repos instead of PyPI, so you could install the "latest" version of a package using pip and get a different version to if you'd used Conda. Not ideal. Even JavaScript has npmjs, and you know you've fucked it when JS is doing something better than you...
Very much so! Tris has been a big influence, as have a few others. It was Tris who put me onto the idea of using Obsidian Advanced Slides for video, for sure
I've avoided conda. I like to keep my python vanilla. Every time I add something over the top it's one level of abstraction for me to work through. All the additions are cute but in the end it's more complexity that that I can do without.
That's totally valid! I don't like conda either, but it does have its uses. You'll see that firsthand if you do any work with data scientists. The reason I like Poetry is because it's just an abstraction on top of pip, so you can still use all the pip commands if you need to.
It's a *nightmare.* We'll soon need a package manager's manager to manage the way package managers manage each other, and an IA manager to manage the IA that tries to help us remember how different command names from different package managers are the same or different. Disk get full of libraries we don't use. No links, copies everywhere, multiple lock files… Companies hire people just because they know how to use the latest package manager, even if they suck at algorithmic. And what do they build and sell? Guess what... a package manager. At this point, I want to go back to the days when I had to manually place files in the file system one at a time. Back then, I only needed to know four bash commands to know where things were installed. Why does every damn new package manager have to invent its own non-standard way and alternative vocabulary instead of extending what already exists and following standards? I hate them all.
Poetry had a random Installation fail too on some version change. Again *random* which is A DECICION of its own. Cant trust the dev team -> dont use it.
Sometimes you want to install just the deps and then the source in separate build stages, so you can skip the deps install and only copy the source in another stage to hit cache during dev builds which speeds things up. Poetry made this quite difficult.
@@IsaacHarrisHolt Good video BTW. While I can't use a tool I'm not able to distinguish its name from other similar tools. I used poetry once for a small project other one created, and didn't remember if it resolved eventually 1000 years later before I uninstalled it. And now conda export > environment.yml keeps leaking my real name in its don't-know-what-it-does "prefix" field. Otherwise I have to conda install and manually edit the environment.yml like an ancient monkey. What is your recommendation? 😂
Conda is nice while you can find the packages in the repos. But sometimes you'll have to mix conda and pip packages because the package it's only available on pip and that's when it gets messy. You can do that and even create a environment file that keeps track of those pip dependencies, but it won't check the pip packages for compatibility with conda packages. So you loose the main feature of conda. Poetry sounds cool. But if it doesn't download binaries that makes it less appealing.
Poetry is an abstraction layer on top of pip, so it can download pre-compiled binaries. It just won't install a FORTRAN compiler on your machine like conda can
I like when people remind you to give the thumbs up after giving some quality information instead of asking about it at the start
You can use pip in conda environments, so I usually just install all the packages with pip and not with conda. Switching between environments is convenient
Just be careful with this one - Conda does some funky stuff with installs, so this approach won't always work, especially if you need the Fortran compilers etc. that Conda can install
I've been using Poetry for a while and really like it, but I've always wondered if I should be using Pipenv or Conda instead. This video did a great job explaining the differences. If I had one complaint about Poetry though (aside from having faster package resolution, though I remember reading that it had to do build packages to check their version info because it wasn't properly noted on PyPI which seems outside their control), is to be able to use junctions to a common cache folder like Pnpm. I don't need a dozen copies of Numpy or Black for each of my projects taking up space on my computer. I would much rather have a common installation of those packages that are linked where needed.
Yes! I'm not sure how well Python deals with symlinks to packages, but I'm sure it would be possible to achieve something like you've mentioned.
Poetry is open source though, so there's nothing stopping you giving it a go 😉
You are using pip for the wrong purpose. Pip is to isolate your local environments for different projects. It is not for migrating your app to production. You should use docker for that. I use Docker (for lift and shift of applications) and pip (for python library installation) together and this combination has served me well.
If your application cannot be containerised for lift and shift, if you need to manually rebuild your production environment, you should thoroughly rethink your whole architecture and tech stacks.
I'm going to have to disagree here. Typically, pip is just used for dependency installation, whereas venv is used for environment creation. Also, pip, Poetry etc. are not a replacement for Docker, and would typically be used alongside it, like you mentioned, but I'm also not certain what you mean by "lift and shift".
Do you mean installing the dependencies on your local machine and then copying them into a Docker image? If not, you're always going to need to run pip/Poetry/Conda during the image build process to install dependencies anyway.
Well, the name of the user tells everything😂
I encounter similar situation alott of times, i use conda tho:
conda create -n env1 python=3.9
conda create -n test1 python=3.8
Now that i have 3 env in my conda, base, env1,test1,
then i activate the env i need and use "pip install" libraries within that env,
but still overtime things just go wrong ( especially i use alot of Machine learning/GPU /numpy lib, they just not not compatible with each other, or even crash the other env somehow, what have i done wrong ?
If you're still using pip, you're missing out on Conda's functionality. I'd recommend using conda install to get your libraries. Most things you'll need should be available via the Anaconda repo or conda forge
@@IsaacHarrisHolt 🤔I thought I was already inside a separate environment where I could simply use pip install. I am now going to try using Poetry. It reminds me a bit of Rust haha. Anyway thanks for the great video !
Technically you are, but conda environments aren't JUST Python environments, so they don't behave in exactly the same way. If you're working with ML and GPU libraries a lot, I probably wouldn't recommend Poetry, as you'll lose the benefits conda provides.
Then again, you won't have been getting those benefits when using pip inside conda, so you might not be missing out on much.
And I'm glad to hear I was able to help!
Sorry, I don't get the premise of the issue with PIP. I've personally never had an issue that wasn't easily rectified.
Maybe I'm missing something.
I've had some really nightmarish issues with pip's dependency resolver when working with large teams in the past. Pip will warn you that things might not be compatible, sure, but since it will still install the package, people just ignore it and continue, which then leads to a mess when you need to update one of the packages or otherwise make changes. Also these other tools are typically more user-friendly and have more robust features.
+1 for Python build tools video!
Noted!
Video idea: the history of setuptools and distlib. You have such good production quality, and one cannot find that on youtube.
Thank you! I'm curious why you'd be interested in a video like this - I would have thought that most people would want to know how to go about using the new way of doing things.
Mostly for historical purposes. Like a documentary of the history. p.s I also enjoy the history channel :). @@IsaacHarrisHolt
Poetry is the future but unfortunately not all packages support it properly with indexes. I’ve seen like an hour or longer resolving dependencies task on my machine since it has to download and index big libraries. I’m sure this will be improved one day. I use poetry when I can and switch to pip when it gives me trouble.
This is a great way of doing things. Poetry's dependency resolving can take ages, but hopefully it gets improved as packages move towards defining dependencies in a `pyproject.yaml` format, which makes them easier to resolve.
You can run poetry in a verbose mode where you can exactly see which resolution takes so long. Then take this dependency (mostly dependency of your direct dependency) and pin it on a specific version. Depending on the size of your project it can take some time, but it will improve the dependeny resolution from hours to seconds. (which i did for a huge project)
Oh interesting! I didn't know that. I think this'll be useful for a lot of folks.
@@IsaacHarrisHolt It
It's a great video and helped me a lot, thank you! Do you know the best approach if you have a package version that i is only supported by Python 3.7+ and you need to install another package that only is compatible to python versions up to 3.6? Is it still possible to use them somehow together?
You.might be able to find a version of the first package that supports Python 3.6, but I'd generally recommend finding a replacement if you can
Would like to know how uv fits in here
Uv replaces a lot of these, but it's more a Poetry competitor than a Pixi/Conda competitor
6:04 u should have done included micromamba in ur list.
it's far far more better than conda/miniconda in setting up aspect. it's much more liteweight
I only heard about it recently! I'm thinking of having a look at it in a future video though. Thanks for the suggestion!
Why doesn't pipenv solve the dependency management problem? Which part does it not solve? I don't understand.
It doesn't resolve dependencies as well as some other tools available, meaning you're more likely to end up with mismatched versions that can break things
@@IsaacHarrisHolt OK, so in theory they could just swap in a better algorithm for that and it would be fine? But I guess I like this setup the best, so if I ever use Python again, I'll still try pipenv first until I run into trouble. I'm not planning to do any huge projects anyway, so maybe for a small number of dependencies it's sufficient.
You're definitely right! Ultimately it's all a matter of preference anyway, so pick what you're most comfortable with
By the way I very much liked your video on rustifying the python Fibonacci code with maturin. Good stuff.
Thank you!
What ide are you using?
PyCharm! I set it in full screen zen mode for videos, and I use the colour-blind friendly theme :)
For a second I thought I was watching a no boiler plate video, very clean editing and styling absolutely love it.
Thank you!
why
```zsh
brew pipx
pipx install pipenv
```
rather than
```zsh
brew pipenv
```
?
I just like pipx :) and I'm not a total fan of brew.
Oh, my goodness. Speaking of Fortran compilers. I attempted installing the Intel Fortran compiler, which is considered the gold standard against any other, like Gnu Fortran or LFortran, but I could not stomach putting 26 GiB of nonsense on my laptop 500 GiB NVMe SSD just to compare ifort with gfortran and lfortran. Nope.
Sounds like you need an external hard drive 👀
Imagine a world in which python decides to stop reinventing the world with new package managers every few years. Imagine.
I know, right? If only...
For faster package dependency resolution there is uv made by astral sh makers of ruff
Yes! uv is great, but it came out after this video :)
@@IsaacHarrisHolt yeah. I started using pipenv,mamba. uv is still in development so I am very excited about all in one tool for python from development to release.
I have arrived at the nerdiest part of the internet, and it feels nice.
Welcome! Feel free to stay :)
i wish conda was faster determining conflicts between dependencies
I think this is a problem that a lot of the tools ultimately face. It's a tricky problem to solve that a lot of package managers struggle with.
The "quick" option would be to do what NPM etc. do for the JS ecosystem and just download every required version of a package. It's not great for storage space though :/
The conda executable is unbearable if you use the conda-forge channel. However, there is a binary implementation of the executable called "mamba", which makes resolving faster by at least a magnitude and a half.
what is wrong with pip + miniconda ? venv are not very useful - too much redundancy in day to day job
It depends what you're writing! Dockerising is much easier if you're not using Conda - environment.yaml files are fine, but they're not strictly reproducible and can lead to some of the dependency resolving issues I mentioned in the video.
pip-tools + docker is all you need
Pip tools is certainly great, and it's wonderful for deploying stuff. Poetry works really well for publishing packages, which is what I mostly use Python for these days, so I tend to use that more often
Poetry is great but so bloated.
Do be honest. Never had problems with pip. Maybe because I am using Docker?
I also used Docker when I had problems! But yes, if you do your development in Docker, it will help. Pip has also improved a lot recently, too.
"Story for another time". What's in the mess? How do they all work together or work around each other?
For an in-depth look, take a read of this wonderful article: pradyunsg.me/blog/2023/01/21/thoughts-on-python-packaging/
TL;DR: There are too many tools in the ecosystem that are actively competing, and not working together. They all have different ways of approaching the problem (fine) and have some different features (also fine). The problem is that these tools are pretty much all endorsed by PyPA (the Python packaging authority) and while they're all slightly different, they're all fundamentally solving the same problem, causing unnecessary competition and segregating the community.
You just have to look at some of the comments on this video, which is a mostly objective overview of available tools, to see how divided people can get over this decision.
Other modern languages have a single recommended tool, often shipped as part of the language itself. While Python has legacy concerns that these don't, Python doesn't even have a single package repository! Conda uses condaforge or other repos instead of PyPI, so you could install the "latest" version of a package using pip and get a different version to if you'd used Conda.
Not ideal. Even JavaScript has npmjs, and you know you've fucked it when JS is doing something better than you...
@@IsaacHarrisHolt Thanks!
Wait isn't this editing style "inspired" by No Boilerplate?
Very much so! Tris has been a big influence, as have a few others. It was Tris who put me onto the idea of using Obsidian Advanced Slides for video, for sure
I like, Rye written in Rust by Armin Ronacher, the author of Flask. Very fast and sexy.
Yes! I actually made a video about it
I've avoided conda. I like to keep my python vanilla. Every time I add something over the top it's one level of abstraction for me to work through. All the additions are cute but in the end it's more complexity that that I can do without.
That's totally valid! I don't like conda either, but it does have its uses. You'll see that firsthand if you do any work with data scientists. The reason I like Poetry is because it's just an abstraction on top of pip, so you can still use all the pip commands if you need to.
It's a *nightmare.* We'll soon need a package manager's manager to manage the way package managers manage each other, and an IA manager to manage the IA that tries to help us remember how different command names from different package managers are the same or different.
Disk get full of libraries we don't use. No links, copies everywhere, multiple lock files…
Companies hire people just because they know how to use the latest package manager, even if they suck at algorithmic. And what do they build and sell? Guess what... a package manager.
At this point, I want to go back to the days when I had to manually place files in the file system one at a time. Back then, I only needed to know four bash commands to know where things were installed.
Why does every damn new package manager have to invent its own non-standard way and alternative vocabulary instead of extending what already exists and following standards? I hate them all.
Agreed! It's a complete nightmare
on top of that, people also FIGHT for which package manager is the best, like wtf ?
Poetry causes so much grief when combined with docker though.
Ooh interesting you say that! I used to use Poetry with Docker and it worked fine for me, so I'm curious to know what issues you've faced.
Poetry had a random Installation fail too on some version change.
Again *random* which is A DECICION of its own.
Cant trust the dev team -> dont use it.
I've never seen this behaviour, so if you could share some examples that would be great!
Sometimes you want to install just the deps and then the source in separate build stages, so you can skip the deps install and only copy the source in another stage to hit cache during dev builds which speeds things up. Poetry made this quite difficult.
Where did my previous comment go... >.
Once I discoved poetry I never looked back
It's brilliant! I reach for it almost all the time
nice but it requires more explanation with hands-on plz. Ty
Sure! What would you like to see more of?
I prefer pdm
I've not used it! What makes it good?
Conda is slllooooooooooww. Micromamba is slightly better, but can still be slow if you run into conflicts
Good to know! I'm not that familiar with tools in the data science space, so I went with conda as the most well-known one
how are python packages worse than npm 💀
Don't get me STARTED on npm, but at least it has a lock file...
This meme salad should not be confused with an educational video.
Ooh, I'm curious why you said this! Please explain :)
Docker: "Am I a joke to you"?
These solutions can still all be used with Docker :)
@@IsaacHarrisHoltBut why on earth would I run two projects in the same container?
I'm confused? Why would you need to do that?
maybe don't use python for big projects
There's nothing wrong with Python at enterprise scale! Most applications don't need the latency that using a compiled language will afford them
lot of "env"s... as a frontend dev I feel like at home. 👎
It's not the best situation in the world, for sure
@@IsaacHarrisHolt Good video BTW. While I can't use a tool I'm not able to distinguish its name from other similar tools. I used poetry once for a small project other one created, and didn't remember if it resolved eventually 1000 years later before I uninstalled it. And now conda export > environment.yml keeps leaking my real name in its don't-know-what-it-does "prefix" field. Otherwise I have to conda install and manually edit the environment.yml like an ancient monkey. What is your recommendation? 😂
Take a look at some of the newer package managers like UV. That might suit your needs