So I’m no expert, but I have been a hobbyist C and Rust dev for a while now, and I’ve installed tons of programs from GitHub and whatnot that required manual compilation or other hoops to jump through, but I am constantly befuddled installing python apps. They seem to always need a very specific (often outdated) version of python, require a bunch of venv nonsense, googling gives tons of outdated info that no longer works, and generally seem incredibly not portable. As someone who doesn’t work in python, it seems more obtuse than any other language’s ecosystem. Why is it like this?
No, it’s not just you, Python’s tooling is a mess. It’s not necessarily anyone’s fault, but there are a ton of options and a lot of very similarly named things that accomplish different (but sometimes similar) tasks. (pyenv, venv, and virtualenv come to mind.) As someone who considers themselves between beginner and intermediate proficiency in Python, this is my biggest hurdle right now.
Python’s tooling is a mess.
Not only that. It’s a historic mess. Over the years, growing a better and better toolset left a lot of projects in a very messy state. So many answers on Stack Overflow that mention
easy_install
- I still don’t know what it is, but I guess it was some kind of protouv
.
You re not stupid, python’s packaging & versionning is PITA. as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem
as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem
A perfect summary of the history of computer code!
Python’s packaging is not great. Pip and venvs help but, it’s lightyears behind anything you’re used to. My go-to is using a venv for everything.
Python is the only programming language that has forced me to question what the difference is between an egg and a wheel.
No it’s not. E.g. nobody who starts a new project uses setup.py anymore
OP seems to be trying to install older projects, rather than creating a new project.
Python developer here. Venv is good, venv is life. Every single project I create starts with
python3 -m venv venv
source venv/bin/activate
pip3 install {everything I need}
pip3 freeze > requirements.txt
Now write code!
Don’t forget to update your requirements.txt using pip3 freeze again anytime you add a new library with pip.
If you installed a lot of packages before starting to develop with virtual environments, some libraries will be in your OS python install and won’t be reflected in pip freeze and won’t get into your venv. This is the root of all evil. First of all, don’t do that. Second, you can force libraries to install into your venv despite them also being in your system by installing like so:
pip3 install --ignore-installed mypackage
If you don’t change between Linux and windows most libraries will just work between systems, but if you have problems on another system, just recreate the whole venv structure
rm -rf venv (…make a new venv, activate it) pip3 install -r requirements.txt
Once you get the hang of this you can make Python behave without a lot of hassle.
This is a case where a strength can also be a weakness.
You have been in lala land for too long. That list of things to do is insane. Venv is possibly one of the worst solutions around, but many Python devs are incapable of seeing how bad it is. Just for comparison, so you can understand, in Ruby literally everything you did is covered by one command
bundle
. On every system.OP sounds like a victim of Python 3, finding various Python 2 projects on the internet, a venv isn’t going to help
pip3 freeze > requirements.txt
I hate this. Because now I have a list of your dependencies, but also the dependencies of the dependencies, and I now have regular dependencies and dev-dependencies mixed up. If I’m new to Python I would have NO idea which libraries would be the important ones because it’s a jumbled mess.
I’ve come to love
uv
(coming frompoetry
, coming frompip
with arequirements/base.txt
andrequirements/dev.txt
- gotta keep regular dependencies and dev-dependencies separate).uv sync
uv run <application>
That’s it. I don’t even need to install a compatible Python version, as
uv
takes care of that for me. It’ll automatically create a local.venv/
, and it’s blazingly fast.This is the way
It’s a stupid way
It’s something of a “14 competing standards” situation, but uv seems to be the nerd favourite these days.
I still do the python3 -m venv venv && source venv/bin/activate
How can uv help me be a better person?
- let
pyproject.toml
track the dependencies and dev-dependencies you actually care about
- dependencies are what you need to run your application
- dev-dependencies are not necessary to run your app, but to develop it (formatting, linting, utilities, etc)
- it can track exactly what’s needed ot run the application via the
uv.lock
file that contains each and every lib that’s needed. - uv will install the needed Python version for you, completely separate from what your system is running.
uv sync
anduv run <application>
is pretty much all you need to get going- it’s blazingly fast in everything
- let
And pip install -r requirements.txt
Fuck it, I just use sudo and live with the consequences.
You’ll see when you start your second project why this doesn’t work.
I’m not sure this can be really fixed with Python 3, maybe we just have to hope for Python 4
It’s fixed, and the python version had nothing to do with it. Just use hatch
Ah yes, the 15th standard we’ve been waiting for!
It’s not a standard, it’s built on standards.
You can also use Poetry (which recently grew standard metadata support) or plain
uv venv
if you want to do things manually but fast.Just use this one… or any of this 4 others.
This is the issue for us, python outsiders. Each time we try we get a different answer with new tools. We are outside of the comtunity, we don’t know the trend, old and new, pro and cons.
Your first recommandation is hatch… first time I’ve heard of it. Uv seems trendy in this thread, but before that it was unknown to me too.
As I understands it, it should be pip’s job. When it detect I’m in a project it install packages in it and python use them. It can use any tool under the hood, but the default package manager shoud be able to do it on its own.
Just out of curiosity, I haven’t seen anyone recommend miniconda… Why so, is there something wrong I’m not aware of?
I’m no expert, but I totally feel you, python packages, dependencies and version matching is a real nightmare. Even with
venv
I had a hard time to make everything work flawlessly, especially on MacOS.However, with miniconda everything was way easier to configure and worked as expected.
Isn’t conda specifically for mathy things?
I haven’t heard of Mathy, but it seems to be a math tool?
From what I gathered, miniconda is like pipx or venv. It’s able to create python virtual environments.
But I’m very new to all of this so I’m not really a good source. However after experimenting with either of them (venv, pip or miniconda) I found miniconda the easiest to use, but that’s also probably a skill issue.
I was genuinely asking because their could be something I wasn’t aware of because yeah I’m new to all of this. (proprietary, bugs, not the right tool…
You seem related to programming, maybe you could give me some pointers here?
By mathy I mean related to math
Difficult? How so? I find compiling C and C++ stuff much more difficult than anything python. It never works on the first try whereas with python the chances are much much higher.
What’s is so difficult to understand about virtual envs? You have global python packages, you can also have per user python packages, and you can create virtual environments to install packages into. Why do people struggle to understand this?
The global packages are found thanks to default locations, which can be overridden with environment variables. Virtual environments set those environment variables to be able to point to different locations.
python -m venv .venv/
means python will execute the modulevenv
and tell it to create a virtual environment in the.venv
folder in the current directory. As mentioned above, the environment variables have to be set to actually use it. That’s whensource .venv/bin/activate
comes into play (there are other scripts for zsh and fish). Now you can runpip install $package
and then run the package’s command if it has one.It’s that simple. If you want to, you can make it difficult by doing
sudo pip install $package
and fucking up your global packages by possibly updating a dependency of another package - just like the equivalent of updating glibc from 1.2 to 1.3 and breaking every application depending on 1.2 because glibc doesn’t fucking follow goddamn semver.As for old versions of python, bro give me a break. There’s pyenv for that if whatever old ass package you’re installing depends on an ancient 10 year old python version. You really think building a C++ package from 10 years ago will work more smoothly than python? Have fun tracking down all the unlocked dependency versions that “Worked On My Machine 🏧” at the start of the century.
The only python packages I have installing are those with C/C++ dependencies which have to be compiled at install time.
Y’all have got to be meme’ing.
This is exactly how I feel about python as well… IMHO, it’s good for some advanced stuff, where bash starts to hit its limits, but I’d never touch it otherwise