Python Virtual Environments: Stop Installing Everything Globally
Why virtual environments exist, how to use venv/virtualenv/conda/poetry, and how to stop breaking your Python setup every time you start a new project.
You've probably been there. You install a library for one project, it upgrades a dependency, and suddenly a completely different project stops working. Or worse -- you run pip install and something deep in your system Python breaks. This is the problem virtual environments solve, and once you understand them, you'll never go back.
What Virtual Environments Actually Are
A virtual environment is just an isolated folder containing a Python interpreter and its own set of installed packages. When you "activate" one, your shell's PATH gets modified so that python and pip point to that folder instead of your system Python.
That's it. There's no VM, no container, no magic. It's literally a directory with a copy (or symlink) of the Python binary and a site-packages folder.
Why You Need Them
Say you have two projects:
- Project A needs
requests==2.28.0 - Project B needs
requests==2.31.0
requests installed at a time. With virtual environments, each project gets its own isolated site-packages. No conflicts.
The other big reason: reproducibility. The classic "it works on my machine" problem. If your project relies on whatever random packages happen to be installed globally on your laptop, good luck deploying it anywhere else. A virtualenv plus a requirements.txt pins exactly what your project needs.
The Tools: venv vs virtualenv vs conda vs poetry
venv (the built-in option)
Ships with Python 3.3+. No install needed.
python -m venv myenv # Create it
source myenv/bin/activate # Activate (Linux/Mac)
myenv\Scripts\activate # Activate (Windows)
deactivate # Leave the environment
This is the right choice for most people most of the time. It's simple, it's built-in, it works.
virtualenv
The third-party predecessor to venv. Slightly faster at creating environments, supports Python 2 (if you still care), and has some extra features like --copies vs --symlinks control.
pip install virtualenv
virtualenv myenv
Unless you need those extras, just use venv.
conda
Different philosophy entirely. Conda manages not just Python packages but also non-Python dependencies (C libraries, CUDA toolkits, etc.). Popular in data science and ML because packages like NumPy and TensorFlow have complex native dependencies.
conda create -n myproject python=3.11
conda activate myproject
conda install numpy pandas
The downside: conda environments are heavier, the resolver can be slow (though mamba fixes this), and mixing conda install with pip install inside a conda env sometimes causes headaches.
poetry
Poetry is a full project management tool that handles virtual environments, dependency resolution, and packaging all in one.
poetry new myproject
cd myproject
poetry add requests
poetry install
poetry shell # Activate the venv
Poetry uses pyproject.toml instead of requirements.txt, which is the direction the Python ecosystem is heading. If you're starting a new project and want something modern, poetry is worth considering.
requirements.txt vs pyproject.toml
requirements.txt is the traditional approach:requests==2.31.0
flask>=3.0,<4.0
numpy~=1.26.0
You generate it with pip freeze > requirements.txt and install with pip install -r requirements.txt. Simple, universal, everyone understands it.
The problem: pip freeze dumps everything including transitive dependencies. You wanted flask but now your requirements file lists 15 packages, and you can't tell which ones you actually depend on vs which ones Flask pulled in.
[project]
name = "myproject"
dependencies = [
"requests>=2.31",
"flask>=3.0",
]
[project.optional-dependencies]
dev = ["pytest", "black", "mypy"]
This separates your direct dependencies from the full resolved tree. Poetry and pip-tools both work with this approach. It's cleaner, and it's where the ecosystem is going.
Common Mistakes
1. Installing packages globally
# Don't do this
pip install flask
# Do this instead
python -m venv venv
source venv/activate
pip install flask
Every time you run pip install outside a virtualenv, you're polluting your global Python. Eventually something breaks.
2. Forgetting to activate
You create the virtualenv but then forget to activate it before installing packages. Your packages go into the global Python and you wonder why your environment is empty.
Check which Python you're using:
which python # Should point to your venv, not /usr/bin/python
3. Committing the virtualenv folder
The venv/ or myenv/ folder should be in your .gitignore. It contains platform-specific binaries. Commit your requirements.txt or pyproject.toml instead -- that's how other people recreate the environment.
4. Not pinning versions
# Bad - installs whatever version is latest right now
requests
# Good - pinned to a specific version
requests==2.31.0
Unpinned dependencies mean your project builds differently depending on when you run pip install. Pin your versions. Your future self will thank you.
A Practical Workflow
Here's what I actually do for every new Python project:
mkdir myproject && cd myproject
python -m venv venv
source venv/bin/activate
pip install --upgrade pip
# Install what I need
pip install flask pytest black
# Freeze dependencies
pip freeze > requirements.txt
# Add venv to gitignore
echo "venv/" >> .gitignore
When someone else clones the project:
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
Done. Same environment, same versions, no surprises.
If you want to practice writing Python in an isolated environment without worrying about any of this setup, CodeUp gives you a browser-based coding environment where you can focus on the code itself.
The Bottom Line
Virtual environments aren't optional -- they're a fundamental part of working with Python. The tool you pick matters less than actually using one consistently. Start with venv because it's built-in and simple. Graduate to poetry or conda when your projects demand it. And never, ever run pip install without checking which Python it's pointing at first.