Create virtual environments with uv

Create virtual environments with uv Create virtual environments with uv

Environment Management with uvlink image 6

Disclaimer: This post has been translated to English using a machine translation model. Please, let me know if you find any mistakes.

So far I have been managing my environments with conda. But for a while now I've been reading a lot about poetry, but especially about uv. What are the advantages of uv? Speed. uv is implemented in Rust, so it manages environments and installs packages extremely quickly.

The following table shows the speed difference between different package managers. Source: LLMs-from-scratch/setup/01_optional-python-setup-preferences/native-uv.md

Command Speed
conda install <pkg> slow
pip install <pkg> up to 10 times faster than the previous version
uv pip install <pkg> between 5 and 10 times faster than the previous
uv add <pkg> between 2 and 5 times faster than the previous

Looking at the table, it's definitely worth using uv. So let's see how to create an environment and install packages with uv.

Repository downloadlink image 7

As I said, I am using LLMs-from-scratch/setup/01_optional-python-setup-preferences /native-uv.md as the source, so let's download the repository, install the proposed environment, and see how to run a script.

We use --depth 1 to download only the latest commit of the repository and make it clone faster, we are not interested in the history.

	
git clone https://github.com/rasbt/LLMs-from-scratch.git --depth 1
Copy
	
Cloning into 'LLMs-from-scratch'...
remote: Enumerating objects: 260, done.
remote: Counting objects: 100% (260/260), done.
remote: Compressing objects: 100% (226/226), done.
remote: Total 260 (delta 61), reused 121 (delta 22), pack-reused 0 (from 0)
Receiving objects: 100% (260/260), 1.64 MiB | 6.94 MiB/s, done.
Resolving deltas: 100% (61/61), done.

Now we are going to the repository that we have downloaded

	
cd LLMs-from-scratch
Copy

Install uvlink image 8

If we are on macOS or Linux, we can install using the command

curl -LsSf https://astral.sh/uv/install.sh | sh

If we are on Windows

curl -LsSf https://astral.sh/uv/install.sh | sh

Create environmentlink image 9

If we do an ls we can see that there is a file called pyproject.toml, this will be the file that uv will use to create the environment.

	
ls
Copy
	
2025-03-10-uv.ipynb appendix-D ch04 pyproject.toml
CITATION.cff appendix-E ch05 requirements.txt
LICENSE.txt ch01 ch06 setup
README.md ch02 ch07
appendix-A ch03 pixi.toml

So let's see what the file has

	
cat pyproject.toml
Copy
	
[project]
name = "llms-from-scratch"
version = "0.1.0"
description = "Implement a ChatGPT-like LLM in PyTorch from scratch, step by step"
readme = "README.md"
requires-python = ">=3.10"
dependencies = [
"torch>=2.3.0",
"jupyterlab>=4.0",
"tiktoken>=0.5.1",
"matplotlib>=3.7.1",
"tensorflow>=2.18.0",
"tqdm>=4.66.1",
"numpy>=1.26,<2.1",
"pandas>=2.2.1",
"pip>=25.0.1",
]
[tool.setuptools.packages]
find = {}
[tool.uv.sources]
llms-from-scratch = { workspace = true }
[dependency-groups]
dev = [
"llms-from-scratch",
]
[tool.ruff]
line-length = 140
[tool.ruff.lint]
exclude = [".venv"]
# Ignored rules (W504 removed)
ignore = [
"C406", "E226", "E402", "E702", "E703",
"E722", "E731", "E741"
]

As can be seen, there are data such as the name, version, etc., and the dependencies, which are the packages we are going to install.

To create the environment, we use the command uv sync, and we add the --dev flag to also install development dependencies and the --python flag to specify the version of Python we want to use.

	
uv sync --dev --python 3.11
Copy
	
Using CPython 3.11.11
Creating virtual environment at: .venv
Resolved 160 packages in 175ms
Installed 139 packages in 1.46s
+ absl-py==2.1.0
+ anyio==4.8.0
+ appnope==0.1.4
+ argon2-cffi==23.1.0
+ argon2-cffi-bindings==21.2.0
+ arrow==1.3.0
+ asttokens==3.0.0
+ astunparse==1.6.3
+ async-lru==2.0.4
+ attrs==25.1.0
+ babel==2.17.0
+ beautifulsoup4==4.13.3
+ bleach==6.2.0
+ certifi==2025.1.31
+ cffi==1.17.1
+ charset-normalizer==3.4.1
+ comm==0.2.2
+ contourpy==1.3.1
+ cycler==0.12.1
+ debugpy==1.8.13
+ decorator==5.2.1
+ defusedxml==0.7.1
+ executing==2.2.0
+ fastjsonschema==2.21.1
+ filelock==3.17.0
+ flatbuffers==25.2.10
+ fonttools==4.56.0
+ fqdn==1.5.1
+ fsspec==2025.3.0
+ gast==0.6.0
+ google-pasta==0.2.0
+ grpcio==1.70.0
+ h11==0.14.0
+ h5py==3.13.0
+ httpcore==1.0.7
+ httpx==0.28.1
+ idna==3.10
+ ipykernel==6.29.5
+ ipython==9.0.2
+ ipython-pygments-lexers==1.1.1
+ isoduration==20.11.0
+ jedi==0.19.2
+ jinja2==3.1.6
+ json5==0.10.0
+ jsonpointer==3.0.0
+ jsonschema==4.23.0
+ jsonschema-specifications==2024.10.1
+ jupyter-client==8.6.3
+ jupyter-core==5.7.2
+ jupyter-events==0.12.0
+ jupyter-lsp==2.2.5
+ jupyter-server==2.15.0
+ jupyter-server-terminals==0.5.3
+ jupyterlab==4.3.5
+ jupyterlab-pygments==0.3.0
+ jupyterlab-server==2.27.3
+ keras==3.9.0
+ kiwisolver==1.4.8
+ libclang==18.1.1
+ markdown==3.7
+ markdown-it-py==3.0.0
+ markupsafe==3.0.2
+ matplotlib==3.10.1
+ matplotlib-inline==0.1.7
+ mdurl==0.1.2
+ mistune==3.1.2
+ ml-dtypes==0.4.1
+ mpmath==1.3.0
+ namex==0.0.8
+ nbclient==0.10.2
+ nbconvert==7.16.6
+ nbformat==5.10.4
+ nest-asyncio==1.6.0
+ networkx==3.4.2
+ notebook-shim==0.2.4
+ numpy==2.0.2
+ opt-einsum==3.4.0
+ optree==0.14.1
+ overrides==7.7.0
+ packaging==24.2
+ pandas==2.2.3
+ pandocfilters==1.5.1
+ parso==0.8.4
+ pexpect==4.9.0
+ pillow==11.1.0
+ pip==25.0.1
+ platformdirs==4.3.6
+ prometheus-client==0.21.1
+ prompt-toolkit==3.0.50
+ protobuf==5.29.3
+ psutil==7.0.0
+ ptyprocess==0.7.0
+ pure-eval==0.2.3
+ pycparser==2.22
+ pygments==2.19.1
+ pyparsing==3.2.1
+ python-dateutil==2.9.0.post0
+ python-json-logger==3.3.0
+ pytz==2025.1
+ pyyaml==6.0.2
+ pyzmq==26.2.1
+ referencing==0.36.2
+ regex==2024.11.6
+ requests==2.32.3
+ rfc3339-validator==0.1.4
+ rfc3986-validator==0.1.1
+ rich==13.9.4
+ rpds-py==0.23.1
+ send2trash==1.8.3
+ setuptools==76.0.0
+ six==1.17.0
+ sniffio==1.3.1
+ soupsieve==2.6
+ stack-data==0.6.3
+ sympy==1.13.1
+ tensorboard==2.18.0
+ tensorboard-data-server==0.7.2
+ tensorflow==2.18.0
+ tensorflow-io-gcs-filesystem==0.37.1
+ termcolor==2.5.0
+ terminado==0.18.1
+ tiktoken==0.9.0
+ tinycss2==1.4.0
+ torch==2.6.0
+ tornado==6.4.2
+ tqdm==4.67.1
+ traitlets==5.14.3
+ types-python-dateutil==2.9.0.20241206
+ typing-extensions==4.12.2
+ tzdata==2025.1
+ uri-template==1.3.0
+ urllib3==2.3.0
+ wcwidth==0.2.13
+ webcolors==24.11.1
+ webencodings==0.5.1
+ websocket-client==1.8.0
+ werkzeug==3.1.3
+ wheel==0.45.1
+ wrapt==1.17.2

It has created the environment and installed the packages in a lightning-fast way

Moreover, if we run ls again now we will see a new folder called .venv, that is the folder for the virtual environment.

	
ls -a
Copy
	
. CITATION.cff ch02 pyproject.toml
.. LICENSE.txt ch03 requirements.txt
.git README.md ch04 setup
.github appendix-A ch05 uv.lock
.gitignore appendix-D ch06
.venv appendix-E ch07
2025-03-10-uv.ipynb ch01 pixi.toml

Add packageslink image 10

If we want to add packages to our environment that are not in the pyproject.toml file, we can do so with the command uv add <pkg>.

For example, if we run cat pyproject.toml | grep dotenv we will see that the package python-dotenv is not installed.

	
cat pyproject.toml | grep dotenv
Copy

So we add the package

	
uv add dotenv
Copy
	
Resolved 162 packages in 92ms
Installed 2 packages in 5ms â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘â–‘ [0/0] Installing wheels...
+ dotenv==0.9.9
+ python-dotenv==1.0.1

If we now run cat pyproject.toml | grep dotenv again, we will see that it has been added to the file.

	
cat pyproject.toml | grep dotenv
Copy
	
"dotenv>=0.9.9",

This is very good because now with this new pyproject.toml file we can recreate the environment with the command uv sync on any other computer.

Running a scriptlink image 11

Now that we have an environment, we can run a script in two ways, the first with uv run python <script>.py, which will activate the .venv environment and run the script.

	
uv run python setup/02_installing-python-libraries/python_environment_check.py
Copy
	
[OK] Your Python version is 3.11.11
[OK] torch 2.6.0
[OK] jupyterlab 4.3.5
[OK] tiktoken 0.9.0
[OK] matplotlib 3.10.1
[OK] tensorflow 2.18.0
[OK] tqdm 4.67.1
[OK] numpy 2.0.2
[OK] pandas 2.2.3
[OK] psutil 7.0.0

However, if what we want is to run the script directly with python <script>.py, we need to activate the environment manually first.

	
source .venv/bin/activate && python setup/02_installing-python-libraries/python_environment_check.py
Copy
	
[OK] Your Python version is 3.11.11
[OK] torch 2.6.0
[OK] jupyterlab 4.3.5
[OK] tiktoken 0.9.0
[OK] matplotlib 3.10.1
[OK] tensorflow 2.18.0
[OK] tqdm 4.67.1
[OK] numpy 2.0.2
[OK] pandas 2.2.3
[OK] psutil 7.0.0

Continue reading

Last posts -->

Have you seen these projects?

Horeca chatbot

Horeca chatbot Horeca chatbot
Python
LangChain
PostgreSQL
PGVector
React
Kubernetes
Docker
GitHub Actions

Chatbot conversational for cooks of hotels and restaurants. A cook, kitchen manager or room service of a hotel or restaurant can talk to the chatbot to get information about recipes and menus. But it also implements agents, with which it can edit or create new recipes or menus

Subtify

Subtify Subtify
Python
Whisper
Spaces

Subtitle generator for videos in the language you want. Also, it puts a different color subtitle to each person

View all projects -->

Do you want to apply AI in your project? Contact me!

Do you want to improve with these tips?

Last tips -->

Use this locally

Hugging Face spaces allow us to run models with very simple demos, but what if the demo breaks? Or if the user deletes it? That's why I've created docker containers with some interesting spaces, to be able to use them locally, whatever happens. In fact, if you click on any project view button, it may take you to a space that doesn't work.

Flow edit

Flow edit Flow edit

FLUX.1-RealismLora

FLUX.1-RealismLora FLUX.1-RealismLora
View all containers -->

Do you want to apply AI in your project? Contact me!

Do you want to train your model with these datasets?

short-jokes-dataset

Dataset with jokes in English

opus100

Dataset with translations from English to Spanish

netflix_titles

Dataset with Netflix movies and series

View more datasets -->