Memory Calculation
Disclaimer: This post has been translated to English using a machine translation model. Please, let me know if you find any mistakes.
If you want to calculate the memory you need to run a model, use this space from HuggingFace.
-->
Disclaimer: This post has been translated to English using a machine translation model. Please, let me know if you find any mistakes.
If you want to calculate the memory you need to run a model, use this space from HuggingFace.
Learn how to use Langchain with the most popular open-source integrations. In this post, we will explore how to integrate Langchain with ChromaDB, Ollama and HuggingFace.
Forget about Ctrl+F! 🤯 With RAG, your documents will answer your questions directly. 😎 Step-by-step tutorial with Hugging Face and ChromaDB. Unleash the power of AI (and show off to your friends)! 💪
😠 Are your commits written in alien language? 👽 Join the club! 😅 Learn Conventional Commits in Python and stop torturing your team with cryptic messages. git-changelog and commitizen will be your new best friends. 🤝
Increase DataLoader performance with pin_memory and num_workers
Python library to get GPU data like `nvidia-smi`
Your colleague Patric is writing code that is hard to read? Share with him this code formatter that I show you in this post! Come in and learn how to format code to make it more understandable. We are not going to solve Patric's problems, but at least you won't suffer when reading it
Hugging Face spaces allow us to run models with very simple demos, but what if the demo breaks? Or if the user deletes it? That's why I've created docker containers with some interesting spaces, to be able to use them locally, whatever happens. In fact, if you click on any project view button, it may take you to a space that doesn't work.
Dataset with jokes in English
Dataset with translations from English to Spanish
Dataset with Netflix movies and series