RAG fundamentals
Forget about Ctrl+F! 🤯 With RAG, your documents will answer your questions directly. 😎 Step-by-step tutorial with Hugging Face and ChromaDB. Unleash the power of AI (and show off to your friends)! 💪
The Ultimate Real-Time GPU Tracking Tool
Monitor your GPU's performance, temperature, and memory usage directly from your Ubuntu menu bar with GPU Monitor. This user-friendly and efficient application supports multiple GPUs and is fully integrated with the latest Ubuntu operating system. Get live updates and optimize your gaming or development tasks. Download now and take control of your GPU's health today!
GPU Monitor is an intuitive tool designed for developers, gamers, and professionals who need to keep an eye on their graphics card's performance and health in real time. It integrates seamlessly with the Ubuntu menu bar, providing essential information at your fingertips.
Clone with https
git clone https://github.com/maximofn/gpu_monitor.git
or with ssh
git clone git@github.com:maximofn/gpu_monitor.git
Make sure that you do not have any venv or conda environment installed
if [ -n "$VIRTUAL_ENV" ]; thendeactivatefiif command -v conda &>/dev/null; thenconda deactivatefi
Now install the dependencies
sudo apt-get install python3-gi python3-gi-cairo gir1.2-gtk-3.0sudo apt-get install gir1.2-appindicator3-0.1pip3 install nvidia-ml-py3pip3 install pynvml
Execute this script
./add_to_startup.sh
Then when you restart your computer, the GPU Monitor will start automatically.
If you like it consider giving the repository a star ⭐, but if you really like it consider buying me a coffee ☕.
Forget about Ctrl+F! 🤯 With RAG, your documents will answer your questions directly. 😎 Step-by-step tutorial with Hugging Face and ChromaDB. Unleash the power of AI (and show off to your friends)! 💪
😠 Are your commits written in alien language? 👽 Join the club! 😅 Learn Conventional Commits in Python and stop torturing your team with cryptic messages. git-changelog and commitizen will be your new best friends. 🤝
Have you ever talked to an LLM and they answered you something that sounds like they've been drinking machine coffee all night long 😂 That's what we call a hallucination in the LLM world! But don't worry, because it's not that your language model is crazy (although it can sometimes seem that way 🤪). The truth is that LLMs can be a bit... creative when it comes to generating text. But thanks to DoLa, a method that uses contrast layers to improve the feasibility of LLMs, we can keep our language models from turning into science fiction writers 😂. In this post, I'll explain how DoLa works and show you a code example so you can better understand how to make your LLMs more reliable and less prone to making up stories. Let's save our LLMs from insanity and make them more useful! 🚀
Space to calculate the memory needed to run a model
Your colleague Patric is writing code that is hard to read? Share with him this code formatter that I show you in this post! Come in and learn how to format code to make it more understandable. We are not going to solve Patric's problems, but at least you won't suffer when reading it
Declare neural networks clearly in Pytorch
Hugging Face spaces allow us to run models with very simple demos, but what if the demo breaks? Or if the user deletes it? That's why I've created docker containers with some interesting spaces, to be able to use them locally, whatever happens. In fact, if you click on any project view button, it may take you to a space that doesn't work.
Dataset with jokes in English
Dataset with translations from English to Spanish
Dataset with Netflix movies and series