Posted 2/14/2024, 6:44:00 PM
Nvidia Launches Local AI Chatbot for RTX GPUs That Keeps User Data Private
- Nvidia released a free demo of a new AI chatbot, Chat with RTX, that runs locally on PCs with GeForce RTX GPUs
- The chatbot can be personalized by training it on the user's own content files and URLs
- It keeps user data private since it runs locally without relying on cloud services
- It requires an RTX 30 series or higher GPU with at least 8GB VRAM and Windows 10 or 11
- The chatbot uses Nvidia's TensorRT-LLM software and RAG to provide relevant answers connected to open-source LLMs