Posted 2/13/2024, 2:00:00 PM
Nvidia Demos Local AI Chatbot for RTX GPUs, Promising Faster Queries from Personal Data
- Nvidia released an early demo of Chat with RTX, an AI chatbot that runs locally on PCs with RTX GPUs to analyze documents and videos
- Allows searching transcripts of YouTube videos and summaries; useful for analyzing personal data and documents
- Runs locally with no lag unlike cloud-based Copilot or ChatGPT, but the app is rough as an early developer demo
- Leverages Tensor cores on RTX GPUs to speed up responses to queries; requires at least 8GB VRAM
- Shows the promise of future local AI chatbots, but has bugs and limitations as a demo, like inaccurate responses