Running local models
When building LLM applications with LangChain, you need to decide where your models will run.
- Advantages of local models:
- Complete data control and privacy
 - No API costs or usage limits
 - No internet dependency
 - Control over model parameters and fine-tuning
 
 - Advantages of cloud models:
- No hardware requirements or setup complexity
 - Access to the most powerful, state-of-the-art models
 - Elastic scaling without infrastructure management
 - Continuous model improvements without manual updates
 
 - When to choose local models:
- Applications with strict data privacy requirements
 - Development and testing environments
 - Edge or offline deployment scenarios
 - Cost-sensitive applications with predictable, high-volume usage
 
 
Let’s start with one of the most developer-friendly options for running local models.
Getting started with Ollama
Ollama provides a developer-friendly way to run powerful open-source models locally. It provides a simple interface for...