How to Inference

STEP 1: Install Ollama
Windows Installation
macOS Installation
Linux Installation
STEP 2: Initiate the model on Ollama

STEP 3: Start Conversing


Last updated