Generative AI using Open Source LLMs cover

Generative AI using Open Source LLMs

Instructor: CampusX

Language: Hindi

Validity Period: 1095 days

₹599 including 18% GST

This course shows you how to use Ollama and other such tools to run, customize, and connect open source large language models on your own system. You’ll learn what Ollama is, how its models work, and how you can use it to build real Gen AI projects—without relying on cloud APIs.

We start with the basics: how models are structured, what commands to use, and how to run them locally. Then we move to practical work—using the Ollama library in Python, creating your own model files, and calling models through the REST API.

Once you’re comfortable, you’ll learn how to make Ollama do more:
generate embeddings, use tools, and build RAG (Retrieval-Augmented Generation) systems that pull real data into your model’s answers.

Finally, we’ll cover advanced setups—running multiple models together, handling concurrency, and using GPUs for faster performance.

By the end of this course, you’ll know how to:

  • Run open-source models locally using Ollama
  • Customize models with your own configurations
  • Connect Ollama with external APIs and tools
  • Build practical Gen AI apps that use your data
  • Optimize performance with model chaining and GPU offloading

If you want to understand how Gen AI works under the hood and build real projects with it, this course is for you.

Course Duration: 19+ hours

Reviews
Other Courses