Local AI Chat. This is meant to demonstrate capabilities and run on a local CPU (or GPU if you have and NVidia card). I will likely update this repo later with information on what to do if you have an NVidia card and a lot of VRAM. For now, just about anyone can run this locally.
Go to file
Brian Fertig 230a3a4855 Initial Commit 2025-07-01 12:55:39 -06:00
config/mcpo Initial Commit 2025-07-01 12:55:39 -06:00
.gitignore Initial Commit 2025-07-01 12:55:39 -06:00
README.md Initial Commit 2025-07-01 12:55:39 -06:00
docker-compose.yml Initial Commit 2025-07-01 12:55:39 -06:00
sample.env Initial Commit 2025-07-01 12:55:39 -06:00

README.md

Local AI and Automation Docker Suite

Docker Compose Contains

  • Ollama (LLM Host)
  • Open WebUI (Chat Interface)
  • MCPO Server (Model Context Protocol relay server)
  • N8N (Automation)
  • PostgresDB (for N8N)
  • Jupyter Notebook

Install Instructions

  1. Copy sample.env to .env
  2. Edit .env file to contain your ultimate secret passwords
  3. Copy config/mcpo/sample.config.json to config/mcpo/config.json
  4. Note -- there is no need to edit the config.json file right away
  5. From the root directory (of this repo) run docker compose up -d
  6. If you experience any errors, run it again until it stands completely up