diff --git a/README.md b/README.md index 48adb17..6c074bd 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,9 @@ # Local AI and Automation Docker Suite +OK, let's get you started! At the end of the day, you only really need ```Ollama``` and an IDE (either ```VS Code``` or ```JetBrains```), so why does this docker setup have so many different services? The reason is that this setup is here to get you started, and it takes a modular approach which, over time, is going to allow you to upgrade parts of this install at a component level. As an example, if you love what this setup brings, but want to use bigger better LLM Models? You can buy a server just for LLM Hosting, put Ollama on it, and just re-point your Open WebUI container to the new server, and everything else continues to work as normal. + +![Docker install diagram](docker.png) + ## Docker Compose Contains - Ollama (LLM Host) diff --git a/docker.png b/docker.png new file mode 100644 index 0000000..fffe14d Binary files /dev/null and b/docker.png differ