Added image and intro
This commit is contained in:
parent
01e0c5bd8d
commit
25deda0445
|
@ -1,5 +1,9 @@
|
||||||
# Local AI and Automation Docker Suite
|
# Local AI and Automation Docker Suite
|
||||||
|
|
||||||
|
OK, let's get you started! At the end of the day, you only really need ```Ollama``` and an IDE (either ```VS Code``` or ```JetBrains```), so why does this docker setup have so many different services? The reason is that this setup is here to get you started, and it takes a modular approach which, over time, is going to allow you to upgrade parts of this install at a component level. As an example, if you love what this setup brings, but want to use bigger better LLM Models? You can buy a server just for LLM Hosting, put Ollama on it, and just re-point your Open WebUI container to the new server, and everything else continues to work as normal.
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
## Docker Compose Contains
|
## Docker Compose Contains
|
||||||
|
|
||||||
- Ollama (LLM Host)
|
- Ollama (LLM Host)
|
||||||
|
|
Binary file not shown.
After Width: | Height: | Size: 208 KiB |
Loading…
Reference in New Issue