From https://github.com/eleiton/ollama-intel-arc
WebUI + ollama (on Intel iGPU (Core Ultra 9 185H))
git clone https://github.com/eleiton/ollama-intel-arc.git cd ollama-intel-arc docker compose up -d |
root@server1:~/llama-cpp# docker ps -a CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 7e6aa0aefcf8 ghcr.io/open-webui/open-webui:latest "bash start.sh" 11 minutes ago Up 11 minutes (healthy) 0.0.0.0:4040->8080/tcp, [::]:4040->8080/tcp open-webui 9d4dc3192e19 intelanalytics/ipex-llm-inference-cpp-xpu:latest "sh -c 'mkdir -p /ll…" 11 minutes ago Up 11 minutes 0.0.0.0:11434->11434/tcp, [::]:11434->11434/tcp ollama-intel-arc |



add SDnext or comfyUI
docker login |
#docker compose -f docker-compose.sdnext.yml up -d docker compose -f docker-compose.comfyui.yml up -d |