Hello, I use AlmaLinux 10 as my server, and I have Intel i7 1360p with 96 Xe GPU cores, I would like to use these cores for AI.
I created Ollama docker, I installed some models, they run fine, but they run on CPU, so they are little bit slow, I would like to offload them to GPU cores to be little bit faster.
But I cannot found intel opencl package to install, AI said to install these packages but they are not avaiable:
[root@telekom intel-drivers]# sudo dnf install intel-opencl-rt intel-level-zero-gpu intel-media-driver intel-mediasdk
Last metadata expiration check: 0:04:50 ago on Fri Feb 20 15:49:08 2026.
No match for argument: intel-opencl-rt
No match for argument: intel-level-zero-gpu
No match for argument: intel-media-driver
No match for argument: intel-mediasdk
and I cannot find any docs about this in google, please which packages to install so Ollama will see my GPU cores?
If someone is interested this is my docker compose:
services:
ollama:
image: ollama/ollama:latest
container_name: ollama
# Na Fedore/Alme je privileged: true najistejšia cesta, ako obísť SELinux pre GPU
privileged: true
ports:
- "11434:11434"
devices:
- /dev/dri:/dev/dri
environment:
- OLLAMA_HOST=0.0.0.0
- OLLAMA_INTEL_GPU=1
- OLLAMA_VULKAN=1
- ONEAPI_DEVICE_SELECTOR=level_zero:0
# Povieme Ollame, kde má hľadať tie nové Intel knižnice
- LD_LIBRARY_PATH=/opt/intel/oneapi/redist/lib/intel64:/usr/lib64
group_add:
- "39"
- "105"
volumes:
- ./ollama_data:/root/.ollama:Z # ":Z" povie SELinuxu, že k tomuto volume má mať Docker prístup
- /usr/lib64:/usr/lib64:ro
- /usr/share/vulkan:/usr/share/vulkan:ro
- /etc/OpenCL/vendors:/etc/OpenCL/vendors:ro
- /opt/intel/oneapi:/opt/intel/oneapi:ro
restart: unless-stopped
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
ports:
- "3000:8080"
environment:
- OLLAMA_BASE_URL=http://ollama:11434
volumes:
- ./open_webui_data:/app/backend/data:Z # Opäť ":Z" kvôli SELinuxu
depends_on:
- ollama
restart: unless-stopped