Hello, I use AlmaLinux 10 as my server, and I have Intel i7 1360p with 96 Xe GPU cores, I would like to use these cores for AI.
I created Ollama docker, I installed some models, they run fine, but they run on CPU, so they are little bit slow, I would like to offload them to GPU cores to be little bit faster.
But I cannot found intel opencl package to install, AI said to install these packages but they are not avaiable:
[root@telekom intel-drivers]# sudo dnf install intel-opencl-rt intel-level-zero-gpu intel-media-driver intel-mediasdk Last metadata expiration check: 0:04:50 ago on Fri Feb 20 15:49:08 2026. No match for argument: intel-opencl-rt No match for argument: intel-level-zero-gpu No match for argument: intel-media-driver No match for argument: intel-mediasdk
and I cannot find any docs about this in google, please which packages to install so Ollama will see my GPU cores?
If someone is interested this is my docker compose:
services:
ollama:
image: ollama/ollama:latest
container_name: ollama
# Na Fedore/Alme je privileged: true najistejšia cesta, ako obísť SELinux pre GPU
privileged: true
ports:
- "11434:11434"
devices:
- /dev/dri:/dev/dri
environment:
- OLLAMA_HOST=0.0.0.0
- OLLAMA_INTEL_GPU=1
- OLLAMA_VULKAN=1
- ONEAPI_DEVICE_SELECTOR=level_zero:0
# Povieme Ollame, kde má hľadať tie nové Intel knižnice
- LD_LIBRARY_PATH=/opt/intel/oneapi/redist/lib/intel64:/usr/lib64
group_add:
- "39"
- "105"
volumes:
- ./ollama_data:/root/.ollama:Z # ":Z" povie SELinuxu, že k tomuto volume má mať Docker prístup
- /usr/lib64:/usr/lib64:ro
- /usr/share/vulkan:/usr/share/vulkan:ro
- /etc/OpenCL/vendors:/etc/OpenCL/vendors:ro
- /opt/intel/oneapi:/opt/intel/oneapi:ro
restart: unless-stopped
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
ports:
- "3000:8080"
environment:
- OLLAMA_BASE_URL=http://ollama:11434
volumes:
- ./open_webui_data:/app/backend/data:Z # Opäť ":Z" kvôli SELinuxu
depends_on:
- ollama
restart: unless-stopped
HI, I have the same problem, no binaries in repos, every single git release only for debian.. Intels, repos are down / blocked ? 403 or 404; Even conda does not fix the problem; Anyone tried to install gpu drivers from intel for SYCL ?
The standard repo for AlmaLinux 10 does not include intel-opencl-rt, but EPEL 10.1 includes intel-compute-runtime, which has intel-opencl / intel-level-zero / intel-ocloc / intel-igc as dependencies. Therefore, if you enable EPEL on AlmaLinux 10 and run dnf install intel-compute-runtime, you should be able to install the required user space runtime set, including dependencies.
Hi, glad intel-compute-runtime installed. Note that this provides OpenCL/Level Zero, but Ollama’s Intel GPU support on Linux is via Vulkan, not OpenCL/Level Zero.
Can you confirm Vulkan works on the host and inside the container?
Container: /dev/dri/renderD* is present and vulkaninfo --summary works
Also check /usr/share/vulkan/icd.d/ has an Intel ICD JSON (Mesa ANV), otherwise Ollama will fall back to CPU.
If you paste the output of ls -l /dev/dri, vulkaninfo --summary, and the Ollama server log (with debug if possible), we can see why it’s not selecting Vulkan.