Yet Another Bookmarks Service

Viewing mzimmerm's Bookmarks

rocm delete ,

[https://github.com/ollama/ollama/issues/2637] - - public:mzimmerm
amd, apu, gfx900, install, pytorch, rocm - 6 | id:1491350 -

Latest (0.1.27) docker image with ROCm works for me on Ryzen 5600G with 8GB VRAM allocation. Prompt processing is 2x faster than with CPU. Generation runs at max speed even if CPU is busy running other processes. I am on Fedora 39. Container setup: HSA_OVERRIDE_GFX_VERSION=9.0.0

[https://repo.radeon.com/amdgpu-install/] - - public:mzimmerm
amd, amdgpu, repo, rocm - 4 | id:1489975 -

Repo of rocm (amdgpu) packages from amd. The last version 4, 4.5.2 is under 21.40.2. Try to install with Kernel 5.4 (release 2020). Same packages under different old names are at https://repo.radeon.com/rocm/apt/

[https://civitai.com/articles/2296/how-to-install-rocm-on-opensusesuse] - - public:mzimmerm
amd, gpu, rocm - 3 | id:1489831 -

Another rocm installation claim on Opensuse. Interesting note: I realize this is a bit old, but you don't really need amdgpu from the repository: it comes for free with the kernel. amdgpu-dkms is only needed if you're stuck on an older kernel version and you can't upgrade for some reason. For example, Tumbleweed users will not need it..

[https://medium.com/@rafaelmanzanom/ditching-cuda-for-amd-rocm-for-more-accessible-llm-inference-ryzen-apus-edition-92c3649f8f7d] - - public:mzimmerm
ai, amd, apu, compile, gfx902, install, pytorch, rocm - 8 | id:1489810 -

Train LLM on AMD APU. In this scenario, we’ll use an APU because most laptops with a Ryzen CPU include an iGPU; specifically, this post should work with iGPUs based on the “GCN 5.0” architecture, or “Vega” for friends. We’ll use an AMD Ryzen 2200G in this post, an entry-level processor equipped with 4C/4T and an integrated GPU.

With marked bookmarks
| (+) | |

Viewing 1 - 31, 31 links out of 31 links, page: 1

Follow Tags