Yet Another Bookmarks Service

Viewing mzimmerm's Bookmarks

install delete ,

[https://medium.com/@rafaelmanzanom/ditching-cuda-for-amd-rocm-for-more-accessible-llm-inference-ryzen-apus-edition-92c3649f8f7d] - - public:mzimmerm
amd, apu, compile, gfx902, install, pytorch, rocm, ai - 8 | id:1489810 -

Train LLM on AMD APU. In this scenario, we’ll use an APU because most laptops with a Ryzen CPU include an iGPU; specifically, this post should work with iGPUs based on the “GCN 5.0” architecture, or “Vega” for friends. We’ll use an AMD Ryzen 2200G in this post, an entry-level processor equipped with 4C/4T and an integrated GPU.

With marked bookmarks
| (+) | |

Viewing 1 - 13, 13 links out of 13 links, page: 1

Follow Tags