ROCm 5.xx ever planning to include gfx90c GPUs? · Issue #1743 · ROCm/ROCm
Suggested git-build of pytorch on gfx90c FAILED for me
Suggested git-build of pytorch on gfx90c FAILED for me
venv is Python build-in module; pyvenv is Python script on top of venv. pyenv is a OS-level thing. All of them allow to create virtual environments
Document about LM Studio
180 Celsius for 30 minutes
Optimum is an extension of Transformers that provides a set of performance optimization tools to train and run models on targeted hardware with maximum efficiency. It is also the repository of small, mini, tiny models.
Excellent document about BERT transformers / models and their parameters: - L=number of layers. - H=size of the hidden layer = number of vectors for each word in the sentence. - A = Number of self-attention heads - Total parameters.
Repository of all Bert models, including small. Start using this model for testing.
LM Studio can be installed on Linux with APU or GPU (looks like it needs the AI CPU though??) and run LLM. Install on Laptop and test if it works.
Index of all python packages. pip looks here first to find a package.
Install PyTorch corresponding to rocm versions from Pytorch prebuild by AMD. Also has post-install validation
Test for PyTorch and ROCm after installing ROCm.
Follow falling prices
My account on SageMaker studio. The give out 4 hours of GPU a day!
A program that changes VRAM / UMA size in BIOS on AMD APU, even if VRAM / UMA does not show in BIOS
If you have AMD GPU as I do then you can grab PCI ID for the device with lspci command executed with -D flag (shows PCI doamin) and read the following file cat /sys/bus/pci/devices/${pci_slot}/mem_info_vram_total, it contains GPU VRAM size in bytes.
8000G is the APU series for AI
Top of the guide describing ROCm on Linux. There are 2 core approaches: Using RPM (Package manager), or using AMD installer. I should use Package manager. Also single-version vs. multi-version. I should use single-version, latest.
The links in “How to guide“ provide instructions that are hopeful. Maybe start with those instructions!
Another rocm installation claim on Opensuse. Interesting note: I realize this is a bit old, but you don't really need amdgpu from the repository: it comes for free with the kernel. amdgpu-dkms is only needed if you're stuck on an older kernel version and you can't upgrade for some reason. For example, Tumbleweed users will not need it..
This guy seems to claim ROCM can run on Tumbleweed using Distrobox. But what is distrobox?
This guy claims successful installation of ROCm on Ubuntu - this seems to be workable for Tumbleweed as well. See the comment “nav9 commented on Jul 16, 2023“
Describes how to force Jupyter lab to use a venv for it's kernels!!
Describes the GPU Kaggle is giving 30h a month on.
Doing what a transformer is doing, by hand
Kaggle is like huggingface. They can run notebooks, and give GPU power to notebooks
Mini course of statistical foundations of ML
My account on Stability AI - it is just a link to huggingface
Comparison of efficiency of all LLM models on hugging face
Various methods to run LLM models locally hugging face is only one of them.
AMD seems to sell these accelerators, which are like video cards.