Cost of Asbestos Testing in Canada – LCS Laboratory Inc.
Use them to test. Submit sample in 2 ziplock bags by padded mail envelope.
Use them to test. Submit sample in 2 ziplock bags by padded mail envelope.
Use this to install ROCm on Tumbleweed. Does NOT talk about Pytorch, the tag is for completeness.
They describe using Bell-Theorem-like inequality to refute a real-numbers-only quantum theory is equivalent to imaginary-numbers quantum theory
SHOWS EXACTS SPECS OF CPU OR APU INCLUDING GRAPHICS
$249, at Canadacomputers, they also have open box for $224. BUY THIS, NOT THE AMD Ryzen 5 5600GT(VEGA7), BECAUSE THIS HAS VEGA8, FASTER THAN VEGA7, AND SAME AS VEGA8 ON LAPTOP 2500U.
Install Pytorch on ROCm from AMD git repo. TRY THIS!!!
Database of AMD GPUs. Here we can see that 2500U is Vega 8 mobile , which is GCN 5.0. This is only supported in ROCm up to 4.5.2!!
Installing PyTorch for ROCm - this document claims gfx900 compatibility
Suggested git-build of pytorch on gfx90c FAILED for me
Excellent document about BERT transformers / models and their parameters: - L=number of layers. - H=size of the hidden layer = number of vectors for each word in the sentence. - A = Number of self-attention heads - Total parameters.
Top of the guide describing ROCm on Linux. There are 2 core approaches: Using RPM (Package manager), or using AMD installer. I should use Package manager. Also single-version vs. multi-version. I should use single-version, latest.
This guy claims successful installation of ROCm on Ubuntu - this seems to be workable for Tumbleweed as well. See the comment “nav9 commented on Jul 16, 2023“
Various methods to run LLM models locally hugging face is only one of them.
Replit is a site where I can run any REPL online. Can be used for AI
They have the 1.3B version!!! This may be the best to start with Newspeak. Should work train even on huggingcface
Another possible model. For coding capabilities, Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
Article has comparison with other code-LLM models
Chat, models. Not open source, but instruct and relatively small (3B). The 3B instruct may be the best to try on Newspeak.
AI Code tools : Good summary. Does not talk about which pre-trained model they use. One is gemini (bard) -> alphacode2
Natural Languge processing - full course.
Best summary of Natural Language Processing and terms - model (a language model - e.g. BertModel, defines encoder and decoder and their properties), transformer (a specific neural network based on attention paper), encoder (series of transformers on input), decoders (series of transformers on output). Bert does NOT use decoder. TensorFlow and PyTorch are possible backends to Transformers (NN). Summary: BERT is a highly complex and advanced language model that helps people automate language understanding.
Use the Bert model to train on Yelp dataset
Describes how to train a new language (desperanto) model.
The secret, he found, was to stick the particle in a “disordered” quantum landscape, one dotted with peaks and valleys. Each location would have a random height, representing a random energy. In a real material, this disorder might come from impurities such as missing atoms or atoms of different elements.
cache.addAll treatment of '/' '/index.html', and other assets.