Search
Results
openSUSE Project Listed as Organization on Hugging Face - openSUSE News
Optimum
Optimum is an extension of Transformers that provides a set of performance optimization tools to train and run models on targeted hardware with maximum efficiency. It is also the repository of small, mini, tiny models.
google/bert_uncased_L-4_H-256_A-4 · Hugging Face
Repository of all Bert models, including small. Start using this model for testing.
stabilityai (Stability AI)
My account on Stability AI - it is just a link to huggingface
Open LLM Leaderboard - a Hugging Face Space by HuggingFaceH4
Comparison of efficiency of all LLM models on hugging face
6 Ways to Run LLMs Locally (also how to use HuggingFace)
Various methods to run LLM models locally hugging face is only one of them.
Training Bert on Yelp - Copy of training.ipynb - Colaboratory
Optimizing LLMs for Speed and Memory
bigcode (BigCode)
Research community developing various code models, small and big. Models may not be instruct
WizardLM (WizardLM)
deepseek-ai (DeepSeek)
They have the 1.3B version!!! This may be the best to start with Newspeak. Should work train even on huggingcface
Can Ai Code Results - a Hugging Face Space by mike-ravkine
Comparison of LLM models for coding
openchat/openchat-3.5-0106 · Hugging Face
Open source with lots of information. Uses Multiple undrelying models. Not sure how I would train for it
Welcome Mixtral - a SOTA Mixture of Experts on Hugging Face
The Mixtral model is new, and seems to be good. Click on “Demo“ to test it
StarCoder: A State-of-the-Art LLM for Code
Article has comparison with other code-LLM models
codellama (Code Llama) - Huggingface model for generating programs. Maybe can be used for Newspeak?
Fine-tune a pretrained model
Use the Bert model to train on Yelp dataset
Hugging Face – The AI community building the future.
My Account and profile on Huggingface - the home of AI transformers, models, training sets
