eternal singularity
you can now install & run your local chatGPT like model
they finetuned 13B LLaMA model, it must be better than 7B, check it out
😁24🥰4👍2💩1
PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing
Presents a sparse LM with 1T parameters trained over 329B tokens
Presents a sparse LM with 1T parameters trained over 329B tokens
👍14👎1😍1