lucidrains/soundstorm-pytorch
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanism #audio_generation #deep_learning #non_autoregressive #transformers
Stars: 181 Issues: 0 Forks: 6
https://github.com/lucidrains/soundstorm-pytorch
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanism #audio_generation #deep_learning #non_autoregressive #transformers
Stars: 181 Issues: 0 Forks: 6
https://github.com/lucidrains/soundstorm-pytorch
GitHub
GitHub - lucidrains/soundstorm-pytorch: Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind…
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch - lucidrains/soundstorm-pytorch
ray-project/aviary
Ray Aviary - evaluate multiple LLMs easily
Language: Python
#distributed_systems #large_language_models #ray #serving #transformers
Stars: 279 Issues: 7 Forks: 13
https://github.com/ray-project/aviary
Ray Aviary - evaluate multiple LLMs easily
Language: Python
#distributed_systems #large_language_models #ray #serving #transformers
Stars: 279 Issues: 7 Forks: 13
https://github.com/ray-project/aviary
GitHub
GitHub - ray-project/ray-llm: RayLLM - LLMs on Ray
RayLLM - LLMs on Ray. Contribute to ray-project/ray-llm development by creating an account on GitHub.
hiyouga/FastEdit
🩹Editing large language models within 10 seconds⚡
Language: Python
#bloom #chatbots #chatgpt #falcon #gpt #large_language_models #llama #llms #pytorch #transformers
Stars: 295 Issues: 5 Forks: 28
https://github.com/hiyouga/FastEdit
🩹Editing large language models within 10 seconds⚡
Language: Python
#bloom #chatbots #chatgpt #falcon #gpt #large_language_models #llama #llms #pytorch #transformers
Stars: 295 Issues: 5 Forks: 28
https://github.com/hiyouga/FastEdit
GitHub
GitHub - hiyouga/FastEdit: 🩹Editing large language models within 10 seconds⚡
🩹Editing large language models within 10 seconds⚡. Contribute to hiyouga/FastEdit development by creating an account on GitHub.
InternLM/lagent
A lightweight framework for building LLM-based agents
Language: Python
#agent #gpt #llm #transformers
Stars: 151 Issues: 2 Forks: 8
https://github.com/InternLM/lagent
A lightweight framework for building LLM-based agents
Language: Python
#agent #gpt #llm #transformers
Stars: 151 Issues: 2 Forks: 8
https://github.com/InternLM/lagent
GitHub
GitHub - InternLM/lagent: A lightweight framework for building LLM-based agents
A lightweight framework for building LLM-based agents - InternLM/lagent
tomaarsen/attention_sinks
Extend existing LLMs way beyond the original training length with constant memory usage, and without retraining
Language: Python
#llm #llms #nlp #python #transformers
Stars: 224 Issues: 2 Forks: 11
https://github.com/tomaarsen/attention_sinks
Extend existing LLMs way beyond the original training length with constant memory usage, and without retraining
Language: Python
#llm #llms #nlp #python #transformers
Stars: 224 Issues: 2 Forks: 11
https://github.com/tomaarsen/attention_sinks
GitHub
GitHub - tomaarsen/attention_sinks: Extend existing LLMs way beyond the original training length with constant memory usage, without…
Extend existing LLMs way beyond the original training length with constant memory usage, without retraining - tomaarsen/attention_sinks
lucidrains/meshgpt-pytorch
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanisms #deep_learning #mesh_generation #transformers
Stars: 195 Issues: 0 Forks: 7
https://github.com/lucidrains/meshgpt-pytorch
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanisms #deep_learning #mesh_generation #transformers
Stars: 195 Issues: 0 Forks: 7
https://github.com/lucidrains/meshgpt-pytorch
GitHub
GitHub - lucidrains/meshgpt-pytorch: Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch - lucidrains/meshgpt-pytorch
kyegomez/MultiModalMamba
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
Language: Python
#ai #artificial_intelligence #attention_mechanism #machine_learning #mamba #ml #pytorch #ssm #torch #transformer_architecture #transformers #zeta
Stars: 264 Issues: 0 Forks: 9
https://github.com/kyegomez/MultiModalMamba
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
Language: Python
#ai #artificial_intelligence #attention_mechanism #machine_learning #mamba #ml #pytorch #ssm #torch #transformer_architecture #transformers #zeta
Stars: 264 Issues: 0 Forks: 9
https://github.com/kyegomez/MultiModalMamba
GitHub
GitHub - kyegomez/MultiModalMamba: A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi…
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever. - kyegomez/MultiModalMamba
lucidrains/self-rewarding-lm-pytorch
Implementation of the training framework proposed in Self-Rewarding Language Model, from MetaAI
Language: Python
#artificial_intelligence #beyond_human_data #deep_learning #self_rewarding #transformers
Stars: 367 Issues: 0 Forks: 17
https://github.com/lucidrains/self-rewarding-lm-pytorch
Implementation of the training framework proposed in Self-Rewarding Language Model, from MetaAI
Language: Python
#artificial_intelligence #beyond_human_data #deep_learning #self_rewarding #transformers
Stars: 367 Issues: 0 Forks: 17
https://github.com/lucidrains/self-rewarding-lm-pytorch
GitHub
GitHub - lucidrains/self-rewarding-lm-pytorch: Implementation of the training framework proposed in Self-Rewarding Language Model…
Implementation of the training framework proposed in Self-Rewarding Language Model, from MetaAI - lucidrains/self-rewarding-lm-pytorch
FoundationVision/VAR
[GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of "Visual Autoregressive Modeling: Scalable Image Generation via Next-Scale Prediction"
Language: Python
#auto_regressive_model #diffusion_models #image_generation #transformers
Stars: 440 Issues: 6 Forks: 10
https://github.com/FoundationVision/VAR
[GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of "Visual Autoregressive Modeling: Scalable Image Generation via Next-Scale Prediction"
Language: Python
#auto_regressive_model #diffusion_models #image_generation #transformers
Stars: 440 Issues: 6 Forks: 10
https://github.com/FoundationVision/VAR
GitHub
GitHub - FoundationVision/VAR: [NeurIPS 2024 Oral][GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of…
[NeurIPS 2024 Oral][GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of "Visual Autoregressive Modeling: Scalable Image Generation via Next-Scale Prediction&qu...
elicit/machine-learning-list
#artificial_intelligence #language_model #machine_learning #transformers
Stars: 188 Issues: 0 Forks: 9
https://github.com/elicit/machine-learning-list
#artificial_intelligence #language_model #machine_learning #transformers
Stars: 188 Issues: 0 Forks: 9
https://github.com/elicit/machine-learning-list
GitHub
GitHub - elicit/machine-learning-list: A curriculum for learning about foundation models, from scratch to the frontier
A curriculum for learning about foundation models, from scratch to the frontier - elicit/machine-learning-list
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks (KANs) for language modeling
Language: Python
#gpt #kanformers #kolmogorov_arnold_networks #kolmogorov_arnold_representation #llm #text_generation #transformers
Stars: 217 Issues: 2 Forks: 11
https://github.com/AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks (KANs) for language modeling
Language: Python
#gpt #kanformers #kolmogorov_arnold_networks #kolmogorov_arnold_representation #llm #text_generation #transformers
Stars: 217 Issues: 2 Forks: 11
https://github.com/AdityaNG/kan-gpt
GitHub
GitHub - AdityaNG/kan-gpt: The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks…
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks (KANs) for language modeling - AdityaNG/kan-gpt
buaacyw/MeshAnythingV2
From anything to mesh like human artists. Official impl. of "MeshAnything V2: Artist-Created Mesh Generation With Adjacent Mesh Tokenization"
Language: Python
#3d #3d_generation #generative_model #mesh #transformers
Stars: 305 Issues: 3 Forks: 12
https://github.com/buaacyw/MeshAnythingV2
From anything to mesh like human artists. Official impl. of "MeshAnything V2: Artist-Created Mesh Generation With Adjacent Mesh Tokenization"
Language: Python
#3d #3d_generation #generative_model #mesh #transformers
Stars: 305 Issues: 3 Forks: 12
https://github.com/buaacyw/MeshAnythingV2
GitHub
GitHub - buaacyw/MeshAnythingV2: From anything to mesh like human artists. Official impl. of "MeshAnything V2: Artist-Created Mesh…
From anything to mesh like human artists. Official impl. of "MeshAnything V2: Artist-Created Mesh Generation With Adjacent Mesh Tokenization" - buaacyw/MeshAnythingV2