lucidrains/toolformer-pytorch
Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI
Language: Python
#api_calling #artificial_intelligence #attention_mechanisms #deep_learning #transformers
Stars: 419 Issues: 2 Forks: 13
https://github.com/lucidrains/toolformer-pytorch
Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI
Language: Python
#api_calling #artificial_intelligence #attention_mechanisms #deep_learning #transformers
Stars: 419 Issues: 2 Forks: 13
https://github.com/lucidrains/toolformer-pytorch
GitHub
GitHub - lucidrains/toolformer-pytorch: Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI
Implementation of Toolformer, Language Models That Can Use Tools, by MetaAI - lucidrains/toolformer-pytorch
SkalskiP/courses
This repository is a curated collection of links to various courses and resources about Artificial Intelligence (AI)
Language: Python
#computer_vision #deep_learning #deep_neural_networks #machine_learning #mlops #multimodal #natural_language_processing #nlp #transformers #tutorial
Stars: 323 Issues: 0 Forks: 29
https://github.com/SkalskiP/courses
This repository is a curated collection of links to various courses and resources about Artificial Intelligence (AI)
Language: Python
#computer_vision #deep_learning #deep_neural_networks #machine_learning #mlops #multimodal #natural_language_processing #nlp #transformers #tutorial
Stars: 323 Issues: 0 Forks: 29
https://github.com/SkalskiP/courses
GitHub
GitHub - SkalskiP/courses: This repository is a curated collection of links to various courses and resources about Artificial Intelligence…
This repository is a curated collection of links to various courses and resources about Artificial Intelligence (AI) - SkalskiP/courses
0hq/WebGPT
Run GPT model on the browser with WebGPU. An implementation of GPT inference in less than ~2000 lines of vanilla Javascript.
Language: JavaScript
#gpt #nanogpt #transformers #webgpu
Stars: 1059 Issues: 2 Forks: 44
https://github.com/0hq/WebGPT
Run GPT model on the browser with WebGPU. An implementation of GPT inference in less than ~2000 lines of vanilla Javascript.
Language: JavaScript
#gpt #nanogpt #transformers #webgpu
Stars: 1059 Issues: 2 Forks: 44
https://github.com/0hq/WebGPT
GitHub
GitHub - 0hq/WebGPT: Run GPT model on the browser with WebGPU. An implementation of GPT inference in less than ~1500 lines of vanilla…
Run GPT model on the browser with WebGPU. An implementation of GPT inference in less than ~1500 lines of vanilla Javascript. - GitHub - 0hq/WebGPT: Run GPT model on the browser with WebGPU. An imp...
lucidrains/recurrent-memory-transformer-pytorch
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanisms #deep_learning #long_context #memory #recurrence #transformers
Stars: 223 Issues: 0 Forks: 4
https://github.com/lucidrains/recurrent-memory-transformer-pytorch
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanisms #deep_learning #long_context #memory #recurrence #transformers
Stars: 223 Issues: 0 Forks: 4
https://github.com/lucidrains/recurrent-memory-transformer-pytorch
GitHub
GitHub - lucidrains/recurrent-memory-transformer-pytorch: Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in…
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch - lucidrains/recurrent-memory-transformer-pytorch
lucidrains/MEGABYTE-pytorch
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanisms #deep_learning #learned_tokenization #long_context #transformers
Stars: 204 Issues: 0 Forks: 10
https://github.com/lucidrains/MEGABYTE-pytorch
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanisms #deep_learning #learned_tokenization #long_context #transformers
Stars: 204 Issues: 0 Forks: 10
https://github.com/lucidrains/MEGABYTE-pytorch
GitHub
GitHub - lucidrains/MEGABYTE-pytorch: Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers…
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch - lucidrains/MEGABYTE-pytorch
PKU-Alignment/safe-rlhf
Safe-RLHF: Constrained Value Alignment via Safe Reinforcement Learning from Human Feedback
Language: Python
#ai_safety #alpaca #datasets #deepspeed #large_language_models #llama #llm #llms #reinforcement_learning #reinforcement_learning_from_human_feedback #rlhf #safe_reinforcement_learning #safe_reinforcement_learning_from_human_feedback #safe_rlhf #safety #transformers #vicuna
Stars: 279 Issues: 0 Forks: 14
https://github.com/PKU-Alignment/safe-rlhf
Safe-RLHF: Constrained Value Alignment via Safe Reinforcement Learning from Human Feedback
Language: Python
#ai_safety #alpaca #datasets #deepspeed #large_language_models #llama #llm #llms #reinforcement_learning #reinforcement_learning_from_human_feedback #rlhf #safe_reinforcement_learning #safe_reinforcement_learning_from_human_feedback #safe_rlhf #safety #transformers #vicuna
Stars: 279 Issues: 0 Forks: 14
https://github.com/PKU-Alignment/safe-rlhf
GitHub
GitHub - PKU-Alignment/safe-rlhf: Safe RLHF: Constrained Value Alignment via Safe Reinforcement Learning from Human Feedback
Safe RLHF: Constrained Value Alignment via Safe Reinforcement Learning from Human Feedback - PKU-Alignment/safe-rlhf
lucidrains/soundstorm-pytorch
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanism #audio_generation #deep_learning #non_autoregressive #transformers
Stars: 181 Issues: 0 Forks: 6
https://github.com/lucidrains/soundstorm-pytorch
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanism #audio_generation #deep_learning #non_autoregressive #transformers
Stars: 181 Issues: 0 Forks: 6
https://github.com/lucidrains/soundstorm-pytorch
GitHub
GitHub - lucidrains/soundstorm-pytorch: Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind…
Implementation of SoundStorm, Efficient Parallel Audio Generation from Google Deepmind, in Pytorch - lucidrains/soundstorm-pytorch
ray-project/aviary
Ray Aviary - evaluate multiple LLMs easily
Language: Python
#distributed_systems #large_language_models #ray #serving #transformers
Stars: 279 Issues: 7 Forks: 13
https://github.com/ray-project/aviary
Ray Aviary - evaluate multiple LLMs easily
Language: Python
#distributed_systems #large_language_models #ray #serving #transformers
Stars: 279 Issues: 7 Forks: 13
https://github.com/ray-project/aviary
GitHub
GitHub - ray-project/ray-llm: RayLLM - LLMs on Ray
RayLLM - LLMs on Ray. Contribute to ray-project/ray-llm development by creating an account on GitHub.
hiyouga/FastEdit
🩹Editing large language models within 10 seconds⚡
Language: Python
#bloom #chatbots #chatgpt #falcon #gpt #large_language_models #llama #llms #pytorch #transformers
Stars: 295 Issues: 5 Forks: 28
https://github.com/hiyouga/FastEdit
🩹Editing large language models within 10 seconds⚡
Language: Python
#bloom #chatbots #chatgpt #falcon #gpt #large_language_models #llama #llms #pytorch #transformers
Stars: 295 Issues: 5 Forks: 28
https://github.com/hiyouga/FastEdit
GitHub
GitHub - hiyouga/FastEdit: 🩹Editing large language models within 10 seconds⚡
🩹Editing large language models within 10 seconds⚡. Contribute to hiyouga/FastEdit development by creating an account on GitHub.
InternLM/lagent
A lightweight framework for building LLM-based agents
Language: Python
#agent #gpt #llm #transformers
Stars: 151 Issues: 2 Forks: 8
https://github.com/InternLM/lagent
A lightweight framework for building LLM-based agents
Language: Python
#agent #gpt #llm #transformers
Stars: 151 Issues: 2 Forks: 8
https://github.com/InternLM/lagent
GitHub
GitHub - InternLM/lagent: A lightweight framework for building LLM-based agents
A lightweight framework for building LLM-based agents - InternLM/lagent
tomaarsen/attention_sinks
Extend existing LLMs way beyond the original training length with constant memory usage, and without retraining
Language: Python
#llm #llms #nlp #python #transformers
Stars: 224 Issues: 2 Forks: 11
https://github.com/tomaarsen/attention_sinks
Extend existing LLMs way beyond the original training length with constant memory usage, and without retraining
Language: Python
#llm #llms #nlp #python #transformers
Stars: 224 Issues: 2 Forks: 11
https://github.com/tomaarsen/attention_sinks
GitHub
GitHub - tomaarsen/attention_sinks: Extend existing LLMs way beyond the original training length with constant memory usage, without…
Extend existing LLMs way beyond the original training length with constant memory usage, without retraining - tomaarsen/attention_sinks
lucidrains/meshgpt-pytorch
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanisms #deep_learning #mesh_generation #transformers
Stars: 195 Issues: 0 Forks: 7
https://github.com/lucidrains/meshgpt-pytorch
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Language: Python
#artificial_intelligence #attention_mechanisms #deep_learning #mesh_generation #transformers
Stars: 195 Issues: 0 Forks: 7
https://github.com/lucidrains/meshgpt-pytorch
GitHub
GitHub - lucidrains/meshgpt-pytorch: Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch
Implementation of MeshGPT, SOTA Mesh generation using Attention, in Pytorch - lucidrains/meshgpt-pytorch
kyegomez/MultiModalMamba
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
Language: Python
#ai #artificial_intelligence #attention_mechanism #machine_learning #mamba #ml #pytorch #ssm #torch #transformer_architecture #transformers #zeta
Stars: 264 Issues: 0 Forks: 9
https://github.com/kyegomez/MultiModalMamba
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
Language: Python
#ai #artificial_intelligence #attention_mechanism #machine_learning #mamba #ml #pytorch #ssm #torch #transformer_architecture #transformers #zeta
Stars: 264 Issues: 0 Forks: 9
https://github.com/kyegomez/MultiModalMamba
GitHub
GitHub - kyegomez/MultiModalMamba: A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi…
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever. - kyegomez/MultiModalMamba
lucidrains/self-rewarding-lm-pytorch
Implementation of the training framework proposed in Self-Rewarding Language Model, from MetaAI
Language: Python
#artificial_intelligence #beyond_human_data #deep_learning #self_rewarding #transformers
Stars: 367 Issues: 0 Forks: 17
https://github.com/lucidrains/self-rewarding-lm-pytorch
Implementation of the training framework proposed in Self-Rewarding Language Model, from MetaAI
Language: Python
#artificial_intelligence #beyond_human_data #deep_learning #self_rewarding #transformers
Stars: 367 Issues: 0 Forks: 17
https://github.com/lucidrains/self-rewarding-lm-pytorch
GitHub
GitHub - lucidrains/self-rewarding-lm-pytorch: Implementation of the training framework proposed in Self-Rewarding Language Model…
Implementation of the training framework proposed in Self-Rewarding Language Model, from MetaAI - lucidrains/self-rewarding-lm-pytorch
FoundationVision/VAR
[GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of "Visual Autoregressive Modeling: Scalable Image Generation via Next-Scale Prediction"
Language: Python
#auto_regressive_model #diffusion_models #image_generation #transformers
Stars: 440 Issues: 6 Forks: 10
https://github.com/FoundationVision/VAR
[GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of "Visual Autoregressive Modeling: Scalable Image Generation via Next-Scale Prediction"
Language: Python
#auto_regressive_model #diffusion_models #image_generation #transformers
Stars: 440 Issues: 6 Forks: 10
https://github.com/FoundationVision/VAR
GitHub
GitHub - FoundationVision/VAR: [NeurIPS 2024 Oral][GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of…
[NeurIPS 2024 Oral][GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of "Visual Autoregressive Modeling: Scalable Image Generation via Next-Scale Prediction&qu...
elicit/machine-learning-list
#artificial_intelligence #language_model #machine_learning #transformers
Stars: 188 Issues: 0 Forks: 9
https://github.com/elicit/machine-learning-list
#artificial_intelligence #language_model #machine_learning #transformers
Stars: 188 Issues: 0 Forks: 9
https://github.com/elicit/machine-learning-list
GitHub
GitHub - elicit/machine-learning-list: A curriculum for learning about foundation models, from scratch to the frontier
A curriculum for learning about foundation models, from scratch to the frontier - elicit/machine-learning-list
AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks (KANs) for language modeling
Language: Python
#gpt #kanformers #kolmogorov_arnold_networks #kolmogorov_arnold_representation #llm #text_generation #transformers
Stars: 217 Issues: 2 Forks: 11
https://github.com/AdityaNG/kan-gpt
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks (KANs) for language modeling
Language: Python
#gpt #kanformers #kolmogorov_arnold_networks #kolmogorov_arnold_representation #llm #text_generation #transformers
Stars: 217 Issues: 2 Forks: 11
https://github.com/AdityaNG/kan-gpt
GitHub
GitHub - AdityaNG/kan-gpt: The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks…
The PyTorch implementation of Generative Pre-trained Transformers (GPTs) using Kolmogorov-Arnold Networks (KANs) for language modeling - AdityaNG/kan-gpt
buaacyw/MeshAnythingV2
From anything to mesh like human artists. Official impl. of "MeshAnything V2: Artist-Created Mesh Generation With Adjacent Mesh Tokenization"
Language: Python
#3d #3d_generation #generative_model #mesh #transformers
Stars: 305 Issues: 3 Forks: 12
https://github.com/buaacyw/MeshAnythingV2
From anything to mesh like human artists. Official impl. of "MeshAnything V2: Artist-Created Mesh Generation With Adjacent Mesh Tokenization"
Language: Python
#3d #3d_generation #generative_model #mesh #transformers
Stars: 305 Issues: 3 Forks: 12
https://github.com/buaacyw/MeshAnythingV2
GitHub
GitHub - buaacyw/MeshAnythingV2: From anything to mesh like human artists. Official impl. of "MeshAnything V2: Artist-Created Mesh…
From anything to mesh like human artists. Official impl. of "MeshAnything V2: Artist-Created Mesh Generation With Adjacent Mesh Tokenization" - buaacyw/MeshAnythingV2
edwko/OuteTTS
Interface for OuteTTS models.
Language: Python
#gguf #llama #text_to_speech #transformers #tts
Stars: 278 Issues: 6 Forks: 13
https://github.com/edwko/OuteTTS
Interface for OuteTTS models.
Language: Python
#gguf #llama #text_to_speech #transformers #tts
Stars: 278 Issues: 6 Forks: 13
https://github.com/edwko/OuteTTS
GitHub
GitHub - edwko/OuteTTS
Contribute to edwko/OuteTTS development by creating an account on GitHub.