kyegomez/MultiModalMamba
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
Language: Python
#ai #artificial_intelligence #attention_mechanism #machine_learning #mamba #ml #pytorch #ssm #torch #transformer_architecture #transformers #zeta
Stars: 264 Issues: 0 Forks: 9
https://github.com/kyegomez/MultiModalMamba
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever.
Language: Python
#ai #artificial_intelligence #attention_mechanism #machine_learning #mamba #ml #pytorch #ssm #torch #transformer_architecture #transformers #zeta
Stars: 264 Issues: 0 Forks: 9
https://github.com/kyegomez/MultiModalMamba
GitHub
GitHub - kyegomez/MultiModalMamba: A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi…
A novel implementation of fusing ViT with Mamba into a fast, agile, and high performance Multi-Modal Model. Powered by Zeta, the simplest AI framework ever. - kyegomez/MultiModalMamba
🔥1
buaacyw/MeshAnything
From anything to mesh like human artists
Language: Python
#3d #generative_ai #generative_model #mesh #mesh_generation #transformer
Stars: 405 Issues: 1 Forks: 12
https://github.com/buaacyw/MeshAnything
From anything to mesh like human artists
Language: Python
#3d #generative_ai #generative_model #mesh #mesh_generation #transformer
Stars: 405 Issues: 1 Forks: 12
https://github.com/buaacyw/MeshAnything
GitHub
GitHub - buaacyw/MeshAnything: [ICLR 2025] From anything to mesh like human artists. Official impl. of "MeshAnything: Artist-Created…
[ICLR 2025] From anything to mesh like human artists. Official impl. of "MeshAnything: Artist-Created Mesh Generation with Autoregressive Transformers" - buaacyw/MeshAnything
🔥4
InternLM/MindSearch
🔍 a LLM-based Multi-agent Framework of Web Search Engine similar to Perplexity.ai Pro and SearchGPT
Language: Python
#ai_search_engine #gpt #llm #llms #multi_agent_systems #perplexity_ai #search #searchgpt #transformer #web_search
Stars: 792 Issues: 9 Forks: 60
https://github.com/InternLM/MindSearch
🔍 a LLM-based Multi-agent Framework of Web Search Engine similar to Perplexity.ai Pro and SearchGPT
Language: Python
#ai_search_engine #gpt #llm #llms #multi_agent_systems #perplexity_ai #search #searchgpt #transformer #web_search
Stars: 792 Issues: 9 Forks: 60
https://github.com/InternLM/MindSearch
GitHub
GitHub - InternLM/MindSearch: 🔍 An LLM-based Multi-agent Framework of Web Search Engine (like Perplexity.ai Pro and SearchGPT)
🔍 An LLM-based Multi-agent Framework of Web Search Engine (like Perplexity.ai Pro and SearchGPT) - InternLM/MindSearch
DepthAnything/Video-Depth-Anything
Video Depth Anything: Consistent Depth Estimation for Super-Long Videos
Language: Python
#depth_estimation #monocular_depth_estimation #transformer #video_depth
Stars: 234 Issues: 2 Forks: 8
https://github.com/DepthAnything/Video-Depth-Anything
Video Depth Anything: Consistent Depth Estimation for Super-Long Videos
Language: Python
#depth_estimation #monocular_depth_estimation #transformer #video_depth
Stars: 234 Issues: 2 Forks: 8
https://github.com/DepthAnything/Video-Depth-Anything
GitHub
GitHub - DepthAnything/Video-Depth-Anything: [CVPR 2025 Highlight] Video Depth Anything: Consistent Depth Estimation for Super…
[CVPR 2025 Highlight] Video Depth Anything: Consistent Depth Estimation for Super-Long Videos - DepthAnything/Video-Depth-Anything
MoonshotAI/MoBA
MoBA: Mixture of Block Attention for Long-Context LLMs
Language: Python
#flash_attention #llm #llm_serving #llm_training #moe #pytorch #transformer
Stars: 521 Issues: 2 Forks: 16
https://github.com/MoonshotAI/MoBA
MoBA: Mixture of Block Attention for Long-Context LLMs
Language: Python
#flash_attention #llm #llm_serving #llm_training #moe #pytorch #transformer
Stars: 521 Issues: 2 Forks: 16
https://github.com/MoonshotAI/MoBA
GitHub
GitHub - MoonshotAI/MoBA: MoBA: Mixture of Block Attention for Long-Context LLMs
MoBA: Mixture of Block Attention for Long-Context LLMs - MoonshotAI/MoBA
therealoliver/Deepdive-llama3-from-scratch
Achieve the llama3 inference step-by-step, grasp the core concepts, master the process derivation, implement the code.
Language: Jupyter Notebook
#attention #attention_mechanism #gpt #inference #kv_cache #language_model #llama #llm_configuration #llms #mask #multi_head_attention #positional_encoding #residuals #rms #rms_norm #rope #rotary_position_encoding #swiglu #tokenizer #transformer
Stars: 388 Issues: 0 Forks: 28
https://github.com/therealoliver/Deepdive-llama3-from-scratch
Achieve the llama3 inference step-by-step, grasp the core concepts, master the process derivation, implement the code.
Language: Jupyter Notebook
#attention #attention_mechanism #gpt #inference #kv_cache #language_model #llama #llm_configuration #llms #mask #multi_head_attention #positional_encoding #residuals #rms #rms_norm #rope #rotary_position_encoding #swiglu #tokenizer #transformer
Stars: 388 Issues: 0 Forks: 28
https://github.com/therealoliver/Deepdive-llama3-from-scratch
GitHub
GitHub - therealoliver/Deepdive-llama3-from-scratch: Achieve the llama3 inference step-by-step, grasp the core concepts, master…
Achieve the llama3 inference step-by-step, grasp the core concepts, master the process derivation, implement the code. - therealoliver/Deepdive-llama3-from-scratch
👍1