Separius/awesome-fast-attention
list of efficient attention modules
Language: Python
#attention #attention_is_all_you_need #awesome #linformer #longformer #multihead_attention #reformer #self_attention #transformer #transformer_network
Stars: 139 Issues: 0 Forks: 10
https://github.com/Separius/awesome-fast-attention
list of efficient attention modules
Language: Python
#attention #attention_is_all_you_need #awesome #linformer #longformer #multihead_attention #reformer #self_attention #transformer #transformer_network
Stars: 139 Issues: 0 Forks: 10
https://github.com/Separius/awesome-fast-attention
GitHub
GitHub - Separius/awesome-fast-attention: list of efficient attention modules
list of efficient attention modules. Contribute to Separius/awesome-fast-attention development by creating an account on GitHub.
kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Language: Python
#artificial_intelligence #attention #attention_is_all_you_need #attention_mechanisms #chatgpt #context_length #gpt3 #gpt4 #machine_learning #transformer
Stars: 381 Issues: 4 Forks: 55
https://github.com/kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Language: Python
#artificial_intelligence #attention #attention_is_all_you_need #attention_mechanisms #chatgpt #context_length #gpt3 #gpt4 #machine_learning #transformer
Stars: 381 Issues: 4 Forks: 55
https://github.com/kyegomez/LongNet
GitHub
GitHub - kyegomez/LongNet: Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens" - kyegomez/LongNet