baichuan-inc/Baichuan-13B
A 13B large language model developed by Baichuan Intelligent Technology
Language: Python
#artificial_intelligence #benchmark #ceval #chatgpt #chinese #gpt_4 #huggingface #large_language_models #mmlu #natural_language_processing
Stars: 459 Issues: 3 Forks: 24
https://github.com/baichuan-inc/Baichuan-13B
A 13B large language model developed by Baichuan Intelligent Technology
Language: Python
#artificial_intelligence #benchmark #ceval #chatgpt #chinese #gpt_4 #huggingface #large_language_models #mmlu #natural_language_processing
Stars: 459 Issues: 3 Forks: 24
https://github.com/baichuan-inc/Baichuan-13B
GitHub
GitHub - baichuan-inc/Baichuan-13B: A 13B large language model developed by Baichuan Intelligent Technology
A 13B large language model developed by Baichuan Intelligent Technology - baichuan-inc/Baichuan-13B
👎1
kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Language: Python
#artificial_intelligence #attention #attention_is_all_you_need #attention_mechanisms #chatgpt #context_length #gpt3 #gpt4 #machine_learning #transformer
Stars: 381 Issues: 4 Forks: 55
https://github.com/kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Language: Python
#artificial_intelligence #attention #attention_is_all_you_need #attention_mechanisms #chatgpt #context_length #gpt3 #gpt4 #machine_learning #transformer
Stars: 381 Issues: 4 Forks: 55
https://github.com/kyegomez/LongNet
GitHub
GitHub - kyegomez/LongNet: Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens" - kyegomez/LongNet
👌2👍1