hiyouga/FastEdit
π©ΉEditing large language models within 10 secondsβ‘
Language: Python
#bloom #chatbots #chatgpt #falcon #gpt #large_language_models #llama #llms #pytorch #transformers
Stars: 295 Issues: 5 Forks: 28
https://github.com/hiyouga/FastEdit
π©ΉEditing large language models within 10 secondsβ‘
Language: Python
#bloom #chatbots #chatgpt #falcon #gpt #large_language_models #llama #llms #pytorch #transformers
Stars: 295 Issues: 5 Forks: 28
https://github.com/hiyouga/FastEdit
GitHub
GitHub - hiyouga/FastEdit: π©ΉEditing large language models within 10 secondsβ‘
π©ΉEditing large language models within 10 secondsβ‘. Contribute to hiyouga/FastEdit development by creating an account on GitHub.
β€2π1
liltom-eth/llama2-webui
Run Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Supporting Llama-2-7B/13B/70B with 8-bit, 4-bit. Supporting GPU inference (6 GB VRAM) and CPU inference.
Language: Python
#llama_2 #llama2 #llm #llm_inference
Stars: 481 Issues: 2 Forks: 42
https://github.com/liltom-eth/llama2-webui
Run Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Supporting Llama-2-7B/13B/70B with 8-bit, 4-bit. Supporting GPU inference (6 GB VRAM) and CPU inference.
Language: Python
#llama_2 #llama2 #llm #llm_inference
Stars: 481 Issues: 2 Forks: 42
https://github.com/liltom-eth/llama2-webui
GitHub
GitHub - liltom-eth/llama2-webui: Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2β¦
Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps. - GitHub - liltom-eth/llama2-...
π4π₯1π₯°1π1
xNul/code-llama-for-vscode
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
Language: Python
#assistant #code #code_llama #codellama #continue #continuedev #copilot #llama #llama2 #llamacpp #llm #local #meta #ollama #studio #visual #vscode
Stars: 170 Issues: 3 Forks: 6
https://github.com/xNul/code-llama-for-vscode
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot.
Language: Python
#assistant #code #code_llama #codellama #continue #continuedev #copilot #llama #llama2 #llamacpp #llm #local #meta #ollama #studio #visual #vscode
Stars: 170 Issues: 3 Forks: 6
https://github.com/xNul/code-llama-for-vscode
GitHub
GitHub - xNul/code-llama-for-vscode: Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternativeβ¦
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot. - xNul/code-llama-for-vscode
π3
tairov/llama2.mojo
Inference Llama 2 in one file of pure π₯
#inference #llama #llama2 #modular #mojo #parallelize #performance #simd #tensor #vectorization
Stars: 200 Issues: 0 Forks: 7
https://github.com/tairov/llama2.mojo
Inference Llama 2 in one file of pure π₯
#inference #llama #llama2 #modular #mojo #parallelize #performance #simd #tensor #vectorization
Stars: 200 Issues: 0 Forks: 7
https://github.com/tairov/llama2.mojo
GitHub
GitHub - tairov/llama2.mojo: Inference Llama 2 in one file of pure π₯
Inference Llama 2 in one file of pure π₯. Contribute to tairov/llama2.mojo development by creating an account on GitHub.
π1
Fuzzy-Search/realtime-bakllava
llama.cpp with BakLLaVA model describes what does it see
Language: Python
#bakllavva #cpp #demo_application #inference #llama #llamacpp #llm
Stars: 141 Issues: 1 Forks: 15
https://github.com/Fuzzy-Search/realtime-bakllava
llama.cpp with BakLLaVA model describes what does it see
Language: Python
#bakllavva #cpp #demo_application #inference #llama #llamacpp #llm
Stars: 141 Issues: 1 Forks: 15
https://github.com/Fuzzy-Search/realtime-bakllava
GitHub
GitHub - OneInterface/realtime-bakllava: llama.cpp with BakLLaVA model describes what does it see
llama.cpp with BakLLaVA model describes what does it see - OneInterface/realtime-bakllava
lxe/llavavision
A simple "Be My Eyes" web app with a llama.cpp/llava backend
Language: JavaScript
#ai #artificial_intelligence #computer_vision #llama #llamacpp #llm #local_llm #machine_learning #multimodal #webapp
Stars: 284 Issues: 0 Forks: 7
https://github.com/lxe/llavavision
A simple "Be My Eyes" web app with a llama.cpp/llava backend
Language: JavaScript
#ai #artificial_intelligence #computer_vision #llama #llamacpp #llm #local_llm #machine_learning #multimodal #webapp
Stars: 284 Issues: 0 Forks: 7
https://github.com/lxe/llavavision
GitHub
GitHub - lxe/llavavision: A simple "Be My Eyes" web app with a llama.cpp/llava backend
A simple "Be My Eyes" web app with a llama.cpp/llava backend - lxe/llavavision