-
Notifications
You must be signed in to change notification settings - Fork 5.1k
Issues: ollama/ollama
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
0.1.39 doesn't work with Nvidia Xavier NX
bug
Something isn't working
#4693
opened May 29, 2024 by
ZanMax
Supress spinner option or add file out option
feature request
New feature or request
#4686
opened May 28, 2024 by
tyson-nw
Model download finally fails behind company firewall
bug
Something isn't working
#4684
opened May 28, 2024 by
berndgoetz
Please support Baichuan series models
feature request
New feature or request
#4678
opened May 28, 2024 by
Han-Huaqiao
Ollama create test -f ./Modelfile error ==> Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"
bug
Something isn't working
#4676
opened May 28, 2024 by
farmountain
Error: llama runner process has terminated: exit status 0xc0000409
bug
Something isn't working
#4675
opened May 28, 2024 by
FreemanFeng
any command but serve get errors,when using proxy
bug
Something isn't working
#4674
opened May 28, 2024 by
lingfengchencn
llama3 8b BF16 error
bug
Something isn't working
importing
Issues relating to ollama create
#4670
opened May 27, 2024 by
ccbadd
The ability to log the generated text response
feature request
New feature or request
#4669
opened May 27, 2024 by
MarkWard0110
Add the contents of "/api/ps" endpoint to "docs/api.md"
feature request
New feature or request
#4667
opened May 27, 2024 by
mili-tan
ollama doesn't create a model from modelfile and gives an error
bug
Something isn't working
#4666
opened May 27, 2024 by
tMrMorgan
OLLAMA support MiniCPM-Llama3-V 2.5
feature request
New feature or request
#4664
opened May 27, 2024 by
zhqfdn
Can Ollama be ran with Nemo Guardrails
bug
Something isn't working
#4662
opened May 27, 2024 by
ShreyasChhetri
Changing seed does not change response
bug
Something isn't working
#4660
opened May 27, 2024 by
ccreutzi
Can run Ollama as windows service ?
feature request
New feature or request
#4658
opened May 27, 2024 by
petersha0630
[Windows 10] Error: llama runner process has terminated: exit status 0xc0000139
bug
Something isn't working
#4657
opened May 27, 2024 by
bogdandinga
invalid conversion from ‘void*’ to ‘unsigned int’ [-fpermissive]
bug
Something isn't working
#4655
opened May 27, 2024 by
Zhou-CyberSecurity-AI
Can the model download page add a new ranking?
feature request
New feature or request
#4654
opened May 27, 2024 by
despairTK
Settings File In Addition to Environment Flags
feature request
New feature or request
#4649
opened May 26, 2024 by
chigkim
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.