-
Notifications
You must be signed in to change notification settings - Fork 908
Issues: TabbyML/tabby
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
vscode: Option to not store edit prompts history
enhancement
New feature or request
#2673
opened Jul 16, 2024 by
iTrooz
vscode: "Edit" command (or Ctrl+I) writes weird stuff in the editor
bug-unconfirmed
#2672
opened Jul 16, 2024 by
iTrooz
Provide a new code snippet filtering mode
enhancement
New feature or request
#2664
opened Jul 16, 2024 by
kannae97
Who can give an example by using openai interface?
enhancement
New feature or request
#2659
opened Jul 16, 2024 by
Arcmoon-Hu
Answer Engine Quality - Ideas
enhancement
New feature or request
#2657
opened Jul 16, 2024 by
wsxiaoys
Reuse llama-server for models supporting both chat / fim completion (e.g Codestral)
enhancement
New feature or request
good first issue
Good for newcomers
#2654
opened Jul 16, 2024 by
wsxiaoys
tabby is loading same model 2 times when using Codestral-22B for both chat and tab complition
bug-unconfirmed
#2652
opened Jul 15, 2024 by
bubundas17
Tabby plugin for JetBrains IDE is always under initialization.
bug
Something isn't working
#2650
opened Jul 15, 2024 by
danny-su
Generic Bash Autocomplete Support
enhancement
New feature or request
#2644
opened Jul 15, 2024 by
sirebellum
Allow customizing context length in config.toml
enhancement
New feature or request
good first issue
Good for newcomers
#2638
opened Jul 14, 2024 by
wsxiaoys
llama-server with cpu device is not working in docker image
bug
Something isn't working
#2634
opened Jul 13, 2024 by
b-reich
Use int8 reranking for search scoring
enhancement
New feature or request
good first issue
Good for newcomers
#2633
opened Jul 13, 2024 by
wsxiaoys
VS Code Extension: Chat responses are not shown in Tabby Chat Window
bug-unconfirmed
#2628
opened Jul 12, 2024 by
Eulenator
llama-server distributed with tabby requires avx2 cpu instruction
documentation
Improvements or additions to documentation
#2597
opened Jul 8, 2024 by
KweezyCode
add IDE information (e.g vim / vscode / ...) to New feature or request
~/.tabby/events
log.
enhancement
#2581
opened Jul 5, 2024 by
wsxiaoys
Please consider using standard XDG paths for configs and data
enhancement
New feature or request
#2563
opened Jul 2, 2024 by
bendavis78
Enable repository context in Answer Engine
enhancement
New feature or request
#2561
opened Jul 2, 2024 by
coffeebe4code
Logged out of Tabby Web after about 10 minutes even when active
bug-unconfirmed
#2557
opened Jul 1, 2024 by
ge-hall
bug: New feature or request
--chat-device
option broken (Mixed GPU + CPU for completion + chat models)
enhancement
#2527
opened Jun 27, 2024 by
jtbr
Code completion often experiences some lag. Are there any optimization configuration methods?
#2493
opened Jun 24, 2024 by
5bug
Previous Next
ProTip!
Updated in the last three days: updated:>2024-07-13.