You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- xAI Grog4 free (from OpenRouter) added to the initial models
- Chat with AI with project context removed (agent does it better)
- Chat with AI about llama-vscode is now with agent, not using webui
- Agent - new buttons "Tools Model" and "Agent" - possibility to view the selected model and agent and to change them.
Copy file name to clipboardExpand all lines: resources/help.md
+18-32Lines changed: 18 additions & 32 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,7 @@
2
2
3
3
### What are agent commands
4
4
Agent commands are a way to reuse often used prompts. They could be used to describe complex workflows or for simple instructions.
5
+
Agent commands are stored in setting agent_commands and llama-vscode menu item "Agent commands..." could be used to manage them.
5
6
In future they could be extended with additional context, specificatlly prepared by llama-vscode for each command.
6
7
7
8
### How to use them
@@ -17,34 +18,17 @@ The agent will explain the selected code.
17
18
## Chat with AI about llama-vscode
18
19
19
20
### Requred servers
20
-
-Chat server
21
+
-Tools server
21
22
22
23
### How to use it
23
-
This is a conversation with the local AI about llama-vscode, something like help how to use llama-vscode.
24
-
- From llama-vscode menu select "Chat with AI about llama-vscode" -> a window will be opened (the conversation history overlays the actual chat window, but just click on the chat window)
24
+
This is a conversation with the llama-vscode help agent AI about llama-vscode, something like help how to use llama-vscode.
25
+
- From llama-vscode menu select "Chat with AI about llama-vscode" -> the agent will be opened
25
26
- Enter your question about llama-vscode
26
-
The first time it could take longer to answer. The following questions will be answered faster as the input will be cached.
27
+
The first time it could take longer to answer. The following questions will be answered faster as the help information will be cached.
27
28
28
29
29
30
## Chat with AI with project context
30
-
31
-
### Required servers
32
-
- Chat server
33
-
- Embeddings server
34
-
35
-
### How to use it
36
-
This is a conversation with the local AI. It uses the project information and therefore is slower than Chat with AI, but could answer questions related to the project.
37
-
- Press Ctrl+Shift+; inside an editor (or select from llama.vscode menu Chat with AI with project context)
38
-
- Enter your question
39
-
- llama-vscode collects a relevant context information from the project and sends it to the AI together with your question
40
-
- Project context information is sent to the AI only if the question is entered with Ctrl+Shift+;. If the question is written directly in the chat window - no new context information is sent to the AI.
41
-
- If the AI answers too slowly - close the VS Code chat window and open a new one with Ctrl+Shift+;
42
-
- Press Esc if you want to return from the chat to the editor
43
-
44
-
It is possible to configure rag_* settings to adjust the rag search according to models and hardware ressources
45
-
46
-
47
-

31
+
This is removed. Chat with AI with project context is equal to using agent with the tool search_source. The agent has many other tools and is therefore a better choice.
48
32
49
33
50
34
## Chat with AI
@@ -56,7 +40,6 @@ It is possible to configure rag_* settings to adjust the rag search according to
56
40
This is a conversation with the local AI. Mainly for asking questions for reference instead of searching with google. It doesn't use the project information and therefore is fast.
57
41
- Press Ctrl+; inside an editor (or select from llama.vscode menu Chat with AI) - A chat window will open inside VS Code
58
42
- Enter your message and start the chat
59
-
- Press Esc if you want to return from the chat to the editor
60
43
61
44

62
45
@@ -184,12 +167,12 @@ This generate a commit message, based on the current changes.
184
167
185
168

186
169
187
-
## Version 0.0.19 is released (18.08.2025)
170
+
## Version 0.0.27 is released (21.09.2025)
188
171
## What is new
189
-
* llama.cpp already supports gpt-oss with tools! If you have 20+ VRAM available, you could select env "Local, full package - min, gpt-oss 20B" and use all AI features, including Llama Agent, only with local models.
190
-
* Llama agent UI now shows tables correctly
191
-
* Auto select/start last used env if desired - setting Env_start_last_usedEnv_start_last_used
192
-
* Agent (system prompt + default tools) selection is now possible ("Agents..." -> "Select/start agent" from llama-vscode menu). All agents details are stored in setting agents_list. Export/import agents is also possible, i.e. they could be shared.
172
+
- xAI Grog4 free (from OpenRouter) added to the initial models
173
+
- Chat with AI with project context removed (agent does it better)
174
+
- Chat with AI about llama-vscode is now with agent, not using webui
175
+
- Agent - new buttons "Tools Model" and "Agent" - possibility to view the selected model and agent and to change them.
193
176
194
177
## Setup instructions for llama.cpp server
195
178
@@ -215,8 +198,6 @@ This generate a commit message, based on the current changes.
215
198
216
199
### [Chat with AI](https://github.com/ggml-org/llama.vscode/wiki/Chat-with-AI)
217
200
218
-
### [Chat with AI with project context](https://github.com/ggml-org/llama.vscode/wiki/Chat-with-AI-with-project-context)
@@ -284,14 +265,19 @@ Llama agent asks for permission for executing terminal command. However, if the
284
265
### How to use it
285
266
The best wey to prepare the environment for the agent is by selecting an Env (group of models). So, below is the standard workflow:
286
267
1. Select "Show Llama Agent" from llama-vscode menu or Ctrl+Shift+A to show Llama Agent.
287
-
2. Click "Select Env" button (visible if there is no selected env) and select env, which supports agent, for your needes. This will download the required models and start llama.cpp servers with them. For the external servers (like OpenRouter) llama-vscode will ask for api key if needed.
268
+
2. Click "Select Env" button and select env, which supports agent, for your needes. This will download the required models and start llama.cpp servers with them. For the external servers (like OpenRouter) llama-vscode will ask for api key if needed.
288
269
3. Write your request and send it with Enter or the "Send" button.
289
270
271
+
You could also use the agent only with tools model selected. In this case the tool search_source will not work (requires chat and embeddins server)
272
+
290
273
Optional
291
274
- You could add files to the context with the @ button or just by entering "@".
275
+
- You could select a command (predefined prompt) by pressin "/". The commands could be added frim llama-vscdoe menu - Agent commands...
292
276
- Activating an agent (Ctrl+Shift+A or from llama-vscodd menu) adds the open file to the agent context
293
277
- You could select source code and activate the agent (Ctrl+Shift+A or from llama-vscodd menu) to attach the selected lines to the context
294
-
- You could choose the tools to be used from "Select Tools" button (on the right side of "New Chat" button). If you have installed and started MCP Servers in VS Code, their tools will be available for selection too. Don't forget to click the OK button after changing the tool selection.
278
+
- You could choose the tools to be used from "Select Tools" button (on the right side of "New Chat" button). If you have installed and started MCP Servers in VS Code, their tools will be available for selection too. Don't forget to click the OK button after changing the tool selection.
279
+
- View the selected tools model from the tool tip of the "Tools Model" button and select a new tools model by clicking it.
280
+
- View the selected agent from the tool tip of the "Agent" button and select a new agent by clicking it
295
281
296
282
Click button "Deselect Env" (vislble if there is a selected env with agent model) to deselect the env and selected models and stop the servers, which were started by llama-vscode.
297
283
Click button "Selected Models" to show details about the selected models
Copy file name to clipboardExpand all lines: src/menu.ts
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -126,7 +126,7 @@ export class Menu {
126
126
},
127
127
{
128
128
label: (this.app.configuration.getUiText("Chat with AI about llama-vscode")??""),
129
-
description: this.app.configuration.getUiText(`Opens a chat with AI window with llama-vscode help docu context inside VS Code using the selected chat model (or setting endpoint_chat)`)
129
+
description: this.app.configuration.getUiText(`Selects llama-vscode help agent and opens llama agent view for asking ai about llama-vscode`)
130
130
},
131
131
{
132
132
label: this.app.configuration.getUiText('How to delete models'),
["Opens a chat with AI window inside VS Code using the selected chat model (or setting endpoint_chat)","Отваря прозорец за чат с AI във VS Code, използвайки избрания чат модел (или задаване на endpoint_chat)","Öffnet ein Chat-Fenster mit KI in VS Code unter Verwendung des ausgewählten Chat-Modells (oder der Einstellung endpoint_chat)","Открывает окно чата с ИИ в VS Code с использованием выбранной модели чата (или настройки endpoint_chat)","Abre una ventana de chat con IA en VS Code usando el modelo de chat seleccionado (o configurando endpoint_chat)","在 VS Code 中使用选定的聊天模型(或设置 endpoint_chat)打开 AI 聊天窗口","Ouvre une fenêtre de chat avec l'IA dans VS Code en utilisant le modèle de chat sélectionné (ou en définissant endpoint_chat)"],
137
137
["Opens a chat with AI window with project context inside VS Code using the selected chat model (or setting endpoint_chat)","Отваря прозорец за чат с AI с контекст на проекта във VS Code, използвайки избрания чат модел (или задаване на endpoint_chat)","Öffnet ein Chat-Fenster mit KI und Projektkontext in VS Code unter Verwendung des ausgewählten Chat-Modells (oder der Einstellung endpoint_chat)","Открывает окно чата с ИИ с контекстом проекта в VS Code с использованием выбранной модели чата (или настройки endpoint_chat)","Abre una ventana de chat con IA con contexto del proyecto en VS Code usando el modelo de chat seleccionado (o configurando endpoint_chat)","在 VS Code 中使用选定的聊天模型(或设置 endpoint_chat)打开包含项目上下文的 AI 聊天窗口","Ouvre une fenêtre de chat avec l'IA avec le contexte du projet dans VS Code en utilisant le modèle de chat sélectionné (ou en définissant endpoint_chat)"],
138
138
["Chat with AI about llama-vscode","Чат с AI относно llama-vscode","Chatten Sie mit der KI über llama-vscode","Чат с ИИ о llama-vscode","Chatea con IA sobre llama-vscode","与 AI 聊天关于 llama-vscode","Discuter avec l'IA à propos de llama-vscode"],
139
-
["Opens a chat with AI window with llama-vscode help docu context inside VS Code using the selected chat model (or setting endpoint_chat)","Отваря прозорец за чат с AI с контекст на помощния документ на llama-vscode във VS Code, използвайки избрания чат модел (или задаване на endpoint_chat)","Öffnet ein Chat-Fenster mit KI und Hilfe-Dokumentationskontext von llama-vscode in VS Code unter Verwendung des ausgewählten Chat-Modells (oder der Einstellung endpoint_chat)","Открывает окно чата с ИИ с контекстом документации помощи llama-vscode в VS Code с использованием выбранной модели чата (или настройки endpoint_chat)","Abre una ventana de chat con IA con contexto de documentación de ayuda de llama-vscode en VS Code usando el modelo de chat seleccionado (o configurando endpoint_chat)","在 VS Code 中使用选定的聊天模型(或设置 endpoint_chat)打开包含 llama-vscode 帮助文档上下文的 AI 聊天窗口","Ouvre une fenêtre de chat avec l'IA avec le contexte de la documentation d'aide de llama-vscode dans VS Code en utilisant le modèle de chat sélectionné (ou en définissant endpoint_chat)"],
139
+
["Selects llama-vscode help agent and opens llama agent view for asking ai about llama-vscode","Избира помощния агент на llama-vscode и отваря изгледа на агента на llama за задаване на въпроси към изкуствен интелект относно llama-vscode","Wählt den llama-vscode-Hilfe-Agenten aus und öffnet die Llama-Agenten-Ansicht, um KI zu llama-vscode zu befragen","Выбирает агент справки llama-vscode и открывает представление агента llama для запросов к ИИ о llama-vscode","Selecciona el agente de ayuda de llama-vscode y abre la vista del agente de llama para consultar a la IA sobre llama-vscode","选择 llama-vscode 帮助代理并打开 llama 代理视图以向 AI 询问有关 llama-vscode 的事宜","Sélectionne l'agent d'aide de llama-vscode et ouvre la vue de l'agent llama pour interroger l'IA sur llama-vscode"],
["Shows details about the selected env","Показва подробности за избраната среда","Zeigt Details zur ausgewählten Umgebung","Показывает подробности о выбранной среде","Muestra detalles sobre el entorno seleccionado","显示所选环境的详细信息","Affiche les détails de l'environnement sélectionné"],
142
142
["Select/start env...","Избор/стартиране на среда...","Umgebung auswählen/starten...","Выбрать/запустить среду...","Seleccionar/iniciar entorno...","选择/启动环境...","Sélectionner/démarrer l'environnement..."],
0 commit comments