Skip to content

Commit 1a38de5

Browse files
authored
Documentation updated (#115)
- xAI Grog4 free (from OpenRouter) added to the initial models - Chat with AI with project context removed (agent does it better) - Chat with AI about llama-vscode is now with agent, not using webui - Agent - new buttons "Tools Model" and "Agent" - possibility to view the selected model and agent and to change them.
1 parent dff8c28 commit 1a38de5

File tree

3 files changed

+20
-34
lines changed

3 files changed

+20
-34
lines changed

resources/help.md

Lines changed: 18 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22

33
### What are agent commands
44
Agent commands are a way to reuse often used prompts. They could be used to describe complex workflows or for simple instructions.
5+
Agent commands are stored in setting agent_commands and llama-vscode menu item "Agent commands..." could be used to manage them.
56
In future they could be extended with additional context, specificatlly prepared by llama-vscode for each command.
67

78
### How to use them
@@ -17,34 +18,17 @@ The agent will explain the selected code.
1718
## Chat with AI about llama-vscode
1819

1920
### Requred servers
20-
- Chat server
21+
- Tools server
2122

2223
### How to use it
23-
This is a conversation with the local AI about llama-vscode, something like help how to use llama-vscode.
24-
- From llama-vscode menu select "Chat with AI about llama-vscode" -> a window will be opened (the conversation history overlays the actual chat window, but just click on the chat window)
24+
This is a conversation with the llama-vscode help agent AI about llama-vscode, something like help how to use llama-vscode.
25+
- From llama-vscode menu select "Chat with AI about llama-vscode" -> the agent will be opened
2526
- Enter your question about llama-vscode
26-
The first time it could take longer to answer. The following questions will be answered faster as the input will be cached.
27+
The first time it could take longer to answer. The following questions will be answered faster as the help information will be cached.
2728

2829

2930
## Chat with AI with project context
30-
31-
### Required servers
32-
- Chat server
33-
- Embeddings server
34-
35-
### How to use it
36-
This is a conversation with the local AI. It uses the project information and therefore is slower than Chat with AI, but could answer questions related to the project.
37-
- Press Ctrl+Shift+; inside an editor (or select from llama.vscode menu Chat with AI with project context)
38-
- Enter your question
39-
- llama-vscode collects a relevant context information from the project and sends it to the AI together with your question
40-
- Project context information is sent to the AI only if the question is entered with Ctrl+Shift+;. If the question is written directly in the chat window - no new context information is sent to the AI.
41-
- If the AI answers too slowly - close the VS Code chat window and open a new one with Ctrl+Shift+;
42-
- Press Esc if you want to return from the chat to the editor
43-
44-
It is possible to configure rag_* settings to adjust the rag search according to models and hardware ressources
45-
46-
47-
![Chat with AI with project context](https://github.com/user-attachments/assets/d5753717-1d85-4e4e-a093-53b0ed5f51dc)
31+
This is removed. Chat with AI with project context is equal to using agent with the tool search_source. The agent has many other tools and is therefore a better choice.
4832

4933

5034
## Chat with AI
@@ -56,7 +40,6 @@ It is possible to configure rag_* settings to adjust the rag search according to
5640
This is a conversation with the local AI. Mainly for asking questions for reference instead of searching with google. It doesn't use the project information and therefore is fast.
5741
- Press Ctrl+; inside an editor (or select from llama.vscode menu Chat with AI) - A chat window will open inside VS Code
5842
- Enter your message and start the chat
59-
- Press Esc if you want to return from the chat to the editor
6043

6144
![Chat with AI](https://github.com/user-attachments/assets/e068f5cc-fce3-4366-9b8f-1c89e952b411)
6245

@@ -184,12 +167,12 @@ This generate a commit message, based on the current changes.
184167

185168
![Generate a commit message](https://github.com/user-attachments/assets/25f5d1ae-3673-4416-ba52-7615969c1bb3)
186169

187-
## Version 0.0.19 is released (18.08.2025)
170+
## Version 0.0.27 is released (21.09.2025)
188171
## What is new
189-
* llama.cpp already supports gpt-oss with tools! If you have 20+ VRAM available, you could select env "Local, full package - min, gpt-oss 20B" and use all AI features, including Llama Agent, only with local models.
190-
* Llama agent UI now shows tables correctly
191-
* Auto select/start last used env if desired - setting Env_start_last_usedEnv_start_last_used
192-
* Agent (system prompt + default tools) selection is now possible ("Agents..." -> "Select/start agent" from llama-vscode menu). All agents details are stored in setting agents_list. Export/import agents is also possible, i.e. they could be shared.
172+
- xAI Grog4 free (from OpenRouter) added to the initial models
173+
- Chat with AI with project context removed (agent does it better)
174+
- Chat with AI about llama-vscode is now with agent, not using webui
175+
- Agent - new buttons "Tools Model" and "Agent" - possibility to view the selected model and agent and to change them.
193176

194177
## Setup instructions for llama.cpp server
195178

@@ -215,8 +198,6 @@ This generate a commit message, based on the current changes.
215198

216199
### [Chat with AI](https://github.com/ggml-org/llama.vscode/wiki/Chat-with-AI)
217200

218-
### [Chat with AI with project context](https://github.com/ggml-org/llama.vscode/wiki/Chat-with-AI-with-project-context)
219-
220201
### [Generate commit message](https://github.com/ggml-org/llama.vscode/wiki/Generate-commit-message)
221202

222203

@@ -284,14 +265,19 @@ Llama agent asks for permission for executing terminal command. However, if the
284265
### How to use it
285266
The best wey to prepare the environment for the agent is by selecting an Env (group of models). So, below is the standard workflow:
286267
1. Select "Show Llama Agent" from llama-vscode menu or Ctrl+Shift+A to show Llama Agent.
287-
2. Click "Select Env" button (visible if there is no selected env) and select env, which supports agent, for your needes. This will download the required models and start llama.cpp servers with them. For the external servers (like OpenRouter) llama-vscode will ask for api key if needed.
268+
2. Click "Select Env" button and select env, which supports agent, for your needes. This will download the required models and start llama.cpp servers with them. For the external servers (like OpenRouter) llama-vscode will ask for api key if needed.
288269
3. Write your request and send it with Enter or the "Send" button.
289270

271+
You could also use the agent only with tools model selected. In this case the tool search_source will not work (requires chat and embeddins server)
272+
290273
Optional
291274
- You could add files to the context with the @ button or just by entering "@".
275+
- You could select a command (predefined prompt) by pressin "/". The commands could be added frim llama-vscdoe menu - Agent commands...
292276
- Activating an agent (Ctrl+Shift+A or from llama-vscodd menu) adds the open file to the agent context
293277
- You could select source code and activate the agent (Ctrl+Shift+A or from llama-vscodd menu) to attach the selected lines to the context
294-
- You could choose the tools to be used from "Select Tools" button (on the right side of "New Chat" button). If you have installed and started MCP Servers in VS Code, their tools will be available for selection too. Don't forget to click the OK button after changing the tool selection.
278+
- You could choose the tools to be used from "Select Tools" button (on the right side of "New Chat" button). If you have installed and started MCP Servers in VS Code, their tools will be available for selection too. Don't forget to click the OK button after changing the tool selection.
279+
- View the selected tools model from the tool tip of the "Tools Model" button and select a new tools model by clicking it.
280+
- View the selected agent from the tool tip of the "Agent" button and select a new agent by clicking it
295281

296282
Click button "Deselect Env" (vislble if there is a selected env with agent model) to deselect the env and selected models and stop the servers, which were started by llama-vscode.
297283
Click button "Selected Models" to show details about the selected models

src/menu.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -126,7 +126,7 @@ export class Menu {
126126
},
127127
{
128128
label: (this.app.configuration.getUiText("Chat with AI about llama-vscode") ?? ""),
129-
description: this.app.configuration.getUiText(`Opens a chat with AI window with llama-vscode help docu context inside VS Code using the selected chat model (or setting endpoint_chat)`)
129+
description: this.app.configuration.getUiText(`Selects llama-vscode help agent and opens llama agent view for asking ai about llama-vscode`)
130130
},
131131
{
132132
label: this.app.configuration.getUiText('How to delete models'),

src/translations.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -136,7 +136,7 @@ export const translations: string[][] = [
136136
["Opens a chat with AI window inside VS Code using the selected chat model (or setting endpoint_chat)", "Отваря прозорец за чат с AI във VS Code, използвайки избрания чат модел (или задаване на endpoint_chat)", "Öffnet ein Chat-Fenster mit KI in VS Code unter Verwendung des ausgewählten Chat-Modells (oder der Einstellung endpoint_chat)", "Открывает окно чата с ИИ в VS Code с использованием выбранной модели чата (или настройки endpoint_chat)", "Abre una ventana de chat con IA en VS Code usando el modelo de chat seleccionado (o configurando endpoint_chat)", "在 VS Code 中使用选定的聊天模型(或设置 endpoint_chat)打开 AI 聊天窗口", "Ouvre une fenêtre de chat avec l'IA dans VS Code en utilisant le modèle de chat sélectionné (ou en définissant endpoint_chat)"],
137137
["Opens a chat with AI window with project context inside VS Code using the selected chat model (or setting endpoint_chat)", "Отваря прозорец за чат с AI с контекст на проекта във VS Code, използвайки избрания чат модел (или задаване на endpoint_chat)", "Öffnet ein Chat-Fenster mit KI und Projektkontext in VS Code unter Verwendung des ausgewählten Chat-Modells (oder der Einstellung endpoint_chat)", "Открывает окно чата с ИИ с контекстом проекта в VS Code с использованием выбранной модели чата (или настройки endpoint_chat)", "Abre una ventana de chat con IA con contexto del proyecto en VS Code usando el modelo de chat seleccionado (o configurando endpoint_chat)", "在 VS Code 中使用选定的聊天模型(或设置 endpoint_chat)打开包含项目上下文的 AI 聊天窗口", "Ouvre une fenêtre de chat avec l'IA avec le contexte du projet dans VS Code en utilisant le modèle de chat sélectionné (ou en définissant endpoint_chat)"],
138138
["Chat with AI about llama-vscode", "Чат с AI относно llama-vscode", "Chatten Sie mit der KI über llama-vscode", "Чат с ИИ о llama-vscode", "Chatea con IA sobre llama-vscode", "与 AI 聊天关于 llama-vscode", "Discuter avec l'IA à propos de llama-vscode"],
139-
["Opens a chat with AI window with llama-vscode help docu context inside VS Code using the selected chat model (or setting endpoint_chat)", "Отваря прозорец за чат с AI с контекст на помощния документ на llama-vscode във VS Code, използвайки избрания чат модел (или задаване на endpoint_chat)", "Öffnet ein Chat-Fenster mit KI und Hilfe-Dokumentationskontext von llama-vscode in VS Code unter Verwendung des ausgewählten Chat-Modells (oder der Einstellung endpoint_chat)", "Открывает окно чата с ИИ с контекстом документации помощи llama-vscode в VS Code с использованием выбранной модели чата (или настройки endpoint_chat)", "Abre una ventana de chat con IA con contexto de documentación de ayuda de llama-vscode en VS Code usando el modelo de chat seleccionado (o configurando endpoint_chat)", "在 VS Code 中使用选定的聊天模型(或设置 endpoint_chat)打开包含 llama-vscode 帮助文档上下文的 AI 聊天窗口", "Ouvre une fenêtre de chat avec l'IA avec le contexte de la documentation d'aide de llama-vscode dans VS Code en utilisant le modèle de chat sélectionné (ou en définissant endpoint_chat)"],
139+
["Selects llama-vscode help agent and opens llama agent view for asking ai about llama-vscode", "Избира помощния агент на llama-vscode и отваря изгледа на агента на llama за задаване на въпроси към изкуствен интелект относно llama-vscode", "Wählt den llama-vscode-Hilfe-Agenten aus und öffnet die Llama-Agenten-Ansicht, um KI zu llama-vscode zu befragen", "Выбирает агент справки llama-vscode и открывает представление агента llama для запросов к ИИ о llama-vscode", "Selecciona el agente de ayuda de llama-vscode y abre la vista del agente de llama para consultar a la IA sobre llama-vscode", "选择 llama-vscode 帮助代理并打开 llama 代理视图以向 AI 询问有关 llama-vscode 的事宜", "Sélectionne l'agent d'aide de llama-vscode et ouvre la vue de l'agent llama pour interroger l'IA sur llama-vscode"],
140140
["Show selected env", "Показва избраната среда", "Ausgewählte Umgebung anzeigen", "Показать выбранную среду", "Mostrar entorno seleccionado", "显示所选环境", "Afficher l'environnement sélectionné"],
141141
["Shows details about the selected env", "Показва подробности за избраната среда", "Zeigt Details zur ausgewählten Umgebung", "Показывает подробности о выбранной среде", "Muestra detalles sobre el entorno seleccionado", "显示所选环境的详细信息", "Affiche les détails de l'environnement sélectionné"],
142142
["Select/start env...", "Избор/стартиране на среда...", "Umgebung auswählen/starten...", "Выбрать/запустить среду...", "Seleccionar/iniciar entorno...", "选择/启动环境...", "Sélectionner/démarrer l'environnement..."],

0 commit comments

Comments
 (0)