|
2 | 2 |
|
3 | 3 | From Seafile 13, users can enable ***Seafile AI*** to support the following features:
|
4 | 4 |
|
| 5 | +!!! note "Prerequisites of Seafile AI deployment" |
| 6 | + To deploy Seafile AI, you have to deploy [Metadat mmanagement](./metadata-server.md) extension firstly. Then you can follow this manual to deploy Seafile AI. |
| 7 | + |
5 | 8 | - File tags, file and image summaries, text translation, sdoc writing assistance
|
6 | 9 | - Given an image, generate its corresponding tags (including objects, weather, color, etc.)
|
7 | 10 | - Detect faces in images and encode them
|
@@ -35,30 +38,85 @@ The Seafile AI basic service will use API calls to external large language model
|
35 | 38 |
|
36 | 39 | 2. Modify `.env`, insert or modify the following fields:
|
37 | 40 |
|
38 |
| - === "Use ***gpt-4o-mini*** model" |
| 41 | + === "OpenAI" |
39 | 42 |
|
40 | 43 | ```
|
41 | 44 | COMPOSE_FILE='...,seafile-ai.yml' # add seafile-ai.yml
|
42 | 45 |
|
43 | 46 | ENABLE_SEAFILE_AI=true
|
| 47 | + SEAFILE_AI_LLM_TYPE=openai |
44 | 48 | SEAFILE_AI_LLM_KEY=<your openai LLM access key>
|
| 49 | + SEAFILE_AI_LLM_MODEL=gpt-4o-mini # recommend |
45 | 50 | ```
|
46 |
| - === "Use other models" |
| 51 | + === "Deepseek" |
47 | 52 | ```
|
48 | 53 | COMPOSE_FILE='...,seafile-ai.yml' # add seafile-ai.yml
|
49 | 54 |
|
50 | 55 | ENABLE_SEAFILE_AI=true
|
51 |
| - SEAFILE_AI_LLM_TYPE=other |
52 |
| - SEAFILE_AI_LLM_URL=https://api.openai.com/v1 # your LLM API endpoint |
| 56 | + SEAFILE_AI_LLM_TYPE=deepseek |
53 | 57 | SEAFILE_AI_LLM_KEY=<your LLM access key>
|
54 |
| - SEAFILE_AI_LLM_MODEL=gpt-4o-mini # your model id |
| 58 | + SEAFILE_AI_LLM_MODEL=deepseek-chat # recommend |
| 59 | + ``` |
| 60 | + === "Azure OpenAI" |
| 61 | + ``` |
| 62 | + COMPOSE_FILE='...,seafile-ai.yml' # add seafile-ai.yml |
| 63 | + |
| 64 | + ENABLE_SEAFILE_AI=true |
| 65 | + SEAFILE_AI_LLM_TYPE=azure |
| 66 | + SEAFILE_AI_LLM_URL= # your deployment url, leave blank to use default endpoint |
| 67 | + SEAFILE_AI_LLM_KEY=<your API key> |
| 68 | + SEAFILE_AI_LLM_MODEL=<your_deployment_name> |
55 | 69 | ```
|
| 70 | + === "Ollama" |
| 71 | + ``` |
| 72 | + COMPOSE_FILE='...,seafile-ai.yml' # add seafile-ai.yml |
| 73 | + |
| 74 | + ENABLE_SEAFILE_AI=true |
| 75 | + SEAFILE_AI_LLM_TYPE=ollama |
| 76 | + SEAFILE_AI_LLM_URL=<your LLM endpoint> |
| 77 | + SEAFILE_AI_LLM_KEY=<your LLM access key> |
| 78 | + SEAFILE_AI_LLM_MODEL=<your model-id> |
| 79 | + ``` |
| 80 | + === "HuggingFace" |
| 81 | + ``` |
| 82 | + COMPOSE_FILE='...,seafile-ai.yml' # add seafile-ai.yml |
| 83 | + |
| 84 | + ENABLE_SEAFILE_AI=true |
| 85 | + SEAFILE_AI_LLM_TYPE=huggingface |
| 86 | + SEAFILE_AI_LLM_URL=<LLM Provider>/<your huggingface API endpoint> |
| 87 | + SEAFILE_AI_LLM_KEY=<your huggingface API key> |
| 88 | + SEAFILE_AI_LLM_MODEL=<model-id> |
| 89 | + ``` |
| 90 | + === "Self-proxy Server" |
| 91 | + ``` |
| 92 | + COMPOSE_FILE='...,seafile-ai.yml' # add seafile-ai.yml |
| 93 | + |
| 94 | + ENABLE_SEAFILE_AI=true |
| 95 | + SEAFILE_AI_LLM_TYPE=proxy |
| 96 | + SEAFILE_AI_LLM_URL=<your proxy url> |
| 97 | + SEAFILE_AI_LLM_KEY=<your proxy virtual key> |
| 98 | + SEAFILE_AI_LLM_MODEL=<model-id> |
| 99 | + ``` |
| 100 | + === "Other" |
| 101 | + Seafile AI utilizes [LiteLLM](https://docs.litellm.ai/docs/) to interact with LLM services. For a complete list of supported LLM providers, please refer to [this documentation](https://docs.litellm.ai/docs/providers). Then fill the following fields in your `.env`: |
| 102 | + |
| 103 | + ``` |
| 104 | + COMPOSE_FILE='...,seafile-ai.yml' # add seafile-ai.yml |
| 105 | + ENABLE_SEAFILE_AI=true |
| 106 | + |
| 107 | + # according to your situation |
| 108 | + SEAFILE_AI_LLM_TYPE=... |
| 109 | + SEAFILE_AI_LLM_URL=... |
| 110 | + SEAFILE_AI_LLM_KEY=... |
| 111 | + SEAFILE_AI_LLM_MODEL=... |
| 112 | + ``` |
| 113 | + |
| 114 | + For example, if you are using a LLM service with ***OpenAI-compatible endpoints***, you should set `SEAFILE_AI_LLM_TYPE` to `other` or `openai`, and set other LLM configuration items accurately. |
56 | 115 |
|
57 |
| - !!! note "About use custom model" |
| 116 | + |
| 117 | + !!! note "About model selection" |
58 | 118 |
|
59 |
| - Seafile AI supports the use of custom large models, but the following conditions must be met: |
60 |
| - - OpenAI compatibility API |
61 |
| - - The large model supports multi-modality (such as supporting images, etc.) |
| 119 | + Seafile AI supports using large model providers from [LiteLLM](https://docs.litellm.ai/docs/providers) or large model services with OpenAI-compatible endpoints. Therefore, Seafile AI is compatible with most custom large model services except the default model (*gpt-4o-mini*), but in order to ensure the normal use of Seafile AI features, you need to select a **multimodal large model** (such as supporting image input and recognition) |
62 | 120 |
|
63 | 121 |
|
64 | 122 | 3. Restart Seafile server:
|
@@ -87,11 +145,11 @@ The Seafile AI basic service will use API calls to external large language model
|
87 | 145 | | `REDIS_HOST` | Redis server host |
|
88 | 146 | | `REDIS_PORT` | Redis server port |
|
89 | 147 | | `REDIS_PASSWORD` | Redis server password |
|
90 |
| - | `SEAFILE_AI_LLM_TYPE` | Large Language Model (LLM) Type. `openai` (default) will use OpenAI's ***gpt-4o-mini*** model and `other` for user-custom models which support multimodality | |
91 |
| - | `SEAFILE_AI_LLM_URL` | LLM API endpoint, only needs to be specified when `SEAFILE_AI_LLM_TYPE=other`. Default is `https://api.openai.com/v1` | |
| 148 | + | `SEAFILE_AI_LLM_TYPE` | Large Language Model (LLM) Type. Default is `openai`. | |
| 149 | + | `SEAFILE_AI_LLM_URL` | LLM API endpoint. Default is `` (none) | |
92 | 150 | | `SEAFILE_AI_LLM_KEY` | LLM API key |
|
93 | 151 | | `FACE_EMBEDDING_SERVICE_URL` | Face embedding service url |
|
94 |
| - | `SEAFILE_AI_LLM_MODEL` | LLM model id (or name), only needs to be specified when `SEAFILE_AI_LLM_TYPE=other`. Default is ***gpt-4o-mini*** | |
| 152 | + | `SEAFILE_AI_LLM_MODEL` | LLM model id (or name). Default is ***gpt-4o-mini*** | |
95 | 153 |
|
96 | 154 | then start your Seafile AI server:
|
97 | 155 |
|
|
0 commit comments