Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions docs/GEMMA3.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ The Python and HTTP APIs support sending images as:
The Rust API takes an image from the [image](https://docs.rs/image/latest/image/index.html) crate.

## HTTP server
You can find this example [here](../examples/server/gemma3.py).
You can find this example [here](../examples/server/vision_chat.py).

We support an OpenAI compatible HTTP API for vision models. This example demonstrates sending a chat completion request with an image.

Expand Down Expand Up @@ -97,7 +97,7 @@ print(resp)
---

## Rust
You can find this example [here](../mistralrs/examples/gemma3/main.rs).
You can find this example [here](../mistralrs/examples/vision_chat/main.rs).

This is a minimal example of running the Gemma 3 model with a dummy image.

Expand Down Expand Up @@ -142,7 +142,7 @@ async fn main() -> Result<()> {
```

## Python
You can find this example [here](../examples/python/gemma3.py).
You can find this example [here](../examples/python/vision_chat.py).

This example demonstrates loading and sending a chat completion request with an image.

Expand Down
2 changes: 1 addition & 1 deletion docs/IDEFICS2.md
Original file line number Diff line number Diff line change
Expand Up @@ -137,7 +137,7 @@ async fn main() -> Result<()> {
```

## Python
You can find this example [here](../examples/python/phi3v.py).
You can find this example [here](../examples/python/vision_chat.py).

This example demonstrates loading and sending a chat completion request with an image.

Expand Down
6 changes: 3 additions & 3 deletions docs/LLAMA4.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ The Python and HTTP APIs support sending images as:
The Rust API takes an image from the [image](https://docs.rs/image/latest/image/index.html) crate.

## HTTP server
You can find this example [here](../examples/server/llama4.py).
You can find this example [here](../examples/server/vision_chat.py).

We support an OpenAI compatible HTTP API for vision models. This example demonstrates sending a chat completion request with an image.

Expand Down Expand Up @@ -116,7 +116,7 @@ print(resp)
---

## Rust
You can find this example [here](../mistralrs/examples/llama4/main.rs).
You can find this example [here](../mistralrs/examples/vision_chat/main.rs).

This is a minimal example of running the Llama 4 model with a dummy image.

Expand Down Expand Up @@ -162,7 +162,7 @@ async fn main() -> Result<()> {
```

## Python
You can find this example [here](../examples/python/llama4.py).
You can find this example [here](../examples/python/vision_chat.py).

This example demonstrates loading and sending a chat completion request with an image.

Expand Down
6 changes: 3 additions & 3 deletions docs/LLaVA.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ The Rust API takes an image from the [image](https://docs.rs/image/latest/image/
> It should be added to messages manually, and is of the format `<image>`.

## HTTP server
You can find this example [here](../examples/server/llava_next.py).
You can find this example [here](../examples/server/vision_chat.py).

We support an OpenAI compatible HTTP API for vision models. This example demonstrates sending a chat completion request with an image.

Expand Down Expand Up @@ -101,7 +101,7 @@ print(resp)
---

## Rust
You can find this example [here](../mistralrs/examples/llava_next/main.rs).
You can find this example [here](../mistralrs/examples/vision_chat/main.rs).

This is a minimal example of running the LLaVA and LLaVANext model with a dummy image.

Expand Down Expand Up @@ -146,7 +146,7 @@ async fn main() -> Result<()> {
```

## Python
You can find this example [here](../examples/python/llava_next.py).
You can find this example [here](../examples/python/vision_chat.py).

This example demonstrates loading and sending a chat completion request with an image.

Expand Down
6 changes: 3 additions & 3 deletions docs/MISTRAL3.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ tool calling with Mistral Small 3.1, and you can use it by specifying the `jinja


## HTTP server
You can find this example [here](../examples/server/mistral3.py).
You can find this example [here](../examples/server/vision_chat.py).

We support an OpenAI compatible HTTP API for vision models. This example demonstrates sending a chat completion request with an image.

Expand Down Expand Up @@ -107,7 +107,7 @@ print(resp)
---

## Rust
You can find this example [here](../mistralrs/examples/mistral3/main.rs).
You can find this example [here](../mistralrs/examples/vision_chat/main.rs).

This is a minimal example of running the Mistral 3 model with a dummy image.

Expand Down Expand Up @@ -152,7 +152,7 @@ async fn main() -> Result<()> {
```

## Python
You can find this example [here](../examples/python/mistral3.py).
You can find this example [here](../examples/python/vision_chat.py).

This example demonstrates loading and sending a chat completion request with an image.

Expand Down
6 changes: 3 additions & 3 deletions docs/PHI3V.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ The Rust API takes an image from the [image](https://docs.rs/image/latest/image/
> They should be added to messages manually, and are of the format `<|image_{N}|>` where N starts from 1.

## HTTP server
You can find this example [here](../examples/server/phi3v.py).
You can find this example [here](../examples/server/vision_chat.py).

We support an OpenAI compatible HTTP API for vision models. This example demonstrates sending a chat completion request with an image.

Expand Down Expand Up @@ -96,7 +96,7 @@ print(resp)
---

## Rust
You can find this example [here](../mistralrs/examples/phi3v/main.rs).
You can find this example [here](../mistralrs/examples/vision_chat/main.rs).

This is a minimal example of running the Phi 3 Vision model with a dummy image.

Expand Down Expand Up @@ -140,7 +140,7 @@ async fn main() -> Result<()> {
```

## Python
You can find this example [here](../examples/python/phi3v.py).
You can find this example [here](../examples/python/vision_chat.py).

This example demonstrates loading and sending a chat completion request with an image.

Expand Down
6 changes: 3 additions & 3 deletions docs/PHI4MM.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ The Rust API takes an image from the [image](https://docs.rs/image/latest/image/
> They should be added to messages manually, and are of the format `<|image_{N}|>` where N starts from 1.

## HTTP server
You can find this example [here](../examples/server/phi3v.py).
You can find this example [here](../examples/server/vision_chat.py).

We support an OpenAI compatible HTTP API for vision models. This example demonstrates sending a chat completion request with an image.

Expand Down Expand Up @@ -94,7 +94,7 @@ print(resp)
---

## Rust
You can find this example [here](../mistralrs/examples/phi3v/main.rs).
You can find this example [here](../mistralrs/examples/vision_chat/main.rs).

This is a minimal example of running the Phi 4 Multimodal model with a dummy image.

Expand Down Expand Up @@ -139,7 +139,7 @@ async fn main() -> Result<()> {
```

## Python
You can find this example [here](../examples/python/phi3v.py).
You can find this example [here](../examples/python/vision_chat.py).

This example demonstrates loading and sending a chat completion request with an image.

Expand Down
6 changes: 3 additions & 3 deletions docs/QWEN2VL.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ camellias are also known for their resilience and ability to thrive in a variety
```

## HTTP server
You can find this example [here](../examples/server/qwen2vl.py).
You can find this example [here](../examples/server/vision_chat.py).

We support an OpenAI compatible HTTP API for vision models. This example demonstrates sending a chat completion request with an image.

Expand Down Expand Up @@ -137,7 +137,7 @@ print(resp)
---

## Rust
You can find this example [here](../mistralrs/examples/qwen2vl/main.rs).
You can find this example [here](../mistralrs/examples/vision_chat/main.rs).

```rust
use anyhow::Result;
Expand Down Expand Up @@ -184,7 +184,7 @@ async fn main() -> Result<()> {
---

## Python
You can find this example [here](../examples/python/qwen2vl.py).
You can find this example [here](../examples/python/vision_chat.py).

This example demonstrates loading and sending a chat completion request with an image.

Expand Down
2 changes: 1 addition & 1 deletion docs/VISION_MODELS.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@ Please see docs for the following model types:
- Phi 4 Multimodal: [PHI4MM.md](PHI4MM.md)

> Note for the Python and HTTP APIs:
> We follow the OpenAI specification for structuring the image messages and allow both base64 encoded images as well as a URL/path to the image. There are many examples of this, see [this Python example](../examples/python/phi3v.py).
> We follow the OpenAI specification for structuring the image messages and allow both base64 encoded images as well as a URL/path to the image. There are many examples of this, see [this Python example](../examples/python/vision_chat.py).
6 changes: 3 additions & 3 deletions docs/VLLAMA.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ The image appears to be of Mount Washington, which is the highest peak in the No
```

## HTTP server
You can find this example [here](../examples/server/llama_vision.py).
You can find this example [here](../examples/server/vision_chat.py).

We support an OpenAI compatible HTTP API for vision models. This example demonstrates sending a chat completion request with an image.

Expand Down Expand Up @@ -152,7 +152,7 @@ print(resp)
---

## Rust
You can find this example [here](../mistralrs/examples/llama_vision/main.rs).
You can find this example [here](../mistralrs/examples/vision_chat/main.rs).

```rust
use anyhow::Result;
Expand Down Expand Up @@ -198,7 +198,7 @@ async fn main() -> Result<()> {
---

## Python
You can find this example [here](../examples/python/llama_vision.py).
You can find this example [here](../examples/python/vision_chat.py).

This example demonstrates loading and sending a chat completion request with an image.

Expand Down
23 changes: 0 additions & 23 deletions examples/python/deepseekr1.py

This file was deleted.

23 changes: 0 additions & 23 deletions examples/python/deepseekv2.py

This file was deleted.

37 changes: 0 additions & 37 deletions examples/python/gemma3.py

This file was deleted.

38 changes: 0 additions & 38 deletions examples/python/llama4.py

This file was deleted.

40 changes: 0 additions & 40 deletions examples/python/llama_vision.py

This file was deleted.

Loading
Loading