You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/guide/guide.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,14 +10,14 @@ The user guide is organized in sections to help you get started with llm-d and t
10
10
11
11
llm-d is an open source project providing distributed inferencing for GenAI runtimes on any Kubernetes cluster. Its highly performant, scalable architecture helps reduce costs through a spectrum of hardware efficiency improvements. The project prioritizes ease of deployment+use as well as SRE needs + day 2 operations associated with running large GPU clusters.
12
12
13
-
[For more information check out the Architecture Documentation](./architecture/00_architecture.mdx)
13
+
[For more information check out the Architecture Documentation](./architecture/00_architecture)
14
14
15
15
## Installation: Start here to minimize your frustration
16
16
17
17
This guide will walk you through the steps to install and deploy the llm-d quickstart demo on a Kubernetes cluster.
18
18
19
-
-[Prerequisites](./guide/Installation/prerequisites.md) Make sure your compute resources and system configuration are ready
20
-
-[Quick Start](./guide/Installation/quickstart.md) If your resources are ready, "kick the tires" with our Quick Start!
19
+
-[Prerequisites](./guide/Installation/prerequisites) Make sure your compute resources and system configuration are ready
20
+
-[Quick Start](./guide/Installation/quickstart) If your resources are ready, "kick the tires" with our Quick Start!
0 commit comments