Skip to content

Conversation

@bconsolvo
Copy link

Overhaul of index.rst and inst.rst for ease of use for developers

  • Keeping release notes with only actual release notes, not hardware configurations. Moved hardware configuration and model support to index (starting page), and adjusted other links elsewhere to point to this place.
  • Added HW support table to provide much more clarity on codenames and supported OS for 1.6.0 release
  • Added links for installation into table of prerequisites
  • Added code for setting environment variables in PowerShell (not obvious from documentation)
  • Added 5-step development workflow overview cycle to clarify a typical development cycle
  • Other small syntactical and structural changes

@dwithchenna dwithchenna added the documentation Improvements or additions to documentation label Oct 15, 2025
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR overhauls the documentation structure for index.rst and inst.rst to improve ease of use for developers. Key changes include reorganizing hardware configuration information, adding clearer installation instructions, and providing a structured development workflow overview.

  • Moved hardware configuration and model support information from release notes to the main index page
  • Enhanced installation instructions with specific links and PowerShell environment variable setup
  • Added a 5-step development workflow overview for typical Ryzen AI usage
  • Updated example titles and restructured documentation organization

Reviewed Changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
docs/relnotes_backup.rst Created as backup containing original release notes content with hardware configurations
docs/relnotes.rst Removed hardware configuration sections to focus on actual release notes
docs/inst.rst Enhanced with detailed prerequisites table, PowerShell setup code, and clearer installation steps
docs/index.rst Added comprehensive hardware support tables and development workflow overview
docs/getstartex.rst Updated tutorial title and added more detailed explanations
docs/examples.rst Simplified structure and updated example titles
docs/conf.py Added sphinx_copybutton extension for code block copying

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

- Specify the name for the conda environment (default: ``ryzen-ai-1.6.0``)

The Ryzen AI Software packages are now installed in the conda environment created by the installer.
The Ryzen AI Software packages should now installed in the conda environment created by the installer.
Copy link

Copilot AI Oct 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing word 'be' in sentence. Should read 'should now be installed'.

Suggested change
The Ryzen AI Software packages should now installed in the conda environment created by the installer.
The Ryzen AI Software packages should now be installed in the conda environment created by the installer.

Copilot uses AI. Check for mistakes.

- Download and Install the NPU driver version: 32.0.203.280 or newer using the following links:
- Under "Task Manager" in Windows, go to Performance -> NPU0 to check the driver version.
- If needed, download the NPU driver version: 32.0.203.280 or the latest 32.0.203.304 here:
Copy link

Copilot AI Oct 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The instruction mentions 'here:' but the actual download links are on the following lines. Consider rephrasing to 'download the NPU driver from one of the following links:' for better clarity.

Suggested change
- If needed, download the NPU driver version: 32.0.203.280 or the latest 32.0.203.304 here:
- If needed, download the NPU driver version: 32.0.203.280 or the latest 32.0.203.304 from one of the following links:

Copilot uses AI. Check for mistakes.
Comment on lines +83 to 84
*The CIFAR-10 dataset consists of 60,000 32x32 colour images in 10 classes, with 6,000 images per class. There are 50,000 training images and 10,000 test images.* You can learn more about the CIFAR-10 dataset here: https://www.cs.toronto.edu/~kriz/cifar.html. This dataset is used in the subsequent steps for quantization and inference. The script also exports the provided PyTorch model into ONNX format. The following snippet from the script shows how the ONNX model is exported:

Copy link

Copilot AI Oct 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The CIFAR-10 dataset description is formatted with asterisks instead of proper reStructuredText formatting. Consider using proper emphasis markup or a note directive for better presentation.

Suggested change
*The CIFAR-10 dataset consists of 60,000 32x32 colour images in 10 classes, with 6,000 images per class. There are 50,000 training images and 10,000 test images.* You can learn more about the CIFAR-10 dataset here: https://www.cs.toronto.edu/~kriz/cifar.html. This dataset is used in the subsequent steps for quantization and inference. The script also exports the provided PyTorch model into ONNX format. The following snippet from the script shows how the ONNX model is exported:
.. note::
The CIFAR-10 dataset consists of 60,000 32x32 colour images in 10 classes, with 6,000 images per class. There are 50,000 training images and 10,000 test images.
You can learn more about the CIFAR-10 dataset here: https://www.cs.toronto.edu/~kriz/cifar.html.
This dataset is used in the subsequent steps for quantization and inference. The script also exports the provided PyTorch model into ONNX format. The following snippet from the script shows how the ONNX model is exported:

Copilot uses AI. Check for mistakes.
The C++ source files, CMake list files, and related artifacts are provided in the ``cpp/resnet_cifar/*`` folder. The source file ``cpp/resnet_cifar/resnet_cifar.cpp`` takes 10 images from the CIFAR-10 test set, converts them to .png format, preprocesses them, and performs model inference. The example has onnxruntime dependencies that are provided in ``%RYZEN_AI_INSTALLATION_PATH%/onnxruntime/*``.

Run the following command to build the resnet example. Assign ``-DOpenCV_DIR`` to the OpenCV build directory.

Copy link

Copilot AI Oct 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Path uses backslashes which are Windows-specific. Consider using forward slashes for cross-platform compatibility or noting this is Windows-specific.

Suggested change
.. note::
The following command uses Windows-style backslashes and is intended for use in a Windows environment.

Copilot uses AI. Check for mistakes.
NPU
~~~

- :doc:`Getting Started Tutorial for INT8 models <getstartex>` - Uses a custom ResNet model to demonstrate:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why remove "Getting Started" from the description?

- 2025
- ☑️
-
* - Ryzen Z2
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't believe this device is supported.

****************
Ryzen AI 1.6 Software runs on AMD processors outlined below. For a more detailed list of supported devices, refer to the `processor specifications <https://www.amd.com/en/products/specifications/processors.html>`_ page (scroll to the "AMD Ryzen™ AI" column toward the right side of the table, and select "Available" from the pull-down menu). Support for Linux is coming soon in Ryzen AI 1.6.1.

.. list-table:: Supported Ryzen AI Processor Configurations
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This table will only grow and will be hard to maintain as we add support for more platforms. It's redundant with the https://www.amd.com/en/products/specifications/processors.html page. And we risk creating inconsistencies. Case in point, see the comment about Z2 below.

I recommend we simply link to the official processor specification page.

- GPU
- NPU
- Hybrid (NPU + iGPU)
* - Ryzen AI 300
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We support LLMs on all STX and KRK platforms, not all Ryzen AI 300. The first column in this table is not needed and should be removed.

.. list-table::
:header-rows: 1

* - Model Type
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why mention CPU and GPU for LLMs and not for other models? BF16 models can run on CPU and GPU on PHX/HPT.
The way LLMs and CNN/NLPs are presented in inconsistent. It would be preferable to find a common way to presenting the information.

*************************

The Ryzen AI development flow does not require any modifications to the existing model training processes and methods. The pre-trained model can be used as the starting point of the Ryzen AI flow.
A typical Ryzen AI flow might look like the following:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is accurate for CNNs, but not for BF16 NLPs (no quantization step) or LLMs (OGA flow).

- 2022 with `Desktop Development with C++` checked
* - `cmake <https://cmake.org/download/>`_
- >= 3.26
* - `Python (Miniforge preferred) <https://conda-forge.org/download/>`_
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we really say "Miniforge preferred"?
Internally to AMD, we need to use Miniforge. But other companies may have different requirements.

- Miniforge: ensure that the following path is set in the System PATH variable: ``path\to\miniforge3\condabin`` or ``path\to\miniforge3\Scripts\`` or ``path\to\miniforge3\`` (The System PATH variable should be set in the *System Variables* section of the *Environment Variables* window).
$existingPath = [System.Environment]::GetEnvironmentVariable('Path', 'Machine')
.. code-block:: powershell
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not put all 3 lines in the same code block?

.. code-block:: powershell
$newPaths = "C:\Users\<user>\miniforge3\Scripts;C:\Users\<user>\miniforge3\condabin"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will only work for miniforge. If people have Anaconda or Miniconda, this will not work.

Copy link
Collaborator

@ThomasXilinx ThomasXilinx left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some of the proposed changes need more discussion

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants