Skip to content

Conversation

@Kaihui-intel
Copy link
Contributor

@Kaihui-intel Kaihui-intel commented Nov 7, 2025

User description

Type of Change

documentation

Description

detail description

Expected Behavior & Potential Risk

the expected behavior that triggered by this PR

How has this PR been tested?

how to reproduce the test (including hardware information)

Dependency Change?

any library dependency introduced or removed


PR Type

Documentation


Description

  • Added scheme and layer_config to AutoRound documentation

  • Provided example usage of layer_config for AutoRound


Diagram Walkthrough

flowchart LR
  doc_update["Update documentation"]
  scheme_addition["Add scheme parameter"]
  layer_config_addition["Add layer_config parameter"]
  example_usage["Provide example usage of layer_config"]

  doc_update -- "includes" --> scheme_addition
  doc_update -- "includes" --> layer_config_addition
  layer_config_addition -- "example" --> example_usage
Loading

File Walkthrough

Relevant files
Documentation
PT_WeightOnlyQuant.md
Document `scheme` and `layer_config` for AutoRound             

docs/source/3x/PT_WeightOnlyQuant.md

  • Added scheme and layer_config to table
  • Provided example usage of layer_config for AutoRound
+19/-0   

Signed-off-by: Kaihui-intel <[email protected]>
@PRAgent4INC
Copy link
Collaborator

PR Reviewer Guide 🔍

Here are some key observations to aid the review process:

⏱️ Estimated effort to review: 2 🔵🔵⚪⚪⚪
🧪 No relevant tests
🔒 No security concerns identified
⚡ Recommended focus areas for review

Incomplete Example

The example provided for layer_config in the documentation seems incomplete and inconsistent. Specifically, the example includes a commented-out block that is not properly formatted and an uncommented block that does not match the commented-out example.

```python
# layer_config = {
#      "layer1": {
#          "data_type": "int",
#          "bits": 3,
#          "group_size": 128,
#          "sym": True,
#      },
#      "layer2": {
#          "W8A16"
#       }
# }
layer_config = {"lm_head": {"data_type": "int"}}
quant_config = AutoRoundConfig(layer_config=layer_config)
quant_config.set_local("lm_head", lm_head_config)

</details>

</td></tr>
</table>

@PRAgent4INC
Copy link
Collaborator

PR Code Suggestions ✨

Explore these optional code suggestions:

CategorySuggestion                                                                                                                                    Impact
General
Clarify layer_config format

Ensure that the layer_config dictionary format is consistent and correctly
documented.

docs/source/3x/PT_WeightOnlyQuant.md [181-182]

 |             scheme (str)               | A preset scheme that defines the quantization configurations. | "W4A16" |
-|             layer_config (dict)               | Layer-wise quantization config | None |
+|             layer_config (dict)               | Layer-wise quantization config, e.g., `{"layer1": {"data_type": "int", "bits": 3, "group_size": 128, "sym": True}}` | None |
Suggestion importance[1-10]: 6

__

Why: The suggestion aims to provide a clearer example of the layer_config format, which improves documentation but does not address a critical issue.

Low
Possible issue
Fix syntax error

Correct the syntax error in the layer2 configuration example.

docs/source/3x/PT_WeightOnlyQuant.md [298-299]

 #      "layer2": {
-#          "W8A16"
-#       }
+#          "scheme": "W8A16"
+#      }
Suggestion importance[1-10]: 5

__

Why: The suggestion corrects a syntax error in the example, improving the accuracy of the documentation but not addressing a critical issue.

Low

Signed-off-by: Kaihui-intel <[email protected]>
@Kaihui-intel Kaihui-intel requested a review from thuang6 November 7, 2025 05:51
lm_head_config = RTNConfig(dtype="fp32")
quant_config.set_local("lm_head", lm_head_config)
```
3. Example of using `layer_config` for AutoRound
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If 'set_local' does not work in current implementation, we should call out this, let user know, AutoRound specific 'layer_config' instead of 'set_local' API should be used. since in AutoRoundConfig, not any one of 3 items work.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code will be raised in another PR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do u mean you will implement 'set_local' support by converting it to layer_config so any of 3 options is valid after your another PR merged?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure. Currently, only the third option is supported. option `1, 2 'will be implemented in phase 2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants