Skip to content

Commit 32f9392

Browse files
rahul-tuliclaude
andcommitted
fix: Initialize no_rope_layers when missing in Eagle3 config
Handles cases where draft model configs (like LlamaConfig) don't have no_rope_layers attribute by initializing it with proper defaults before Llama4DecoderLayer creation. This prevents AttributeError during draft model initialization while maintaining compatibility with existing configurations. Co-Authored-By: Claude <[email protected]> Signed-off-by: Rahul Tuli <[email protected]>
1 parent 60501f8 commit 32f9392

File tree

1 file changed

+10
-3
lines changed

1 file changed

+10
-3
lines changed

vllm/model_executor/models/llama4_eagle3.py

Lines changed: 10 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -275,9 +275,16 @@ def _validate_and_update_config(
275275

276276
# Draft model layer index is increased by start_layer_id,
277277
# so we need to pad relevant configs accordingly
278-
self.config.no_rope_layers = [
279-
0
280-
] * start_layer_id + self.config.no_rope_layers
278+
if not hasattr(self.config, 'no_rope_layers'):
279+
# Initialize no_rope_layers if it doesn't exist
280+
# Default to all layers having rope (value of 1)
281+
total_layers = start_layer_id + getattr(self.config, 'num_hidden_layers', 1)
282+
self.config.no_rope_layers = [1] * total_layers
283+
else:
284+
# Pad existing no_rope_layers for the layer offset
285+
self.config.no_rope_layers = [
286+
0
287+
] * start_layer_id + self.config.no_rope_layers
281288

282289
# Update quantization configuration for layer offset
283290
if isinstance(quant_config, TorchAOConfig):

0 commit comments

Comments
 (0)