You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried to run the LLaMA3.2 3B model on the MTK platform (D9400) and encountered an error when exporting the .pte model based on the instructions in examples/mediatek/README.md.
Loading weights from disk
Shape: 128t512c
Shape: 1t512c
export shapes: {'128t512c': [128, 512], '1t512c': [1, 512]}
Max Num Token: 128
Max Cache Size: 512
Instantiating submodels
Loading weights for chunk 0
Loading weights for chunk 1
Loading weights for chunk 2
Loading weights for chunk 3
Exporting Chunk 0 to PTE
Getting pre autograd ATen Dialect Graph
Getting ATen Dialect Graph for Llama-3.2-3B-Instruct_A16W4_dummy_cal_4_chunks 128t512c chunk 0
Getting ATen Dialect Graph for Llama-3.2-3B-Instruct_A16W4_dummy_cal_4_chunks 1t512c chunk 0
Delegating Edge Program to Neuropilot Backend
Traceback (most recent call last):
File "/mnt/f/Code/executorch/examples/mediatek/model_export_scripts/llama.py", line 499, in <module>
main()
File "/mnt/f/Code/executorch/examples/mediatek/model_export_scripts/llama.py", line 485, in main
export_to_et_ir(
File "/mnt/f/Code/executorch/examples/mediatek/model_export_scripts/llama.py", line 369, in export_to_et_ir
delegated_program = to_backend(
File "/usr/lib/python3.10/functools.py", line 889, in wrapper
return dispatch(args[0].__class__)(*args, **kw)
File "/home/matt823/.local/lib/python3.10/site-packages/executorch/exir/backend/backend_api.py", line 762, in _
lower_all_submodules_to_backend(
File "/home/matt823/.local/lib/python3.10/site-packages/executorch/exir/backend/backend_api.py", line 591, in lower_all_submodules_to_backend
backend_name_to_subclass[backend_id].preprocess_multimethod(
File "/home/matt/.local/lib/python3.10/site-packages/executorch/backends/mediatek/preprocess.py", line 195, in preprocess_multimethod
blob, new_models = mtk_neuron.extract_shared_data(
AttributeError: module 'mtk_neuron' has no attribute 'extract_shared_data'
Additional Information
I checked the mtk_neuron module and confirmed it does not have an attribute named extract_shared_data. A similar interface named extract_shared_hint does exist.
Request for Assistance
Could you please help investigate this issue? It appears there might be a mismatch between the expected API and the actual implementation in the mtk_neuron module. Thank you.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Problem Description
I tried to run the LLaMA3.2 3B model on the MTK platform (D9400) and encountered an error when exporting the
.pte
model based on the instructions inexamples/mediatek/README.md
.Command Used:
source shell_scripts/export_llama.sh llama3.2-3b 4 128 512 None
Error Log
Additional Information
I checked the mtk_neuron module and confirmed it does not have an attribute named extract_shared_data. A similar interface named extract_shared_hint does exist.
The tools were downloaded from:
https://s3.ap-southeast-1.amazonaws.com/mediatek.neuropilot.com/06302508-4c94-4bf2-9789-b0ee44e83e27.gz
Request for Assistance
Could you please help investigate this issue? It appears there might be a mismatch between the expected API and the actual implementation in the mtk_neuron module. Thank you.
Beta Was this translation helpful? Give feedback.
All reactions