-
Notifications
You must be signed in to change notification settings - Fork 227
Distill BLOOM - tentative 2 #354
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Distill BLOOM - tentative 2 #354
Conversation
megatron/model/transformer.py
Outdated
self.alibi = self.alibi.to(torch.bfloat16) | ||
else: | ||
self.alibi = None | ||
@torch.no_grad() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nuke this
|
||
modules = _filter_for_teacher_student(modules) | ||
|
||
for module in modules: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we do the filtering on the for loop
- check before the fist if - if it is a Student module
- check also with
isinstance
instead of class name
megatron/optimizer/__init__.py
Outdated
|
||
for module in modules: | ||
for module_ in module.modules(): | ||
if "Student" in module_.__class__.__name__: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use isinstance
instead
partition_method = 'type:transformer' | ||
|
||
|
||
from deepspeed.runtime.pipe.topology import PipeModelDataParallelTopology |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cc @thomasw21
Tentative of applying teacher student using Megatron-DeepSpeed
WIP draft PR - not supposed to merge
cc @thomasw21