Skip to content

Conversation

kozistr
Copy link
Contributor

@kozistr kozistr commented Sep 9, 2025

What does this PR do?

related to #596

I've completed testing the fused MoE kernel, which is originally implemented in here by @EricLBuehler. (thanks!)

Here's a fused MoE implementation repository: https://github.com/kozistr/candle-moe. (I adopted and edited Eric's baseline to work with the Nomic MoE version)

Main Changes

  • Nomic MoE model
    • topk_softmax
    • fused MoE

Of course, I've also tested that it outputs the (almost) identical result to the naive implementation.

And, honestly, I haven't yet run extensive benchmarks across multiple settings due to time and resource constraints :(, but I've observed an improvement in latency based on wall clock time. (but still have to be verified more and benchmark precise kernel timing)

Also, I'm very new to CUDA programming, so any feedback or suggestions would be greatly appreciated :)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline?
  • Was this discussed/approved via a GitHub issue or the forum? Please add a link to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the documentation guidelines.
  • Did you write any new necessary tests? If applicable, did you include or update the insta snapshots?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@Narsil OR @alvarobartt

@alvarobartt
Copy link
Member

Hey @kozistr thanks for this PR! I'll try to run some benchmarks on my end to in order to compare both solutions!

Also, do you think it would make sense to add this within https://github.com/huggingface/candle-extensions instead of on a separate repository? Asking as this way it might be easier to maintain in the long-term cc @ivarflakstad too as per the latter!

Thanks again, I'll come back to you once I've tested + reviewed @kozistr 🤗

@kozistr
Copy link
Contributor Author

kozistr commented Sep 9, 2025

Hey @kozistr thanks for this PR! I'll try to run some benchmarks on my end to in order to compare both solutions!

Also, do you think it would make sense to add this within https://github.com/huggingface/candle-extensions instead of on a separate repository? Asking as this way it might be easier to maintain in the long-term cc @ivarflakstad too as per the latter!

Thanks again, I'll come back to you once I've tested + reviewed @kozistr 🤗

Hi @alvarobartt! I also think it'd be much better to add the MoE kernel to the candle-extensions repository!

Actually, I had opened an incomplete version of the MoE kernel PR before, which is partially working. And, it'd be a nice time to renew that PR with the new implementation 🤗

I'll get back to you when ready to reopen the PR to candle-extensions! Thanks for your suggestion :)

@alvarobartt
Copy link
Member

Hey @kozistr thanks for flagging, I missed that! Thanks for the work, and please do let us know if there's anything other than reviews that we can do to help 🤗

@kozistr
Copy link
Contributor Author

kozistr commented Sep 10, 2025

Hey @kozistr thanks for flagging, I missed that! Thanks for the work, and please do let us know if there's anything other than reviews that we can do to help 🤗

thanks for your help :) I'll surely get back to you if I need any help 🤗

btw, I've just opened a PR at candle-extensions! Could you please review it if you have the bandwidth? thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants