Skip to content

Conversation

merrymercy
Copy link
Contributor

No description provided.

@merrymercy merrymercy merged commit 199e82a into main Jan 19, 2024
@merrymercy merrymercy deleted the format branch January 19, 2024 07:51
timethink pushed a commit to timethink/sglang that referenced this pull request Mar 9, 2025
chunyuan-w added a commit to chunyuan-w/sglang that referenced this pull request Apr 15, 2025
* use native MoE and FP8 scaled_mm

* disable shared_expert for FP8

* disable forward_absorb_fused_mla_rope_cpu for FP8
NorthmanPKU pushed a commit to NorthmanPKU/sglang that referenced this pull request May 16, 2025
chunyuan-w added a commit to chunyuan-w/sglang that referenced this pull request May 28, 2025
* use native MoE and FP8 scaled_mm

* disable shared_expert for FP8

* disable forward_absorb_fused_mla_rope_cpu for FP8
chunyuan-w added a commit to chunyuan-w/sglang that referenced this pull request Jun 6, 2025
* use native MoE and FP8 scaled_mm

* disable shared_expert for FP8

* disable forward_absorb_fused_mla_rope_cpu for FP8
chunyuan-w added a commit to chunyuan-w/sglang that referenced this pull request Jun 25, 2025
* use native MoE and FP8 scaled_mm

* disable shared_expert for FP8

* disable forward_absorb_fused_mla_rope_cpu for FP8
blzheng added a commit to blzheng/sglang that referenced this pull request Jul 31, 2025
liupeng374 pushed a commit to liupeng374/sglang that referenced this pull request Aug 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant