TAG-MoE: Task-Aware Gating for Unified Generative Mixture-of-Experts
IntermediateYu Xu, Hongbin Yan et al.Jan 12arXiv
TAG-MoE is a new way to steer Mixture-of-Experts (MoE) models using clear task hints, so the right โmini-expertsโ handle the right parts of an image job.
#Task-Aware Gating#Mixture-of-Experts#Unified Image Generation