Summary of Differentiable Combinatorial Scheduling at Scale, by Mingju Liu et al.
Differentiable Combinatorial Scheduling at Scale
by Mingju Liu, Yingjie Li, Jiaqi Yin, Zhiru Zhang, Cunxi Yu
First submitted to arxiv on: 6 Jun 2024
Categories
- Main: Machine Learning (cs.LG)
- Secondary: Artificial Intelligence (cs.AI); Hardware Architecture (cs.AR)
GrooveSquid.com Paper Summaries
GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!
Summary difficulty | Written by | Summary |
---|---|---|
High | Paper authors | High Difficulty Summary Read the original abstract here |
Medium | GrooveSquid.com (original content) | Medium Difficulty Summary The proposed differentiable combinatorial scheduling framework addresses the complex issue of resource-constrained scheduling, an NP-hard problem that spans critical areas including chip design and high-performance computing. By utilizing a novel Gumbel-Softmax differentiable sampling technique, the approach enables a fully differentiable formulation of linear programming (LP) based scheduling, extending its application to a broader range of LP formulations. The paper introduces constrained Gumbel Trick to encode inequality constraints for scheduling tasks, allowing for efficient and scalable scheduling via gradient descent without the need for training data. Comparative evaluations on both synthetic and real-world benchmarks demonstrate significant improvements in optimization efficiency, surpassing state-of-the-art solutions offered by commercial and open-source solvers. |
Low | GrooveSquid.com (original content) | Low Difficulty Summary The paper proposes a new way to schedule things more efficiently using math and computers. Traditional methods have limitations when it comes to scalability and applicability. The researchers created a new framework that uses a differentiable combinatorial scheduling approach, which allows for more flexibility in solving linear programming problems. This new method can handle inequality constraints and is efficient and scalable. It outperforms existing commercial and open-source solvers on many designs. |
Keywords
» Artificial intelligence » Gradient descent » Optimization » Softmax