← Research
🧠

SketchML

Sub-Linear Memory Training via Randomized Sketching

Randomized sketching methods for training neural networks with sub-linear memory complexity.

// 编辑下方 markdown 变量即可更新页面内容

Related Publications

2024
SketchGrad: Memory-Efficient Training at Scale
G. Chen, M. Kowalski
ICML 2024 · Vienna, Austria
2025
Differential Privacy Bounds for Distributed Stochastic Gradient Descent
G. Chen, T. Nguyen, H. Wang
Journal of Machine Learning Research · Vol. 26