7.1 KiB
7.1 KiB
自动深度学习库 (AutoDL-Projects) 是一个开源的,轻量级的,功能强大的项目。 该项目实现了多种网络结构搜索(NAS)和超参数优化(HPO)算法。
谁应该考虑使用AutoDL-Projects
- 想尝试不同AutoDL算法的初学者
- 想调研AutoDL在特定问题上的有效性的工程师
- 想轻松实现和实验新AutoDL算法的研究员
为什么我们要用AutoDL-Projects
- 最简化的python依赖库
- 所有算法都在一个代码库下
- 积极地维护
AutoDL-Projects 能力简述
目前,该项目提供了下列算法和以及对应的运行脚本。请点击每个算法对应的链接看他们的细节描述。
Type | ABBRV | Algorithms | Description |
---|---|---|---|
NAS | TAS | Network Pruning via Transformable Architecture Search | NeurIPS-2019-TAS.md |
DARTS | DARTS: Differentiable Architecture Search | ICLR-2019-DARTS.md | |
GDAS | Searching for A Robust Neural Architecture in Four GPU Hours | CVPR-2019-GDAS.md | |
SETN | One-Shot Neural Architecture Search via Self-Evaluated Template Network | ICCV-2019-SETN.md | |
NAS-Bench-201 | NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search | NAS-Bench-201.md | |
NATS-Bench | NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and Size | NATS-Bench.md | |
... | ENAS / REA / REINFORCE / BOHB | Please check the original papers. | NAS-Bench-201.md NATS-Bench.md |
HPO | HPO-CG | Hyperparameter optimization with approximate gradient | coming soon |
Basic | ResNet | Deep Learning-based Image Classification | BASELINE.md |
准备工作
请使用3.6
以上的Python
,更多的Python包参见requirements.txt.
请下载并且解压缩CIFAR
和ImageNet
到$TORCH_HOME
.
引用
如果您发现该项目对您的科研或工程有帮助,请考虑引用下列的某些文献:
@inproceedings{dong2021autohas,
title = {{AutoHAS}: Efficient Hyperparameter and Architecture Search},
author = {Dong, Xuanyi and Tan, Mingxing and Yu, Adams Wei and Peng, Daiyi and Gabrys, Bogdan and Le, Quoc V},
booktitle = {2nd Workshop on Neural Architecture Search at International Conference on Learning Representations (ICLR)},
year = {2021}
}
@article{dong2021nats,
title = {{NATS-Bench}: Benchmarking NAS Algorithms for Architecture Topology and Size},
author = {Dong, Xuanyi and Liu, Lu and Musial, Katarzyna and Gabrys, Bogdan},
doi = {10.1109/TPAMI.2021.3054824},
journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)},
year = {2021},
note = {\mbox{doi}:\url{10.1109/TPAMI.2021.3054824}}
}
@inproceedings{dong2020nasbench201,
title = {{NAS-Bench-201}: Extending the Scope of Reproducible Neural Architecture Search},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {International Conference on Learning Representations (ICLR)},
url = {https://openreview.net/forum?id=HJxyZkBKDr},
year = {2020}
}
@inproceedings{dong2019tas,
title = {Network Pruning via Transformable Architecture Search},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {Neural Information Processing Systems (NeurIPS)},
year = {2019}
pages = {760--771},
}
@inproceedings{dong2019one,
title = {One-Shot Neural Architecture Search via Self-Evaluated Template Network},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
pages = {3681--3690},
year = {2019}
}
@inproceedings{dong2019search,
title = {Searching for A Robust Neural Architecture in Four GPU Hours},
author = {Dong, Xuanyi and Yang, Yi},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
pages = {1761--1770},
year = {2019}
}
其他
如果你想要给这份代码库做贡献,请看CONTRIBUTING.md。 此外,使用规范请参考CODE-OF-CONDUCT.md。
许可证
The entire codebase is under MIT license