update README
This commit is contained in:
parent
c2ba1bae57
commit
a415c613de
@ -16,6 +16,7 @@ Some methods use knowledge distillation (KD), which require pre-trained models.
|
||||
|
||||
|
||||
## [Network Pruning via Transformable Architecture Search](https://arxiv.org/abs/1905.09717)
|
||||
In this paper, we proposed a differentiable searching strategy for transformable architectures, i.e., searching for the depth and width of a deep neural network.
|
||||
|
||||
<img src="https://d-x-y.github.com/resources/paper-icon/NIPS-2019-TAS.png" width="700">
|
||||
|
||||
@ -42,7 +43,8 @@ args: `cifar10` indicates the dataset name, `ResNet56` indicates the basemodel n
|
||||
|
||||
## One-Shot Neural Architecture Search via Self-Evaluated Template Network
|
||||
|
||||
<img src="https://d-x-y.github.com/resources/paper-icon/ICCV-2019-SETN.png" width="550">
|
||||
<img src="https://d-x-y.github.com/resources/paper-icon/ICCV-2019-SETN.png" width="450">
|
||||
|
||||
Train the searched SETN-searched CNN on CIFAR-10, CIFAR-100, and ImageNet.
|
||||
```
|
||||
CUDA_VISIBLE_DEVICES=0 bash ./scripts/nas-infer-train.sh cifar10 SETN 96 -1
|
||||
@ -55,7 +57,7 @@ Searching codes come soon!
|
||||
|
||||
## [Searching for A Robust Neural Architecture in Four GPU Hours](http://openaccess.thecvf.com/content_CVPR_2019/papers/Dong_Searching_for_a_Robust_Neural_Architecture_in_Four_GPU_Hours_CVPR_2019_paper.pdf)
|
||||
|
||||
<img src="https://d-x-y.github.com/resources/paper-icon/CVPR-2019-GDAS.png" width="450">
|
||||
<img src="https://d-x-y.github.com/resources/paper-icon/CVPR-2019-GDAS.png" width="350">
|
||||
|
||||
The old version is located at [`others/GDAS`](https://github.com/D-X-Y/NAS-Projects/tree/master/others/GDAS) and a paddlepaddle implementation is locate at [`others/paddlepaddle`](https://github.com/D-X-Y/NAS-Projects/tree/master/others/paddlepaddle).
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user