update decompress codes and figures

This commit is contained in:
Xuanyi Dong 2019-04-02 17:06:25 +08:00
parent 3f483c37e7
commit d9026be4b2
7 changed files with 33 additions and 12 deletions

View File

@ -1,6 +1,6 @@
MIT License
Copyright (c) 2018 Xuanyi Dong
Copyright (c) 2019 Xuanyi Dong
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal

View File

@ -1,8 +1,11 @@
# Searching for A Robust Neural Architecture in Four GPU Hours
## Searching for A Robust Neural Architecture in Four GPU Hours
We propose A Gradient-based neural architecture search approach using Differentiable Architecture Sampler (GDAS).
## Requirements
<img src="data/GDAS.png" width="520">
Figure-1. We utilize a DAG to represent the search space of a neural cell. Different operations (colored arrows) transform one node (square) to its intermediate features (little circles). Meanwhile, each node is the sum of the intermediate features transformed from the previous nodes. As indicated by the solid connections, the neural cell in the proposed GDAS is a sampled sub-graph of this DAG. Specifically, among the intermediate features between every two nodes, GDAS samples one feature in a differentiable way.
### Requirements
- PyTorch 1.0.1
- Python 3.6
- opencv
@ -10,7 +13,7 @@ We propose A Gradient-based neural architecture search approach using Differenti
conda install pytorch torchvision cuda100 -c pytorch
```
## Usages
### Usages
Train the searched CNN on CIFAR
```
@ -41,10 +44,14 @@ CUDA_VISIBLE_DEVICES=0 bash ./scripts-rnn/train-WT2.sh DARTS_V2
CUDA_VISIBLE_DEVICES=0 bash ./scripts-rnn/train-WT2.sh GDAS
```
## Training Logs
### Training Logs
Some training logs can be found in `./data/logs/`, and some pre-trained models can be found in [Google Driver](https://drive.google.com/open?id=1Ofhc49xC1PLIX4O708gJZ1ugzz4td_RJ).
## Citation
### Experimental Results
<img src="data/imagenet-results.png" width="600">
Figure 2. Top-1 and top-5 errors on ImageNet.
### Citation
```
@inproceedings{dong2019search,
title={Searching for A Robust Neural Architecture in Four GPU Hours},

BIN
data/GDAS.png Executable file

Binary file not shown.

After

Width:  |  Height:  |  Size: 514 KiB

View File

@ -16,7 +16,7 @@ def execute(cmds, idx, num):
def command(prefix, cmd):
#print ('{:}{:}'.format(prefix, cmd))
#if execute: os.system(cmd)
xcmd = '(echo {:}; {:}; sleep 0.1s)'.format(prefix, cmd)
xcmd = '(echo {:} $(date +\"%Y-%h-%d--%T\") \"PID:\"$$; {:}; sleep 0.1s)'.format(prefix, cmd)
return xcmd

BIN
data/imagenet-results.png Executable file

Binary file not shown.

After

Width:  |  Height:  |  Size: 139 KiB

View File

@ -19,14 +19,16 @@ else
fi
echo "CHECK-DATA-DIR DONE"
PID=$$
# config python
PYTHON_ENV=py36_pytorch1.0_env0.1.3.tar.gz
wget -e "http_proxy=cp01-sys-hic-gpu-02.cp01:8888" http://cp01-sys-hic-gpu-02.cp01/HGCP_DEMO/$PYTHON_ENV > screen.log 2>&1
tar xzf $PYTHON_ENV
echo "JOB-PWD : " `pwd`
echo "JOB-files : " `ls`
echo "JOB-PID : "${PID}
echo "JOB-PWD : "$(pwd)
echo "JOB-files : "$(ls)
echo "JOB-CUDA_VISIBLE_DEVICES: " ${CUDA_VISIBLE_DEVICES}
./env/bin/python --version

View File

@ -18,6 +18,7 @@ layers=$3
SAVED=./output/NAS-CNN/${arch}-${dataset}-C${channels}-L${layers}-E250
PY_C="./env/bin/python"
#PY_C="$CONDA_PYTHON_EXE"
if [ ! -f ${PY_C} ]; then
echo "Local Run with Python: "`which python`
@ -27,12 +28,23 @@ else
echo "Unzip ILSVRC2012"
tar --version
#tar xf ./hadoop-data/ILSVRC2012.tar -C ${TORCH_HOME}
#${PY_C} ./data/decompress.py ./hadoop-data/ILSVRC2012-TAR ./data/data/ILSVRC2012 tar > ./data/data/get_imagenet.sh
${PY_C} ./data/decompress.py ./hadoop-data/ILSVRC2012-ZIP ./data/data/ILSVRC2012 zip > ./data/data/get_imagenet.sh
bash ./data/data/get_imagenet.sh
commands="./data/data/get_imagenet.sh"
${PY_C} ./data/decompress.py ./hadoop-data/ILSVRC2012-TAR ./data/data/ILSVRC2012 tar > ${commands}
#${PY_C} ./data/decompress.py ./hadoop-data/ILSVRC2012-ZIP ./data/data/ILSVRC2012 zip > ./data/data/get_imagenet.sh
#bash ./data/data/get_imagenet.sh
count=0
while read -r line; do
temp_file="./data/data/TEMP-${count}.sh"
echo "${line}" > ${temp_file}
bash ${temp_file}
count=$((count+1))
done < "${commands}"
echo "Unzip ILSVRC2012 done"
fi
exit 1
${PY_C} --version
${PY_C} ./exps-cnn/train_base.py \