Update README.md

This commit is contained in:
Jack Turner 2020-06-08 10:59:13 +01:00 committed by GitHub
parent 37ec9fe822
commit 3c467fe635
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -1,14 +1,20 @@
# Neural Architecture Search Without Training
**IMPORTANT** : our codebase relies on use of the NASBench-201 dataset. As such we make use of cloned code from [this repository](https://github.com/D-X-Y/AutoDL-Projects). We have left the copyright notices in the code that has been cloned, which includes the name of the author of the open source library that our code relies on.
This repository contains code for replicating our paper on NAS without training.
The datasets can also be downloaded as instructed from the NASBench-201 README: [https://github.com/D-X-Y/NAS-Bench-201](https://github.com/D-X-Y/NAS-Bench-201).
## Setup
1. Download the [datasets](https://drive.google.com/drive/folders/1L0Lzq8rWpZLPfiQGd6QR8q5xLV88emU7).
2. Download [NAS-Bench-201](https://drive.google.com/file/d/1OOfVPpt-lA4u2HJrXbgrRd42IbfvJMyE/view).
3. Install the requirements in a conda environment with `conda env create -f environment.yml`.
We also refer the reader to instructions in the official [NASBench-201 README](https://github.com/D-X-Y/NAS-Bench-201).
## Reproducing our results
To reproduce our results:
```
conda env create -f environment.yml
conda activate nas-wot
./reproduce.sh 3 # average accuracy over 3 runs
./reproduce.sh 500 # average accuracy over 500 runs (this will take longer)
@ -34,3 +40,7 @@ To try different sample sizes, simply change the `--n_samples` argument in the c
Note that search times may vary from the reported result owing to hardware setup.
The code is licensed under the MIT licence.
# Acknowledgements
This repository makes liberal use of code from the [AutoDL](https://github.com/D-X-Y/AutoDL-Projects) library. We also rely on [NAS-Bench-201](https://github.com/D-X-Y/NAS-Bench-201).