diff --git a/README.md b/README.md
index a23067c..d7b62a3 100644
--- a/README.md
+++ b/README.md
@@ -1,4 +1,4 @@
-# Nueral Architecture Search
+# Nueral Architecture Search (NAS)
This project contains the following neural architecture search algorithms, implemented in [PyTorch](http://pytorch.org). More NAS resources can be found in [Awesome-NAS](https://github.com/D-X-Y/Awesome-NAS).
@@ -52,7 +52,7 @@ args: `cifar10` indicates the dataset name, `ResNet56` indicates the basemodel n
-Highlight: we equip one-shot NAS with an architecture sampler and train network weights using uniformly sampling.
+Highlight: we equip one-shot NAS with an architecture sampler and train network weights using uniformly sampling.
### Usage
@@ -72,10 +72,12 @@ Searching codes come soon!
-We proposed a gradient-based searching algorithm using differentiable architecture sampling (improving DARTS with Gumbel-softmax sampling).
+We proposed a Gradient-based searching algorithm using Differentiable Architecture Sampling (GDAS). GDAS is baseed on DARTS and improves it with Gumbel-softmax sampling.
+Experiments on CIFAR-10, CIFAR-100, ImageNet, PTB, and WT2 are reported.
The old version is located at [`others/GDAS`](https://github.com/D-X-Y/NAS-Projects/tree/master/others/GDAS) and a paddlepaddle implementation is locate at [`others/paddlepaddle`](https://github.com/D-X-Y/NAS-Projects/tree/master/others/paddlepaddle).
+
### Usage
Please use the following scripts to train the searched GDAS-searched CNN on CIFAR-10, CIFAR-100, and ImageNet.