RAFT/README.md

80 lines
2.6 KiB
Markdown
Raw Normal View History

2020-03-27 04:19:08 +01:00
# RAFT
This repository contains the source code for our paper:
[RAFT: Recurrent All Pairs Field Transforms for Optical Flow](https://arxiv.org/pdf/2003.12039.pdf)<br/>
2020-07-22 17:02:43 +02:00
ECCV 2020 <br/>
2020-03-27 04:19:08 +01:00
Zachary Teed and Jia Deng<br/>
<img src="RAFT.png">
2020-03-27 04:19:08 +01:00
## Requirements
2020-07-26 01:36:17 +02:00
The code has been tested with PyTorch 1.5.1 and PyTorch Nightly. If you want to train with mixed precision, you will have to install the nightly build.
```Shell
conda create --name raft
conda activate raft
conda install pytorch torchvision cudatoolkit=10.1 -c pytorch-nightly
conda install matplotlib
conda install tensorboard
conda install scipy
conda install opencv
```
2020-03-27 04:19:08 +01:00
## Demos
Pretrained models can be downloaded by running
```Shell
./scripts/download_models.sh
```
2020-07-26 01:36:17 +02:00
or downloaded from [google drive](https://drive.google.com/file/d/10-BYgHqRNPGvmNUWr8razjb1xHu55pyA/view?usp=sharing)
2020-03-27 04:19:08 +01:00
2020-07-26 01:36:17 +02:00
You can demo a trained model on a sequence of frames
2020-03-27 04:19:08 +01:00
```Shell
2020-07-26 01:36:17 +02:00
python demo.py --model=models/raft-things.pth --path=demo-frames
2020-03-27 04:19:08 +01:00
```
2020-07-26 01:36:17 +02:00
## Required Data
To evaluate/train RAFT, you will need to download the required datasets.
* [FlyingChairs](https://lmb.informatik.uni-freiburg.de/resources/datasets/FlyingChairs.en.html#flyingchairs)
* [FlyingThings3D](https://lmb.informatik.uni-freiburg.de/resources/datasets/SceneFlowDatasets.en.html)
* [Sintel](http://sintel.is.tue.mpg.de/)
* [KITTI](http://www.cvlibs.net/datasets/kitti/eval_scene_flow.php?benchmark=flow)
* [HD1K](http://hci-benchmark.iwr.uni-heidelberg.de/) (optional)
2020-03-27 04:19:08 +01:00
2020-07-26 01:36:17 +02:00
By default `datasets.py` will search for the datasets in these locations. You can create symbolic links to wherever the datasets were downloaded in the `datasets` folder
2020-03-27 04:19:08 +01:00
```Shell
├── datasets
2020-07-26 01:36:17 +02:00
├── Sintel
├── test
├── training
├── KITTI
├── testing
├── training
├── devkit
├── FlyingChairs_release
├── data
├── FlyingThings3D
├── frames_cleanpass
├── frames_finalpass
├── optical_flow
2020-03-27 04:19:08 +01:00
```
2020-07-26 01:36:17 +02:00
## Evaluation
You can evaluate a trained model using `evaluate.py`
2020-03-27 04:19:08 +01:00
```Shell
2020-07-26 01:36:17 +02:00
python evaluate.py --model=models/raft-things.pth --dataset=sintel
2020-03-27 04:19:08 +01:00
```
2020-07-26 01:36:17 +02:00
## Training
Training code will be made available in the next few days
<!-- We used the following training schedule in our paper (note: we use 2 GPUs for training). Training logs will be written to the `runs` which can be visualized using tensorboard
2020-03-27 04:19:08 +01:00
```Shell
2020-07-26 01:36:17 +02:00
./train_standard.sh
2020-03-27 04:19:08 +01:00
```
2020-07-26 01:36:17 +02:00
If you have a RTX GPU, training can be accelerated using mixed precision. You can expect similiar results in this setting (1 GPU)
2020-03-27 04:19:08 +01:00
```Shell
2020-07-26 01:36:17 +02:00
./train_mixed.sh
``` -->