# RAFT This repository contains the source code for our paper: [RAFT: Recurrent All Pairs Field Transforms for Optical Flow](https://arxiv.org/pdf/2003.12039.pdf)
ECCV 2020
Zachary Teed and Jia Deng
## Requirements The code has been tested with PyTorch 1.5.1 and PyTorch Nightly. If you want to train with mixed precision, you will have to install the nightly build. ```Shell conda create --name raft conda activate raft conda install pytorch torchvision cudatoolkit=10.1 -c pytorch-nightly conda install matplotlib conda install tensorboard conda install scipy conda install opencv ``` ## Demos Pretrained models can be downloaded by running ```Shell ./download_models.sh ``` or downloaded from [google drive](https://drive.google.com/file/d/10-BYgHqRNPGvmNUWr8razjb1xHu55pyA/view?usp=sharing) You can demo a trained model on a sequence of frames ```Shell python demo.py --model=models/raft-things.pth --path=demo-frames ``` ## Required Data To evaluate/train RAFT, you will need to download the required datasets. * [FlyingChairs](https://lmb.informatik.uni-freiburg.de/resources/datasets/FlyingChairs.en.html#flyingchairs) * [FlyingThings3D](https://lmb.informatik.uni-freiburg.de/resources/datasets/SceneFlowDatasets.en.html) * [Sintel](http://sintel.is.tue.mpg.de/) * [KITTI](http://www.cvlibs.net/datasets/kitti/eval_scene_flow.php?benchmark=flow) * [HD1K](http://hci-benchmark.iwr.uni-heidelberg.de/) (optional) By default `datasets.py` will search for the datasets in these locations. You can create symbolic links to wherever the datasets were downloaded in the `datasets` folder ```Shell ├── datasets ├── Sintel ├── test ├── training ├── KITTI ├── testing ├── training ├── devkit ├── FlyingChairs_release ├── data ├── FlyingThings3D ├── frames_cleanpass ├── frames_finalpass ├── optical_flow ``` ## Evaluation You can evaluate a trained model using `evaluate.py` ```Shell python evaluate.py --model=models/raft-things.pth --dataset=sintel ``` ## Training Training code will be made available in the next few days