diff --git a/README.md b/README.md index a80a934b995c2fbcd2ca5ecc3785880a73e998ed..7170ecaee84c4fa462de08e42682ad221bc929fd 100644 --- a/README.md +++ b/README.md @@ -1,19 +1,29 @@ -# ECML-PKDD2021 CNN Boundary Conditions for spatio-temporal dynamics +# Effects of Boundary Conditions in Fully Convolutional Networks for Learning Spatio-temporal Dynamics (ECML-PKDD 2021) -This repository is the supplementary material of the article **"Effects of boundary conditions in fully convolutional networks for learning spatio-temporal dynamics"**, submitted to the Applied Data Science Tracks at ECML-PKDD 2021. It contains the complete description of the neural network and of the computing environement, used code, implementation details and supplementary results. +This repository contains the data, code and additional results of our [paper](https://arxiv.org/abs/2106.11160) accepted to the Applied Data Science Tracks at ECML-PKDD 2021. If you find this code useful in your research, please consider citing: + + @misc{alguacil2021effects, + title={Effects of boundary conditions in fully convolutional networks for learning spatio-temporal dynamics}, + author={Antonio Alguacil and Wagner Gonçalves Pinto and Michael Bauerheim and Marc C. Jacob and Stéphane Moreau}, + year={2021}, + eprint={2106.11160}, + archivePrefix={arXiv}, + primaryClass={cs.LG} + } -Repository is organized as follows: -- [images](./images): folder containing the figures shown in this page -- [network](./network): implementation of the neural network, train and testing scripts +The repository is organized as follows: + - [data_generation](./data_generation): code for the generation of the database using Palabos +- [network](./network): implementation of the neural network, train and testing scripts -More details are availble in the subfolders +You can browse the different subfolder to generate the data with an open-source CFD code, train the neural network or +test the method. Network architecture ------------ -Neural network is multi-scale (field dimensions of N, N/2 and N/4), composed by 17 two-dimensional convolution operations, for a total of 422,419 trainable parameters. ReLUs are used as activation function and replication padding is used to maintain layers size unchanged inside each scale. +The employed neural network is a Multi-Scale architecture [from this paper](https://arxiv.org/abs/1511.05440). 3 Scales are used, with dimensions N, N/2 and N/4, composed by 17 two-dimensional convolution operations, for a total of 422,419 trainable parameters. ReLUs are used as activation function and replication padding is used to maintain layers size unchanged inside each scale. <p align="center"> <img alt="Neural network architecture" src="./images/drawing_network_architecture.png" width="800"/> diff --git a/icml21_sm.code-workspace b/icml21_sm.code-workspace deleted file mode 100644 index 876a1499c09dc083612f43c53c0ae71b9c30c5b1..0000000000000000000000000000000000000000 --- a/icml21_sm.code-workspace +++ /dev/null @@ -1,8 +0,0 @@ -{ - "folders": [ - { - "path": "." - } - ], - "settings": {} -} \ No newline at end of file