On a previous article published on this blog, we introduced our work dedicated to convolutional neural network.
We are now happy to announce the 0.4 release of this R&D project!
Until the last released version (0.3.2), neural networks were built with the TensorFlow library. The major modification comes with the transition to the Keras library. Taking advantage of Keras API simplicity, deeposlandia is now easier to use.
It is more reliable too, as it comes now as a standalone package with a set of unit tests.
The API has been simplified; it behaves more intuitively, with a datagen command for generating ready-to-train datasets, a train command to train a model and do predictions after the training process, and a inference command that aims to infer labels from a set of input images.
How to use the code?
The project is easy to clone and install on a system from scratch, by a using virtual environment:
$ git clone https://github.com/Oslandia/deeposlandia $ cd deeposlandia $ virtualenv -p /usr/bin/python3 venv $ source venv/bin/activate (venv)$ pip install -r requirements-dev.txt
After getting Mapillary vistas dataset from their website and storing it in a data/mapillary/input/ repository, the following command builds preprocessed version of the dataset:
python deeposlandia/datagen.py -D mapillary -s 224
Then the preprocessed dataset may be used in a training effort:
python deeposlandia/train.py -M semantic_segmentation -D mapillary -s 224 -e 5
This produces a trained model saved as a .h5 file on the file system. This backup may be recovered for training the model on more periods and/or for predicting some image labels as follows:
python deeposlandia/inference.py -M semantic_segmentation -D mapillary -i picture.png
These commands are highly configurable, do not hesitate to read the README of the project on Github.
If you have questions when using the code, please contact us by email (firstname.lastname@example.org) or through Github issues. If you want to add some new handsome features to the code, do not hesitate to contribute!