Search code examples
pythontensorflowdeep-learningskflowtflearn

which higher layer abstraction to use for tensorflow


I am looking for higher layer abstractions for my deep learning project.

Few doubts lately.

  1. I am really confused about which is more actively maintained tflearn(docs), or tensorflow.contrib.learn. But projects are different and actively contributed on Github. I did not find why are people working this way, same goal, same name, but working differently.

  2. That was not enough, we also have skflow, why do we have this project separately, this aims to mimic scikit-learn like functionality for deep learning(just like tflearn do).

  3. There are more and more coming, which one choose, and which one will be maintained in future?

Any ideas?

PS: I know this might get closed. but I would definitely want some answers first. Those wanting it closed, please care to drop a reason/hint/link in comments


Solution

  • What about keras (https://keras.io/)? It is easy to use. However you can do pretty much everything you want with it. It uses either theano or tensorflow as its backend. Kaggle contests are often solved using keras (e.g. https://github.com/EdwardTyantov/ultrasound-nerve-segmentation).

    Edit:

    Because you did not specify python I would also recommend matconvnet if you look for more abstraction.