Is CUDA, cuBLAS or cuBLAS-XT the right place to start with for machine learning?

Posted by Stefan R. Falk on Programmers See other posts from Programmers or by Stefan R. Falk
Published on 2014-05-30T20:16:34Z Indexed on 2014/05/30 21:56 UTC
Read the original article Hit count: 895

I am not sure if this is the right forum to post this question - but it surely is no question for stackoverflow.

I work on my bachelor thesis and therefore I am implementing a so called Echo-State Network which basically is an artificial neural network that has a large reservoir of randomly initialized neurons and just a few input and output neurons .. but I think we can skip that.

The thing is, there is a Python library called Theano which I am using for this implementation. It encapsulates the CUDA API and offers a quiet "comfortable" way to access the power of a NVIDIA graphics card.

Since CUDA 6.0 there is a sub-library called cuBLAS (Basic Linear Algebra Subroutines) for LinAlg operations and also a cuBLAS-XT an extention which allows to run calculations on multiple graphics cards.

My question at this point is if it would make sense to start using cuBLAS and/or cuBLAS-XT right now since the API is quite complex or rather wait for libraries that will build up on those library (such as Theano does on basic CUDA)?


If you think this is the wrong place for this question please tell me which one is, thank you.

© Programmers or respective owner

Related posts about machine-learning

Related posts about parallel-programming