Questions tagged [gpu]
12 questions
9
votes
5 answers
Significance of single precision floating point
I've been looking at some of the packages from the High perf task view dealing with GPU computations, and given that most GPU seem to be an order of magnitude stronger at performing single precision arithmetics than DP ones, I was wondering:
why…

user603
- 21,225
- 3
- 71
- 135
7
votes
3 answers
Deep learning libraries/software with good tutorials or examples
I am looking for deep learning libraries that have very good tutorials or examples, since I would like to learn by examples or doing tutorials even if these are not the best deep learning libraries. If they also run on GPUs much better.

Open the way
- 225
- 1
- 8
7
votes
2 answers
Gputools for R: how to interpret the experimental procedure?
The following paper describes an implementation of R in parallel on a graphics processing unit (GPU).
Buckner et al., The gputools package enables GPU computing in R, BIOINFORMATICS, Vol. 26 no. 1 2010, pages 134–135
In the experimental section,…

Douglas S. Stones
- 6,931
- 4
- 16
- 18
3
votes
0 answers
What are key differences between Theano (Python) and Torch (Lua) for deep learning?
Theano and Torch both supports GPU calculations. My question is whether Theano or Torch have significant differences in:
performance
ease of use (assuming one knows the programming language)
libaray support for optimizaiton algorithms (e.g.…

Mark Horvath
- 795
- 1
- 8
- 9
2
votes
1 answer
run multiple deep learning models on the same GPU
Can I use just one GPU to train a model and predict images at the same time? I want to host a website for image predictions. So using GPU for prediction is persistent. At the same time, I may use it to train some models. Is that doable? Or I need…

Jane
- 41
- 1
- 4
1
vote
1 answer
Why do neural networks outperform SVMs on image recognition if SVMs have the less generalization error?
Why do neural networks outperform SVMs if SVMs have the less generalization error according to Vapnik?
Is generalization error only useful in data scarce environments?
Is it because neural networks are unfairly given an advantage by GPUs?
user294054
1
vote
0 answers
GPU for fully-connected network?
I'm interested in neural networks from a general machine learning and pattern recognition perspective and not as much from the perspective of image processing or NLP data. If I want to train a neural network with many layers each of which have many…

JacKeown
- 628
- 1
- 6
- 17
1
vote
0 answers
Why is number of neurons in hidden layers a power of two?
There is a statement in this quora answer:
Layer depth is usually a power of 2 because it is convenient for the GPU.
Also, in fully connected layers number of neurons in every hidden layer corresponds to a power of 2.
But, why is the power of 2…

Alina
- 915
- 2
- 10
- 21
0
votes
1 answer
Are notebooks with NVIDIA GPU usually Gaming notebooks?
I am planning to purchase a notebook to do some Machine Learning development and ideally would like to have NVIDIA GPUs so that I can use CUDA as well.
Doing a search of the best notebooks for this, I find that (almost) all of them are gaming…

KansaiRobot
- 103
- 6
0
votes
1 answer
How to compare the performance of different SVMs and CNNs?
I'm a beginner in machine learning und I have a problem to find the best way to compare the performance (accuracy) of different SVMs and CNNs (Jupyter Notebook)?
The CNNs I train in Google Colab with GPU.
So far I've tried to make the models…

Code Now
- 37
- 4
0
votes
2 answers
What is causing the GPU out-of-memory error(OOM) for my Sequence-to-Sequence network with LSTM?
I'm currently attempting to make a Seq2Seq Chatbot with LSTMs. The data I used is from Cornell's Movie Dialog Corpus.
Here's the link to my code on GitHub, I would appreciate it if you took a look at it: Seq2Seq Chatbot
You need to change the path…

narutatsuri
- 232
- 2
- 8
-1
votes
2 answers
What is it used to parallellize machine learning (RandoForest, SVM, etc) training and to accelerate inference?
I would like to know what is it used to parallelize ML models and to optimize them?
DEEP LEARNING
When using Deep Learning usually a GPU is used for training and inference (deployment). This is understandable as GPUs have thousands of cores and…

Aizzaac
- 989
- 2
- 11
- 21