graph algorithms on GPU
Posted
by scatman
on Stack Overflow
See other posts from Stack Overflow
or by scatman
Published on 2010-03-12T08:17:00Z
Indexed on
2010/03/12
8:27 UTC
Read the original article
Hit count: 297
the current GPU threads are somehow limited (memory limit, limit of data structures, no recursion...).
do you think it would be feasible to implement a graph theory problem on GPU. for example vertex cover? dominating set? independent set? max clique?....
is it also feasible to have branch-and-bound algorithms on GPUs? Recursive backtracking?
© Stack Overflow or respective owner