Message boards : Graphics cards (GPUs) : GPU grid??! Which GPU is supported?
Author | Message |
---|---|
I saw \"GPU Grid - PS3 Grid\" in BOINC project list. | |
ID: 1124 | Rating: 0 | rate:
![]() ![]() ![]() | |
I saw \"GPU Grid - PS3 Grid\" in BOINC project list. We will support NVIDIA GPUs, but we cannot say more now. g | |
ID: 1127 | Rating: 0 | rate:
![]() ![]() ![]() | |
I saw \"GPU Grid - PS3 Grid\" in BOINC project list. WOOT! Please keep us updated. Got a whole bunch of NVIDIA cards here just itching to crunch. :D ____________ ![]() | |
ID: 1154 | Rating: 0 | rate:
![]() ![]() ![]() | |
news ? | |
ID: 1188 | Rating: 0 | rate:
![]() ![]() ![]() | |
news ? For Linux64: 1) Install the latest Nvidia driver from http://www.nvidia.com/object/cuda_get.html 2) Download this BOINC client http://boinc.berkeley.edu/dl/boinc_ubuntu_6.3.5_x86_64-pc-linux-gnu.sh 3) Attach to ps3grid.net NVIDIA GPU supported: http://www.nvidia.com/object/cuda_learn_products.html GDF | |
ID: 1210 | Rating: 0 | rate:
![]() ![]() ![]() | |
all in and working, test underway | |
ID: 1211 | Rating: 0 | rate:
![]() ![]() ![]() | |
Hi Hi, do you have a file called libcudart64.so in your BOINC dir? thanks, gdf | |
ID: 1212 | Rating: 0 | rate:
![]() ![]() ![]() | |
do you have a file called libcudart64.so in your BOINC dir? You're too quick :D You're quite right, I failed to copy libcudart64.so into my boinc folder but realised after I posted. Its up & running now, I'll post an update when I have some results. Nice work | |
ID: 1213 | Rating: 0 | rate:
![]() ![]() ![]() | |
What about Windows support? | |
ID: 1214 | Rating: 0 | rate:
![]() ![]() ![]() | |
What about Windows support? We will support Windows but I don't know when. It could be in month or later. gdf | |
ID: 1215 | Rating: 0 | rate:
![]() ![]() ![]() | |
There are 7 users with failed workunits at startup. | |
ID: 1217 | Rating: 0 | rate:
![]() ![]() ![]() | |
Hi, | |
ID: 1218 | Rating: 0 | rate:
![]() ![]() ![]() | |
Hi, Hi, I have sent you a private message asking for info on your machine. gdf | |
ID: 1219 | Rating: 0 | rate:
![]() ![]() ![]() | |
Answered | |
ID: 1220 | Rating: 0 | rate:
![]() ![]() ![]() | |
Hi | |
ID: 1222 | Rating: 0 | rate:
![]() ![]() ![]() | |
Hello, Can you tell us which GPU and drive you use for comparison? Thanks | |
ID: 1223 | Rating: 0 | rate:
![]() ![]() ![]() | |
I would say that any GPU with over 100 cores is good. We have 8800 GT, and 9800 GX2. We want to support multiple GPUs not in SLI mode, I have not tested it yet. A machine with a quad code cpu and 2 9800 GX2 would crunch more or less 20,000 credits/day. | |
ID: 1225 | Rating: 0 | rate:
![]() ![]() ![]() | |
There are 7 users with failed workunits at startup. Users with old version of the client would not receive work now. They will have to update to version 6.3.5. gdf | |
ID: 1226 | Rating: 0 | rate:
![]() ![]() ![]() | |
Can you tell us which GPU and drive you use for comparison? Hi Fedora 7 64bit Asus 8800GS NVIDIA-Linux-x86_64-169.09-pkg2.run | |
ID: 1227 | Rating: 0 | rate:
![]() ![]() ![]() | |
Hi The 4th WU has just completed after 4 hours but failed validation with process exited with code 1 message in stderr out (same as the 2nd & 3rd WUs). I've set No New Work for the moment. | |
ID: 1228 | Rating: 0 | rate:
![]() ![]() ![]() | |
Thanks I tested two drivers, 173.14 and 177.13, no difference I will test 169.09 ... maybe | |
ID: 1229 | Rating: 0 | rate:
![]() ![]() ![]() | |
I tested two drivers, 173.14 and 177.13, no difference I initially had 173.14 installed (from here) but wasn't sure if that was a cuda driver or not So followed the http://www.nvidia.com/object/cuda_get.html link and it suggested the 169.09 package for my setup. | |
ID: 1230 | Rating: 0 | rate:
![]() ![]() ![]() | |
So what do you guys think? It appears my older Nvidia cards don't support CUDA. :( | |
ID: 1235 | Rating: 0 | rate:
![]() ![]() ![]() | |
The 9800 GTX shouldn't be a bad cruncher, you can see the runtimes per WU of my card in the other thread you started... | |
ID: 1237 | Rating: 0 | rate:
![]() ![]() ![]() | |
The 9800 GTX shouldn't be a bad cruncher, you can see the runtimes per WU of my card in the other thread you started... A way to know which card is better for the money do this: Compute the peak Gflops of the card. Peak Gflops = (shader clock)x(number of stream processors)x(3 flop) The highest the better. GDF | |
ID: 1239 | Rating: 0 | rate:
![]() ![]() ![]() | |
It's x2 flop I think | |
ID: 1241 | Rating: 0 | rate:
![]() ![]() ![]() | |
Thanks guys! | |
ID: 1242 | Rating: 0 | rate:
![]() ![]() ![]() | |
Thanks guys! Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor... I really can't say how important the CPU speed is for the GPU application, but earlier test have shown that if the app uses only 50% of a CPU core, the WUs were a good bit slower. Would be interesting to see some comparisons with the same graphics cards but other CPUs in the thread you started... ____________ ![]() pixelicious.at - my little photoblog | |
ID: 1244 | Rating: 0 | rate:
![]() ![]() ![]() | |
Thanks guys! The CPU is not important at all. It appears to be using 100% of resources just because it is polling waiting for the accelerated kernel to finish. So, any CPU should do. We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU. We are looking into building another one with Geforce 280. GDF | |
ID: 1245 | Rating: 0 | rate:
![]() ![]() ![]() | |
Wow! This sounds like a nice machine! :D Want to give it away to me as a present? ;-) ____________ ![]() pixelicious.at - my little photoblog | |
ID: 1255 | Rating: 0 | rate:
![]() ![]() ![]() | |
Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor... Please excuse my ignorance, but does that mean the remaining cores are left free to do 'traditional' BOINC projects? | |
ID: 1258 | Rating: 0 | rate:
![]() ![]() ![]() | |
Here's to hoping this can get ported to windows soon. I'd love to donate my new 8800 GT to the cause. :) | |
ID: 1259 | Rating: 0 | rate:
![]() ![]() ![]() | |
Actually the application uses also one core of the CPU to 100% plus the GPU as coprocessor... Yes, they are. gdf | |
ID: 1260 | Rating: 0 | rate:
![]() ![]() ![]() | |
Here's to hoping this can get ported to windows soon. I'd love to donate my new 8800 GT to the cause. :) No, it should not. gdf | |
ID: 1261 | Rating: 0 | rate:
![]() ![]() ![]() | |
We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU. Could your clarify this. Does a GPU task use only 1 CPU core and all avaialable GPU cores, in this case 6, or does each gpu task use 1 cpu and 1 graphics card (2 gpu cores) or just 1 gpu core ? So your rig above, how many gpu tasks could run at the same time to use all cores, and how many cpus would that use and how many cpus would be left for other boinc projects ? | |
ID: 1267 | Rating: 0 | rate:
![]() ![]() ![]() | |
We have built a machine with 6 GPU cores by putting together 3 Geforce 9800 GX2, a power supply of 1200W, an Nvidia 780i motherboard with 3 PCI-E 16x slots and a quad core CPU. This is tunable. At the moment we are using 1 CPU core for each GPU core. Regarding the machine, we are using it mainly outside BOINC. gdf | |
ID: 1270 | Rating: 0 | rate:
![]() ![]() ![]() | |
news ? yes sir :-)) thx for your efforts ;-) | |
ID: 1272 | Rating: 0 | rate:
![]() ![]() ![]() | |
Message boards : Graphics cards (GPUs) : GPU grid??! Which GPU is supported?