Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
![]() |
World Community Grid Forums
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
No member browsing this thread |
Thread Status: Active Total posts in this thread: 200
|
![]() |
Author |
|
Grumpy Swede
Master Cruncher Svíþjóð Joined: Apr 10, 2020 Post Count: 2182 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Thanks Uplinger!!
When Beta 7.27 comes online, I'll test it with my so far failing GTX660M. With some luck it failed due to the DP issue. My other GPU's GTX980, and iGPU HD4600 doesn't seem to have any problems at all. |
||
|
nanoprobe
Master Cruncher Classified Joined: Aug 29, 2008 Post Count: 2998 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
When I checked TechPower Up GPU-Z and placed my cursor over OpenCL it indicates ver. 1.2 CUDA Yet the GPU fails the Beta tests. So go figure............. Clive OpenCL and CUDA are not the same thing. Your driver may support OpenCl 1.2 but the card may not which means the tasks won't run.
In 1969 I took an oath to defend and protect the U S Constitution against all enemies, both foreign and Domestic. There was no expiration date.
![]() ![]() |
||
|
goben_2003
Advanced Cruncher Joined: Jun 16, 2006 Post Count: 145 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
I have a question about runtime credit for gpu units. Is it just the cpu time spent on each unit? That is what it appears to be from observing the beta units my computers have crunched.
----------------------------------------![]() |
||
|
Andrew80431
Cruncher Joined: Nov 25, 2005 Post Count: 36 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
@Jorlin Out of curiosity: Anyone ran this on a NVIDIA Quadro P400? On my laptop the WUs ran on a P600 without problems. It's the same architecture as the P400, only slightly faster... Thanks :) Anything to compare the p600 to in terms of performance? Is it worth the "loss" of CPU time? I would not go for the P400, or P600. Compared to other pricey NVIDIA cards, their only advantage are 3 respectively 4 DP ports. In terms of 3D performance, they are low range. My GT740 is over 6 years old and is probably 1.5 times as fast as a P600. I'd suggest to wait until GTX1650s are at ~$130-150 again. Getting one of those is probably a lot better if you are serious about donating GPU time to the project. Just my 2 cents... ![]() |
||
|
Jorlin
Advanced Cruncher Deutschland Joined: Jan 22, 2020 Post Count: 89 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
@Jorlin Out of curiosity: Anyone ran this on a NVIDIA Quadro P400? On my laptop the WUs ran on a P600 without problems. It's the same architecture as the P400, only slightly faster... Thanks :) Anything to compare the p600 to in terms of performance? Is it worth the "loss" of CPU time? I would not go for the P400, or P600. Compared to other pricey NVIDIA cards, their only advantage are 3 respectively 4 DP ports. In terms of 3D performance, they are low range. My GT740 is over 6 years old and is probably 1.5 times as fast as a P600. I'd suggest to wait until GTX1650s are at ~$130-150 again. Getting one of those is probably a lot better if you are serious about donating GPU time to the project. Just my 2 cents... Thanks again. ![]() |
||
|
Vester
Senior Cruncher USA Joined: Nov 18, 2004 Post Count: 325 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
---------------------------------------- ![]() |
||
|
Bryn Mawr
Senior Cruncher Joined: Dec 26, 2018 Post Count: 345 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
|
||
|
Richard Haselgrove
Senior Cruncher United Kingdom Joined: Feb 19, 2021 Post Count: 360 Status: Offline Project Badges: ![]() ![]() |
Number of shaders (first number in final column) is a good indication. Depends whether the app can use them all, though.
|
||
|
Bryn Mawr
Senior Cruncher Joined: Dec 26, 2018 Post Count: 345 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Number of shaders (first number in final column) is a good indication. Depends whether the app can use them all, though. Thanks, when it comes to GPUs I’m a newby. Does the clock speed act as a secondary indicator? [Edit 2 times, last edit by Bryn Mawr at Mar 11, 2021 12:56:39 PM] |
||
|
widdershins
Veteran Cruncher Scotland Joined: Apr 30, 2007 Post Count: 674 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Ok, been doing a little more digging into my GT730 card. Now, Nvidia's marketing department likes to confuse things it seems. They and their OEM's pwho make thier own cards have apparently had cards called GT730 on sale at around the same time with three different processors two different bus widths, and two different types of memory. They also seem to to pretty much mix and match those as they please so there are probably a dozen or so different permutations on sale at around the same time. That of course excludes the extra variations due to memory size.
That's just one (supposedly) card! I imagine they've been doing the same with each and every Nvidia card model. I've found out that, according to Wikipedia (I know, not the most reliable info source, but best I could find), my particular card is supposedly capable of Open CL 1.2 but also of double precision calculations. Based on the GPU info about processor type etc. Not sure what to make of that, given the suspicion that it was some double precision items that were suspected as the main culprit. |
||
|
|
![]() |