Index | Recent Threads | Unanswered Threads | Who's Active | Guidelines | Search |
![]() |
World Community Grid Forums
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
No member browsing this thread |
Thread Status: Active Total posts in this thread: 613
|
![]() |
Author |
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
I can't get BOINC 6.3.21 installed. It says "Error reading setup initialization file" I've had that happen to me if you downloaded from GPUGrid. Only part of the file may have downloaded. Check your file size. It should be 7,695 kb for that build. If not, delete it, delete your temp internet files and download the version you need (Vista 32 or 64 bit) directly from Berkeley: http://boinc.berkeley.edu/dl/ Fun, eh? ![]() ![]() Edit: I installed the 32 bit version and it seems to pick up my old work units. Does this look right? It downloaded a ton of GPUGrid work units but it only shows 2? I hate Beta software... Also why does it do a "Suspending computation - initial delay" when I start it up? Yes there are several files downloaded for one wu. It will dl 4 wu's initially and get one more as each one completes. Do you have 4 cores and 1 GPU running? I don't know about supending comp ... post the message and maybe I can help. Yes, Vincent all GPU crunching is Beta ... 6.4.1 is alpha ... ![]() ![]() Ah.. this is confusing. It better not burn out my video card. ![]() When I start the BOINC client it suspends for about 2 minutes before starting. Is there a way to turn that option off? And it's fine having 50% resource share on WCG and GPUGrid right? Yes it is confusing. No you can't turn it off (I think it will be in all future releases to allow your start deck to complete before BOINC eats the machine ![]() My 9800GTX is still not working. I'm going to try completly uninstalling/reinstalling 6.3.21 today if I have time. ~5 hours per work unit.. Sounds about right? My card is calling to be overclocked but then it'll put out more heat and more heat means louder fans and louder fans means I'll annoy me. |
||
|
Dataman
Ace Cruncher Joined: Nov 16, 2004 Post Count: 4865 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
~5 hours per work unit.. Sounds about right? My card is calling to be overclocked but then it'll put out more heat and more heat means louder fans and louder fans means I'll annoy me. Wow ... that is fast. It take at least 7 on the 9800 and 10+ on the 8800. I have the 9800 up and running but it has not completed a wu yet. I don't know if the problem has been resolved. It may be the wu are bad as there have been some other posts with the same error message I get. <core_client_version>6.3.21</core_client_version> <![CDATA[ <message> Incorrect function. (0x1) - exit code 1 (0x1) </message> <stderr_txt> # Using CUDA device 0 # Device 0: "GeForce 9800 GTX/9800 GTX+" # Clock rate: 1944000 kilohertz # Number of multiprocessors: 16 # Number of cores: 128 MDIO ERROR: cannot open file "restart.coor" </stderr_txt> ]]> ![]() |
||
|
Dataman
Ace Cruncher Joined: Nov 16, 2004 Post Count: 4865 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
~5 hours per work unit.. Sounds about right? My card is calling to be overclocked but then it'll put out more heat and more heat means louder fans and louder fans means I'll annoy me. Wow ... that is fast. It take at least 7 on the 9800 and 10+ on the 8800. I have the 9800 up and running but it has not completed a wu yet. I don't know if the problem has been resolved. It may be the wu are bad as there have been some other posts with the same error message I get. <core_client_version>6.3.21</core_client_version> <![CDATA[ <message> Incorrect function. (0x1) - exit code 1 (0x1) </message> <stderr_txt> # Using CUDA device 0 # Device 0: "GeForce 9800 GTX/9800 GTX+" # Clock rate: 1944000 kilohertz # Number of multiprocessors: 16 # Number of cores: 128 MDIO ERROR: cannot open file "restart.coor" </stderr_txt> ]]> My GPU problem continues @ Blizzie/Jonathan: The Admin @ GPUGrid gave me this information: "The standard clock for 9800GTX+ is 1.83 GHz, for 9800GTX it's 1.69 GHz. Your card runs at 1.95 GHz and probably ~750 MHz core, so it's way above spec.. which may cause problems." The machine is not OC'ed and the card is just as it was out of the box. Is this true? If so, can you tell me how to set it back to 1.69 GHz? I have no idea. ![]() ![]() |
||
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
~5 hours per work unit.. Sounds about right? My card is calling to be overclocked but then it'll put out more heat and more heat means louder fans and louder fans means I'll annoy me. Wow ... that is fast. It take at least 7 on the 9800 and 10+ on the 8800. I have the 9800 up and running but it has not completed a wu yet. I don't know if the problem has been resolved. It may be the wu are bad as there have been some other posts with the same error message I get. <core_client_version>6.3.21</core_client_version> <![CDATA[ <message> Incorrect function. (0x1) - exit code 1 (0x1) </message> <stderr_txt> # Using CUDA device 0 # Device 0: "GeForce 9800 GTX/9800 GTX+" # Clock rate: 1944000 kilohertz # Number of multiprocessors: 16 # Number of cores: 128 MDIO ERROR: cannot open file "restart.coor" </stderr_txt> ]]> My GPU problem continues @ Blizzie/Jonathan: The Admin @ GPUGrid gave me this information: "The standard clock for 9800GTX+ is 1.83 GHz, for 9800GTX it's 1.69 GHz. Your card runs at 1.95 GHz and probably ~750 MHz core, so it's way above spec.. which may cause problems." The machine is not OC'ed and the card is just as it was out of the box. Is this true? If so, can you tell me how to set it back to 1.69 GHz? I have no idea. ![]() http://www.techpowerup.com/gpuz/ You can use this to see the core/memory/shader clock of your card. |
||
|
Dataman
Ace Cruncher Joined: Nov 16, 2004 Post Count: 4865 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Thanks (neat utility). For both the GPU clock and Default clock is shows 792 MHz, Memory 1152 MHz and Shader 1944 MHz. ??? ![]() |
||
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
Thanks (neat utility). For both the GPU clock and Default clock is shows 792 MHz, Memory 1152 MHz and Shader 1944 MHz. ??? Graphics Clock (MHz) 675 MHz Processor Clock (MHz) 1688 MHz Memory Clock (MHz) 1100 MHz This is the speed for the GTX. Graphics Clock (MHz) 738 MHz Processor Clock (MHz) 1836 MHz Memory Clock (MHz) 1100 MHz And this is the GTX+. Your clock speeds are a little weird? Even the Superclocked versions don't have that speed. You can use EVGA Percision, which is an overclocking utility to set your card back to whatever the stock speeds are. Look up your cards model. I'll upload it for you since you can't download it without registering with EVGA. http://files.blizzie.com/EVGAPrecision.ZIP [Edit 1 times, last edit by Former Member at Nov 24, 2008 12:18:28 AM] |
||
|
Dataman
Ace Cruncher Joined: Nov 16, 2004 Post Count: 4865 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Thanks (neat utility). For both the GPU clock and Default clock is shows 792 MHz, Memory 1152 MHz and Shader 1944 MHz. ??? Graphics Clock (MHz) 675 MHz Processor Clock (MHz) 1688 MHz Memory Clock (MHz) 1100 MHz This is the speed for the GTX. Graphics Clock (MHz) 738 MHz Processor Clock (MHz) 1836 MHz Memory Clock (MHz) 1100 MHz And this is the GTX+. Your clock speeds are a little weird? Even the Superclocked versions don't have that speed. You can use EVGA Percision, which is an overclocking utility to set your card back to whatever the stock speeds are. Look up your cards model. I'll upload it for you since you can't download it without registering with EVGA. http://files.blizzie.com/EVGAPrecision.ZIP Way cool! Thanks again. I guess I will be reading some readme files tonight. ![]() ![]() |
||
|
Dataman
Ace Cruncher Joined: Nov 16, 2004 Post Count: 4865 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Way cool! Thanks again. I guess I will be reading some readme files tonight. ![]() Short read ... ![]() ![]() ![]() ![]() ![]() |
||
|
Former Member
Cruncher Joined: May 22, 2018 Post Count: 0 Status: Offline |
How come GPUGrid isn't downloading new tasks for the queue? I have one work unit only and it's the one I'm currently crunching.
|
||
|
RT
Master Cruncher USA - Texas - DFW Joined: Dec 22, 2004 Post Count: 2636 Status: Offline Project Badges: ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
I suppose that you guys have noticed the daily stats are missing. That is because Keithhenry is out of pocket. I am collecting the numbers each day and will get them to him upon his return.
---------------------------------------- |
||
|
|
![]() |