Index  | Recent Threads  | Unanswered Threads  | Who's Active  | Guidelines  | Search
 

Quick Go ยป
No member browsing this thread
Thread Status: Active
Total posts in this thread: 75
Posts: 75   Pages: 8   [ Previous Page | 1 2 3 4 5 6 7 8 | Next Page ]
[ Jump to Last Post ]
Post new Thread
Author
Previous Thread This topic has been viewed 19902 times and has 74 replies Next Thread
TPCBF
Master Cruncher
USA
Joined: Jan 2, 2011
Post Count: 1950
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

+1 Adri! Good write-up!

Unfortunately I just received a SCC1 4175 MyoD1-C :(
Yeah, kind of a bad sign that after at least 5 days of those WUs being send out, they still haven't mentioned to completely remove this bad batch from distribution... sad


Ralf
----------------------------------------

[May 19, 2023 1:48:23 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Sgt.Joe
Ace Cruncher
USA
Joined: Jul 4, 2006
Post Count: 7660
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

They must be rationing them very slowly as they are few and far between. I concur with Adri on only the "A" and "B" variety validate and every "C" errors out. I do not concur that the flow is normal. To say they are scarce would be an understatement..
Cheers
----------------------------------------
Sgt. Joe
*Minnesota Crunchers*
[May 19, 2023 2:27:05 AM]   Link   Report threatening or abusive post: please login first  Go to top 
adriverhoef
Master Cruncher
The Netherlands
Joined: Apr 3, 2009
Post Count: 2155
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

Sgt.Joe:
They must be rationing them very slowly as they are few and far between. I concur with Adri on only the "A" and "B" variety validate and every "C" errors out. I do not concur that the flow is normal. To say they are scarce would be an understatement..

True. I think what we see here is that our clients are considered unreliable after processing one Error task, so you get less SCC1-tasks, so the server has to search for other clients, and because there are so few reliable clients left, the server has difficulty - at least that's what I think - appointing tasks (also, or should I say especially) from the faulty batch SCC1_0004175_MyoD1-C) to reliable hosts ("Waiting to be sent").

Or maybe the flow has been halted altogether at 2023-05-19T03:44:42. See here:
Workunits apart from each other with stepsize 1000:
workunit 298089814
SCC1_0004083_MyoD1-A_1416_0                Waiting to be sent  
workunit 298088814
SCC1_0004083_MyoD1-A_1156_0                Waiting to be sent  
workunit 298087814
SCC1_0004083_MyoD1-A_0931_0                Waiting to be sent  
workunit 298086814
SCC1_0004138_MyoD1-B_23600_0                Waiting to be sent  
workunit 298085814
SCC1_0004083_MyoD1-A_0292_0                Waiting to be sent  
workunit 298084814
SCC1_0004138_MyoD1-B_22910_0                Waiting to be sent  
workunit 298083814
SCC1_0004138_MyoD1-B_22437_0                Waiting to be sent  
workunit 298082814
SCC1_0004138_MyoD1-B_21918_0                Waiting to be sent  
workunit 298081814
SCC1_0004175_MyoD1-C_25205_0                Waiting to be sent  
workunit 298080814
SCC1_0004175_MyoD1-C_24702_0                Waiting to be sent  
workunit 298079814
SCC1_0004175_MyoD1-C_24198_0  Linux Debian  Error                 2023-05-19T03:29:26  2023-05-19T07:20:57
SCC1_0004175_MyoD1-C_24198_1 Fedora Linux Error 2023-05-19T03:29:40 2023-05-19T03:31:52
SCC1_0004175_MyoD1-C_24198_2 Arch Linux In Progress 2023-05-19T03:33:33 2023-05-22T03:33:33
SCC1_0004175_MyoD1-C_24198_3 Waiting to be sent

Zooming in, workunits apart from each other with stepsize 100:
workunit 298080813
SCC1_0004175_MyoD1-C_24688_0                Waiting to be sent  
workunit 298080713
SCC1_0004175_MyoD1-C_24650_0                Waiting to be sent  
workunit 298080613
SCC1_0004138_MyoD1-B_20833_0                Waiting to be sent  
workunit 298080513
SCC1_0004138_MyoD1-B_20765_0                Waiting to be sent  
workunit 298080413
SCC1_0004175_MyoD1-C_24490_0                Waiting to be sent  
workunit 298080313
SCC1_0004175_MyoD1-C_24450_0                Waiting to be sent  
workunit 298080213
SCC1_0004138_MyoD1-B_20641_0                Waiting to be sent  
workunit 298080113
SCC1_0004138_MyoD1-B_20574_0                Waiting to be sent  
workunit 298080013
SCC1_0004138_MyoD1-B_20532_0  MSWin 10      In Progress           2023-05-19T03:42:22  2023-05-25T03:42:22
SCC1_0004138_MyoD1-B_20532_1 MSWin 10 Pending Validation 2023-05-19T03:42:27 2023-05-19T08:25:22
workunit 298079913
SCC1_0004175_MyoD1-C_24243_0  Linux Ubuntu  Error                 2023-05-19T03:31:47  2023-05-19T06:37:49
SCC1_0004175_MyoD1-C_24243_1 Waiting to be sent
workunit 298079813
SCC1_0004175_MyoD1-C_24184_0  MSWin 10      Error                 2023-05-19T03:29:26  2023-05-19T07:42:52
SCC1_0004175_MyoD1-C_24184_1 MSWin 10 In Progress 2023-05-19T03:29:32 2023-05-25T03:29:32
SCC1_0004175_MyoD1-C_24184_2 Waiting to be sent

Zooming in, workunits apart from each other with stepsize 10:
workunit 298080112
SCC1_0004138_MyoD1-B_20573_0                Waiting to be sent  
workunit 298080102
SCC1_0004138_MyoD1-B_20561_0                Waiting to be sent  
workunit 298080092
SCC1_0004175_MyoD1-C_24340_0                Waiting to be sent  
workunit 298080082
SCC1_0004175_MyoD1-C_24338_0                Waiting to be sent  
workunit 298080072
SCC1_0004175_MyoD1-C_24324_0                Waiting to be sent  
workunit 298080062
SCC1_0004175_MyoD1-C_24302_0                Waiting to be sent  
workunit 298080052
SCC1_0004175_MyoD1-C_24308_0  MSWin 10      In Progress           2023-05-19T03:44:42  2023-05-25T03:44:42
workunit 298080042
SCC1_0004175_MyoD1-C_24310_0  MSWin 10      Error                 2023-05-19T03:43:41  2023-05-19T08:30:04
SCC1_0004175_MyoD1-C_24310_1 MSWin 10 In Progress 2023-05-19T03:44:34 2023-05-25T03:44:34
SCC1_0004175_MyoD1-C_24310_2 Waiting to be sent
workunit 298080032
SCC1_0004138_MyoD1-B_20556_0  MSWin 10      Pending Validation    2023-05-19T03:43:05  2023-05-19T08:08:40
SCC1_0004138_MyoD1-B_20556_1 MSWin 11 In Progress 2023-05-19T03:43:21 2023-05-25T03:43:21
workunit 298080022
SCC1_0004138_MyoD1-B_20548_0  MSWin Server  In Progress           2023-05-19T03:42:33  2023-05-25T03:42:33
SCC1_0004138_MyoD1-B_20548_1 MSWin 10 In Progress 2023-05-19T03:42:42 2023-05-25T03:42:42
workunit 298080012
SCC1_0004138_MyoD1-B_20518_0  MSWin 10      In Progress           2023-05-19T03:42:20  2023-05-25T03:42:20

Zooming in, workunits apart from each other with stepsize 1:
workunit 298080061
SCC1_0004175_MyoD1-C_24303_0                Waiting to be sent  
workunit 298080060
SCC1_0004175_MyoD1-C_24295_0                Waiting to be sent  
workunit 298080059
SCC1_0004175_MyoD1-C_24305_0                Waiting to be sent  
workunit 298080058
SCC1_0004175_MyoD1-C_24293_0                Waiting to be sent  
workunit 298080057
SCC1_0004175_MyoD1-C_24294_0                Waiting to be sent  
workunit 298080056
SCC1_0004175_MyoD1-C_24297_0                Waiting to be sent  
workunit 298080055
SCC1_0004175_MyoD1-C_24296_0                Waiting to be sent  
workunit 298080054
SCC1_0004175_MyoD1-C_24298_0                Waiting to be sent  
workunit 298080053
SCC1_0004175_MyoD1-C_24300_0                Waiting to be sent  
workunit 298080052
SCC1_0004175_MyoD1-C_24308_0  MSWin 10      In Progress           2023-05-19T03:44:42  2023-05-25T03:44:42
workunit 298080051
SCC1_0004175_MyoD1-C_24299_0  MSWin 7       In Progress           2023-05-19T03:44:42  2023-05-25T03:44:42
SCC1_0004175_MyoD1-C_24299_1 Waiting to be sent


This was the last SCC1-workunit that I received uptil now:
workunit 298079959:
SCC1_0004175_MyoD1-C_24260_0  Linux Ubuntu  Error                 2023-05-19T03:38:07  2023-05-19T03:40:25
SCC1_0004175_MyoD1-C_24260_1 Fedora Linux Error 2023-05-19T03:38:14 2023-05-19T03:40:22
SCC1_0004175_MyoD1-C_24260_2 Linux Ubuntu In Progress 2023-05-19T03:40:33 2023-05-22T03:40:33
SCC1_0004175_MyoD1-C_24260_3 Linux In Progress 2023-05-19T03:40:35 2023-05-22T03:40:35

Adri
[May 19, 2023 9:55:39 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Sgt.Joe
Ace Cruncher
USA
Joined: Jul 4, 2006
Post Count: 7660
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

the last one I got was SCC1_0004175_MyoD1-C_24333_1 which of course errored out. You would think with the number of errors on the "C" variety some tech would have noticed after all these days and put a stop to them until they have been fixed. It makes no sense to keep sending out the "C" type and getting the same result for all of them. I fear nobody is watching the henhouse.
Cheers
----------------------------------------
Sgt. Joe
*Minnesota Crunchers*
[May 19, 2023 1:09:46 PM]   Link   Report threatening or abusive post: please login first  Go to top 
bfmorse
Senior Cruncher
US
Joined: Jul 26, 2009
Post Count: 297
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

@Cyclops

Will you give us an update on the current status of projects with regards to:
1) General availability of WU's.
2) Release plans (approximate timelines) of next phase (i.e., ARP, HSTB, OPN and SCC).
3) Any additional information about your team's discoveries as issues resolved or still in process of being resolved would be welcomed.

We appreciate your attention to passing on the device's information to your team for helping them become more visible.

Some of us treat WCG as an appliance. But others, myself included, like to be kept aware of what is going on behind the scenes as both a learning experience and just pain curious.

Thanks!
[May 19, 2023 2:00:09 PM]   Link   Report threatening or abusive post: please login first  Go to top 
TPCBF
Master Cruncher
USA
Joined: Jan 2, 2011
Post Count: 1950
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

the last one I got was SCC1_0004175_MyoD1-C_24333_1 which of course errored out. You would think with the number of errors on the "C" variety some tech would have noticed after all these days and put a stop to them until they have been fixed. It makes no sense to keep sending out the "C" type and getting the same result for all of them. I fear nobody is watching the henhouse.
Cheers
I don't think it's a matter of that "C" variant. It's that ".._0004175_..." batch that keeps crapping out with that error and that happens since Sunday, at least that's when I first noticed it and tried to report it. But no reaction/reply to my email (which is supposedly working again) and my posts on the forum are still being censored...

Ralf
----------------------------------------

[May 19, 2023 10:47:01 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Dayle Diamond
Senior Cruncher
Joined: Jan 31, 2013
Post Count: 452
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

[May 20, 2023 6:35:13 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Dayle Diamond
Senior Cruncher
Joined: Jan 31, 2013
Post Count: 452
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

"There are 32.5 days of batches for MCM1 remaining based on 1,075 batches available and an estimated rate of 33/day"


This has yet to be amended since May 5th, so possibly we're looking at MCM becoming an intermittent project for the first time.

From looking at other BOINC projects, the default webpage shows exactly how many work units per project are unsent at any given time. Their forums have a lot less speculation about work unit availability, and need less monitoring by the administrator, because contributors can get answers in real time.
----------------------------------------
[Edit 1 times, last edit by Dayle Diamond at May 26, 2023 10:52:35 PM]
[May 26, 2023 10:51:32 PM]   Link   Report threatening or abusive post: please login first  Go to top 
erich56
Senior Cruncher
Austria
Joined: Feb 24, 2007
Post Count: 295
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

From looking at other BOINC projects, the default webpage shows exactly how many work units per project are unsent at any given time. Their forums have a lot less speculation about work unit availability, and need less monitoring by the administrator, because contributors can get answers in real time.

this is a very good point.
I have been wondering all time long why WCG seems to be the only project without this "Server Status Page" showing valuable information about e.g. number of unsent tasks etc.
----------------------------------------
[Edit 1 times, last edit by erich56 at May 27, 2023 2:49:08 PM]
[May 27, 2023 2:25:21 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Yavanius
Senior Cruncher
Antarctica
Joined: Jan 21, 2015
Post Count: 191
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: May 2023 workunit update

A server status page would be nice...

But despite requests/suggestions from the beginning of time (or at least after the transition), it doesn't seem very likely. They don't seem very PROactive....
[May 27, 2023 4:27:19 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Posts: 75   Pages: 8   [ Previous Page | 1 2 3 4 5 6 7 8 | Next Page ]
[ Jump to Last Post ]
Post new Thread