Index  | Recent Threads  | Unanswered Threads  | Who's Active  | Guidelines  | Search
 

Quick Go »
No member browsing this thread
Thread Status: Active
Total posts in this thread: 35
Posts: 35   Pages: 4   [ 1 2 3 4 | Next Page ]
[ Jump to Last Post ]
Post new Thread
Author
Previous Thread This topic has been viewed 3032 times and has 34 replies Next Thread
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Big size Work Units

About 27MB, zip compressed, 44MB decompressed... I'm not complaining, but never saw such huge result file, so if this project wants it to have less impact on traffic (some helpers might find it annoying) it could use 7-zip 2.90 is freeware I have Win7 64-bit so I use the 64-bit version, for better compression: 17.2 MB, settings to compress with 7-Zip:

Format: 7z
Compression Level: Ultra
Method: LZMA2
Dictionary: 1GB
Word: 273
Solid Block; 4GB
# of CPU threads: MUST BE 2/4 (4/4 would be a waste, is not optimized for more than 2 threads for now, improvement is planed)

It only took like 10 secs. to compress in my Phenom II X4 965 BE 3.4Ghz (overclocked to 3.74 Ghz stable).
----------------------------------------
[Edit 2 times, last edit by Former Member at Nov 23, 2011 4:49:58 AM]
[Nov 23, 2011 1:38:01 AM]   Link   Report threatening or abusive post: please login first  Go to top 
gb009761
Master Cruncher
Scotland
Joined: Apr 6, 2005
Post Count: 2955
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: Big size Work Units

THAT'S why this project is an OPT-IN project (along with the very long duration between checkpoints and several other reasons).
----------------------------------------

[Nov 23, 2011 1:57:35 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: Big size Work Units

This particular project has some hefty requirements so that size for a download is to be expected.

One of the uploads for a completed workunit for this project tends to be in the 30 meg or more category.
[Nov 23, 2011 3:30:48 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
CEP2: Big size Work Units

Saul,

Fair enough, an individual file can be better compressed with the choice zipper. A question could be, what's without license / open source, and what comes with BOINC and works on all platforms i.e. when it's time to compress, Windows, Linux, Mac need to follow the same program routine. Don't know if they did tests how fast a current zip versus a 7z file would transit. Transfer protocols seem to waste time on trying to compress what's already compressed. My calculation lastly was that of the total est. up/down of 880GB WCG daily transferred, for CEP2 [compressed] it's 625GB or so (fills the researcher's Jabber storage unit in 200 days). Presently an average 114.6MB are used per core, crunching 24/7 (See the WCG Dashboard chart). Maybe cleanenergy likes to chip in or one of the techs as a 40% reduction surely would have interest.

Whilst, if you never seen 27MB, you ain't seen a full 16 jobs result or the much bigger early in the project. Presently the zip goes 30-31MB on me Windows and Linux cruncher and has been gradually increasing for a long time and as per the past experience and System Requirement page, may grow to 80MB when we get further into the project.

2 Euro cents.

--//--

edit: spellcheck
----------------------------------------
[Edit 1 times, last edit by Former Member at Nov 23, 2011 7:19:29 AM]
[Nov 23, 2011 7:09:26 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: CEP2: Big size Work Units

Wow, hten 7-zip is more needed than ever if the compressed WUs are going to be 80MB zipped.

Confirmed 7-zip is open source & freeware so is up to IBM now.
[Nov 23, 2011 2:36:37 PM]   Link   Report threatening or abusive post: please login first  Go to top 
sk..
Master Cruncher
http://s17.rimg.info/ccb5d62bd3e856cc0d1df9b0ee2f7f6a.gif
Joined: Mar 22, 2007
Post Count: 2324
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: CEP2: Big size Work Units

That's a good idea. I think someone might have raised this before (or maybe I just read a review of zip package systems). Is it cross platform? I definitely remember reading that W7 and Vista were poor in comparison to XP and Linux when it came to zipping and unzipping. So the time taken to zip/unzip, a consideration in itself, does vary, and as pointed out network compression needs to be considered.

The more files that are zipped together, the bigger the file and the greater the chance of upload/download failure, task shortages, and frustration. So smaller sized files would be great.

I remember suggesting that tasks should be allowed to continue uploading from where they left off, if interrupted. Don't know if that was/will be implemented, even if just for some big CEP2 files. Anyone? I only have 100Kb to 130Kb uploads on this system atm or I would test it.
[Nov 23, 2011 3:03:48 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: CEP2: Big size Work Units

? We know the WCG server do not seem to support UL resume and most files don't really need that, but CEP2 part _4 going directly to the scientists? Anyway, got one ready to finish, so suspended network, then will interrupt UL and see how much the final value is that was transmitted [who knows the log flag to set so the UL value is recorded?].
--//--

edit: So figured it out and got the anticipated answer printed in the client message log. Spock logical.
----------------------------------------
[Edit 1 times, last edit by Former Member at Nov 23, 2011 6:03:18 PM]
[Nov 23, 2011 3:40:17 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Ingleside
Veteran Cruncher
Norway
Joined: Nov 19, 2005
Post Count: 974
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: CEP2: Big size Work Units

I remember suggesting that tasks should be allowed to continue uploading from where they left off, if interrupted. Don't know if that was/will be implemented, even if just for some big CEP2 files. Anyone? I only have 100Kb to 130Kb uploads on this system atm or I would test it.

Resuming of transfers has been part of BOINC for a very long time, but WCG doesn't allow resuming of downloads. For uploads on the other hand I don't remember ever having any problems with WCG, and a quick test with a HPF2-upload reveals upload-resuming works as it should (even the file wasn't very large). As for CEP2, I've not got any CEP2 at the moment, but don't remember any problems last time tried upload-resuming here...
----------------------------------------


"I make so many mistakes. But then just think of all the mistakes I don't make, although I might."
[Nov 23, 2011 9:47:39 PM]   Link   Report threatening or abusive post: please login first  Go to top 
Former Member
Cruncher
Joined: May 22, 2018
Post Count: 0
Status: Offline
Reply to this Post  Reply with Quote 
Re: Big size Work Units

7-zip is for Linux too, so with little modifications also for Mac, since MacOS X is practically Unix.

I was experiencing recently BSOD again but it not my new MoBo or anything else, my PSU was overheating because of the overclocking and it doesn't have a secon 2x2 connector (I need a better PSU); anyway when I restarted my PC BOINC (set to 3 min check points) resumed uploading or downloading.
[Nov 24, 2011 2:44:54 AM]   Link   Report threatening or abusive post: please login first  Go to top 
knreed
Former World Community Grid Tech
Joined: Nov 8, 2004
Post Count: 4504
Status: Offline
Project Badges:
Reply to this Post  Reply with Quote 
Re: Big size Work Units

gzip is what is built into BOINC. Thus that is what we use for file transfers.

For long term storage, both us and I believe Harvard decompress and re-package the data for longer term storage/transfer/backup/analysis.
[Nov 27, 2011 3:49:26 AM]   Link   Report threatening or abusive post: please login first  Go to top 
Posts: 35   Pages: 4   [ 1 2 3 4 | Next Page ]
[ Jump to Last Post ]
Post new Thread