Extract Speeds

Post your questions here if you need help to use NewsLeecher or if you have a question about a feature.

Ytterbium
Posts: 10
Joined: Tue Sep 05, 2006 12:15 am

Extract Speeds

Post by Ytterbium »

I have my NL Download and Extract folders set to be on 2 different SSD.

I had a some older 3gbs drives and I upgraded to some new 6gbs ones.

However I seem to always get 200mbs extract speed, is there some limit to how fast the extract can be?

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

It is very possible there is a max throughput of the extraction library.
Just upgrading your drives doesn't mean you will get the faster speeds unless the rest of your hardware also supports it. The SATA drive controllers may only be able to support 3Gb/s speeds. Have you run drive benchmarking tests to see what read and write speeds you actually get? Run multiple tests between the drives and to/from the same drive, at the same time.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

Ytterbium
Posts: 10
Joined: Tue Sep 05, 2006 12:15 am

Post by Ytterbium »

I bought another SSD to see if I could get it faster.

I have my source and extract drives different so there is less throughput issues, there bother SATA 6GBs.

I see about 50% active time on the source and virtually none on the host drive.

Still seeing about 200MB/s extraction speed. I'm happy to collect some logs or other info to enable enhancement.

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

Try a 3rd party benchmarking utility, like WinBench, to see the speeds of the drives.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

Ytterbium
Posts: 10
Joined: Tue Sep 05, 2006 12:15 am

Post by Ytterbium »

If I just copy a file from my Download Drive to extract Drive I see about 340Mbs in windows file copy dialogue.

Using CrystalDiskMark for my Download Drive (source) I see 375Mb/s sequential Read and extract Drive (destination) I see 523MB/s sequential Write.

WinBench was kind of old from 1999, so I thought I'd skip that benchmark.

Ytterbium
Posts: 10
Joined: Tue Sep 05, 2006 12:15 am

Post by Ytterbium »

I created a 7zip multipart archive in LZMA2 format and it was very slow to extract (~50MB/s in 7zip dialogue).

If I just did a multipart archive in 'store' format it was insane fast extract (700MB/s)

So it could be the de-compression that is bottlenecking?

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

Sounds like it could be that. Decompression is CPU intensive. If it is a slower processor, that will be a cause of the low speeds.

Also try extracting to a different drive than where you are reading. Sometimes the bottleneck is the IO bus so reading from and writing to the same drive is slow.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

Ytterbium
Posts: 10
Joined: Tue Sep 05, 2006 12:15 am

Post by Ytterbium »

I have the extract drive to be a different one from the download/read drive.

Yeah, seems like compression thing, I have i7-3970X running at 4Ghz so it's probably as NZ extract is single threaded, it's going as fast as it can with one thread.

D-Kay
Posts: 12
Joined: Tue Jul 23, 2013 1:14 pm

Post by D-Kay »

try to update your unrar libary to version 5, which should improve speed.

Ytterbium
Posts: 10
Joined: Tue Sep 05, 2006 12:15 am

Post by Ytterbium »

I manually swapped out the unrar.dll & unrar.exe

I couldn't even see the extract on 900Mb input files, it was like 600MB/s or something.

So I'd say it's would probably be a good idea for NL to bundle a new version of the files?

I'll report if I see any bug, I'm not sure if the API's have changed in the dll's?

D-Kay
Posts: 12
Joined: Tue Jul 23, 2013 1:14 pm

Post by D-Kay »

I have gone through serveral archives, no problems so far.

Ytterbium
Posts: 10
Joined: Tue Sep 05, 2006 12:15 am

Post by Ytterbium »

yeah no issues here

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

Good to know it is working well. We hope to have it officially part of NewsLeecher soon.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

garberfc
Posts: 34
Joined: Wed Jul 28, 2004 5:48 pm

Post by garberfc »

Ytterbium wrote:I manually swapped out the unrar.dll & unrar.exe

I couldn't even see the extract on 900Mb input files, it was like 600MB/s or something.

So I'd say it's would probably be a good idea for NL to bundle a new version of the files?

I'll report if I see any bug, I'm not sure if the API's have changed in the dll's?
Where are you seeing the unrar.dll & unrar.exe? They're not in the \Program Files\Newsleecher directory. I only see "ztvunrar38.dll" in there.

D-Kay
Posts: 12
Joined: Tue Jul 23, 2013 1:14 pm

Post by D-Kay »

garberfc wrote:
Ytterbium wrote:I manually swapped out the unrar.dll & unrar.exe

I couldn't even see the extract on 900Mb input files, it was like 600MB/s or something.

So I'd say it's would probably be a good idea for NL to bundle a new version of the files?

I'll report if I see any bug, I'm not sure if the API's have changed in the dll's?
Where are you seeing the unrar.dll & unrar.exe? They're not in the \Program Files\Newsleecher directory. I only see "ztvunrar38.dll" in there.
you must be running an old version? like the 5.0 final i might guess. i dont know if its different there, but it works fine on the newest beta... (where its already integrated)

Post Reply