Help and Support
Ask a question, report a problem, request a feature...
<< Back To Forum
Optimize memory usage for sharing huge library of files
on 2017/12/08 11:43:33 PM
I have been collecting books etc. information for a decade.
I have almost 20TB of files on my collection and I would like to make it all available.
But sharing about 20TB/500.000 files takes too much memory. Because I dont have knowledge
of how Fopnu's network works, I dont sharing this library makes sense or not. But because
I have 100mb connection and powerful server with lots of memory, I would like to share
My current configuration is sharing 5.2TB/220.000files and it takes 3.3gb of memory. And while
I have 48TB on my server, I cant run Fopnu if it takes 13GB.
So I would appreciate if you could reduce memory usage, and also optimize Fopnu for sharing huge
amount of files. I pay back by sharing my 20TB library 24/7 for all the Fopnu users =)
on 2017/12/11 11:46:39 PM
Have you tried sharing such a big collection on emule for example? How's the memory consumption there?
on 2017/12/16 04:45:07 AM
I have shared almost the same amount of files with emule and it took about 2gb if I remember correctly.
However because KAD is so fcking slow, I made script that "constructed" 60 emules that each share 1/60 of the total amount of files.
That way I could push up KAD publishing to 60 times faster. Anything over that starts to build too much overhead.
Im not in any way trying to say that Fopnu is doing something wrong or using too much memory, but I just wanted to point out that there are people who would benefit from lower memory usage, if its possible to optimize it in any way. I already have the max amount of ram that my mobo/chipset supports, well infact I have double of what the mobo/chipset supports and it works.
on 2018/04/23 10:52:21 PM
I'm on w10 x64, for 200GB (22676) shared files -> 350MB used RAM. I think that is too much.
on 2018/04/27 05:55:41 AM
Unusable for large databases. At least 16 GB RAM or better 32 GB must have! I have only 8 GB of RAM (W7x64), and can't share more than 8 TB (over 750000 files, provided that no other software is started!)
on 2018/05/03 03:02:18 PM
Today after update with v1.29 (W10x64), the situation is even worse! 300GB of shared files are taken off 3GB of RAM! :O
I like Fopnu and want to use it, please Help!
on 2018/05/05 08:35:04 PM
Odd, im sharing roughly 500gb of files, nearly 40 thousand and Fopnu is using much less than my web browser, Fopnu 243,563K vs only one instance of google chrome at 2,566,237k and there are in excess of 10 google chromes showing, glad iv got lots of cores and ram for chrome, but fopnu is just idling along at 0% cpu
on 2018/05/06 04:48:39 PM
If you think I'm wrong, check out here:
This is the current RAM usage on my W10x64, I7-6700HQ, 12GB-RAM
on 2018/05/17 03:45:01 PM
With Linux is a bit better, but not as all.
This is my current RAM usage on Linux Mint 17.3 Rosa (Intel-Core2 Duo CPU T6400 @ 2.00GHz, RAM 8174676 kB
for 332845 shared files with amount around 3TB
Тhere are still folders and files for scanning and hashing
on 2018/07/10 03:24:39 PM
This is the fundamental purpose of fopnu as I see it...to create a huge, overwhelming decentralized public library.
How to manage the index to all that data will be key.
on 2019/05/07 02:04:56 PM
It is interesting theoretical problem, how to send the information from all of your files as economically as possible.
And how to index those files as economically as possible.
I have a feeling that the most efficient way would be really complex.
I would be willing to sacrifice what ever CPU time and RAM requirements for this.
To me it seems that P2P would become the vault of information and evidence. That would mean huge amount of files.
As a researcher of secret societies and conspiracies from the past centuries, I would love to see network where all of
my peers could share everything they have collected. And a place where sharing new sensitive material would be possible.
<< Back To Forum
This web site is powered by
Super Simple Server