Log In
Register
Download
News
Support
Forum
Donate
Help and Support
Ask a question, report a problem, request a feature...
<< Back To Forum
Fopnu 1.67 consuming large amounts of RAM
by
ASmith
on 2025/01/09 11:05:21 PM
Fopnu is now consuming 8 GB of RAM on my system. Is there anything I can look at or adjust to lower that down, I don't recall Fopnu gobbling up so much RAM previously.
by
Guest
on 2025/01/10 06:05:12 AM
did you update recently?
when did you notice the increase?
how much is in your library?
do you run any channel? lots of people in them?
I dont run any channels but I have almost 9TB in 123,000 files and I use about 2-3GiB
by
Guest
on 2025/01/26 12:35:48 AM
Fopnu is now consuming 8 GB of RAM on my system
given the size of your library; this honestly seems tame.
..what did it eat previously?
by
ASmith
on 2025/04/25 12:10:45 AM
Normally around 6 GB, that is for over 100 TB of shared files and nearly 1/2 million files. I had been sharing my music singles which has a little over 1 million music singles files in 12 different musical genre, but after seeing my available ram disappearing, I ditched that offering. Perhaps that alone was draining 2 GB in indexing that music library addition alone, dunno.
by
Guest
on 2025/04/27 11:45:49 PM
My (unscientific) findings are that the quantity of files has a bigger bearing on RAM usage than the storage size of the library.
So, a 1 TB library of 1,000 1 GB files will use up less RAM than a 1 GB library of 1 million 1 KB files.
I assume what's stored in RAM is a file path to file hash map, each with a relatively similar size regardless of file storage size.
I wish there was an option to store this library info to disk rather than RAM. A SQLite database or similar would be ideal. But I have a feeling the dev isn't keen on dependencies. Most storage/config solutions implemented across the suite tend to be custom, text-based formats.
A text based approach wouldn't really work in the case of massive libraries because the whole file would need to be scanned each time.
(Note: There is a lot of technical and implementation assumption in the above post)
by
ASmith
on 2025/05/30 08:42:18 PM
Thanks, that pretty much sums up the choke point on the software. I had to suspend sharing 1 million music singles spanning 12 musical genries and my massive movie .torrent collection of files to drop my shares to around 400,000 files. Sqdb database would be the logical step and many, many trusted and widely used platforms and their applications use SQdb. Then its a delicate balance between any hesitation lags by using a HDD for the library lookup instead of only sheer RAM.
Add Reply
<< Back To Forum
This web site is powered by
Super Simple Server