QNX 4 max files in directory

The OS in question is QNX 4.25

Does anyone know the maximum number of files that can be placed into a single directory? and would the performance of the system suffer if constantly adding files into this directory that has an enormous number of files in it?


C. Scott

C. Scott <scott.anderson@fkilogistex_nospam_.com> wrote:
CS > This is a multi-part message in MIME format.

CS > ------=_NextPart_000_0070_01C36CB2.265D19C0
CS > Content-Type: text/plain;
CS > charset=“iso-8859-1”
CS > Content-Transfer-Encoding: quoted-printable

CS > The OS in question is QNX 4.25

CS > Does anyone know the maximum number of files that can be placed into a =
CS > single directory? and would the performance of the system suffer if =
CS > constantly adding files into this directory that has an enormous number =
CS > of files in it?

CS > --=20
CS > C. Scott=20


I don’t know if there is an official limit. If there is it is quite
large. I’ve had directories with thousands but less than 10,000 files
in it.

YES, Performance will suffer! Opens will take forever. So will renames
and deletes. Once your file is open the performance should be the same
as always (I think). Another permormance hit is in the form of a
fragmented directory. If you know the directory is going to be huge up
front, try to preallocate the directory with:
mkdir -s number_of_expected_files my_big_fat_dir

There is another approach. Try to impose a director tree even where
one may not be required. I.E. figure out what all of the possible
first characters of the files names will be and create a directory whos
name is just that letter. Figure out the second letters and make a
directory that is just that letter. Do this am many tiles as
necessary.

Example:
Instead of /mydir/system.cfg
use /mydir/s/y/system.cfg

Good luck


Bill Caroselli – Q-TPS Consulting
1-(626) 824-7983
qtps@earthlink.net

I have made crude measurements of time required to open a file depending on
the number of files in a directory (QNX 6).
Suppose time to open a file in a directory with 100 files is 1.
Then it is 264 in a directory with 1000 files and 3300 in a directory with
10000 files.

“Bill Caroselli” <qtps@earthlink.net> wrote in message
news:bij4lp$qsu$1@inn.qnx.com

C. Scott <scott.anderson@fkilogistex_nospam_.com> wrote:
CS > This is a multi-part message in MIME format.

CS > ------=_NextPart_000_0070_01C36CB2.265D19C0
CS > Content-Type: text/plain;
CS > charset=“iso-8859-1”
CS > Content-Transfer-Encoding: quoted-printable

CS > The OS in question is QNX 4.25

CS > Does anyone know the maximum number of files that can be placed into
a =
CS > single directory? and would the performance of the system suffer if =
CS > constantly adding files into this directory that has an enormous
number =
CS > of files in it?

CS > --=20
CS > C. Scott=20


I don’t know if there is an official limit. If there is it is quite
large. I’ve had directories with thousands but less than 10,000 files
in it.

YES, Performance will suffer! Opens will take forever. So will renames
and deletes. Once your file is open the performance should be the same
as always (I think). Another permormance hit is in the form of a
fragmented directory. If you know the directory is going to be huge up
front, try to preallocate the directory with:
mkdir -s number_of_expected_files my_big_fat_dir

There is another approach. Try to impose a director tree even where
one may not be required. I.E. figure out what all of the possible
first characters of the files names will be and create a directory whos
name is just that letter. Figure out the second letters and make a
directory that is just that letter. Do this am many tiles as
necessary.

Example:
Instead of /mydir/system.cfg
use /mydir/s/y/system.cfg

Good luck


Bill Caroselli – Q-TPS Consulting
1-(626) 824-7983
qtps@earthlink.net