What's the best way to write to more files than the kernel allows open at a time?
Posted
by Elpezmuerto
on Stack Overflow
See other posts from Stack Overflow
or by Elpezmuerto
Published on 2010-06-16T15:45:50Z
Indexed on
2010/06/16
18:02 UTC
Read the original article
Hit count: 206
I have a very large binary file and I need to create separate files based on the id within the input file. There are 146 output files and I am using cstdlib
and fopen
and fwrite
. FOPEN_MAX
is 20, so I can't keep all 146 output files open at the same time. I also want to minimize the number of times I open and close an output file.
How can I write to the output files effectively?
I also must use the cstdlib
library due to legacy code.
© Stack Overflow or respective owner