I need to process files. These files are big! The type of process cannot be done line by line. But it can be done using small chunks. Then, why not define $/ and read chunk at a time?
It seemed to be a good solution, but perl uses too many memory (although chunks are 300 bytes long).
An elegant solution which does not works... let think in another one...
$/ = 300;
" when you should be setting $/ = \300;
. With the former, "<>" thinks the string "300" is the line terminator (so you could get very large chunks). The latter reads 300 bytes at a time.