In my PHP application, I receive mails using Postmarks's inbound hook. This service receives the mail and sends it JSON encoded to a URL on my server, which works fine.
The issue I have is, when the mail has attachments with more than 10MB.
Which results in
PHP Fatal error: Allowed memory size of 104857600 bytes exhausted (tried to allocate 1821693 bytes)
What I'm doing in this line is:
$in = json_decode(file_get_contents("php://input"));
I have two questions:
Edit after debuging with memory_get_usage():
Script start
47MB memory usage.
$in = file_get_contents("php://input");
63MB memory usage.
json_decode($in);
PHP terminates, due to memory size exhausted.
It is interesting, that the script already starts with a memory usage of 47MB, without issuing any command. I guess this is due to the large input data? Maybe because PHP stores it in $HTTP_RAW_POST_DATA?
Is there any php.ini directive I could use, to let PHP create less variables?
e-mail attachments are stored as base64, so actual e-mail body will be about 2 times bigger, so we have 20mb
json_encode (at senders side) also can add base64 overhead, so we can have about 40mb for single file_get_contents call, then json_decode will need about 20mb, add some local variables, and at least 1 loop - and 100mb is exhausted
I suggest you to read about: memory_get_usage - use it to trace where php allocates memory
then use unset and gc_collect_cycles
UPDATE: I'm not sure why json_decode needs so much memory (maybe some bug, update php?), anyway in php.ini
register_globals = off
register_long_arrays = Off
register_argc_argv = Off
auto_globals_jit = On
always_populate_raw_post_data = Off
You 2nd question: base64
Thus, the actual length of MIME-compliant Base64-encoded binary data is usually about 137% of the original data length
JSON should not add big overhead, but additional encoding of mail body into json could probably use base64 again