I have a php script processing some big data from database. It takes it from table by N(tried from 100 to 100000) rows and inserts it other table. max_execution_time
is set to 0. Every iteration is wrapped in transaction. When selecting each portion I use pg_query()
. But after 1-2 hours my script fails with Maximum execution time of 0 seconds exceeded
, with error message pointing to the line with pg_query()
. Did anyone have this issue? Any cure?
UPD:
Having tried the answer proposed here -- setting max_input_time to -1 -- still have no luck. The error moved from pg_query
line to another line, which seems to be a pretty random one. So pg_query
I guess has nothing to do with that, as well as max_input_time
.