3

I have a php script processing some big data from database. It takes it from table by N(tried from 100 to 100000) rows and inserts it other table. max_execution_time is set to 0. Every iteration is wrapped in transaction. When selecting each portion I use pg_query(). But after 1-2 hours my script fails with Maximum execution time of 0 seconds exceeded, with error message pointing to the line with pg_query(). Did anyone have this issue? Any cure?

UPD:

Having tried the answer proposed here -- setting max_input_time to -1 -- still have no luck. The error moved from pg_query line to another line, which seems to be a pretty random one. So pg_query I guess has nothing to do with that, as well as max_input_time.

Community
  • 1
  • 1
Vadim Samokhin
  • 3,378
  • 4
  • 40
  • 68

2 Answers2

1

max_execution_time = 0 means run forever.

However there might be other things that may stop your script. For example apache has a default script execution timeout of 5 minutes.

see this: Is ini_set('max_execution_time', 0) a bad idea?

Community
  • 1
  • 1
Francois Bourgeois
  • 3,650
  • 5
  • 30
  • 41
1

Where do you get that setting from? From a php.ini file? If so, search your project code for ini_set's, they have higher priority. I bet there is one that silently creeped in.