4

Specs = CakePHP 2.4.6, PHP 5.3.13, Windows 7 x64, Apache Server 2.4.9.

I keep getting this error:

Allowed memory size xxxx exhausted

I have tried everything I could find, increasing memory_limit or max_execution_time inisde php.ini file is not helping no matter how much I increase the limits. My Apache server has openssl enabled (some say it might be related), php_pdo_mysql is enabled too. I am using fastCGI for PHP and it's not helping to switch to php5_module!

The strange thing is that it was working on my localhost for a while and not working on the dev server, and recently it's not working on my localhost as well (I started getting execution time error instead sometimes)

Here is the action I am using:

public function index() {

    $this->Certificate->recursive = 0; 
    $certificates = $this->Certificate->find('all',         
        array(       
        'conditions' => array('course_id !=' => '1'),       
        'contain' => array(

                         'CertificateStatus.status',
                         'Course.title',
                         'Student.full_name',
                         'User.name'

                 )));
    $this->set(compact('certificates'));
}

Any ideas as to what could be the reason for this problem? how would I find memory leaks in CakePHP? What could be the solution to fix/optimize it?

** New Update:

After limiting the number of "fields" in the find(), and after increasing some Fcgid numbers inside httpd, now I can fetch the whole table, but it takes 5 minutes for the page to load completely, which is a failure performance wise!

Is this related to CakePHP caching? If yes, is there any way to optimize it for this number of records?? Everyone tells me that 26,000 text records is not a "huge" data set!

Warren Sergent
  • 2,542
  • 4
  • 36
  • 42
salouri
  • 760
  • 1
  • 8
  • 19
  • possible duplicate of [CakePHP error Allowed memory size exhausted](http://stackoverflow.com/questions/6483162/cakephp-error-allowed-memory-size-exhausted) – Nunser Jul 21 '14 at 18:47
  • I read the article referred in your link[ http://stackoverflow.com/questions/6483162/cakephp-error-allowed-memory-size-exhausted] already, Nunser, and none of the suggestions there worked. – salouri Jul 21 '14 at 19:16
  • Then you should post the action in particular that's draining all the memory so we can help you optimize it. I don't believe that if you have implemented Containable behavior as the other question says, that you could be running out of memory on *every* action. – Nunser Jul 21 '14 at 19:19
  • Ok, I will check that out again and I will get back with the results – salouri Jul 21 '14 at 23:20
  • Nunser, the code (using Containable) is provided, and again, its not helping. BTW number of records I am fetching is > 26000 – salouri Jul 22 '14 at 01:42
  • 3
    What are you trying to do ? Dealing with 26000 records on a single HTTP request sounds as something that should be avoided. As you seems to pass them to the view, I guess you want to show them ? So why not using pagination ? And if your goal is not to show them but to do some treatment, a shell script would certainly be more appropriated. – nIcO Jul 22 '14 at 08:19
  • nIcO, This is supposed to be an online database interface. I am not using cakephp pagination, instead I am using the JQuery Datatables plugin. There is a need to show these records to be able to search for, view, edit, and delete each one. The problem is, it was working find on my localhost (only) with 128M memory_limit size, and now its not working no matter how much I put this size to (including -1 )! – salouri Jul 22 '14 at 14:20
  • Let's compare with phpMyAdmin: you can search, view, edit and delete records. But it doesn't use queries that return all records at the same time. Even if you manage to make your page work the way you want, what will happen when you will have 500'000 records ? At some point it will break the memory again (Apache or even worse, the server memory). Using a jQuery plugin doesn't change anything: you should fetch the records as batches. Also think of how much memory browsers will consume with all records in the page, be they shown or not. I think you should really better change your design. – nIcO Jul 22 '14 at 15:03
  • nIcO, and what exactly you are suggesting to change and how to implement the batches-fetching approach? I am aware of this "idea" but I haven't seen how to apply it in CakePhp, would you direct me to a reference pls? thanks – salouri Jul 22 '14 at 15:24
  • Does that mean CakePHP was never meant to build an online Interfaces for databases?? that would be a shame! – salouri Aug 06 '14 at 21:17
  • SQL's purpose is to search large datasets (using `WHERE` clauses and the like) to produce human-sized result sets. Your cakephp code shouldn't be trying to buffer 26K records in web server RAM. Rather, you should be passing your search criteria from jQuery to php to the database, and rendering subsets of the dataset. It's true that 26K records is a medium size *table*. But it's an over-the-top too-large *result set*. – O. Jones Jan 10 '15 at 14:59

4 Answers4

3
ini_set('memory_limit', '-1'); 

overrides the default PHP memory limit.

apurav gaur
  • 342
  • 7
  • 18
  • I tried this one before, and then I get the error: `Maximum execution time of 60 seconds exceeded` (I increased it from 30 to 60 seconds) – salouri Jul 22 '14 at 15:03
1

Try this code:

ini_set('memory_limit', '1024M');
Warren Sergent
  • 2,542
  • 4
  • 36
  • 42
Indrajeet Singh
  • 2,958
  • 25
  • 25
1

Indrajeet answer worked for me.

To improve his answer, simply go to app/lib/Cake/I18N/I18N.php and in the function public function _construct() add the line

ini_set('memory_limit', '1024M');
clod986
  • 2,527
  • 6
  • 28
  • 52
0
    set_time_limit(0);

    ini_set('memory_limit','2048M');

i have used set_time_limit function to handle mysql expiration time and ini_set function to handle size of data to be stored

Naresh Dudhat
  • 245
  • 3
  • 9