0

I am trying to develop a Node.js function on Google Cloud Functions that reads a CSV file on Cloud Storage and encodes its character code, and writes an encoded CSV file on the storage. The function generates the file on the storage successfully when the target file size is small(15KB). However, if the file size is large(>100MB), the function generates nothing.

  • Is there an upper limit to the file size when reading and writing files with Node js on Google Cloud Functions?
  • If you know how to deal with this problem, I would appreciate it if you could let me know.
  • 1
    Most likely the code is trying to read the entire file (or at least too much of it) into memory, and running out of memory. But there's not much more we can say without seeing the error you get, and the [minimal, complete/standalone code that reproduces that error](http://stackoverflow.com/help/mcve). – Frank van Puffelen Jul 26 '21 at 06:11
  • Did you try to increase the memory size of your cloud function and try again? Do you have error logs when you generate large file? – guillaume blaquiere Jul 26 '21 at 07:30

2 Answers2

0

Though google spreadsheets are completely free from google, it still has some limitations

Google help shows limitations but not on the file size limit itself, you may compute from these limitations that file size is somewhere around 20mb though it varies depending on the field datatype you have whether string or integer, and so on.

Google spreadsheet limitation as of 26-Jul-2021

​Up to 5 million cells or 18,278 columns (column ZZZ) for spreadsheets that are created in or converted to Google Sheets.

Up to 5 million cells or 18,278 columns for spreadsheets imported from Microsoft Excel. The limits are the same for Excel and CSV imports.

If anyone cell has more than 50,000 characters, that single cell will not be uploaded.

For more than 100mb file size, you would have already crossed some of these limits. Sometimes such error is responded from google cloud for your requests, you might notice such errors in the google sheet responses for your batch request.

Ajay
  • 33
  • 1
  • 5
  • Thank you for your response, Ajay-san! Indeed, I have just found an article about it.https://cloud.google.com/functions/quotas I'd appreciate it if you could share the alternative to deal with it. – Satoshi Nakamura Jul 26 '21 at 06:35
  • An alternative that I would suggest is to split your data into multiple spreadsheets to overcome this. In case, that's not possible, then look for a database instead of spreadsheets. – Ajay Jul 26 '21 at 07:55
0

From this documentation it is clear that Google Cloud Functions has a Quota limitation of 10 mb for data sent to HTTP Functions in an HTTP request, data sent from HTTP functions in an HTTP response and data sent in events to background functions. This StackOverFlow thread shows how you can compress the files to avoid this limit.

Prabir
  • 1,415
  • 4
  • 10