15

I have several occasions where I want to collect data when in the field. This is in situations where I do not always have access to my postgres database.

To keep things in sync, it would be excellent if I could use psycopg2 functions offline to generate queries that can be held back and once I am able to connect to the database; process everything that is held back.

One thing I am currently struggling with is that the psycopg2 cursor requires a connection to be constructed.

My question is:

Is there a way to use a cursor to do things like mogrify without an active connection object? Or with a connection object that is not connected to a database? I would then like to write the mogrify results temporarily to file so they can be processed later.

milovanderlinden
  • 1,124
  • 1
  • 12
  • 27

2 Answers2

18

It would be a fragile approach, as the connection is used to detect some parameters used for escaping (encoding, standard-conforming string etc).

You can have a "manual mogrify" calling psycopg2.extensions.adapt(x).getquoted() on your parameters and then merging them to the query using the regular Python % operator.

Currently released psycopg versions (up to 2.3.2) may fail on None -> NULL conversion though. You can either convert these values manually or register an adapter for None: you can see how in this commit.

piro
  • 13,378
  • 5
  • 34
  • 38
  • Thank you for your answer. I understand the fragility and will indeed look for another option. – milovanderlinden Feb 04 '11 at 08:32
  • this in 2011, today 2017 there is any way to send those parameters (encoding, standard-conforming string etc) or fake them to call mogrify without connection, i see your same photo in github dvarrazzo and i believe you are the same developer. if are you, could explain that. thanks – Sk. Feb 22 '18 at 15:37
  • @Sk. no, there isn't a way to fake the parameters. If you know what you are doing, mostly in terms of what your server is able to accept, you can still use `adapt()` manually. It is not a "generic" solution, a string you create once and run anywhere, but if you know how your server is configured the solution is probably good enough. I think the accepted answer is right: an SQL string is a poor choice to store data, better store it into a proper data structure and just convert them to SQL when sending it to the server. – piro Feb 22 '18 at 17:02
0

It seems like it would be easier and more versatile to store the data to be inserted later in another structure. Perhaps a csv file. Then when you connect you can run through that table, but you can also easily do other things with that CSV if necessary.

chmullig
  • 13,006
  • 5
  • 35
  • 52
  • I will use this aproach. Create a (or more) csv files that can be copied into postgresql when online: http://initd.org/psycopg/docs/cursor.html#cursor.copy_from. Thank you – milovanderlinden Feb 04 '11 at 08:31