SQL request size, limits?

Hi I have an issue with large inserts or updates on postgres. Large SQL statements are returning an unquoted string error, and I have tested the exact SQL in PSQL at the command line and it works fine. Sending it through the postgresql driver in Xojo, in straight SQL errors out. I was wondering if there is some limit on the size of a query, I though I remembered some of the drivers putting an artificial limit on SQL size.

This is for a bytea field, so the inserts and updates will be reasonably large.

I don’t know of any limit. Are you using a PreparedStatement?

Not right now these are just simple, run of the mill SQL. Say 28k.

How do you escape/unescape your binary data? I didn’t find bytea support (proper escape/unscape methods for example) in Xojo docs.

Well I have been using the built in routines in the older PG4SQL driver I have traditionally used. With Xojo I have been working on my own version of escape routines that will compile in the newer versions. During testing I found this current issue, by hand the update/insert works in psql, but is failing in Xojo using the Hex encoding method from the 9+ series of PG. The older escape format seemed more problematic to tackle first, but if this is a road block, the next step is using the MBS SQL drivers and wrapping them in Xojo code to give a consistent interface.

I had an old feature request in the system for having the escape routines for all the DB types supported exposed for direct usage for this type of problem.

We need some words from Xojo here about this. :frowning:

At the very least an example that writes into a bytea column for pure binary data and pulls it back out with native storage on the backend. I have a solution based on using base64 encoding to store it as a string essentially, but that increases the storage requirements significantly, and is totally unnecessary if we have access to the libpq functions or a “Xojo DB” driver solution that handles bytea properly behind the scenes as a blob field.

Using the newer Hex encoded method to access bytea columns in PostgreSQL 9.x,I just read the field as NativeValue from the RecordSet (StringValue should work as well), strip the leading “\x” pass the value through REALbasic.DecodeHex and assign it to a MemoryBlock. For writing I use Prepared Statements and essentially do the reverse operation. This works fine for me. What is your exact error message from the db.ErrorMessage? Have you had a look to the Log at the server? Is the error you receive actually depending on the size of the binary data?

forgot to say: I’m building the INSERT / UPDATE statements myself, not relying on the DatabaseRecord for Insert or RecordSet.Edit/Update for Updates…

It is not a prepared stmt, just regular SQL getting executed. The actual SQL was about 28k during the initial test, small jpg file, and it returns a improperly terminated quoted string or something similar, it is not in front of me at the moment, but copying the same sql data out of the debugger and running through psql works great. I will try your approach and see if prepared stmt makes any difference.