Best Way To Save Lengthy String Data to MS SQL Server DB

I’m still wrestling with trying to save lengthy RTF text string data to my MS SQL Server database, so I’m going to try this from a slightly different angle than the post I made yesterday.

I have read all about the “8K Limit” but still don’t understand why I can’t save over 2,000 characters of Unicode string data to the db. The involved table’s column is set to NVARCHAR(MAX) and I set the table’s stored procedure (sp_tableoption) “large value types out of row” parameter to “1” (e.g., “On”) which supposedly from what I read allows for the storage of over the 8K limit (only 24 bytes are stored in the main row and an additional data page is allocated on the LOB storage space for the large data itself). This didn’t seem to make any difference though.

The string involved has RTF text in it. When the total character count reaches 2,000, it just doesn’t save anything (no error messages, no writing over data that’s already there … just NOTHING!). Anything under the 2,000 characters and it saves it just fine. 2,000 Unicode characters would equal 4,000 bytes (2bytes to a Unicode character) which is only half the 8K limit … so, why does it stop there???

How do all of you out there save lengthy string data to a database? I really would like to avoid having to use an individual RTF files if at all possible. I’m doing that with the images in this program right now (just using the db to store ‘pointers’ to the images on the server) and there are hundreds of them (which would be the same for RTF files if I went that way instead of saving the RTF text to the db itself).

Is there something “basic” that I’m just missing here (and haven’t found in 2 days of reading!)???

If you try to do this manually in a db tool what happens ?
That would at least help isolate if its a oho only or db issue

Thanks Norman … Good idea! … let me try SSMS right now and see what I get.

When you say 2000 characters in RTF format, I suppose you talk about the raw code ? RTF is awfully verbose, and the actual code underneath a simple string can be really overwhelming.

{\fonttbl\f0\fswiss\fcharset0 Helvetica;}

\f0\fs24 \cf0 Hello World}[/code]

Hi Michel … The 2,000 limit I’m running up against is the “total” character count including all the RTF overhead. I displayed the “len” of the string containing the RTF data (which counts all the characters, not just the ASCII text I entered) and that is what I am referring to when I say 2,000 … and you are certainly correct about the verbose nature … the RTF stuff has more characters than the actual text content itself at times.

OK, Norman … I AM able to paste MUCH larger (I went substantially over 3,500 total characters with one try) RTF text strings directly into the appropriate table cell using SSMS 2012. And, after that, I can open them successfully in their entirety by the same application I’m creating that won’t save them properly over 2,000 characters to begin with. Other than that probably ruling out a database issue, what’s it telling me??? Where do I look next???

Good question
I’m not at home right now and I don’t have MS SQL here
I’d actually file a bug report about this with the steps you’ve tried (basically what you have written in this thread) and one of the guys can look at it
It may be an issue in the plugin but I don’t really know

I noticed on another thread you’re using the MBS plugin
Is that the case here ?
Maybe christian has an idea about this
And if that is the case then don’t bother filing a bug report about xojo unless we can isolate it to the Xojo plugin

What you tried to use a BLOB instead?

[quote=126574:@Norman Palardy]I noticed on another thread you’re using the MBS plugin
Is that the case here ?
Maybe christian has an idea about this [/quote]
Thanks again, Norman … Yes, I am using the MBS SQL plugin. I have already sent Christian a few emails on it yesterday and am awaiting his response now. I don’t really have a good workaround, so I’m stuck in the water at the moment. I keep narrowing it down to something happening with the MBS Prepared Statements. The data going into that part where the binding occurs is correct (and for string sizes much greater than 2000 characters) … then the prepared statements are executed and nothing results (no data saved) when the string size exceeds 2000 characters. Also, on the other end, I can manually stick a large RTF text string (substantially more than 4000 characters, at that) in the database and retrieve it successfully with my same program using that same MBS SQL plugin (but obviously executing read operations, not write). However, if I immediately try saving that very same data back to the database via the MBS Prepared Statements, the problem occurs (won’t save the RTF string) once again if the string size is over 2000 characters and you can no longer retrieve what was previously there in the database table.

Hi Beatrix … thanks for taking your time to respond. MS SQL does not have a “BLOB” data type to choose from (I think that’s a MySQL, SQLite thing) in the table column data type choices. Closest they come is VARBINARY(MAX). Going the binary stream route is one of the few things I haven’t tried yet, but at the moment, it appears to be more related to something byte count-related in the MBS SQL plugin than it does what data type is used … but believe me, I’ll give the binary route a go of it just to prove one way or the other if it has any impact.

Well you could try the ODBC plugin on Windows and target MS SQL

You lost me there, Norman … could you please elaborate a bit on how I’d do that … Thks!

The MBS plugins I think have a way for you to use one of ODBC or DB Library as the underlying mechanism they use to talk to the database.
Since they use SQLAPI++ under the hood and those libraries allow you to specify which mechanism to use

I can see how using the SQLAPI++ to do it but I’m not sure what the equivalent in the MBS plugin is but it should be very close to

SAConnection con;
con.setOption( “UseAPI” ) = “DB-Library”;
con.setClient( SA_SQLServer_Client );

Gotcha! Now I understand exactly. Worth a try, that’s for sure …

I did simply do my tests with OLEDB:

[code] dim db as new SQLDatabaseMBS

db.Option(“UseAPI”) = “OLEDB”
DB.Option(“SQLNCLI.LIBS”) = “sqlsrv32.dll” // Library included in Windows Vista and newer
db.RaiseExceptions = true

if db.Connect then
db.AutoCommit = db.kAutoCommitOff
db.Scrollable = false

That worked with 100000 characters of text in a field with type “ntext”.

@Don Lyttle Were you ever able to resolve this successfully? I’m running into a similar problem here, but I’m accessing MSSQL 2005 via ODBS.

Not a lot of help, but it is trivial to do this in .Net, and I have seen, somewhere, T-SQL procs that do it as well. I think the key is just handing the storage off to a routine that compresses and writes it.

One thing that works for Oracle ClOBs is to use the Xojo DatabaseRecord. Might try that, There is a chance it will work with MSSQL.

Certainly you are nowhere near hitting the max size of the field, which is about 2GB. It is more like you are hitting the max size of an SQL statement.

Sorry I cannot be more help, I do not have a test MSSQL server up right now.