SQLLite column width vs 2D Array

I have need to create either 2D arrays or preferably a memory db to perform data manipulation on files whose column width exceeds 3000 and row depth exceeds 6000. These are the files I receive and must work with. Splitting the files is not an option due to Column Headers not lining up or being different.

I currently have two 2D Arrays which I dynamically map row 0 which = HEADER COLUMNS from cdv files in order to line up the cell that needs to be worked on or the holder of needed info.

This process is SLOW

So I was looking into a memory SQL Lite table because then I could do stuff like Record sets and find stuff quickly. Well that is a problem because the SQL Lite cannot handle columns greater than 2000. I have read that you can open SQL Lite to beyond 2000

Has anyone actually opened up the column width of SQL Lit that runs inside XOJO


Does anyone have other suggestions?



You would have to recompile Sqlite to make it possible to use more than 2000 columns in a table
And at that point you would also have to create your own spite plugin as you can’t just replace a portion of a plugin

You could make an in-memory sqlite table with 3 columns

row integer, column integer, value text

Make an index on row, column

Then make accessor methods with the same names as your current arrays. Eg.,

Sub array1(row as integer, column as integer, assigns value as string)
Function array1(row as integer, column as integer) as string

The accessor methods can update/lookup the data in the database. And you don’t have to change any of your existing code.