MySQLPrepared-compile error


I’m trying to use a MySQL prepared statement with a variable number of fields. I get a csv file of data, top row is field names, the number of columns varies. I create the table and then need to import the data. I’m getting an error when trying to do the BindType using a string variable for that line. I’m using the example from the docs and trying to get that working.

The error I get is: There is more than one item with this name and it’s not clear to which it refers.

As in the docs I want to reuse the prepared statement to insert new rows of data.

'ps.BindType(0, MySQLPreparedStatement.MYSQL_TYPE_STRING)
'ps.BindType(1, MySQLPreparedStatement.MYSQL_TYPE_LONG)

			Dim vBind1 As String = "0, MySQLPreparedStatement.MYSQL_TYPE_STRING"
			Dim vBind2 As String = "1, MySQLPreparedStatement.MYSQL_TYPE_LONG"

If I reduce this to just one BindType I still get the error. It has something to do with trying to use a variable string it seems.

Can’t figure this one out.

you can’t do it that way.
ps.BindType( 0, MySQLPreparedStatement.MYSQL_TYPE_STRING)

would work.

You can’t simply put both parameters in a string and pass them.
You can put them in variables if needed:

[code]Dim vBind1 As String = MySQLPreparedStatement.MYSQL_TYPE_STRING
Dim vBind2 As String = MySQLPreparedStatement.MYSQL_TYPE_LONG

ps.BindType(0, vBind1)
ps.BindType(1, vBind2)[/code]

That doesn’t work, same error.

it should work. What lines do you use and what error?

Well the second argument in BindType is an integer, so I’m not sure if the snippet provided would work directly.

Do the CSV column titles match the Database columns? You might be able to write something clever that iterates over the columns and finds where data should go.

I’m using the exact lines as the example. Here’s a screen shot of the lines of code and the compile error.

Well I guess I won’t publish the screenshots because the service I’ve used, Photobucket, is impossible to work with. I’ll try something else but you can take my word for it that I am not going outside the published documentation.

I can see the bind type is integer. The first row of the csv data is the column titles/field names. I save that in an array and then I have another array for each data row. My intent is to iterate through them and they will match up exactly.

I’m now working with the Database.InsertRecord approach.

Sorry, my snippet was wrong:

[code]Dim vBind1 As Integer = MySQLPreparedStatement.MYSQL_TYPE_STRING
Dim vBind2 As Integer = MySQLPreparedStatement.MYSQL_TYPE_LONG

ps.BindType(0, vBind1)
ps.BindType(1, vBind2)[/code]

The types are integers, not string.

These two aren’t the same at all

The example is

      ps.BindType(Integer , Integer)

and what you’re trying is


Strings do not automagically convert themselves from a string into two integer parameters

Suppose you have the names of the columns in an array called names
And somewhere you need to have the types of the destination columns (thats VERY useful)

     dim columnList as string
     dim valuesMarkers as string
     for i as integer = 0 to names.ubound()
             if columnList <> "" then columnList = columnList + ","
             if valuesMarkers <> "" then valuesMarkers = valuesMarkers + ","
             columnList = columnList + names(i)
             valuesMarkers = valuesMarkers + "?" 

      // now build the insert cmd
      dim insertCmd as string = "insert into " + tableName + "(" + columnList + ") values (" +  valuesMarkers + ")"

      // create the prepared stmt
      dim ps as MySQLPreparedStatement = db.Prepare( insertCmd ) 

     //  now bind
     for i as integer = 0 to types.ubound()
             ps.bind(i+1, types(i))

    //  now use the heck out of the statement

WARNING THIS IS FORUM CODE so there may be syntax errors !!!

@Christian Schmitz [quote]The types are integers, not string.[/quote]

Thank you. A fine detail that I was overlooking. Both the bind types are integer even if the type being bound is a string. That got past me.

@Norman Palardy [quote]// now use the heck out of the statement[/quote]

Thanks for taking the time to do this. I’ve got something similiar but could not get it to compile. Now it does.

I’ll work this up and post back.

I’m hung up now on the ps.SQLExecute piece of this. It doesn’t seem to like a variable in place of the actual data. I get the following error when trying to do the insert: DB Error: 1 parameters are being bound, but 2 types were specified.
Referring to the docs for MySQLPreparedStatement:

I’m trying to replace these with a loop to reuse the preparation already done. BTW, these will work so I know the prepare and bind statements work.

ps.SQLExecute("john", 20) ps.SQLExecute("john", 21) ps.SQLExecute("john", 22) ps.SQLExecute("john", 20) ps.SQLExecute("john", 21) ps.SQLExecute("john", 22)

I have a csv text file with exactly the data above (john,20), I read a line, I build the data string to be inserted, and I put it inside the ps.SQLExecute. Here’s the code.

[code]While Not input.EOF
Dim line, vRecordData() As String
line = input.ReadLine

Dim vSQLRecordInsert As String
For i=0 to UBound(vRecordData)
if i<>UBound(vRecordData)then
vSQLRecordInsert=vSQLRecordInsert+""""+vRecordData(i)+""", "
End If
If db.Error Then
MsgBox("DB Error: " + db.ErrorMessage)
End If

This line works: ps.SQLExecute(“john”, “20”). Note my code generates “” around each field data.
But this does not: ps.SQLExecute(vSQLRecordInsert) where vSQLRecordInsert = “john”, “20”.

It seems to be seeing the first example as having 2 pieces of data while in the second it sees the the string variable as just one piece of data.

I assume the prepared statement doesn’t like this sort of variable for some reason. Does it want something other than a string? Like the BindType?

Same problem as before
You CANNOT replace two parameters with one that holds a value that LOOKS right
It HAS to be 2 separate parameters (or if you build up a statement that requires 10 then you need to pass 10 parameters)

the one that works
ps.SQLExecute(“john”, “20”) // two string parameters

the one that doesn’t
ps.SQLExecute(vSQLRecordInsert) // 1 string parameter

its literally that simple
if you say you bound 1 parameter you MUST pass 1 parameter to sqlexecute
if you say you bound 2 parameter2 you MUST pass 2 parameter2 to sqlexecute
and so on

one variable with contents that look like they are parameters is NOT the same a X many parameters

Makes sense. I’ll have to come up with a different approach to building this statement. Thanks.

why not, for the purposes of the import, use a temporary table ?
then you can write a loop that is more like

    create temporary table with right columns
    for each line in the csv
           for each column in this line of the csv
                  if column = 1 then 
                        insert row to temporary with first column & retain rowid (or primary key whatever that might be)
                        update row with primary key from above with column N
                  end if
    now insert all rows from temporary to real table (using select into)

this is just the outline but it makes it so you dont need to try & craft a sql statement that needs an arbitrary number of columns
you can insert / update one at a time

and you get actual sql error checks while you’re at it in both the loading of the csv itself AND when inserting to the real table

That’s a much better idea. Thanks for the advice. That’s what I’m going to do.

Something like this might wrk better since you can invent the primary key column for your own purposes and make life easier

    figure out a UNIQUE primary key column name (use anything not in the list of columns from the CSV)
    create temporary table with unique primary key + columns from csv
    set counter = 1
    for each line in the csv
           insert into temporary (UNIQUE primary key column name) values ( counter )
           for each column in this line of the csv
                     update row set column N = value where UNIQUE primary key column name = counter 
         counter = counter + 1  
  now insert all rows from temporary to real table (using select into)

no this is likely to run slower since its piles of updates but at least you can write a simple loop to import tables with an unknown number of columns

About the only thing you might want to do is not make everything strings but there’s no real way to know what types things should be from a csv