I’m getting an error whenever I try to execute the MySQL “load data local infile” command in Xojo to import a file of csv data. The same command works when used in Sequel Pro or Workbench. The error I get is: The used command is not allowed with this MySQL version.
Well it is allowed because I’m using it.
I understand it may be a my.conf issue as well as a security issue.
If someone can share some experience with this I would appreciate it. I know there’s other ways to read the data in.
This issue has come up again. Why can’t I run the “load data local infile” from Xojo but can run it from MySQL Workbench and SQL Pro on the same local OS X machine?
I am a newbie to AWS RDS as well and their docs are hard to follow. So I don’t know. Looking at that now. Just think that if it wasn’t then I would not be able to run the command in Workbench or Sequel Pro. Puzzling.
You have a point. LOCAL-INFILE needs to be enabled on BOTH sides of the connection in order for it to work since it is a security concern. My guess is that it is not enabled by default in the Xojo plugin for mySQL.
Maybe someone from Xojo will chime in and clarify the use of client configuration options beside the basic host, port, etc…?
With the mysql command line client you can put a file in your home directory “~/.my.cnf” and put options in a “[client]” section. You could give that a try and see if the plugin reads it.
Tried creating the .my.conf file. No joy. I’m just fishing around on this so I’ll wait and see what someone else may say. I assume I’m not the first person to try this. Thanks.
Basically I’m building the SQL string which I also test in Sequel Pro and/or Workbench. That string is like this:
vSQL="LOAD DATA LOCAL INFILE '/Users/dm/Desktop/TestTargetList3000Tabs.txt' INTO TABLE temp FIELDS TERMINATED BY '\\t' LINES TERMINATED BY '\\r\
' IGNORE 1 LINES;"
And I’m trying to execute it as both a prepared statement and regular execute. I’ll try both.
Dim ps as MySQLPreparedStatement
db.SQLExecute(vSQL)
'ps=db.Prepare(vSQL)
'ps.SQLExecute
Not sure if this is worth pursuing since I can’t download the results as a file. It takes about 3.5 minutes to create a 3000 record table using Xojo reading the tab file one row at a time. Using the load data local file it takes .75 seconds. For now we can live with that and I’ll look elsewhere for performance improvements. Thanks for the input.
Duane, you could get significantly better performance by writing fewer & larger SQL inserts instead of going line by line, inserting each one (if that is what you mean). I do some much larger tables like this and the only way to get reasonable speed is to reduce the number of inserts by combining lines. Locking the tables also helps a lot. 3000 records is not very much so this could be greatly improved, I think.
This is what I will do but not sure exactly how. I’ll look around for a solution and if I can’t find one I’ll come back here and post.
Yes, unfortunately, it is what I mean. I’ve not done the larger transactions before so will have to figure it out. So as to keep this subject on topic I’ll post back on the subject of making these sorts of larger SQL inserts.
I agree. I think you’ll see me back here shortly asking about this subject.