version 2023r2 on windows 10
I am creating a csv file from an sqlite database query. It works fine up to about 6500 records. If I allow more records, then the save file dialog never shows up. I have to use the save file dialog because the users want to decide where to save the csv file. I think this is memory or scope issue. If I remove the part that adds the commas then the whole file will save. (about 9000 records). Can anyone see if I did something wrong in the code below. I have tried several options but nothing seems to work. Textfile and textstr are both a property of the webpage.
var sql as string
var rs as rowset
var cc as integer
var rowstr as string
var cleanStr as string
sql = "Select dispatchrequest.*, dispatchequip.* FROM dispatchrequest Left Join DispatchEquip On DispatchEquip.ticketdate = dispatchrequest.ticketdate LIMIT 6000"
try
rs = session.db.SelectSQL(sql) //get all the records in the query
catch e as DatabaseException
messagebox(e.message)
return //stop processing if an error occured.
end try
if rs <> Nil then //make sure the recordset is not Nil
TextFile = New WebFile //create a new webfile
TextFile.MimeType = "text/csv" //set the MimeType of the webfile
TextFile.ForceDownLoad = True //force the file to download
TextFile.FileName = "logistics_export.csv" //name the webfile
cc = rs.ColumnCount //Get the column count of the recordset for later use
if not rs.AfterLastRow then //make sure the recordset has records
//write the column header name in the first row
for j as integer = 0 to cc - 1 //loop the columns, subtract 1 because column count starts at 1
rowstr = rowstr + rs.columnat(j).name + "," //add each column name to the rowstr
next j
textstr = textstr + rowstr + EndOfLine //add Endofline to the end of the row to start a new row
rowstr = "" //clear the variable rowstr to get ready for a new row
//write all the rows in the recordset
for each row as Databaserow in rs //loop each row in the rowset
For i As Integer = 0 To cc - 1 //loop each column to get the text in each field of this row
cleanStr = rs.ColumnAt(i).stringvalue.replaceLineEndings(" ").ReplaceAll(",", " ")
rowstr = rowstr + cleanStr + "," //add all the fields together to make a row string
cleanStr = ""
next i
textstr = textstr + rowstr + EndOfLine //add an endofline and add the row to the other rows
rowstr = "" //clear the rowstr variable
next
textfile.data = textstr //add the entire list of text to the data for download.
GotoURL(TextFile.URL) //this will show a file save dialog for the download.
end if
rs.close
end if
String concatenation becomes slower as the string gets longer. It’s often best to put the values in an array and use FromArray to get the string. Put each rowstr in another array and join them all together at the end.
var rowarray() as string
var textarray() as string
...
rowarray.Add(cleanStr)
...
rowstr = String.FromArray(rowarray, ",")
textarray.Add(rowstr)
...
textstr = String.FromArray(textarray, EndOfLine)
Tim, I tried the array’s like you suggest but it still would not work with over 6000 rows. I changed my code to write to a temp file on the server and then download that file. I did concat the text in the row but it don’t have many fields and then I write the row and repeat. It seems to be very fast. Here is the code in case someone else has this problem. Thanks for your help.
var sql as string
var rs as rowset
var cc as integer
var rowstr as string
var cleanStr as string
sql = "Select dispatchrequest.*, dispatchequip.* FROM dispatchrequest Left Join DispatchEquip On DispatchEquip.ticketdate = dispatchrequest.ticketdate"
try
rs = session.db.SelectSQL(sql) //get all the records in the query
catch e as DatabaseException
messagebox(e.message)
return //stop processing if an error occured.
end try
Var fileStream As TextOutputStream
Var f as folderitem = SpecialFolder.Temporary.child("logistics_export.csv")
If f <> Nil Then
fileStream = TextOutputStream.Create(f)
if rs <> Nil then //make sure the recordset is not Nil
cc = rs.ColumnCount //Get the column count of the recordset for later use
if not rs.AfterLastRow then //make sure the recordset has records
//write the column header name in the first row
for j as integer = 0 to cc - 1 //loop the columns, subtract 1 because column count starts at 1
rowstr = rowstr + rs.columnat(j).name + "," //add each column name to the rowstr
next j
textstr = rowstr //add Endofline to the end of the row to start a new row
rowstr = "" //clear the variable rowstr to get ready for a new row
filestream.writeline(textstr)
for each row as Databaserow in rs //loop each row in the rowset
For i As Integer = 0 To cc - 1 //loop each column to get the text in each field of this row
cleanStr = rs.ColumnAt(i).stringvalue.replaceLineEndings(" ").ReplaceAll(",", " ") + ","
rowstr = rowstr + cleanStr //add all the fields together to make a row string
cleanStr = ""
next i
filestream.Writeline(rowstr) //write the row to the file
rowstr = "" //clear the rowstr variable
next
end if
//when finished writing all the rows, close the filestream
fileStream.Close
end if
End If
If f <> Nil And f.Exists Then
TextFile = WebFile.Open(f) //open the file as a webfile
TextFile.ForceDownload = True
End If
if TextFile <> Nil Then
GotoURL(TextFile.URL) //shows the savefile dialog to the user
end if
Greg, I took your note and decided to change the code and do away with concatenation. I am now using the the array as Tim suggested. With over 7000 rows it takes about 3 seconds to create the file. Is there any way to make it faster than that? I always appreciate all the help on this forum. Thanks
var sql as string
var rs as rowset
var cc as integer
var rowArray() as string
var rowstr as string
var cleanStr as string
sql = "Select dispatchrequest.*, dispatchequip.* FROM dispatchrequest Left Join DispatchEquip On DispatchEquip.ticketdate = dispatchrequest.ticketdate"
try
rs = session.db.SelectSQL(sql) //get all the records in the query
catch e as DatabaseException
messagebox(e.message)
return //stop processing if an error occured.
end try
Var fileStream As TextOutputStream
Var f as folderitem = SpecialFolder.Temporary.child("logistics_export.csv")
If f <> Nil Then
fileStream = TextOutputStream.Create(f)
if rs <> Nil then //make sure the recordset is not Nil
cc = rs.ColumnCount //Get the column count of the recordset for later use
if not rs.AfterLastRow then //make sure the recordset has records
//write the column header name in the first row
for j as integer = 0 to cc - 1 //loop the columns, subtract 1 because column count starts at 1
rowArray.add(rs.columnat(j).name)
next j
rowstr = string.FromArray(rowArray, ",")
filestream.writeline(rowstr) //writes the line to the file
rowArray.RemoveAll //empty the array
for each row as Databaserow in rs //loop each row in the rowset
For i As Integer = 0 To cc - 1 //loop each column to get the text in each field of this row
cleanStr = rs.ColumnAt(i).stringvalue.replaceLineEndings(" ").ReplaceAll(",", " ")
rowArray.add(cleanStr)
next i
rowstr = string.FromArray(rowarray, ",") //add all the fields together to make a row string
filestream.Writeline(rowstr) //write the row to the file
rowArray.RemoveAll //empty the array
next
end if
//when finished writing all the rows, close the filestream
fileStream.Close
end if
End If
If f <> Nil And f.Exists Then
TextFile = WebFile.Open(f) //open the file as a webfile
TextFile.ForceDownload = True
End If
if TextFile <> Nil Then
GotoURL(TextFile.URL) //shows the savefile dialog to the user
end if
I’d start by doing away with the rowStr variable and just concatenating right on the line that writes to the file.
Individual write operations are costly. You may find that instead of writing to the file on each row, that you should append each row to the end of an array and then join and write all of the lines at once.