XML Import to MS SQLServer

I have a need to import a very large XML file to a MS SQLserver table possibly multiple times per day. This is a list of university applications from students. The SQLserver table has about 400 columns and the XML is well structured. I believe the Elements (if I am using that XML term correctly) are fixed until there is a new release of the the XML file format that only changes infrequently (maybe yearly).

I know that MSSQL will allow XML into a single column but to use XOJO I have to parse the column with every SQL Select. Not an option.

Without having to write XOJO code to parse every XML element and create a SQL INSERT is there an easy way to import XML to a SQL table so each element ends up in a table column?

I see two distinct issues.
One is to create the empty table with all of the correct column names. Tedious but a one time activity yet with many opportunities for errors.
The second is to process incremental additions (rows) to the table possibly multiple times per day.

Previously this service offered a CSV download and this is going to eventually replace it.

I have not used it, but SQL Server has a “Bulk Load Component” that seems designed for this sort of thing.

Yep … but it still contains lots of manual steps like create the table and all of the columns, create an XML to column map.

Being a “lazy” programmer I was hoping to find something that could read XML and create a SQL schema that I could use to create an empty table. Then a tool to read the XML and do the INSERTS.

Bulk Load has some merit because my previous experience says it is “crazy fast”.

I found an open source JAVA tool that claims to generate SQL schema from XML but I don’t know how it works yet.

I will “throw this in the pot” along with what else I find and see what “cooks up”.

I believe that you can import an XML file in Access 2013 and then use the SQL Server Migration Tool for Access (a free download from Microsoft directly) to upsize your access database to SQL server.

I tried this tool on a small database just to see how it works. It was flawless in my test.

Perhaps it is one approach that you could try.

That would at least get the tables setup but I need a way to automate the data load.

I will give it a try on my XML and see what happens.


Normally I would use SOX or some other XML parser, but it looks like XoJo has an XML Parser built right in.

It should not be too difficult to parse the XML and define a RDBMS table structure. One of the fields could, of course, be a CBLOB or the equivalent and contain the entire input XML. That way, if you wind up with misformed XML or with an element you are unprepared to cope with, you can log it and store the entire XML.


Mark, in my tests the data is moved along with the structure of the tables when upsizing an Access DB to MSSQL with the migration assistant.

Thanks for the tips.

Louis - I need to move “fresh” additional data probably a few times each day so the process needs to be automated. Do you think that is possible with the technique you suggested?

Paul - The CBLOB idea would be good just in case something turns up new but this is a pretty rigid structure that “in theory” would not change unless we were notified in advance so I probably would not build in lots of error checking for unexpected XML. The only thing difficult with using the XOJO XML parser is the quantity of XML Elements – over 400. I would need to generalize my XOJO parser and have it walk the XML element tree automatically. I tested with one parser I found and it created about 10 tables. That was OK. Unfortunately the parser did not know exactly how to link the tables so it generated its own key as a counter (1, 2, 3, etc for each set). The next load would start over with the same counter (1, 2, 3, etc). There was an identity element in the XML data but it did not use it.

I am far from an XML expert but I may become an expert or have to hire one before this project is done. ;-))

No, I would expect that the migration assistant is more of a one-time thing. You need a live XML parser-data loader.

Just to not leave this dangling in case somebody has the same problem in the future.

I found an XML parser from CoolUtils. I ran my sample XML file through it and got 138 tables. It generated the table CREATE and the INSERTs for all of the data. There is a key issue since it seems to just generate it’s own primary key renumbering from 1 each time it is run and sometimes I will have multiple loads in one day. I could work around that issue using a temp table and copying to the live table while renumbering but it appears that XOJO can do what I need for the load.

At this point I think I will take the CREATE statements from the tool and generate the DB but after exploring one of the XOJO example programs to process XML data I think I will write my own loader in XOJO. The amount of XOJO code in the XMLexample sample program is very small and it was able to process my 138 table XLM file just fine. The XMLexample program is just an XML browser but it will be a good start on creating a batch mode loader.

there is a tool in MS SQLServer called SSIS (Sql Server Integration Services) that allows you to do these things.
you can create mappings to fields, specify types, do transformations and also allows you to schedule recurring jobs.

however, it has a steep learning curve if the wizards don’t quite work.

just thought id mention it