Importing And Parsing A Large XML File In SQL Server (when "normal" Methods Are Rather Slow)
Solution 1:
OK. I created an XML Index on the XML data column. (Just a primary one for now).
A query that took ~4:30 minutes before takes ~9 seconds now! Seems that a table that stores the XML with a proper XML Index and the parsing the data with the xml.nodes() function are a feasible solution.
Thank you all.
Solution 2:
Since you want a tabular structure, you could convert the XML to a CSV file (using this java or this .NET tool , or even an XSLT tranformation) and then perform a bulk insert.
Of course, all that depends on your XML being properly formed.
Solution 3:
Well, first of all I don't really understand why you would use OpenXml to load the file. I am pretty sure that doing that will internally trigger a whole bunch of tests for validity according to OpenXml ISO Standard.
But - Xml.Nodes() (I assume that means the DOM way of loading data) - is by far the slowest way to load and parse Xml data. Consider instead a SAX approach using XmlReader or similar. I do realize that the article is from 2004 - but it still explains the stuff pretty well.
Post a Comment for "Importing And Parsing A Large XML File In SQL Server (when "normal" Methods Are Rather Slow)"