Hi <BR><BR>I am having a problem importing data using a CSV file using a DTS package. Typically when I import the data from a csv it work successfuly, but one particular CSV file contains columns which contains comma&#039;s as part of the data. When SQL server imports directly from the CSV file additional columns are created knocking the rest of the row of track.<BR>ROW 1<BR>"THOMAS J","Person","4 street","town","city","country",000,000,000,9999 <BR>ROW 2<BR>"person2","person2","59 street hall,","Mount Saint Annes,",000,000,000,0000<BR><BR>ROW 1 creates ten columns which is what is expected however ROW2 creates 12 rows.<BR><BR><BR>Not all of the cells in the CSV cells have double quotes as text qualifiers so an error is thrown when double quotes are used (invalid delimiter). The file is too far large to adjust manually (replace all)and the data cannot be adjusted prior to import. The file cannot be adjusted to for example Excel Document. <BR><BR>All the quotes are double quotes. Turning of all delimiters creates one column. This would require the result to be parsed once it has been entered in order to break it into the different columns which could get messy. <BR><BR>The delimiter type (comma) cannot be changed either. Removing all the quotes does not work it returns rows with varying number of columns. <BR>About 5,000 to 15,000 rows need to be inserted each time the DTS is run. Currently it is being done by a series of inserts which takes forever to run! Any Ideas?<BR><BR><BR>