When you have big flat file may be CSV with lots of records may be couple hundred mega byte file and you wants to transfer those data into SQL-Server than you are at the right article as bulk insert is one of the way to get this kind of heavy lifting.
It is really very efficient and fast as it by passes the transaction and directly injects the data into data file. Let us try by creating one .CSV file (a.k.a. comma delimited file or comma separated value) and load it in to SQL Server database table.
— Open your notepad.
— Copy following data in it. Below given are a dummy data and represent name, department and company name.
— Save the above notepad file as “emp.txt” in your “C” drive.
— Open your SSMS and create following table and use BULK INSERT command as follows:
–create one table for demo
CREATE TABLE emps
–BULK INSERT to insert file’s data to emps table
BULK INSERT emps
WITH (FIELDTERMINATOR=‘,’,FIRSTROW =0, ROWTERMINATOR=‘\n’);
–FIELD TERMINATOR defines the separator it could be any like ‘,’ or ” etc.
–FIRSTROW defines from where it should read data, first line of the file may be
–header so SQL shouldn’t isert it.
–ROWTERMINATOR is new line characte ‘\n’, it could be different if you have got the
–file from mainframe PC or other system it could be different, you can see it in hex editor
–test the data whether it has really been copied.
SELECT * FROM emps
So this is all about BULK INSERT. It is really very fast. Once I did BULK INSERT for about 100,000 records and it hardly took 2 minutes. Enjoy bulk operation!!!!
Reference: Ritesh Shah