Export table to file with column headers (column names) using the bcp utility and SQL Server 2008

223,523

Solution 1

The easiest is to use the queryout option and use union all to link a column list with the actual table content

    bcp "select 'col1', 'col2',... union all select * from myschema.dbo.myTableout" queryout myTable.csv /SmyServer01 /c /t, -T

An example:

create table Question1355876
(id int, name varchar(10), someinfo numeric)

insert into Question1355876
values (1, 'a', 123.12)
     , (2, 'b', 456.78)
     , (3, 'c', 901.12)
     , (4, 'd', 353.76)

This query will return the information with the headers as first row (note the casts of the numeric values):

select 'col1', 'col2', 'col3'
union all
select cast(id as varchar(10)), name, cast(someinfo as varchar(28))
from Question1355876

The bcp command will be:

bcp "select 'col1', 'col2', 'col3' union all select cast(id as varchar(10)), name, cast(someinfo as varchar(28)) from Question1355876" queryout myTable.csv /SmyServer01 /c /t, -T

Solution 2

This method automatically outputs column names with your row data using BCP.

The script writes one file for the column headers (read from INFORMATION_SCHEMA.COLUMNS table) then appends another file with the table data.

The final output is combined into TableData.csv which has the headers and row data. Just replace the environment variables at the top to specify the Server, Database and Table name.

set BCP_EXPORT_SERVER=put_my_server_name_here
set BCP_EXPORT_DB=put_my_db_name_here
set BCP_EXPORT_TABLE=put_my_table_name_here

BCP "DECLARE @colnames VARCHAR(max);SELECT @colnames = COALESCE(@colnames + ',', '') + column_name from %BCP_EXPORT_DB%.INFORMATION_SCHEMA.COLUMNS where TABLE_NAME='%BCP_EXPORT_TABLE%'; select @colnames;" queryout HeadersOnly.csv -c -T -S%BCP_EXPORT_SERVER%

BCP %BCP_EXPORT_DB%.dbo.%BCP_EXPORT_TABLE% out TableDataWithoutHeaders.csv -c -t, -T -S%BCP_EXPORT_SERVER%

set BCP_EXPORT_SERVER=
set BCP_EXPORT_DB=
set BCP_EXPORT_TABLE=

copy /b HeadersOnly.csv+TableDataWithoutHeaders.csv TableData.csv

del HeadersOnly.csv
del TableDataWithoutHeaders.csv

Note that if you need to supply credentials, replace the -T option with -U my_username -P my_password

This method has the advantage of always having the column names in sync with the table by using INFORMATION_SCHEMA.COLUMNS. The downside is that it creates temporary files. Microsoft should really fix the bcp utility to support this.

This solution uses the SQL row concatenation trick from here combined with bcp ideas from here

Solution 3

For:

  • Windows, 64 bit
  • SQL Server (tested with SQL Server 2017 and it should work for all versions):

Option 1: Command Prompt

sqlcmd -s, -W -Q "set nocount on; select * from [DATABASE].[dbo].[TABLENAME]" | findstr /v /c:"-" /b > "c:\dirname\file.csv"

Where:

  • [DATABASE].[dbo].[TABLENAME] is table to write.
  • c:\dirname\file.csv is file to write to (surrounded in quotes to handle a path with spaces).
  • Output .csv file includes headers.

Note: I tend to avoid bcp: it is legacy, it predates sqlcmd by a decade, and it never seems to work without causing a whole raft of headaches.

Option 2: Within SQL Script

-- Export table [DATABASE].[dbo].[TABLENAME] to .csv file c:\dirname\file.csv
exec master..xp_cmdshell 'sqlcmd -s, -W -Q "set nocount on; select * from [DATABASE].[dbo].[TABLENAME]" | findstr /v /c:"-" /b > "c:\dirname\file.csv"'

Troubleshoooting: must enable xp_cmdshell within MSSQL.

Sample Output

File: file.csv:

ID,Name,Height
1,Bob,192
2,Jane,184
3,Harry,186

Speed

As fast as theoretically possible: same speed as bcp, and many times faster than manually exporting from SSMS.

Parameter Explanation (optional - can ignore)

In sqlcmd:

  • -s, puts a comma between each column.
  • -W eliminates padding either side of the values.
  • set nocount on eliminates a garbage line at the end of the query.

For findstr:

  • All this does is remove the second line underline underneath the header, e.g. --- ----- ---- ---- ----- --.
  • /v /c:"-" matches any line that starts with "-".
  • /b returns all other lines.

Importing into other programs

In Excel:

  • Can directly open the file in Excel.

In Python:

import pandas as pd
df_raw = pd.read_csv("c:\dirname\file.csv")

Solution 4

A good alternative is SqlCmd, since it does include headers, but it has the downside of adding space padding around the data for human readability. You can combine SqlCmd with the GnuWin32 sed (stream editing) utility to cleanup the results. Here's an example that worked for me, though I can't guarantee that it's bulletproof.

First, export the data:

sqlcmd -S Server -i C:\Temp\Query.sql -o C:\Temp\Results.txt -s"    "

The -s" " is a tab character in double quotes. I found that you have to run this command via a batch file, otherwise the Windows command prompt will treat the tab as an automatic completion command and will substitute a filename in place of the tab.

If Query.sql contains:

SELECT name, object_id, type_desc, create_date
FROM MSDB.sys.views
WHERE name LIKE 'sysmail%'

then you'll see something like this in Results.txt

name                                          object_id   type_desc           create_date            
-------------------------------------------   ----------- ------------------- -----------------------
sysmail_allitems                               2001442204 VIEW                2012-07-20 17:38:27.820
sysmail_sentitems                              2017442261 VIEW                2012-07-20 17:38:27.837
sysmail_unsentitems                            2033442318 VIEW                2012-07-20 17:38:27.850
sysmail_faileditems                            2049442375 VIEW                2012-07-20 17:38:27.860
sysmail_mailattachments                        2097442546 VIEW                2012-07-20 17:38:27.933
sysmail_event_log                              2129442660 VIEW                2012-07-20 17:38:28.040

(6 rows affected)

Next, parse the text using sed:

sed -r "s/ +\t/\t/g" C:\Temp\Results.txt | sed -r "s/\t +/\t/g" | sed -r "s/(^ +| +$)//g" | sed 2d | sed $d | sed "/^$/d" > C:\Temp\Results_New.txt

Note that the 2d command means to delete the second line, the $d command means to delete the last line, and "/^$/d" deletes any blank lines.

The cleaned up file looks like this (though I replaced the tabs with | so they could be visualized here):

name|object_id|type_desc|create_date
sysmail_allitems|2001442204|VIEW|2012-07-20 17:38:27.820
sysmail_sentitems|2017442261|VIEW|2012-07-20 17:38:27.837
sysmail_unsentitems|2033442318|VIEW|2012-07-20 17:38:27.850
sysmail_faileditems|2049442375|VIEW|2012-07-20 17:38:27.860
sysmail_mailattachments|2097442546|VIEW|2012-07-20 17:38:27.933
sysmail_event_log|2129442660|VIEW|2012-07-20 17:38:28.040

Solution 5

I was trying to figure how to do this recently and while I like the most popular solution at the top, it simply would not work for me as I needed the names to be the alias's that I entered in the script so I used some batch files (with some help from a colleague) to accomplish custom table names.

The batch file that initiates the bcp has a line at the bottom of the script that executes another script that merges a template file with the header names and the file that was just exported with bcp using the code below. Hope this helps someone else that was in my situation.

echo Add headers from template file to exported sql files....
Echo School 0031
copy e:\genin\templates\TEMPLATE_Courses.csv + e:\genin\0031\courses0031.csv e:\genin\finished\courses0031.csv /b
Share:
223,523

Related videos on Youtube

JD Long
Author by

JD Long

Only slightly ashamed creator of disgusting and frustrating code. I'm a data guy not a programmer. But sometimes I have to program my data into submission.

Updated on March 02, 2022

Comments

  • JD Long
    JD Long about 2 years

    I have seen a number of hacks to try to get the bcp utility to export column names along with the data. If all I am doing is dumping a table to a text file what is the most straightforward method to have bcp add the column headers?

    Here's the bcp command I am currently using:

    bcp myschema.dbo.myTableout myTable.csv /SmyServer01 /c /t, -T
    
    • kanagaraj palanisamy
      kanagaraj palanisamy over 4 years
      I used with clause along with a dummy column OrderCol to guarantee the order. with temp as ( SELECT 'colName1','colName2','colName3','colName4','colName5','colN‌​ame6','colName7','co‌​lName8',1 OrderCol Union all SELECT col1,col2,col3,col4,col5,col6,col7,col8, 2 OrderCol FROM TableName ) select col1,col2,col3,col4,col5,col6,col7,col8 from Temp order by OrderCol
  • Galled
    Galled almost 12 years
    Throws an error: Conversion failed when converting the varchar value 'COL001' to data type int.
  • Galled
    Galled almost 12 years
    I use select substring(@colnames,2,1000000000); instead of select @colnames; because the @colnames variable is printed with a coma at the begining of the headers.
  • Filip De Vos
    Filip De Vos almost 12 years
    I added an example which contains numeric values. You need to cast the numeric columns to varchar (or nvarchar)
  • CrazyTim
    CrazyTim about 11 years
    Awesome script dude. I would vote up twice if I could. Any way to make the names of the tables and databases as batch script variables?
  • Dylan Hogg
    Dylan Hogg over 10 years
    @CrazyTim, I've made the names env variables for ease of use. Glad it's of use to you.
  • thchaver
    thchaver over 9 years
    I get this error when I try it: Error = [Microsoft][SQL Native Client]Host-file columns may be skipped only when copying into the Server
  • Steve
    Steve over 8 years
    This is the best answer. Just remember you don't have to do this from a batch file, you can call the copy command using 'xp_cmdshell' as well, if that's how you are calling BCP anyway.
  • Srinivas
    Srinivas over 8 years
    Order By clause fails !
  • Filip De Vos
    Filip De Vos over 8 years
    order by is perfectly supported by this. If you encounter an error it must be in your sql statement.
  • Dylan Hogg
    Dylan Hogg almost 8 years
    @thchaver what version of SQL Server are you using? This issue may be related to SQL Server Express.
  • MichaelEvanchik
    MichaelEvanchik over 6 years
    this does not work if datatypes are int in source table
  • MichaelEvanchik
    MichaelEvanchik over 6 years
    what if the data types are different
  • Filip De Vos
    Filip De Vos over 6 years
    Cast the columns to varchar in the query.
  • r2evans
    r2evans over 5 years
    @MichaelEvanchik, just to confirm, you're talking about getting an error like Unable to resolve column level collations, right?
  • Filip De Vos
    Filip De Vos over 5 years
    you can't concat numbers & strings and some other types. To do that you need to cast. Columns with different collations probably also throw errors.
  • Geoff Griswald
    Geoff Griswald over 4 years
    if the data types are different, CAST anything in your SELECT to VARCHAR which isn't already a VARCHAR or NVARCHAR. Since you're writing to .CSV anyway, then it won't make any different to the output.
  • Geoff Griswald
    Geoff Griswald over 4 years
    Not a terrible solution, but this would mean you need to create a view for every type of .CSV file you want to export, and modify both the view and the BCP command when you wanted to change fields, column names etc. Using Dynamic SQL and performing the UNION as part of a dynamic statement rather than in a view is much more flexible.
  • Peter Mortensen
    Peter Mortensen about 4 years
    An explanation would be in order.
  • Geoff Griswald
    Geoff Griswald about 4 years
    This is great. I particularly like how it's a single line of SQL to run the export. I will probably start using this instead of bcp. Only thing to watch out for is if any of your values in column 1 start with "-" they will be removed from the export. Could get messy for negative numbers converted to varchar.
  • abhijitcaps
    abhijitcaps over 3 years
    This is great. Helped me a lot. Thanks
  • vukis
    vukis about 3 years
    Nice. Thanks! With "@CmdExc varchar(max)" was getting these errors: Procedure expects parameter 'command_string' of type 'varchar'. So changed from max to 8000 and it works