Teradata BTEQ is the oldest available tool from Teradata for executing SQL queries. It can be used to run queries and export and import data.
Teradata BTEQ offers more possibilities than Teradata SQL Assistant and should be one of your essential tools as a Teradata developer. We can use BTEQ interactively or execute batch scripts.
ETL-Tools versus BTEQ
BTEQ is an excellent alternative to ETL tools, primarily as ETL tools are often not used properly, and theoretical advantages such as data lineage are not given. I often notice that ETL tools are used as scheduling tools and not in the way they are designed to be used.
Data lineage with ETL Tools is most times not given. For performance reasons, data lineage often is sacrificed.
Especially the combination of the ETL process using BTEQ and an ETL tool leads to the fact that all advantages of the ETL tool are lost. I know customers who use an ETL tool, e.g., only because they have or see no possibility to load EBCDIC files in Linux.
Also, remember that SQL and Linux developers are a dime a dozen. For ETL tools, the supply of experts can be tight. If we think of Ab Initio, where you can’t find documentation, you are on a minimal market.
Simple tasks can be coded in BTEQ in a few minutes but frequently require much longer when implemented in an ETL tool. This proves that ETL tools do not make development more manageable and efficient, which is one of the main arguments for ETL tools.
To sum it up: in my more than two decades of experience with data warehousing, I have never met a customer who used an ETL tool (whichever one) correctly and was able to achieve benefits.
ETL tools are not generally bad. But developers without the right knowledge and training can quickly undo the potential benefits of these tools.
Therefore I always recommend checking if the Teradata tools like BTEQ, TPT Load, etc., are insufficient for the tasks to be solved.
Frequently Asked Questions about Teradata BTEQ
How is data imported and exported with BTEQ?
BTEQ is not a bulk load utility like Fastload, Multiload, or Fast Export. Data is exported or imported transactional and row by row. BTEQ import and export are therefore not suitable for large amounts of data.
Can Teradata BTEQ perform loops?
Loops are not possible in BTEQ.
Can I define my variables in BTEQ?
Variables cannot be defined. A common workaround is to use Linux parameters in the BTEQ scripts and replace them with a wrapper script.
What types of conditional logic does BTEQ handle?
We can use the “.GOTO” command with which we can jump to a specific label (indicated by “.LABEL” ).
Which methods for conditional statements and branches exist in BTEQ?
There are the conditional statements available like in most procedural languages:
IF, THEN, ELSE, ENDIF.
Can I determine how many rows are affected by an INSERT, UPDATE or DELETE?
BTEQ returns the result in the following variables: INSERTCOUNT, UPDATECOUNT, ACTIVITYCOUNT These can be used in a “.GOTO” command, for example:
.IF UPDATECOUNT = 0 THEN .GOTO <NowRowsUpdated>
How is error handling implemented in a Teradata BTEQ script?
We can use the variable ERRORCODE, which is holding the result of each executed statement. This ERRORCODE can be used for different actions:
— Terminate the script and return error code 255 to the client
.IF ERRORCODE <> 0 THEN .QUIT 255
— If the step was successful goto a certain label
.IF ERRORCODE <> 0 THEN .GOTO <NextStep>
Is there a possibility to execute Linux commands from BTEQ?
You can execute the command “.OS <linux_command>”.
Can I run another BTEQ from within BTEQ?
Yes, with the command “.RUN <other_bteq>”.
Teradata BTEQ Export
BTEQ can export data in various formats. Here are the four most important ones:
- EXPORT DATA:
Exports data in record mode.
You can use this format to exchange data between Fastload, Fast export, Multiload, and BTEQ. - EXPORT REPORT:
The data is exported as a report in a readable format and contains the columns as headers. This is the default mode. - EXPORT INDICDATA:
NULL values are marked in this format - EXPORT DIF:
Creates a file in DIF (Data Interchange Format) format, which can be read on any Windows system.
A Teradata BTEQ Export Example
In the below example, BTEQ export writes the table Customer in the record mode into a text file:
.LOGON SYSTEM1/DWHPRO,DWHPRO_PWD .EXPORT DATA FILE = c:\temp\dwhpro.txt SELECT * FROM Customer; .EXPORT RESET .QUIT .LOGOFF
In the following example, BTEQ export writes the table Customer in the report mode into a text file readable by users. It includes column headers:
.LOGON SYSTEM1/DWHPRO,DWHPRO_PWD .EXPORT REPORT FILE = c:\temp\dwhpro.txt SELECT * FROM Customer; .EXPORT RESET .QUIT .LOGOFF
In our last example, BTEQ export writes the table Customer in report mode into a text file but removes the column headers using the TITLE command:
.LOGON SYSTEM1/DWHPRO,DWHPRO_PWD .SET TITLEDASHES OFF .EXPORT REPORT FILE = c:\temp\dwhpro.txt SELECT FirstName (TITLE ''), LastName (TITLE '') FROM Customer; .EXPORT RESET .QUIT .LOGOFF
To create a text file that uses a comma as a separator, we can do the following:
.LOGON SYSTEM1/DWHPRO,DWHPRO_PWD .SET TITLEDASHES OFF .EXPORT REPORT FILE = c:\temp\dwhpro.txt SELECT FirstName|| ',' || LastName (TITLE '') FROM Customer; .EXPORT RESET .QUIT .LOGOFF
A Teradata BTEQ Import Example
A text file in record mode can be easily imported with BTEQ. The command. REPEAT means that all lines should be read in:
.LOGON SYSTEM1/DWHPRO,DWHPRO_PWD .IMPORT DATE FILE = c:\temp\dwhpro.txt .REPEAT * USING (FirstName VARCHAR(200), LastName VARCHAR(200)) INSERT INTO Customer VALUES (:FirstName,:LastName); .QUIT .LOGOFF
A comma-separated text file can be imported, as shown below. It is necessary to define all columns as VARCHAR when importing CSV files:
.LOGON SYSTEM1/DWHPRO,DWHPRO_PWD .IMPORT VARTEXT FILE = c:\temp\dwhpro.txt .REPEAT * USING (FirstName VARCHAR(200), LastName VARCHAR(200)) INSERT INTO Customer VALUES (:FirstName,:LastName); .QUIT .LOGOFF
Teradata BTEQ Sessions
Multiple sessions are like several users (but with the same user credentials). Each of these sessions belongs to the same user and can run one or more sequential transactions:
.SET SESSIONS 100 .LOGON SYSTEM1/DWHPRO,DWHPRO_PWD <...> .LOGOFF
The use of multiple sessions can improve the performance of a BTEQ script.
Multiple sessions only make sense for data access via PI or USI.
Using various sessions in BTEQ imports can make sense, but not in BTEQ exports.
You can read more details here:
Teradata BTEQ Fast Path Insert
Suppose several INSERT statements are made in the same target table, and both the source and target tables have the same primary index. The following technique should be used since this bypasses the transaction log when writing to an empty table.

Place a semicolon at the beginning of the lines (“;INSERT INTO”) as shown in the following example
.SET SESSIONS 100 .LOGON SYSTEM1/DWHPRO,DWHPRO_PWD INSERT INTO CUSTOMER SELECT * FROM Customer1 ;INSERT INTO CUSTOMER SELECT * FROM Customer2; .LOGOFF
Below figure shows a comparison of how the Slow Path INSERT works when the comma is not at the beginning of the line or the tables do not have the same primary index:
