Showing posts with label fastexport example. Show all posts
Showing posts with label fastexport example. Show all posts

Saturday, 12 March 2016

TPT export script - How to use the TPT export operator and understanding Data Streams

TPT (Teradata Parallel Transporter) is a standard utility that can be used to export data from Teradata system onto a file.

The below diagram explains how TPT operates:

TPT Export in Teradata
Data is generated using the Export operator. Once posted onto the Data Steam, it is consumed by the Consumer.

This is a widely used methodology. It is called Source-Sink in Spring-XD.
Similar architecture is called Publisher-Subscriber for Kafka. Though it is a Message Broker service.
But the ideology remains the same.

Below, I have given a sample TPT Export script for your reference. As you see , once the operators are created, the producer is applied to the consumer.

/*************************************+
* Script Name - export_data.tpt
* Usage - tbuild -f export_data.tpt
* Author - Tduser 
* Description - exports data from 
*               retail.item to 
*               export_tpt_item.txt file
* Version - Relase 1 - 12-Mar-2016
**************************************/

Define Job export_to_file
description 'export records from item table'
(
define schema item_schema
(
item_id char(30),
country_code char(30)
);

/* Consumer operator since in an export job the final target is the file */

define operator file_writer
type dataconnector consumer
schema item_schema
attributes
(
varchar filename = 'export_tpt_item.txt',
varchar format = 'text',
varchar openmode = 'write',
varchar indicatormode='N'
);


/* Producer operator reads data from a teradata table */

define operator export_item
type export
schema item_schema
attributes
(
varchar username = 'dbc',
varchar userpassword = 'dbc',
varchar tdpid = '127.0.0.1',
integer maxsessions = 8,
varchar selectstmt = '
select cast(l_orderkey as char(30)),
cast(l_partkey as char(30))
from retail.item;');


/* apply the producer output to the consumer */

apply to operator(file_writer[1])
select item_id,country_code
from operator(export_item[1]);

);

Execution Command: Use a wrapper to call the below script:


$> tbuild -f export_script.tpt | tee export_script.tpt.log

Log entries will be as follows:

Teradata Parallel Transporter Version 15.00.00.02
Job log: /opt/teradata/client/15.00/tbuild/logs/root-38.out
Job id is root-38, running on TDExpress150001_Sles10
Teradata Parallel Transporter file_writer[1]: TPT19006 Version 15.00.00.02
Teradata Parallel Transporter SQL Selector Operator Version 15.00.00.02
export_item: private log not specified
file_writer[1]: TPT19010 Instance 1 directing private log report to 'dtacop-root-1447-1'.
file_writer[1]: TPT19007 DataConnector Consumer operator Instances: 1
file_writer[1]: TPT19003 ECI operator ID: 'file_writer-1447'
file_writer[1]: TPT19222 Operator instance 1 processing file 'export_tpt_item.txt'.
export_item: connecting sessions
export_item: sending SELECT request
export_item: retrieving data
export_item: Total Rows Exported:  60175
export_item: finished retrieving data
file_writer[1]: TPT19221 Total files processed: 1.
export_item: disconnecting sessions
export_item: Total processor time used = '0.07 Second(s)'
export_item: Start : Sat Mar 12 11:05:33 2016
export_item: End   : Sat Mar 12 11:05:38 2016
Job step MAIN_STEP completed successfully
Job root completed successfully
Job start: Sat Mar 12 11:05:27 2016
Job end:   Sat Mar 12 11:05:38 2016

Reach out to me if you have any doubts. 
Special Offer, register for the online 7-video course just for USD 15.




Friday, 17 August 2012

Fast-export Utility in Teradata with example: How to effecitvely send data from Teradata to Client-applications


Fast-export example:

As the definition says, FastExport is a command-driven utility that uses multiple sessions to quickly transfer large amounts of data from tables and views of the Teradata Database to a client-based application.

Example of Fast-export:


.logtable $LOADDB.LOGTABLE_ARRANGEMENT;
.run File $LOGON_FILE;
.begin export sessions 2;
.export outfile $VAR6 FORMAT TEXT MODE RECORD;
 
 
SELECT TRIM(CAST(HEAD1.HEADER AS CHAR(100))) FROM
(
SELECT
'¿'||
TRIM(A.ID_HEADER)
|| '|' ||TRIM(A.CLUSTER_NAME)
|| '|' || TRIM(A.BUS_DATE)
|| '|' ||TRIM(A.PRODUCT_NAME)
|| '|' ||TRIM(A.PROCESS_NAME)
|| '|' ||TRIM(A.FREQUENCY_NAME)
|| '|' ||TRIM(A.VERSION_NO)
|| '¿'
AS HEADER
FROM
(
SELECT
'01' AS ID_HEADER
,TRIM(CLUSTER_NAME) AS CLUSTER_NAME
, CAST(CAST(CAST('$BUS_DATE' AS DATE FORMAT 'YYYY-MM-DD') AS DATE FORMAT 'DDMMMYYYY') AS VARCHAR(12)) AS BUS_DATE
,TRIM(PRODUCT_NAME) AS PRODUCT_NAME
,TRIM(PROCESS_NAME) AS PROCESS_NAME
,TRIM(FREQUENCY_NAME) AS FREQUENCY_NAME
,MAX(VERSION_NO)+1 AS VERSION_NO
FROM
?LOADDB.VERSION_HEAD
WHERE CAST('$BUS_DATE' AS DATE FORMAT 'YYYY-MM-DD')= RUNDATE
AND PRODUCT_NAME='PT_007_LEA'
group by 1,2,3,4,5,6
) AS A
) AS HEAD1
;
.end export;
.logoff;


TIPS:

    • Convert date fields to varchar(12) before export


    • Generally Fast-export will append the first 2 bytes with header information, so to eliminate that we have appended '¿' symbol

    Use a post-treatment script to remove the '¿' character

    Include the following command in UNIX script for post-treatment:
    cat $VAR6 | cut -f 2 -d ¿ >$VAR6.txt

    Please let me know if you need more information.