Sponsored Content
Special Forums UNIX and Linux Applications Using BTEQ with perl to Teradata Post 302899630 by islanderman on Wednesday 30th of April 2014 11:43:24 AM
Old 04-30-2014
Using BTEQ with perl to Teradata

Has anyone used either bteq or mload for inserting into a Teradata Database via perl. We're running Red Hat Enterprise 5.7. I've gotten the DBD::ODBC drivers installed and running. I installed the Teradata ICU, TeraGSS and ODBC pkgs. We are using perl to insert data across the network into Teradata, but wanted to know if there is a way to speed up the transfer. I am told that Fastload will not work with perl.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Connectivity to Bteq through Perl

I want to connect to bteq using Perl script and i am unaware of how to do this? In Shell script it is very simple, Bteq<<-END .logon ..... ... .quit ... END but what is the syntex for perl? Please help me out . Thanks Kunal (1 Reply)
Discussion started by: kunal_dixit
1 Replies

2. Shell Programming and Scripting

Shell/perl script to connect to different servers in single login in teradata

Hi, I want to write a shell script to compare two tables in teradata.these tables are present on different servers. I want to connect to both servers in single login in order to fetch and compare the data in one go. Thanks (1 Reply)
Discussion started by: monika
1 Replies

3. UNIX for Dummies Questions & Answers

Log file not getting populated when using BTEQ

Hi, I am trying to run some SQL scripts under the UNIX server using BTEQ. When I try to create a log file, the file gets populated only to the point where I log into BTEQ. The log file for the running of the actual script does not seem to be stored. Would any one know ehy this could be... (3 Replies)
Discussion started by: zsrinathz
3 Replies

4. Programming

Unix - teradata

I am trying execute a sql file from the script and the sql file has the following code snippet, which throws out the error given below FOR C_FINELINE_LP AS CURSOR C_SLS FOR SELECT * FROM WM_UTIL.FLT_DEP WHERE LOAD_IND = 'N' DO ..... ..... .... END FOR; FOR C_FLTSLS_STR_LP AS... (0 Replies)
Discussion started by: yschd
0 Replies

5. Programming

Connect To Teradata

How do i connect from C program to teradata Database? The C program is being executed from a Unix script, AIX. I am calling a C program from a Unix shell script and the C Program executes some SQLs on Teradata Database. (3 Replies)
Discussion started by: yschd
3 Replies

6. Shell Programming and Scripting

Teradata connectivity through UNIX by use bteq

Hi, I want a script for connecting teradata to load the file to teradata table. Can you please help me out. Thanks in advance. (1 Reply)
Discussion started by: victory
1 Replies

7. Shell Programming and Scripting

How to pass parameter to bteq?

I am using below code to connect terdata and getting the query result in a file.Now i want to use same code for different tables,plz tell me how to pass table name as parameter.i tried using as below code but not working. bteq < /download/viv/dev/ops/Scripts/ter.sh FLTORGTKR_ORG_etc.. ... (1 Reply)
Discussion started by: katakamvivek
1 Replies

8. Shell Programming and Scripting

Teradata fastexport in ksh

Hi, I am trying to use Teradata fastexport in ksh, but getting error as below temp1.ksh: line 7: syntax error at line 10: `newline' unexpected below is my code: #!/bin/ksh LOGON_STR="TDDB/user,paswd;" DATAFILE=/path/a.lst; DEBUG=0 >$DATAFILE fexp > /dev/null... (3 Replies)
Discussion started by: usrrenny
3 Replies

9. Shell Programming and Scripting

How to write BTEQ batch scripts in UNIX?

Hi All, I need to write Unix shell script. To star with : I need to do some file checking on unix file system, then based on file existance, I need to run diff SQL in Teradata Bteq. After that, depending on Results of SQL, I need to code other shell scripting like moving file, within same... (4 Replies)
Discussion started by: Shilpi Gupta
4 Replies

10. Shell Programming and Scripting

Minute(4) issue in teradata

I have values below for which diff field is giving error like "invalid time interval" in teradata Might be it is not doing calculation anymore after exceeding minute(4) value END_TS 2/2/2018 08:50:49.000000 START_TS 1/5/2018 17:30:02.000000 SLA_TIME 23:59:59.000000 select... (0 Replies)
Discussion started by: himanshupant
0 Replies
DBIx::Class::Storage::DBI::ODBC::ACCESS(3)		User Contributed Perl Documentation		DBIx::Class::Storage::DBI::ODBC::ACCESS(3)

NAME
DBIx::Class::Storage::DBI::ODBC::ACCESS - Support specific to MS Access over ODBC DESCRIPTION
This class implements support specific to Microsoft Access over ODBC. It is a subclass of DBIx::Class::Storage::DBI::ODBC and DBIx::Class::Storage::DBI::ACCESS, see those classes for more information. It is loaded automatically by by DBIx::Class::Storage::DBI::ODBC when it detects a MS Access back-end. This driver implements workarounds for "IMAGE" and "MEMO" columns, and DBIx::Class::InflateColumn::DateTime support for "DATETIME" columns. EXAMPLE DSN
dbi:ODBC:driver={Microsoft Access Driver (*.mdb, *.accdb)};dbq=C:Users kitoverDocumentsaccess_sample.accdb TEXT
/IMAGE/MEMO COLUMNS Avoid using "TEXT" columns as they will be truncated to 255 bytes. Some other drivers (like ADO) will automatically convert "TEXT" columns to "MEMO", but the ODBC driver does not. "IMAGE" columns work correctly, but the statements for inserting or updating an "IMAGE" column will not be cached, due to a bug in the Access ODBC driver. "MEMO" columns work correctly as well, but you must take care to set LongReadLen to "$max_memo_size * 2 + 1". This is done for you automatically if you pass LongReadLen in your connect_info; but if you set this attribute directly on the $dbh, keep this limitation in mind. AUTHOR
See "AUTHOR" in DBIx::Class and "CONTRIBUTORS" in DBIx::Class. LICENSE
You may distribute this code under the same terms as Perl itself. perl v5.16.2 2012-08-16 DBIx::Class::Storage::DBI::ODBC::ACCESS(3)
All times are GMT -4. The time now is 11:45 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy