Sponsored Content
Special Forums UNIX and Linux Applications Does anybody know how to store my tables to a csv file? Post 302429617 by zaxxon on Tuesday 15th of June 2010 03:28:42 AM
Old 06-15-2010
I guess that's a basic task using sqlpus to connect, fire off your sql stament which would be a SELECT and then either use some switch of sqlplus to define an output file or just redirect it to an output file. There should be plenty of examples on the web. Strange you did not find any.

My 1st result on Google entering "sqlplus script":
http://www.orafaq.com/wiki/SQL*Plus_FAQ
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Converting tables of row data into columns of tables

I am trying to transpose tables listed in the format into format. Any help would be greatly appreciated. Input: test_data_1 1 2 90% 4 3 91% 5 4 90% 6 5 90% 9 6 90% test_data_2 3 5 92% 5 4 92% 7 3 93% 9 2 92% 1 1 92% ... Output:... (7 Replies)
Discussion started by: justthisguy
7 Replies

2. Shell Programming and Scripting

Store table contents in csv file

I need to write a script to store the contents of a table in a csv file I'm using Toad, it's a Oracle database. (5 Replies)
Discussion started by: ladyAnne
5 Replies

3. Shell Programming and Scripting

Execute stored procedure through script in sybase database and store the output in a .csv file

Hi, I have a sybase stored procedure which takes two input parameters (start_date and end_date) and when it get executed, it gives few records as an output. I want to write a unix script (ksh) which login to the sybase database, then execute this stored procedure (takes the input parameter as... (8 Replies)
Discussion started by: amit.mathur08
8 Replies

4. Shell Programming and Scripting

need to store query output fields in variables edit them and update the same in tables.

Hi , I have a query like select err_qty,drop_qty,unbld_qty,orig_qty from usage_data; I need to store the values of these fetched fields in variables, Need to edit them and update the new values into the table. Can anyone please help me in writing this piece of code:( (1 Reply)
Discussion started by: Rajesh Putnala
1 Replies

5. Shell Programming and Scripting

Search and store value from .csv

Dear All, I am using the command to find the value from path DYMV_STD_NAME=$( echo $file | sed 's#.*/*_\(*\).*#\1#' ) file = RRK11234_RKY5807_SRY000_HOME_20071010.zip It give me value DYMV_STD_NAME = RKY5807 Now i have flat as below The contect of the file would be as below. ... (1 Reply)
Discussion started by: yadavricky
1 Replies

6. Shell Programming and Scripting

Script to store the csv files into a particular folder

I want to write a unix shellScript should store the csv files. into a paticular folder (2 Replies)
Discussion started by: RaghavendraT
2 Replies

7. Shell Programming and Scripting

Compare 2 files of csv file and match column data and create a new csv file of them

Hi, I am newbie in shell script. I need your help to solve my problem. Firstly, I have 2 files of csv and i want to compare of the contents then the output will be written in a new csv file. File1: SourceFile,DateTimeOriginal /home/intannf/foto/IMG_0713.JPG,2015:02:17 11:14:07... (8 Replies)
Discussion started by: refrain
8 Replies

8. Shell Programming and Scripting

Splitting csv into 3 tables in html file

I have the data in csv in 3 tables. how can I output the same into 3 tables in html.also how can I set the width. tried multiple options . attached is the format. #!/bin/ksh awk 'BEGIN{ FS="," print "<HTML><BODY><TABLE border = '1' cellpadding=10 width=100>" print... (7 Replies)
Discussion started by: archana25
7 Replies

9. Shell Programming and Scripting

Read CSV file and delete hdfs, hive and hbase tables

I have a CSV file with hdfs directories, hive tables and hbase tables. 1. first column - hdfs directories 2. second column - hive tables 3. third column - hbase tables I have to check the csv file and look for the first column and delete the hdfs directory from the hdfs path, now... (2 Replies)
Discussion started by: shivamayam
2 Replies

10. UNIX for Beginners Questions & Answers

Export Oracle multiple tables to multiple csv files using UNIX shell scripting

Hello All, just wanted to export multiple tables from oracle sql using unix shell script to csv file and the below code is exporting only the first table. Can you please suggest why? or any better idea? export FILE="/abc/autom/file/geo_JOB.csv" Export= `sqlplus -s dev01/password@dEV3... (16 Replies)
Discussion started by: Hope
16 Replies
genhash(1)						      General Commands Manual							genhash(1)

NAME
genhash - md5 hash generation tool for remote web pages SYNOPSIS
genhash [options] [-s server-address] [-p port] [-u url] DESCRIPTION
genhash is a tool used for generating md5sum hashes of remote web pages. genhash can use HTTP or HTTPS to connect to the web page. The output by this utility includes the HTTP header, page data, and the md5sum of the data. This md5sum can then be used within the keepalived(8) program, for monitoring HTTP and HTTPS services. OPTIONS
--use-ssl, -S Use SSL to connect to the server. --server <host>, -s Specify the ip address to connect to. --port <port>, -p Specify the port to connect to. --url <url>, -u Specify the path to the file you want to generate the hash of. --use-virtualhost <host>, -V Specify the virtual host to send along with the HTTP headers. --hash <alg>, -H Specify the hash algorithm to make a digest of the target page. Consult the help screen for list of available ones with a mark of the default one. --verbose, -v Be verbose with the output. --help, -h Display the program help screen and exit. --release, -r Display the release number (version) and exit. SEE ALSO
keepalived(8), keepalived.conf(5) AUTHOR
genhash was written by Alexandre Cassen <acassen@linux-vs.org>. This man page was contributed by Andres Salomon <dilinger@voxel.net> for the Debian GNU/Linux system (but may be used by others). Feb 2004 genhash(1)
All times are GMT -4. The time now is 10:16 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy