Sponsored Content
Full Discussion: Convert sql file to csv file
Top Forums Shell Programming and Scripting Convert sql file to csv file Post 302913584 by blackrageous on Monday 18th of August 2014 03:44:15 PM
Old 08-18-2014
Post a sample of the file.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

script to convert CSV to SQL

I am trying to write a script to convert csv files into SQL format. What I am missing is: 1. handling of the first row and create it as a insert into format 2. better handling of any other row, and create a line with the data. *The while read line needs a better work. I thought of using awk. ... (1 Reply)
Discussion started by: saariko
1 Replies

2. Shell Programming and Scripting

How to convert tab delimited file to .csv file

Hi, Can any one please help me in converting a tab delimited file in .csv file. Records in my file are similar to mentioned below: DET 001 0201 AC032508970 01478E1X8 DET 002 0202 AC032508971 01478E1X8 Could any one please suggest me what approach would be more suitable for this or if... (5 Replies)
Discussion started by: dtidke
5 Replies

3. Shell Programming and Scripting

How to convert a excel file to a .csv file from unix script

Hi I have a excel file in unix machine and have to convert it into a .csv file.I have to do this from a unix script.How do we do this? Thanks Abhinav (3 Replies)
Discussion started by: akashtcs
3 Replies

4. Shell Programming and Scripting

How to use sql data file in unix csv file as input to an sql query from shell

Hi , I used the below script to get the sql data into csv file using unix scripting. I m getting the output into an output file but the output file is not displayed in a separe columns . #!/bin/ksh export FILE_PATH=/maav/home/xyz/abc/ rm $FILE_PATH/sample.csv sqlplus -s... (2 Replies)
Discussion started by: Nareshp
2 Replies

5. Shell Programming and Scripting

Convert sql output to csv file via bash tools

hi Can anybody help me with converting such structure into csv file for windows : BAT_ID ID_num CVS_LINE A_SEG SKILL_TO A_CUSTOMER_TYPE --------- ---------- --------------------------------- ---------- ------------------ ----------- 14-MAY-11 777752 ... (4 Replies)
Discussion started by: kvok
4 Replies

6. Shell Programming and Scripting

Convert CSV file (with double quoted strings) to pipe delimited file

Hi, could some help me convert CSV file (with double quoted strings) to pipe delimited file: here you go with the same data: 1,Friends,"$3.99 per 1,000 listings",8158here " 1,000 listings " should be a single field. Thanks, Ram (8 Replies)
Discussion started by: Ram.Math
8 Replies

7. Shell Programming and Scripting

Awk to convert a text file to CSV file with some string manipulation

Hi , I have a simple text file with contents as below: 12345678900 971,76 4234560890 22345678900 5971,72 5234560990 32345678900 71,12 6234560190 the new csv-file should be like: Column1;Column2;Column3;Column4;Column5 123456;78900;971,76;423456;0890... (9 Replies)
Discussion started by: FreddyDaKing
9 Replies

8. Shell Programming and Scripting

How to convert excel file to csv file or text file?

Hi all, I need to find a way to convert excel file into csv or a text file in linux command. The reason is I have hundreds of files to convert. Another complication is the I need to delete the first 5 lines of the excel file before conversion. so for instance input.xls description of... (6 Replies)
Discussion started by: johnkim0806
6 Replies

9. UNIX for Advanced & Expert Users

Convert CSV file to nested XML file using UNIX/PERL?

we have a CSV which i need to convert to XML using Perl or Unix shell scripting. I was able to build this XML in oracle database. However, SQL/XML query is running for long time. Hence, I'm considering to write a Perl or shell script to generate this XML file. Basically need to build this XML... (3 Replies)
Discussion started by: laknar
3 Replies

10. Shell Programming and Scripting

Read csv file, convert the data and make one text file in UNIX shell scripting

I have input data looks like this which is a part of a csv file 7,1265,76548,"0102:04" 8,1266,76545,"0112:04" I need to make the output data should look like this and the output data will be part of text file: 7|1265000 |7654899 |A| 8|12660000 |76545999 |B| The logic behind the... (6 Replies)
Discussion started by: RJG
6 Replies
MYSQLDBIMPORT(1)						  MySQL Utilities						  MYSQLDBIMPORT(1)

NAME
mysqldbimport - Import object definitions or data into a database SYNOPSIS
mysqldbimport [options] import_file ... DESCRIPTION
This utility imports metadata (object definitions) or data or both for one or more databases from one or more files. If an object exists on the destination server with the same name as an imported object, it is dropped first before importing the new object. To skip objects by type, use the --skip option with a list of the objects to skip. This enables you to extract a particular set of objects, say, for importing only events (by excluding all other types). Similarly, to skip creation of UPDATE statements for BLOB data, specify the --skip-blobs option. To specify the input format, use one of the following values with the --format option. These correspond to the output formats of the mysqldbexport utility: sql (default) Input consists of SQL statements. For definitions, this consists of the appropriate CREATE and GRANT statements. For data, this is an INSERT statement (or bulk insert if the --bulk-insert option is specified). grid Display output in grid or table format like that of the mysql monitor. csv Input is formatted in comma-separated values format. tab Input is formatted in tab-separated format. vertical Display output in single-column format like that of the G command for the mysql monitor. To indicate that input in csv or tab format does not contain column headers, specify the --no-headers option. To turn off all feedback information, specify the --quiet option. By default, the utility creates each table on the destination server using the same storage engine as the original table. To override this and specify the storage engine to use for all tables created on the destination server, use the --new-storage-engine option. If the desti- nation server supports the new engine, all tables use that engine. To specify the storage engine to use for tables for which the destination server does not support the original storage engine on the source server, use the --default-storage-engine option. The --new-storage-engine option takes precedence over --default-storage-engine if both are given. If the --new-storage-engine or --default-storage-engine option is given and the destination server does not support the specified storage engine, a warning is issued and the server's default storage engine setting is used instead. You must provide connection parameters (user, host, password, and so forth) for an account that has the appropriate privileges to access all objects in the operation. For details, see NOTES. OPTIONS
mysqldbimport accepts the following command-line options: --help Display a help message and exit. --bulk-insert, -b Use bulk insert statements for data. --default-storage-engine=<def_engine> The engine to use for tables if the destination server does not support the original storage engine on the source server. --drop-first, -d Drop each database to be imported if exists before importing anything into it. --dryrun Import the files and generate the statements but do not execute them. This is useful for testing input file validity. --format=<format>, -f<format> Specify the input format. Permitted format values are sql, grid, tab, csv, and vertical. The default is sql. --import=<import_type>, -i<import_type> Specify the import format. Permitted format values are definitions = import only the definitions (metadata) for the objects in the database list, data = import only the table data for the tables in the database list, and both = import the definitions and the data. The default is definitions. If you attempt to import objects into an existing database, the result depends on the import format. If the format is definitions or both, an error occurs unless --drop-first is given. If the format is data, imported table data is added to existing table data. --new-storage-engine=<new_engine> The engine to use for all tables created on the destination server. --no-headers, -h Input does not contain column headers. This option applies only for csv and tab output. --quiet, -q Turn off all messages for quiet execution. --server=<server> Connection information for the server in <user>[:<passwd>]@<host>[:<port>][:<socket>] format. --skip=<skip_objects> Specify objects to skip in the operation as a comma-separated list (no spaces). Permitted values are CREATE_DB, DATA, EVENTS, FUNC- TIONS, GRANTS, PROCEDURES, TABLES, TRIGGERS, and VIEWS. --skip-blobs Do not import BLOB data. --skip-rpl Do not execute replication commands. --verbose, -v Specify how much information to display. Use this option multiple times to increase the amount of information. For example, -v = verbose, -vv = more verbose, -vvv = debug. --version Display version information and exit. NOTES
The login user must have the appropriate permissions to create new objects, access (read) the mysql database, and grant privileges. If a database to be imported already exists, the user must have read permission for it, which is needed to check the existence of objects in the database. Actual privileges needed may differ from installation to installation depending on the security privileges present and whether the database contains certain objects such as views or events and whether binary logging is enabled. Some combinations of the options may result in errors during the operation. For example, excluding tables but not views may result in an error when a view is imported. The --new-storage-engine and --default-storage-engine options apply to all destination tables in the operation. For the --format and --import options, the permitted values are not case sensitive. In addition, values may be specified as any unambiguous prefix of a valid value. For example, --format=g specifies the grid format. An error occurs if a prefix matches more than one valid value. EXAMPLES
To import the metadata from the util_test database to the server on the local host using a file in CSV format, use this command: $ mysqldbimport --server=root@localhost --import=definitions --format=csv data.csv # Source on localhost: ... connected. # Importing definitions from data.csv. #...done. Similarly, to import the data from the util_test database to the server on the local host, importing the data using bulk insert statements, use this command: $ mysqldbimport --server=root@localhost --import=data --bulk-insert --format=csv data.csv # Source on localhost: ... connected. # Importing data from data.csv. #...done. To import both data and definitions from the util_test database, importing the data using bulk insert statements from a file that contains SQL statements, use this command: $ mysqldbimport --server=root@localhost --import=both --bulk-insert --format=sql data.sql # Source on localhost: ... connected. # Importing definitions and data from data.sql. #...done. COPYRIGHT
Copyright (c) 2010, 2012, Oracle and/or its affiliates. All rights reserved. This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; version 2 of the License. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MER- CHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA AUTHOR
MySQL Utilities Team COPYRIGHT
2010, Oracle and/or its affiliates. All rights reserved. 1.0.3 May 09, 2012 MYSQLDBIMPORT(1)
All times are GMT -4. The time now is 08:08 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy