Sponsored Content
Top Forums UNIX for Beginners Questions & Answers Using bash script : How to Import data from a dsv file into multiple tables in mysql Post 303031062 by tera on Thursday 21st of February 2019 04:32:57 AM
Old 02-21-2019
Quote:
Originally Posted by steveo314
I use Perl and Bash everyday. I have written a lot of scripts that I use over and over. Delimited files arent hard to work with with Perl. Look at the split routine for Perl to start with. File handlers. Reach out to me if you need help with it.
I have written an sql query , my intentions are to read and store the data to a tmp table, after that i want to read and insert into two tables that are related through a many to many relation ship . is this the right approach ? .

if it is how can i convert the following sql statements into perl scripting syntax
Code:
CREATE TEMPORARY TABLE tmp_import (`record_number` INT(10) NOT NULL PRIMARY KEY , 
id_number VARCHAR(50), `first name` VARCHAR(255) , `last name` VARCHAR(255) ,
 msisdn VARCHAR(50) , network VARCHAR(20) , points INT ,
`card number` VARCHAR(100) , gender VARCHAR(1));

LOAD DATA LOCAL INFILE 'path/to/dsv' 
INTO TABLE tmp_import 
FIELDS TERMINATED BY '|' 
LINES TERMINATED BY '\n';

DELETE FROM tmp_import WHERE record_number = 0;

-- select * from tmp_import;

INSERT INTO tUSER (id_number ,first_names,last_name)
SELECT id_number, `first name`, `last name`
FROM   tmp_import;

INSERT INTO tTYPES (`type` ,cellphone , description)
SELECT network, `msisdn`, `gender`
FROM   tmp_import;

and how can i grab each id (PK) from both tables after inserting , and store them into an associative array to insert them later to the many to many bridge table , this could be totally off your suggestions are welcomed .
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Import data from compressed file

HI I need to import data from a file which is in comressed format but system doesn't have enough space to uncompress file Is there any way so that i can do import from compressed file. (4 Replies)
Discussion started by: ap_gore79
4 Replies

2. Shell Programming and Scripting

Converting tables of row data into columns of tables

I am trying to transpose tables listed in the format into format. Any help would be greatly appreciated. Input: test_data_1 1 2 90% 4 3 91% 5 4 90% 6 5 90% 9 6 90% test_data_2 3 5 92% 5 4 92% 7 3 93% 9 2 92% 1 1 92% ... Output:... (7 Replies)
Discussion started by: justthisguy
7 Replies

3. Shell Programming and Scripting

Data Import perl script

Hi, I have a requirement for creating a Perl Script which will perform Data Import process in an automated way and I am elaborating herewith : Section 1 ) - use the following command line format : "./import.pl -h hostname -p port -f datafile.txt" Section 2) datafile.txt will... (3 Replies)
Discussion started by: scott_apc
3 Replies

4. Shell Programming and Scripting

Reading data from multiple tables from Oracle DB

Hi , I want to read the data from 9 tables in oracle DB into 9 different files in the same connection instance (session). I am able to get data from one table to one file with below code : X=`sqlplus -s user/pwd@DB <<eof select col1 from table1; EXIT; eof` echo $X>myfile Can anyone... (2 Replies)
Discussion started by: net
2 Replies

5. Shell Programming and Scripting

Shell snip to import CSV data into BASH array

I have been trying to write a simple snip of bash shell code to import from 1 to 100 records into a BASH array. I have a CSV file that is structured like: record1,item1,item2,item3,item4,etc.,etc. .... (<= 100 items) record2,item1,item2,item3,item4,etc.,etc. .... (<= 100 items)... (5 Replies)
Discussion started by: dstrout
5 Replies

6. Web Development

mysql query for multiple columns from multiple tables in a DB

Say I have two tables like below.. status HId sName dName StartTime EndTime 1 E E 9:10 10:10 2 E F 9:15 10:15 3 G H 9:17 10:00 logic Id devName capacity free Line 1 E 123 34 1 2 E 345 ... (3 Replies)
Discussion started by: ilan
3 Replies

7. Shell Programming and Scripting

Bash script with python slicing on multiple data files

I have 2 files generated in linux that has common output and were produced across multiple hosts with the same setup/configs. These files do some simple reporting on resource allocation and user sessions. So, essentially, say, 10 hosts, with the same (2) system reporting in the files, so a... (0 Replies)
Discussion started by: jdubbz
0 Replies

8. Shell Programming and Scripting

Append data by looking up 2 tables for multiple files

I want to lookup values from two different tables based on common columns and append. The trick is the column to be looked up is not fixed and varies , so it has to be detected from the header. How can I achieve this at once, for multiple data files, but lookup tables fixed. The two lookup... (5 Replies)
Discussion started by: ritakadm
5 Replies

9. Shell Programming and Scripting

In PErl script: need to read the data one file and generate multiple files based on the data

We have the data looks like below in a log file. I want to generat files based on the string between two hash(#) symbol like below Source: #ext1#test1.tale2 drop #ext1#test11.tale21 drop #ext1#test123.tale21 drop #ext2#test1.tale21 drop #ext2#test12.tale21 drop #ext3#test11.tale21 drop... (5 Replies)
Discussion started by: Sanjeev G
5 Replies

10. Shell Programming and Scripting

Shell script automation using cron which query's MySQL Tables

What I have: I have a input.sh (script which basically connect to mysql-db and query's multiple tables to write back the output to output1.out file in a directory) note: I need to pass an integer (unique_id = anything b/w 1- 1000) next to the script everytime I run the script which generates... (3 Replies)
Discussion started by: kkpand
3 Replies
Tangram::Type::Array::Scalar(3pm)			User Contributed Perl Documentation			 Tangram::Type::Array::Scalar(3pm)

NAME
Tangram::Type/Array/Scalar - map Perl array of strings or numbers SYNOPSIS
use Tangram::Core; use Tangram::Type/Array/Scalar; # always $schema = Tangram::Schema->new( classes => { NaturalPerson => { fields => { flat_array => { interests => { table => 'NP_int', sql => 'VARCHAR(50)', }, lucky_numbers => 'int', # use defaults } DESCRIPTION
Maps references to a Perl array. The persistent fields are grouped in a hash under the "array" key in the field hash. The array may contain only 'simple' scalars like integers, strings or real numbers. It may not contain references. For arrays of objects, see Tangram::Type::Array::FromMany and Tangram::Type::Array::FromOne. Tangram uses a table to save the state of the collection. The table has three columns, which contain * the id of the container object * the position of the element in the array * the value of the element The field names are passed in a hash that associates a field name with a field descriptor. The field descriptor may be either a hash or a string. The hash uses the following fields: * type * table * sql Optional field "type" specifies the type of the elements. If the type is "string"Tangram quotes the values as they are passed to the data- base. Not specifying a "type" is exactly equivalent to specifying "string". Optional field "table" sets the name of the table that contains the elements. This defaults to 'C_F', where C is the class of the contain- ing object and F is the field name. Optional field "sql" specifies the type that deploy() (see Tangram::Deploy) should use for the column containing the elements. If this field is not present, the SQL type is derived from the "type" field: if "type" is "string" (or is absent) VARCHAR(255) is used; otherwise, the "type" field is interpreted as a SQL type. If the descriptor is a string, it is interpreted as the value of the "type" field and all the other fields take the default value. perl v5.8.8 2006-03-29 Tangram::Type::Array::Scalar(3pm)
All times are GMT -4. The time now is 09:54 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy