Sponsored Content
Top Forums UNIX for Beginners Questions & Answers Using bash script : How to Import data from a dsv file into multiple tables in mysql Post 303031062 by tera on Thursday 21st of February 2019 04:32:57 AM
Old 02-21-2019
Quote:
Originally Posted by steveo314
I use Perl and Bash everyday. I have written a lot of scripts that I use over and over. Delimited files arent hard to work with with Perl. Look at the split routine for Perl to start with. File handlers. Reach out to me if you need help with it.
I have written an sql query , my intentions are to read and store the data to a tmp table, after that i want to read and insert into two tables that are related through a many to many relation ship . is this the right approach ? .

if it is how can i convert the following sql statements into perl scripting syntax
Code:
CREATE TEMPORARY TABLE tmp_import (`record_number` INT(10) NOT NULL PRIMARY KEY , 
id_number VARCHAR(50), `first name` VARCHAR(255) , `last name` VARCHAR(255) ,
 msisdn VARCHAR(50) , network VARCHAR(20) , points INT ,
`card number` VARCHAR(100) , gender VARCHAR(1));

LOAD DATA LOCAL INFILE 'path/to/dsv' 
INTO TABLE tmp_import 
FIELDS TERMINATED BY '|' 
LINES TERMINATED BY '\n';

DELETE FROM tmp_import WHERE record_number = 0;

-- select * from tmp_import;

INSERT INTO tUSER (id_number ,first_names,last_name)
SELECT id_number, `first name`, `last name`
FROM   tmp_import;

INSERT INTO tTYPES (`type` ,cellphone , description)
SELECT network, `msisdn`, `gender`
FROM   tmp_import;

and how can i grab each id (PK) from both tables after inserting , and store them into an associative array to insert them later to the many to many bridge table , this could be totally off your suggestions are welcomed .
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Import data from compressed file

HI I need to import data from a file which is in comressed format but system doesn't have enough space to uncompress file Is there any way so that i can do import from compressed file. (4 Replies)
Discussion started by: ap_gore79
4 Replies

2. Shell Programming and Scripting

Converting tables of row data into columns of tables

I am trying to transpose tables listed in the format into format. Any help would be greatly appreciated. Input: test_data_1 1 2 90% 4 3 91% 5 4 90% 6 5 90% 9 6 90% test_data_2 3 5 92% 5 4 92% 7 3 93% 9 2 92% 1 1 92% ... Output:... (7 Replies)
Discussion started by: justthisguy
7 Replies

3. Shell Programming and Scripting

Data Import perl script

Hi, I have a requirement for creating a Perl Script which will perform Data Import process in an automated way and I am elaborating herewith : Section 1 ) - use the following command line format : "./import.pl -h hostname -p port -f datafile.txt" Section 2) datafile.txt will... (3 Replies)
Discussion started by: scott_apc
3 Replies

4. Shell Programming and Scripting

Reading data from multiple tables from Oracle DB

Hi , I want to read the data from 9 tables in oracle DB into 9 different files in the same connection instance (session). I am able to get data from one table to one file with below code : X=`sqlplus -s user/pwd@DB <<eof select col1 from table1; EXIT; eof` echo $X>myfile Can anyone... (2 Replies)
Discussion started by: net
2 Replies

5. Shell Programming and Scripting

Shell snip to import CSV data into BASH array

I have been trying to write a simple snip of bash shell code to import from 1 to 100 records into a BASH array. I have a CSV file that is structured like: record1,item1,item2,item3,item4,etc.,etc. .... (<= 100 items) record2,item1,item2,item3,item4,etc.,etc. .... (<= 100 items)... (5 Replies)
Discussion started by: dstrout
5 Replies

6. Web Development

mysql query for multiple columns from multiple tables in a DB

Say I have two tables like below.. status HId sName dName StartTime EndTime 1 E E 9:10 10:10 2 E F 9:15 10:15 3 G H 9:17 10:00 logic Id devName capacity free Line 1 E 123 34 1 2 E 345 ... (3 Replies)
Discussion started by: ilan
3 Replies

7. Shell Programming and Scripting

Bash script with python slicing on multiple data files

I have 2 files generated in linux that has common output and were produced across multiple hosts with the same setup/configs. These files do some simple reporting on resource allocation and user sessions. So, essentially, say, 10 hosts, with the same (2) system reporting in the files, so a... (0 Replies)
Discussion started by: jdubbz
0 Replies

8. Shell Programming and Scripting

Append data by looking up 2 tables for multiple files

I want to lookup values from two different tables based on common columns and append. The trick is the column to be looked up is not fixed and varies , so it has to be detected from the header. How can I achieve this at once, for multiple data files, but lookup tables fixed. The two lookup... (5 Replies)
Discussion started by: ritakadm
5 Replies

9. Shell Programming and Scripting

In PErl script: need to read the data one file and generate multiple files based on the data

We have the data looks like below in a log file. I want to generat files based on the string between two hash(#) symbol like below Source: #ext1#test1.tale2 drop #ext1#test11.tale21 drop #ext1#test123.tale21 drop #ext2#test1.tale21 drop #ext2#test12.tale21 drop #ext3#test11.tale21 drop... (5 Replies)
Discussion started by: Sanjeev G
5 Replies

10. Shell Programming and Scripting

Shell script automation using cron which query's MySQL Tables

What I have: I have a input.sh (script which basically connect to mysql-db and query's multiple tables to write back the output to output1.out file in a directory) note: I need to pass an integer (unique_id = anything b/w 1- 1000) next to the script everytime I run the script which generates... (3 Replies)
Discussion started by: kkpand
3 Replies
LOG_DB_DAEMON(1)					User Contributed Perl Documentation					  LOG_DB_DAEMON(1)

NAME
log_db_daemon - Database logging daemon for Squid SYNOPSIS
log_db_daemon DSN [options] DESCRIPTOIN
This program writes Squid access.log entries to a database. Presently only accepts the squid native format DSN Database DSN encoded as a path. This is sent as the access_log file path. Sample configuration: access_log daemon:/host/database/table/username/password squid to leave a parameter unspecified use a double slash: access_log daemon://database/table/username/password squid Default "DBI:mysql:database=squid" --debug Write debug messages to Squid stderr or cache.log DESCRIPTION
This module exploits the new logfile daemon support available in squid 2.7 and 3.2 to store access log entries in a MySQL database. CONFIGURATION
Squid configuration access_log directive The path to the access log file is used to provide the database connection parameters. access_log daemon:/mysql_host:port/database/table/username/password squid The 'daemon' prefix is mandatory and tells squid that the logfile_daemon helper is to be used instead of the normal file logging. The last parameter tells squid which log format to use when writing lines to the log daemon. Presently squid format is supported. mysql_host:port Host where the mysql server is running. If left empty, 'localhost' is assumed. database Name of the database to connect to. If left empty, 'squid_log' is assumed. table Name of the database table where log lines are stored. If left empty, 'access_log' is assumed. username Username to use when connecting to the database. If left empty, 'squid' is assumed. password Password to use when connecting to the database. If left empty, no password is used. To leave all fields to their default values, you can use a single slash: access_log daemon:/ squid To specify only the database password, which by default is empty, you must leave unspecified all the other parameters by using null strings: access_log daemon://///password squid logfile_daemon directive This is the current way of telling squid where the logfile daemon resides. logfile_daemon /path/to/squid/libexec/logfile-daemon_mysql.pl The script must be copied to the location specified in the directive. Database configuration Let's call the database 'squid_log' and the log table 'access_log'. The username and password for the db connection will be both 'squid'. Database Create the database: CREATE DATABASE squid_log; User Create the user: GRANT INSERT,SELECT,CREATE ON squid_log.* TO 'squid'@'localhost' IDENTIFIED BY 'squid'; FLUSH PRIVILEGES; Note that only CREATE, INSERT and SELECT privileges are granted to the 'squid' user. This ensures that the logfile daemon script cannot change or modify the log entries. Table The Daemon will attempt to initialize this table if none exists when it starts. The table created should look like: CREATE TABLE access_log ( id INTEGER NOT NULL AUTO_INCREMENT PRIMARY KEY, time_since_epoch DECIMAL(15,3), time_response INTEGER, ip_client CHAR(15), ip_server CHAR(15), http_status_code VARCHAR(10), http_reply_size INTEGER, http_method VARCHAR(20), http_url TEXT, http_username VARCHAR(20), http_mime_type VARCHAR(50), squid_hier_status VARCHAR(20), squid_request_status VARCHAR(20) ); VERSION INFORMATION
This document refers to "log_db_daemon" script version 0.5. The script has been developed and tested in the following environment: squid-2.7 Squid-3.2 mysql 5.0.26 and 5.1 perl 5.8.8 OpenSUSE 10.2 DATA EXTRACTION
Sample queries. Clients accessing the cache SELECT DISTINCT ip_client FROM access_log; Number of request per day SELECT DATE(FROM_UNIXTIME(time_since_epoch)) AS date_day, COUNT(*) AS num_of_requests FROM access_log GROUP BY 1 ORDER BY 1; Request status count To obtain the raw count of each request status: SELECT squid_request_status, COUNT(*) AS n FROM access_log GROUP BY squid_request_status ORDER BY 2 DESC; To calculate the percentage of each request status: SELECT squid_request_status, (COUNT(*)/(SELECT COUNT(*) FROM access_log)*100) AS percentage FROM access_log GROUP BY squid_request_status ORDER BY 2 DESC; To distinguish only between HITs and MISSes: SELECT 'hits', (SELECT COUNT(*) FROM access_log WHERE squid_request_status LIKE '%HIT%') / (SELECT COUNT(*) FROM access_log)*100 AS percentage UNION SELECT 'misses', (SELECT COUNT(*) FROM access_log WHERE squid_request_status LIKE '%MISS%') / (SELECT COUNT(*) FROM access_log)*100 AS pecentage; Response time ranges SELECT '0..500', COUNT(*)/(SELECT COUNT(*) FROM access_log)*100 AS percentage FROM access_log WHERE time_response >= 0 AND time_response < 500 UNION SELECT '500..1000', COUNT(*)/(SELECT COUNT(*) FROM access_log)*100 AS percentage FROM access_log WHERE time_response >= 500 AND time_response < 1000 UNION SELECT '1000..2000', COUNT(*)/(SELECT COUNT(*) FROM access_log)*100 AS percentage FROM access_log WHERE time_response >= 1000 AND time_response < 2000 UNION SELECT '>= 2000', COUNT(*)/(SELECT COUNT(*) FROM access_log)*100 AS percentage FROM access_log WHERE time_response >= 2000; Traffic by mime type SELECT http_mime_type, SUM(http_reply_size) as total_bytes FROM access_log GROUP BY http_mime_type ORDER BY 2 DESC; Traffic by client SELECT ip_client, SUM(http_reply_size) AS total_bytes FROM access_log GROUP BY 1 ORDER BY 2 DESC; Speed issues The MyISAM storage engine is known to be faster than the InnoDB one, so although it doesn't support transactions and referential integrity, it might be more appropriate in this scenario. You might want to append "ENGINE=MYISAM" at the end of the table creation code in the above SQL script. Indexes should be created according to the queries that are more frequently run. The DDL script only creates an implicit index for the primary key column. TODO
Table cleanup This script currently implements only the "L" (i.e. "append a line to the log") command, therefore the log lines are never purged from the table. This approach has an obvious scalability problem. One solution would be to implement e.g. the "rotate log" command in a way that would calculate some summary values, put them in a "summary table" and then delete the lines used to caluclate those values. Similar cleanup code could be implemented in an external script and run periodically independently from squid log commands. Testing This script has only been tested in low-volume scenarios (single client, less than 10 req/s). Tests in high volume environments could reveal performance bottlenecks and bugs. AUTHOR
Marcello Romani, marcello.romani@libero.it Amos Jeffries, amosjeffries@squid-cache.org COPYRIGHT AND LICENSE
Copyright (C) 2008 by Marcello Romani This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself, either Perl version 5.8.8 or, at your option, any later version of Perl 5 you may have available. perl v5.16.3 2014-06-09 LOG_DB_DAEMON(1)
All times are GMT -4. The time now is 02:05 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy