Sponsored Content
Full Discussion: bash mysql export to file
Top Forums Shell Programming and Scripting bash mysql export to file Post 302424522 by unclecameron on Tuesday 25th of May 2010 03:27:37 PM
Old 05-25-2010
bash mysql export to file

I'm trying to export a mysql query to a .csv file, right now I'm running a successful query like:
Code:
us_id=`mysql -u $USER_NAME --password=$PASSWORD -D "databasename" \
        -e "SELECT * \
        FROM databasename.table \
        WHERE somefield >0 AND otherfield ='$ctry' \
        ORDER BY users \
        DESC LIMIT 0,100";`

but I want to export it to a file, which I've tried by using:
Code:
us_id=`mysql -u $USER_NAME --password=$PASSWORD -D "databasename" \
        -e "SELECT * \
        FROM databasename.table \
        WHERE somefield >0 AND otherfield ='$ctry' \
        ORDER BY users \
        DESC LIMIT 0,100 TO OUTFILE 'somefile.csv' ";`

but then I get a db access denied error. I've also tried piping the output to sed ’s/\t/”,”/g;s/^/”/;s/$/”/;s/\n//g’, which also didn't work. If I could get TO OUTFILE to work I'd hopefully add this code to make the fields formatted:
Code:
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY ‘\\’ LINES TERMINATED BY '\n'

but I can't get the OUTFILE function to work. I could also use mysqldump I guess, but I don't know how to combine the query with a mysqldump. my output file exists and I chmod'ed it to 777 (for testing purposes).

---------- Post updated at 12:27 PM ---------- Previous update was at 10:10 AM ----------

got it, well really never got the TO OUTFILE working, but this worked:
Code:
echo "SELECT * \
        FROM somedb.sometable \
        WHERE field1 >0 AND field2 ='whatever' \
        ORDER BY field3 \
        DESC LIMIT 0,100 ;" > tmp
mysql -sN -u $USER_NAME --password=$PASSWORD -D "somedb" < tmp > temp_to_turn_into_csv_later.csv

now I have to parse the temp csv file into a csv using awk, there's probably an easier way though.
 

10 More Discussions You Might Find Interesting

1. Linux

Cannot running export database using bash script

Hi all, I'm new in linux. When I try to run a bash script, it doesn't execute and i receive the following error message 20070321:220002|ERROR||exportDatabase.bash|Another EXPORT process (pid=2799) is still running. If i kill this pid, i receive "No such process". This process was running... (5 Replies)
Discussion started by: tovohery
5 Replies

2. Shell Programming and Scripting

export not working in Bash shell

Hi Friends, I am presently migrating shell scripts writter in KSH to SH.I am stuck at this place and i am not able to find a work around:- Let the script name is x.sh Below are some of the codes in it... export abc=hello export abc=hi export abc=how When i am trying to compile the script ... (6 Replies)
Discussion started by: amit.behera
6 Replies

3. Shell Programming and Scripting

how can I export an alias and use it in any shell that I want to switch to like bash, ksh

None of the aliases that I set are available if I switch to a different shell. How can I export aliases and make them available in any shell that I switch to like ksh or bash ? I tried these $>alias godata='cd /home/kc/app/data' $>alias -x godata='cd /home/kc/app/data' $>alias |... (2 Replies)
Discussion started by: kchinnam
2 Replies

4. UNIX for Advanced & Expert Users

Bash script with export variables

Hi all guys, how you can read in thread title, I'm deploying a bash script in which I have to export some variables inside it. But (I think you know) the export command works only inside the script and so, on exit command, the variables aren't set like I set inside the script. Consequently in... (8 Replies)
Discussion started by: idro
8 Replies

5. UNIX and Linux Applications

MySQL Daemon failed to start - no mysql.sock file

After doing a yum install mysql mysql-server on Fedora 14 I wasn't able to fully install the packages correctly. It installed MySQL 5.1. I was getting the following error when running the: mysql ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2)... (3 Replies)
Discussion started by: jastanle84
3 Replies

6. Shell Programming and Scripting

export bash history to file

Hi, I want to export bash history to a file, I used the following command history > /home/administrator/bashHistory But the exported file only contains commands with line number from 996 to the last one, How to export all the commands including commands before line 996? Thanks a lot.... (2 Replies)
Discussion started by: Roy987
2 Replies

7. Programming

Loop in bash file for mysql

Hi, I'm having problems with a script which some of you helped me with last week. It's a script to check the status of orders in an SQL database, and if the count(*) is zero, then the script stops. If it's non-zero, the script echos the result to a file and then in cron, I cat the file and mail... (3 Replies)
Discussion started by: davidm123SED
3 Replies

8. Shell Programming and Scripting

BASH script to export var to env

Hi all I am trying to create a script that takes a password input then writes that to a tmp file and puts that tmp file path in my env as a var. It does everything but export the my env and I am unsure why. I am using Ubuntu 12.4 #!/bin/bash read -s -p "Enter Password: " gfpassword... (5 Replies)
Discussion started by: koikoi
5 Replies

9. Shell Programming and Scripting

Invoking a bash shell with an export var

Hello all, How can I invoke the bash shell (via command line) to execute another command by setting an exported environmental variable on the fly (as this env var would be used by the command -another script, the bash would execute). This needs to be done in one single line as the same would... (4 Replies)
Discussion started by: Praveen_218
4 Replies

10. UNIX for Beginners Questions & Answers

Using bash script : How to Import data from a dsv file into multiple tables in mysql

HI I have a dsv file that looks like: <<BOF>> record_number|id_number|first name|last name|msisdn|network|points|card number|gender 312|9101011234011|Test Junior|Smith|071 123 4321|MTN|73|1241551413214444|M 313|9012023213011|Bob|Smith|27743334321|Vodacom|3|1231233232323244|M... (4 Replies)
Discussion started by: tera
4 Replies
LOG_DB_DAEMON(1)					User Contributed Perl Documentation					  LOG_DB_DAEMON(1)

NAME
log_db_daemon - Database logging daemon for Squid SYNOPSIS
log_db_daemon DSN [options] DESCRIPTOIN
This program writes Squid access.log entries to a database. Presently only accepts the squid native format DSN Database DSN encoded as a path. This is sent as the access_log file path. Sample configuration: access_log daemon:/host/database/table/username/password squid to leave a parameter unspecified use a double slash: access_log daemon://database/table/username/password squid Default "DBI:mysql:database=squid" --debug Write debug messages to Squid stderr or cache.log DESCRIPTION
This module exploits the new logfile daemon support available in squid 2.7 and 3.2 to store access log entries in a MySQL database. CONFIGURATION
Squid configuration access_log directive The path to the access log file is used to provide the database connection parameters. access_log daemon:/mysql_host:port/database/table/username/password squid The 'daemon' prefix is mandatory and tells squid that the logfile_daemon helper is to be used instead of the normal file logging. The last parameter tells squid which log format to use when writing lines to the log daemon. Presently squid format is supported. mysql_host:port Host where the mysql server is running. If left empty, 'localhost' is assumed. database Name of the database to connect to. If left empty, 'squid_log' is assumed. table Name of the database table where log lines are stored. If left empty, 'access_log' is assumed. username Username to use when connecting to the database. If left empty, 'squid' is assumed. password Password to use when connecting to the database. If left empty, no password is used. To leave all fields to their default values, you can use a single slash: access_log daemon:/ squid To specify only the database password, which by default is empty, you must leave unspecified all the other parameters by using null strings: access_log daemon://///password squid logfile_daemon directive This is the current way of telling squid where the logfile daemon resides. logfile_daemon /path/to/squid/libexec/logfile-daemon_mysql.pl The script must be copied to the location specified in the directive. Database configuration Let's call the database 'squid_log' and the log table 'access_log'. The username and password for the db connection will be both 'squid'. Database Create the database: CREATE DATABASE squid_log; User Create the user: GRANT INSERT,SELECT,CREATE ON squid_log.* TO 'squid'@'localhost' IDENTIFIED BY 'squid'; FLUSH PRIVILEGES; Note that only CREATE, INSERT and SELECT privileges are granted to the 'squid' user. This ensures that the logfile daemon script cannot change or modify the log entries. Table The Daemon will attempt to initialize this table if none exists when it starts. The table created should look like: CREATE TABLE access_log ( id INTEGER NOT NULL AUTO_INCREMENT PRIMARY KEY, time_since_epoch DECIMAL(15,3), time_response INTEGER, ip_client CHAR(15), ip_server CHAR(15), http_status_code VARCHAR(10), http_reply_size INTEGER, http_method VARCHAR(20), http_url TEXT, http_username VARCHAR(20), http_mime_type VARCHAR(50), squid_hier_status VARCHAR(20), squid_request_status VARCHAR(20) ); VERSION INFORMATION
This document refers to "log_db_daemon" script version 0.5. The script has been developed and tested in the following environment: squid-2.7 Squid-3.2 mysql 5.0.26 and 5.1 perl 5.8.8 OpenSUSE 10.2 DATA EXTRACTION
Sample queries. Clients accessing the cache SELECT DISTINCT ip_client FROM access_log; Number of request per day SELECT DATE(FROM_UNIXTIME(time_since_epoch)) AS date_day, COUNT(*) AS num_of_requests FROM access_log GROUP BY 1 ORDER BY 1; Request status count To obtain the raw count of each request status: SELECT squid_request_status, COUNT(*) AS n FROM access_log GROUP BY squid_request_status ORDER BY 2 DESC; To calculate the percentage of each request status: SELECT squid_request_status, (COUNT(*)/(SELECT COUNT(*) FROM access_log)*100) AS percentage FROM access_log GROUP BY squid_request_status ORDER BY 2 DESC; To distinguish only between HITs and MISSes: SELECT 'hits', (SELECT COUNT(*) FROM access_log WHERE squid_request_status LIKE '%HIT%') / (SELECT COUNT(*) FROM access_log)*100 AS percentage UNION SELECT 'misses', (SELECT COUNT(*) FROM access_log WHERE squid_request_status LIKE '%MISS%') / (SELECT COUNT(*) FROM access_log)*100 AS pecentage; Response time ranges SELECT '0..500', COUNT(*)/(SELECT COUNT(*) FROM access_log)*100 AS percentage FROM access_log WHERE time_response >= 0 AND time_response < 500 UNION SELECT '500..1000', COUNT(*)/(SELECT COUNT(*) FROM access_log)*100 AS percentage FROM access_log WHERE time_response >= 500 AND time_response < 1000 UNION SELECT '1000..2000', COUNT(*)/(SELECT COUNT(*) FROM access_log)*100 AS percentage FROM access_log WHERE time_response >= 1000 AND time_response < 2000 UNION SELECT '>= 2000', COUNT(*)/(SELECT COUNT(*) FROM access_log)*100 AS percentage FROM access_log WHERE time_response >= 2000; Traffic by mime type SELECT http_mime_type, SUM(http_reply_size) as total_bytes FROM access_log GROUP BY http_mime_type ORDER BY 2 DESC; Traffic by client SELECT ip_client, SUM(http_reply_size) AS total_bytes FROM access_log GROUP BY 1 ORDER BY 2 DESC; Speed issues The MyISAM storage engine is known to be faster than the InnoDB one, so although it doesn't support transactions and referential integrity, it might be more appropriate in this scenario. You might want to append "ENGINE=MYISAM" at the end of the table creation code in the above SQL script. Indexes should be created according to the queries that are more frequently run. The DDL script only creates an implicit index for the primary key column. TODO
Table cleanup This script currently implements only the "L" (i.e. "append a line to the log") command, therefore the log lines are never purged from the table. This approach has an obvious scalability problem. One solution would be to implement e.g. the "rotate log" command in a way that would calculate some summary values, put them in a "summary table" and then delete the lines used to caluclate those values. Similar cleanup code could be implemented in an external script and run periodically independently from squid log commands. Testing This script has only been tested in low-volume scenarios (single client, less than 10 req/s). Tests in high volume environments could reveal performance bottlenecks and bugs. AUTHOR
Marcello Romani, marcello.romani@libero.it Amos Jeffries, amosjeffries@squid-cache.org COPYRIGHT AND LICENSE
Copyright (C) 2008 by Marcello Romani This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself, either Perl version 5.8.8 or, at your option, any later version of Perl 5 you may have available. perl v5.16.3 2014-06-09 LOG_DB_DAEMON(1)
All times are GMT -4. The time now is 01:47 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy