Sponsored Content
Top Forums UNIX for Dummies Questions & Answers split a single sql file into multiple files Post 302236446 by rubin on Monday 15th of September 2008 02:06:25 PM
Old 09-15-2008
I see ..., that's not an awk issue, but an OS one. I doubt though that the limit of files in one dir is only 10, it's got to be way more than that.
I tested the codes in Solaris, and they bombed out only after ~ 55000 files were created in one dir, they were all in good shape,and my test file had ~1,500,000 lines.

Well in this case I'd suggest to split the file in smaller chunks ( see man split pages of your OS ) to the size that the codes would not fail, create the same number of directories as the number of chunks created, move these chunks to the newly created dirs, and run the given codes separately in each of these directories created.

BTW if you're on Solaris use nawk.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Executing Multiple .SQL Files from Single Shell Script file

Hi, Please help me out. I have around 700 sql files to execute in a defined order, how can i do it from shell script (3 Replies)
Discussion started by: anushilrai
3 Replies

2. Shell Programming and Scripting

Split a file into multiple files

I have a file ehich has multiple create statements as create abc 123 one two create xyz 456 four five create nnn 666 six four I want to separte each create statement in seperate files (3 Replies)
Discussion started by: glamo_2312
3 Replies

3. Programming

Single sql query to spool to multiple files

Is there anyway to spool my select statement into spool files of max 10000 records each? eg I have a select statement that will return 45000 records. A normal spool command will output the 45000 into just one spool file. How can I make sqlplus do this? 00001 - 10000 records --- spool... (3 Replies)
Discussion started by: Leion
3 Replies

4. Shell Programming and Scripting

Split single file into multiple files based on the number in the column

Dear All, I would like to split a file of the following format into multiple files based on the number in the 6th column (numbers 1, 2, 3...): ATOM 1 N GLY A 1 -3.198 27.537 -5.958 1.00 0.00 N ATOM 2 CA GLY A 1 -2.199 28.399 -6.617 1.00 0.00 ... (3 Replies)
Discussion started by: tomasl
3 Replies

5. Shell Programming and Scripting

Split the single file lines into multiple files

Let's assume that I have a file name called ‘A' and it has 100 lines in it and would like to split these 100 lines into 4 files as specified bellow. INPUT: Input file name A 1 2 3 4 5 6 7 8 9 ........100 Output: 4 output files (x,y,z,w) File x should contains (Skip 4 lines)... (15 Replies)
Discussion started by: subbarao25
15 Replies

6. Shell Programming and Scripting

Execute multiple SQL scripts from single SQL Plus connection

Hi! I would like to do a single connection to sqlplus and execute some querys. Actually I do for every query one connection to database i.e echo 'select STATUS from v$instance; exit' > $SQL_FILE sqlplus user/pass@sid @$SQL_FILE > $SELECT_RESULT echo 'select VERSION from v$instance;... (6 Replies)
Discussion started by: guif
6 Replies

7. UNIX for Dummies Questions & Answers

Split single file into n number of files

Hi, I am new to unix. we have a requirement here to split a single file into multiples files based on the number of people available for processing. So i tried my hand at writing some code as below. #!/bin/bash var1=`wc -l $filename` var2=$var1/$splitno split -l $var2 $1 Please help me... (6 Replies)
Discussion started by: quirkguy
6 Replies

8. Shell Programming and Scripting

Split single file into multiple files using pattern matching

I have one single shown below and I need to break each ST|850 & SE to separate file using unix script. Below example should create 3 files. We can use ST & SE to filter as these field names will remain same. Please advice with the unix code. ST|850 BEG|PO|1234 LIN|1|23 SE|4 ST|850... (3 Replies)
Discussion started by: prasadm
3 Replies

9. UNIX for Dummies Questions & Answers

Split files into smaller ones with 1000 hierarchies in a single file.

input file: AD,00,--,---,---,---,---,---,---,--,--,--- AM,000,---,---,---,---,---,--- AR, ,---,--,---,--- AA,---,---,---,--- AT,--- AU,---,---,--- AS,---,--- AP,---,---,--- AI,--- AD,00,---,---,---, ,---,---,---,---,---,--- AM,000,---,---,--- AR,... (6 Replies)
Discussion started by: kcdg859
6 Replies

10. Shell Programming and Scripting

Split a single file into multiple files based on a value.

Hi All, I have the sales_data.csv file in the directory as below. SDDCCR; SOM ; MD6546474777 ;05-JAN-16 ABC ; KIRAN ; CB789 ;04-JAN-16 ABC ; RAMANA; KS566767477747 ;06-JAN-16 ABC ; KAMESH; A33535335 ;04-JAN-16 SDDCCR; DINESH; GD6674474747 ;08-JAN-16... (4 Replies)
Discussion started by: ROCK_PLSQL
4 Replies
GAMMU-SMSD-MYSQL(7)						       Gammu						       GAMMU-SMSD-MYSQL(7)

NAME
gammu-smsd-mysql - gammu-smsd(1) backend using MySQL database server as a message storage DESCRIPTION
MYSQL backend stores all data in a MySQL database server, which parameters are defined by configuration (see gammu-smsdrc for description of configuration options). For tables description see gammu-smsd-tables. This backend is based on gammu-smsd-sql. CONFIGURATION
Before running gammu-smsd you need to create necessary tables in the database, which is described bellow. The configuration file then can look like: [smsd] service = sql driver = native_mysql host = localhost See also gammu-smsdrc PRIVILEGES
The user accessing the database does not need much privileges, the following privleges should be enough: GRANT USAGE ON *.* TO 'smsd'@'localhost' IDENTIFIED BY 'password'; GRANT SELECT, INSERT, UPDATE, DELETE ON `smsd`.* TO 'smsd'@'localhost'; Note For creating the SQL tables you need more privileges, especially for creating triggers, which are used for some functionality. CREATING TABLES
SQL script for creating tables in MySQL database: -- phpMyAdmin SQL Dump -- version 2.8.0.3 -- http://www.phpmyadmin.net -- -- Host: localhost -- Generation Time: Jun 10, 2006 at 11:08 PM -- Server version: 5.0.18 -- PHP Version: 5.1.3 -- -- Database: `smsd` -- -- -------------------------------------------------------- -- -- Table structure for table `daemons` -- CREATE TABLE `daemons` ( `Start` text NOT NULL, `Info` text NOT NULL ) ENGINE=MyISAM DEFAULT CHARSET=utf8; -- -- Dumping data for table `daemons` -- -- -------------------------------------------------------- -- -- Table structure for table `gammu` -- CREATE TABLE `gammu` ( `Version` integer NOT NULL default '0' ) ENGINE=MyISAM DEFAULT CHARSET=utf8; -- -- Dumping data for table `gammu` -- INSERT INTO `gammu` (`Version`) VALUES(13); -- -------------------------------------------------------- -- -- Table structure for table `inbox` -- CREATE TABLE `inbox` ( `UpdatedInDB` timestamp NOT NULL default CURRENT_TIMESTAMP on update CURRENT_TIMESTAMP, `ReceivingDateTime` timestamp NOT NULL default '0000-00-00 00:00:00', `Text` text NOT NULL, `SenderNumber` varchar(20) NOT NULL default '', `Coding` enum('Default_No_Compression','Unicode_No_Compression','8bit','Default_Compression','Unicode_Compression') NOT NULL default 'Default_No_Compression', `UDH` text NOT NULL, `SMSCNumber` varchar(20) NOT NULL default '', `Class` integer NOT NULL default '-1', `TextDecoded` text NOT NULL default '', `ID` integer unsigned NOT NULL auto_increment, `RecipientID` text NOT NULL, `Processed` enum('false','true') NOT NULL default 'false', PRIMARY KEY `ID` (`ID`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 AUTO_INCREMENT=1 ; -- -- Dumping data for table `inbox` -- -- -------------------------------------------------------- -- -- Table structure for table `outbox` -- CREATE TABLE `outbox` ( `UpdatedInDB` timestamp NOT NULL default CURRENT_TIMESTAMP on update CURRENT_TIMESTAMP, `InsertIntoDB` timestamp NOT NULL default '0000-00-00 00:00:00', `SendingDateTime` timestamp NOT NULL default '0000-00-00 00:00:00', `SendBefore` time NOT NULL DEFAULT '23:59:59', `SendAfter` time NOT NULL DEFAULT '00:00:00', `Text` text, `DestinationNumber` varchar(20) NOT NULL default '', `Coding` enum('Default_No_Compression','Unicode_No_Compression','8bit','Default_Compression','Unicode_Compression') NOT NULL default 'Default_No_Compression', `UDH` text, `Class` integer default '-1', `TextDecoded` text NOT NULL default '', `ID` integer unsigned NOT NULL auto_increment, `MultiPart` enum('false','true') default 'false', `RelativeValidity` integer default '-1', `SenderID` varchar(255), `SendingTimeOut` timestamp NULL default '0000-00-00 00:00:00', `DeliveryReport` enum('default','yes','no') default 'default', `CreatorID` text NOT NULL, PRIMARY KEY `ID` (`ID`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8; CREATE INDEX outbox_date ON outbox(SendingDateTime, SendingTimeOut); CREATE INDEX outbox_sender ON outbox(SenderID); -- -- Dumping data for table `outbox` -- -- -------------------------------------------------------- -- -- Table structure for table `outbox_multipart` -- CREATE TABLE `outbox_multipart` ( `Text` text, `Coding` enum('Default_No_Compression','Unicode_No_Compression','8bit','Default_Compression','Unicode_Compression') NOT NULL default 'Default_No_Compression', `UDH` text, `Class` integer default '-1', `TextDecoded` text default NULL, `ID` integer unsigned NOT NULL default '0', `SequencePosition` integer NOT NULL default '1', PRIMARY KEY (`ID`, `SequencePosition`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8; -- -- Dumping data for table `outbox_multipart` -- -- -------------------------------------------------------- -- -- Table structure for table `pbk` -- CREATE TABLE `pbk` ( `ID` integer NOT NULL auto_increment, `GroupID` integer NOT NULL default '-1', `Name` text NOT NULL, `Number` text NOT NULL, PRIMARY KEY (`ID`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8; -- -- Dumping data for table `pbk` -- -- -------------------------------------------------------- -- -- Table structure for table `pbk_groups` -- CREATE TABLE `pbk_groups` ( `Name` text NOT NULL, `ID` integer NOT NULL auto_increment, PRIMARY KEY `ID` (`ID`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 AUTO_INCREMENT=1 ; -- -- Dumping data for table `pbk_groups` -- -- -------------------------------------------------------- -- -- Table structure for table `phones` -- CREATE TABLE `phones` ( `ID` text NOT NULL, `UpdatedInDB` timestamp NOT NULL default CURRENT_TIMESTAMP on update CURRENT_TIMESTAMP, `InsertIntoDB` timestamp NOT NULL default '0000-00-00 00:00:00', `TimeOut` timestamp NOT NULL default '0000-00-00 00:00:00', `Send` enum('yes','no') NOT NULL default 'no', `Receive` enum('yes','no') NOT NULL default 'no', `IMEI` varchar(35) NOT NULL, `Client` text NOT NULL, `Battery` integer NOT NULL DEFAULT -1, `Signal` integer NOT NULL DEFAULT -1, `Sent` int NOT NULL DEFAULT 0, `Received` int NOT NULL DEFAULT 0, PRIMARY KEY (`IMEI`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8; -- -- Dumping data for table `phones` -- -- -------------------------------------------------------- -- -- Table structure for table `sentitems` -- CREATE TABLE `sentitems` ( `UpdatedInDB` timestamp NOT NULL default CURRENT_TIMESTAMP on update CURRENT_TIMESTAMP, `InsertIntoDB` timestamp NOT NULL default '0000-00-00 00:00:00', `SendingDateTime` timestamp NOT NULL default '0000-00-00 00:00:00', `DeliveryDateTime` timestamp NULL, `Text` text NOT NULL, `DestinationNumber` varchar(20) NOT NULL default '', `Coding` enum('Default_No_Compression','Unicode_No_Compression','8bit','Default_Compression','Unicode_Compression') NOT NULL default 'Default_No_Compression', `UDH` text NOT NULL, `SMSCNumber` varchar(20) NOT NULL default '', `Class` integer NOT NULL default '-1', `TextDecoded` text NOT NULL default '', `ID` integer unsigned NOT NULL default '0', `SenderID` varchar(255) NOT NULL, `SequencePosition` integer NOT NULL default '1', `Status` enum('SendingOK','SendingOKNoReport','SendingError','DeliveryOK','DeliveryFailed','DeliveryPending','DeliveryUnknown','Error') NOT NULL default 'SendingOK', `StatusError` integer NOT NULL default '-1', `TPMR` integer NOT NULL default '-1', `RelativeValidity` integer NOT NULL default '-1', `CreatorID` text NOT NULL, PRIMARY KEY (`ID`, `SequencePosition`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8; CREATE INDEX sentitems_date ON sentitems(DeliveryDateTime); CREATE INDEX sentitems_tpmr ON sentitems(TPMR); CREATE INDEX sentitems_dest ON sentitems(DestinationNumber); CREATE INDEX sentitems_sender ON sentitems(SenderID); -- -- Dumping data for table `sentitems` -- -- -- Triggers for setting default timestamps -- DELIMITER // CREATE TRIGGER inbox_timestamp BEFORE INSERT ON inbox FOR EACH ROW BEGIN IF NEW.ReceivingDateTime = '0000-00-00 00:00:00' THEN SET NEW.ReceivingDateTime = CURRENT_TIMESTAMP(); END IF; END;// CREATE TRIGGER outbox_timestamp BEFORE INSERT ON outbox FOR EACH ROW BEGIN IF NEW.InsertIntoDB = '0000-00-00 00:00:00' THEN SET NEW.InsertIntoDB = CURRENT_TIMESTAMP(); END IF; IF NEW.SendingDateTime = '0000-00-00 00:00:00' THEN SET NEW.SendingDateTime = CURRENT_TIMESTAMP(); END IF; IF NEW.SendingTimeOut = '0000-00-00 00:00:00' THEN SET NEW.SendingTimeOut = CURRENT_TIMESTAMP(); END IF; END;// CREATE TRIGGER phones_timestamp BEFORE INSERT ON phones FOR EACH ROW BEGIN IF NEW.InsertIntoDB = '0000-00-00 00:00:00' THEN SET NEW.InsertIntoDB = CURRENT_TIMESTAMP(); END IF; IF NEW.TimeOut = '0000-00-00 00:00:00' THEN SET NEW.TimeOut = CURRENT_TIMESTAMP(); END IF; END;// CREATE TRIGGER sentitems_timestamp BEFORE INSERT ON sentitems FOR EACH ROW BEGIN IF NEW.InsertIntoDB = '0000-00-00 00:00:00' THEN SET NEW.InsertIntoDB = CURRENT_TIMESTAMP(); END IF; IF NEW.SendingDateTime = '0000-00-00 00:00:00' THEN SET NEW.SendingDateTime = CURRENT_TIMESTAMP(); END IF; END;// DELIMITER ; Note You can find the script in docs/sql/mysql.sql as well. UPGRADING TABLES
The easiest way to upgrade database structure is to backup old one and start with creating new one based on example above. For upgrading existing database, you can use changes described in smsd-tables-history and then manually update Version field in gammu ta- ble. AUTHOR
Michal iha <michal@cihar.com> COPYRIGHT
2009-2012, Michal iha <michal@cihar.com> 1.31.90 February 24, 2012 GAMMU-SMSD-MYSQL(7)
All times are GMT -4. The time now is 09:41 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy