Sponsored Content
Top Forums Shell Programming and Scripting Breaking large file into small files Post 302937530 by emily on Friday 6th of March 2015 03:13:37 AM
Old 03-06-2015
thanks all for useful input, it work fine..Smilie

Greetings,

---------- Post updated at 03:13 AM ---------- Previous update was at 02:45 AM ----------

Hello,
I am not able to provide external parameter here..which is $3 while getting the desired output files..Smilie in this line
Code:
awk '{FILENAME="$3_"int((NR-1)/200)".txt";print >> FILENAME}' $3


Code:
#!/bin/bash                                                                                                                  
#usage ./copyTextFromCastor.sh $PATH $GREP $OUTPUTFILE                                                                       

PATHNAME=$1
CONSTANT=rfio:
GREP=$2
OUTPUT=$3

echo "Copying fileName \"$1 | grep $2\" to $3"
srmls "$PATHNAME" --count 99999 --offset 2 | grep "$2" | awk -F'tier2' '{print string path $2}' string="" path=""  > "$3"

echo "progressing ... please be patient..."

## split $3 into small size files, name InputFileN.txt                                                                       
awk '{FILENAME="$3_"int((NR-1)/200)".txt";print >> FILENAME}' $3

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Splitting large file into small files

Hi, I need to split a large file into small files based on a string. At different palces in the large I have the string ^Job. I need to split the file into different files starting from ^Job to the last character before the next ^Job. Also all the small files should be automatically named.... (4 Replies)
Discussion started by: dncs
4 Replies

2. Shell Programming and Scripting

Split a file into 16 small files

Hi I want to split a file that has 'n' number of records into 16 small files. Can some one suggest me how to do this using Unix script? Thanks rrkk (10 Replies)
Discussion started by: rrkks
10 Replies

3. Shell Programming and Scripting

Split large file and add header and footer to each small files

I have one large file, after every 200 line i have to split the file and the add header and footer to each small file? It is possible to add different header and footer to each file? (7 Replies)
Discussion started by: ashish4422
7 Replies

4. Shell Programming and Scripting

script to splite large file to number of small files

Dear All, Could you please help me to split a file contain around 240,000,000 line to 4 files all equally likely , note that we need to maintain that the end of each file should started by start flage (MSISDN) and ended by end flag (End), also the number of the line between the... (10 Replies)
Discussion started by: ahmed.gad
10 Replies

5. Shell Programming and Scripting

Breaking one file into many files based on first column?

Hi, I have a file that looks like this (tab deliminited). MAT1 YKR2 3 MAT1 YMR1 2 MAT1 YFG2 2 MAT2 YLM4 4 MAT2 YHL2 1 BAR1 YKR2 3 BAR1 YFR1 4 BAR1 YMR1 1 What I want to do is break this file down into multiple files. So the result will look like this: File 1... (2 Replies)
Discussion started by: kylle345
2 Replies

6. Shell Programming and Scripting

Breaking the files as 10k recs. per file

Hi, I have a code as given below Set -A _Category="A\ B\ C" for _cat in ${_Category} do sed -e "s:<TABLE_NAME>:${_cat}:g" \ -e "s:<date>:${_dt}:g" \ ${_home}/skl/sq1.sql >> ${_dest}/del_${_dt}.sql fi ... (4 Replies)
Discussion started by: mr_manii
4 Replies

7. UNIX for Advanced & Expert Users

Splitting a file into small files

Hi Folks, Please help me in solving the problem. I want to write script in order to split a file into small pieces and send it automatically through mail. Ex. The file name is CALM*.txt . It is around 50 MB. I want to split the file into 20 MB 2-3 smaller files and send (like uuencode) it... (6 Replies)
Discussion started by: piyushbhashkar
6 Replies

8. UNIX for Dummies Questions & Answers

Breaking a fasta formatted file into multiple files containing each gene separately

Hey, I've been trying to break a massive fasta formatted file into files containing each gene separately. Could anyone help me? I've tried to use the following code but i've recieved errors every time: for i in *.rtf.out do awk '/^>/{f=++d".fasta"} {print > $i.out}' $i done (1 Reply)
Discussion started by: Ann Mc Cartney
1 Replies

9. Shell Programming and Scripting

Split a large array into small chunks

Hi, I need to split a large array "@sharedArray" into 10 small arrays. The arrays should be like @sharedArray1,@sharedArray2,@sharedArray3...so on.. Can anyone help me with the logic to do so :(:confused: (6 Replies)
Discussion started by: rkrish
6 Replies

10. UNIX for Beginners Questions & Answers

Split large file into 24 small files on one hour basis

I Have a large file with 24hrs log in the below format.i need to split the large file in to 24 small files on one hour based.i.e ex:from 09:55 to 10:55,10:55-11:55 can any one help me on this.! ... (20 Replies)
Discussion started by: Raghuram717
20 Replies
ZGREP(1)                                                      General Commands Manual                                                     ZGREP(1)

NAME
zgrep - search possibly compressed files for a regular expression SYNOPSIS
zgrep [ grep_options ] [ -e ] pattern filename... DESCRIPTION
Zgrep invokes grep on compressed or gzipped files. These grep options will cause zgrep to terminate with an error code: (-[drRzZ]|--di*|--exc*|--inc*|--rec*|--nu*). All other options specified are passed directly to grep. If no file is specified, then the standard input is decompressed if necessary and fed to grep. Otherwise the given files are uncompressed if necessary and fed to grep. If the GREP environment variable is set, zgrep uses it as the grep program to be invoked. EXIT CODE
2 - An option that is not supported was specified. AUTHOR
Charles Levert (charles@comm.polymtl.ca) SEE ALSO
grep(1), gzexe(1), gzip(1), zdiff(1), zforce(1), zmore(1), znew(1) ZGREP(1)
All times are GMT -4. The time now is 07:58 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy