Sponsored Content
Full Discussion: awk - split file
Top Forums Shell Programming and Scripting awk - split file Post 302364464 by Scott on Friday 23rd of October 2009 06:47:50 AM
Old 10-23-2009
Hi.

In awk:

Code:
awk -v LINES=8 'NR % LINES == 1 { FILE = FILENAME "_" ++C } { print > FILE }'  pippo.txt

I was trying to do it using csplit, but that's a complete nightmare!
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Split file using awk

I am trying to read a file and split the file into multiple files. I need to create new files with different set of lines from the original file. ie, the first output file may contain 10 lines and the second 100 lines and so on. The criteria is to get the lines between two lines starting with some... (8 Replies)
Discussion started by: pvar
8 Replies

2. UNIX for Dummies Questions & Answers

Split a file with no pattern -- Split, Csplit, Awk

I have gone through all the threads in the forum and tested out different things. I am trying to split a 3GB file into multiple files. Some files are even larger than this. For example: split -l 3000000 filename.txt This is very slow and it splits the file with 3 million records in each... (10 Replies)
Discussion started by: madhunk
10 Replies

3. Shell Programming and Scripting

split file with awk

I did a lot of search on this forum on spiting file; found a lot, but my requirement is a bit different, please guide. Master file: x:start:5 line1:23 line2:12 2:90 x:end:5 x:start:2 45:56 22:90 x:end:2 x:start:3 line1:23 line2:12 x:end:3 x:start:2 line5:23 (1 Reply)
Discussion started by: uwork72
1 Replies

4. Shell Programming and Scripting

split a file using awk

Hi , I just need to split a file and outputfiles are redirected to gzip file need: Input file - A.gz content of A.gz is 100|sfdds|dffdds|200112|sdfdf 100|sfdds|dffdds|200112|sdfdf 100|sfdds|dffdds|200112|sdfdf 100|sfdds|dffdds|200212|sdfdf 100|sfdds|dffdds|200212|sdfdf... (3 Replies)
Discussion started by: mohan_xunil
3 Replies

5. Shell Programming and Scripting

How to split a file using AWK?

Hello, I have a file like the following: david,a,b,c,20,r thomas,a,b,c,30,r willaiam,a,b,c,80,r barbara,a,b,c,100,r I would like to split the file into other files using a condition for the contents of column 5. The condition should be a if the contents of column 5 is in a range... (4 Replies)
Discussion started by: keenboy100
4 Replies

6. Shell Programming and Scripting

AWK File Split

Hi All, Input.txt XYZONEABC                  CZXTWOJJJ KKKSIXOOO asdfhajlsdhfajs asdfasfasdf Output Files: ONE.txt XYZONEABC                 TWO.txt CZXTWOJJJ SIX.txt KKKSIXOOO I had a script (2 Replies)
Discussion started by: kmsekhar
2 Replies

7. Shell Programming and Scripting

Split a file with awk

Hi! I have a file like this: a,b,c,12,d,e a,b,c,13,d,e a,b,c,14,d,e a,b,c,15,d,e a,b,c,16,d,e a,b,c,17,d,e I need to split that file in two: If field 4 is equal or higher than 14 that row goes to one file and if it is equal or higher than 15 to another. Can anyone point me in the... (2 Replies)
Discussion started by: Tr0cken
2 Replies

8. Shell Programming and Scripting

Split File by Pattern with File Names in Source File... Awk?

Hi all, I'm pretty new to Shell scripting and I need some help to split a source text file into multiple files. The source has a row with pattern where the file needs to be split, and the pattern row also contains the file name of the destination for that specific piece. Here is an example: ... (2 Replies)
Discussion started by: cul8er
2 Replies

9. Shell Programming and Scripting

awk file split

Hi all, First of all I' like to mention that I'm pretty new to unix scripting. :( I'm trying to split an large xml with awk and rename it based on the values of two attributes. Example XML <RECORD> <element1>11</element1> <element2>22</element2> <element3>33</element3>... (18 Replies)
Discussion started by: f0usk4s
18 Replies

10. UNIX for Beginners Questions & Answers

Split file using awk

I need to split the incoming source file in to multiple files using awk. Split position is (6,13) : 8 positions All the records that are greater than 20170101 and less than or equal to 20181231 should go in a split file with file name as source... (11 Replies)
Discussion started by: rosebud123
11 Replies
URLCODING(3)						 libbash urlcoding Library Manual					      URLCODING(3)

NAME
urlcoding -- a Libbash library for encoding and decoding URL's. SYNOPSIS
urlEncodeString [-l] <STRING> urlEncodeFile [-l] <FILE> urlEncodeStream [-l] urlDecodeString <STRING> urlDecodeFile <FILENAME> urlDecodeStream DESCRIPTION
urlcoding is a collection of functions that convert ASCII-text to standard URL's and vice-versa. The AWK code used is based on code by Heiner Steven <heiner.steven@odn.de> The function list: urlEncodeString Creates a URL from an ASCII string urlEncodeFile Converts a file into URL-valid text urlEncodeStream Converts standard input into URL-valid text urlDecodeString Converts a URL-encoded text back to a plain-text form urlDecodeFile Coverts URL-encoded text in a file back to plain text urlDecodeStream Converts URL-encoded standard input to text Detailed interface description follows. The [-l] option for the encoding functions should be used when line-feed characters (' ') are to be encoded as well. All functions print the results of their conversions to standard output. The exit status of all functions is that of the command 'awk', with '0' for success FUNCTIONS DESCRIPTIONS
urlEncodeString [-l] <STRING> Converts STRING - a string of ASCII characters - to URL. urlEncodeFile [-l] <FILE> Coverts FILE of URL-encoded text to plain text urlEncodeStream [-l] Converts text from standard input to URL-text. urlDecodeString <STRING> Converts URL-encoded string STRING back to text. urlDecodeFile <FILENAME> Converts the URL-encoded text in FILE to plain text. urlDecodeStream Converts the URL-encoded text from standard input to plain-text AUTHORS
Alon Keren <alon.keren@gmail.com> SEE ALSO
ldbash(1), libbash(1) Linux Epoch Linux
All times are GMT -4. The time now is 11:02 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy