Sponsored Content
Top Forums Shell Programming and Scripting Split a large file in n records and skip a particular record Post 302877287 by Akshay Hegde on Friday 29th of November 2013 12:29:43 PM
Old 11-29-2013
Quote:
Originally Posted by Corona688
For starters you need to set the original value of 'f' so you won't be printing into a blank filename.

% can cause some precedence problems in C I know, I would bracket your expressions more carefully.
But NR ==1 || ........ will set the filename right ? I still didn't get what might be wrong..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Split A Large File

Hi, I have a large file(csv format) that I need to split into 2 files. The file looks something like Original_file.txt first name, family name, address a, b, c, d, e, f, and so on for over 100,00 lines I need to create two files from this one file. The condition is i need to ensure... (4 Replies)
Discussion started by: nbvcxzdz
4 Replies

2. Shell Programming and Scripting

Split Large File

HI, i've to split a large file which inputs seems like : Input file name_file.txt 00001|AAAA|MAIL|DATEOFBIRTHT|....... 00001|AAAA|MAIL|DATEOFBIRTHT|....... 00002|BBBB|MAIL|DATEOFBIRTHT|....... 00002|BBBB|MAIL|DATEOFBIRTHT|....... 00003|CCCC|MAIL|DATEOFBIRTHT|.......... (1 Reply)
Discussion started by: AMARA
1 Replies

3. Shell Programming and Scripting

Split a large file

I have a 3 GB text file that I would like to split. How can I do this? It's a giant comma-separated list of numbers. I would like to make it into about 20 files of ~100 MB each, with a custom header and footer. The file can only be split on commas, but they're plentiful. Something like... (3 Replies)
Discussion started by: CRGreathouse
3 Replies

4. Shell Programming and Scripting

Split a single record to multiple records & add folder name to each line

Hi Gurus, I need to cut single record in the file(asdf) to multile records based on the number of bytes..(44 characters). So every record will have 44 characters. All the records should be in the same file..to each of these lines I need to add the folder(<date>) name. I have a dir. in which... (20 Replies)
Discussion started by: ram2581
20 Replies

5. Shell Programming and Scripting

How to delete 1 record in large file!

Hi All, I'm a newbie here, I'm just wondering on how to delete a single record in a large file in unix. ex. file1.txt is 1000 records nikki1 nikki2 nikki3 what i want to do is delete the nikki2 record in file1.txt. is it possible? Please advise, Thanks, (3 Replies)
Discussion started by: nikki1200
3 Replies

6. UNIX for Dummies Questions & Answers

Split single record to multiple records

Hi Friends, source .... col1,col2,col3 a,b,1;2;3 here colom delimeter is comma(,). here we dont know what is the max length of col3 means now we have 1;2;3 next time i will receive 1;2;3;4;5;etc... required output .............. col1,col2,col3 a,b,1 a,b,2 a,b,3 please give me... (5 Replies)
Discussion started by: bab.galary
5 Replies

7. UNIX for Dummies Questions & Answers

Using awk to skip record in file

I need to amend the code blow such that it reads a "black list" before the "print" statement; if "substr($1,1,6)" is found in the "blacklist" it will ignore that record and continue. the code is from an awk script that is being called from shell script which passes the input values. BEGIN { "date... (5 Replies)
Discussion started by: bazel
5 Replies

8. Shell Programming and Scripting

How to split one record to multiple records?

Hi, I have one tab delimited file which is having multiple store_ids in first column seprated by pipe.I want to split the file on the basis of store_id(separating 1st record in to 2 records ). I tried some more options like below with using split,awk etc ,But not able to get proper output. can... (1 Reply)
Discussion started by: jaggy
1 Replies

9. UNIX for Advanced & Expert Users

How to split large file with different record delimiter?

Hi, I have received a file which is 20 GB. We would like to split the file into 4 equal parts and process it to avoid memory issues. If the record delimiter is unix new line, I could use split command either with option l or b. The problem is that the line terminator is |##| How to use... (5 Replies)
Discussion started by: Ravi.K
5 Replies

10. UNIX for Beginners Questions & Answers

Trying To Split a Large File

Trying to split a 35gb file into 1000mb parts. My research shows I should you this. split -b 1000m file.txt and my return is "split: cannot open 'crunch1.txt' for reading: No such file or directory" so I tried split -b 1000m Documents/Wordlists/file.txt and I get nothing other than the curser just... (3 Replies)
Discussion started by: sub terra
3 Replies
ppmtopgm(1)                                                   General Commands Manual                                                  ppmtopgm(1)

NAME
ppmtopgm - convert a portable pixmap into a portable graymap SYNOPSIS
ppmtopgm [ppmfile] DESCRIPTION
Reads a portable pixmap as input. Produces a portable graymap as output. The output is a "black and white" rendering of the original image, as in a black and white photograph. The quantization formula used is .299 r + .587 g + .114 b. Note that although there is a pgmtoppm program, it is not necessary for simple conversions from pgm to ppm , because any ppm program can read pgm (and pbm ) files automatically. pgmtoppm is for colorizing a pgm file. Also, see ppmtorgb3 for a different way of converting color to gray. And ppmdist generates a grayscale image from a color image, but in a way that makes it easy to differentiate the original colors, not necessarily a way that looks like a black and white photograph. QUOTE
Cold-hearted orb that rules the night Removes the colors from our sight Red is gray, and yellow white But we decide which is right And which is a quantization error. SEE ALSO
pgmtoppm(1),ppmtorgb3(1),rgb3toppm(1),ppmdist(1),ppm(5),pgm(5) AUTHOR
Copyright (C) 1989 by Jef Poskanzer. 10 April 2000 ppmtopgm(1)
All times are GMT -4. The time now is 10:16 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy