Sponsored Content
Top Forums Shell Programming and Scripting nawk -- separation of records on basis of number of fields Post 302483690 by centurion_13 on Tuesday 28th of December 2010 05:28:21 AM
Old 12-28-2010
Thanks berei Smilie

Can u just explain each parameter of your awk command.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

using nawk to concate the fields

I have the the following question: # cat report.lis | nawk -F: '{ if($0 ~ "^BE NO:") print "report_"$2".lis"}' :confused: .lisrt_ 111 .lisrt_ 111 .lisrt_ 222 .lisrt_ 222 .lisrt_ 333 .lisrt_ 333 # cat report.lis | nawk -F: '{ if($0 ~ "^BE NO:") print "report_" $2}' report_ 111 report_... (2 Replies)
Discussion started by: raychu65
2 Replies

2. Shell Programming and Scripting

awk sed cut? to rearrange random number of fields into 3 fields

I'm working on formatting some attendance data to meet a vendors requirements to upload to their system. With some help on the forums here, I have the data close. But they've since changed what they want. The vendor wants me to submit three fields to them. Field 1 is the studentid field,... (4 Replies)
Discussion started by: axo959
4 Replies

3. Shell Programming and Scripting

Extract CSV records using NAWK?

Example CSV: $ cat myfile HDR COL_A,COL_B,COL_C X,Y,Z Z,Y,X ... X,W,Z In this example, I know that column names are on the second line. I also know that I would like to print lines where COL_A="X" and COL_C="Z". In this simple example, I know that COL_A = $1 and COL_C = $3, and hence... (6 Replies)
Discussion started by: cs03dmj
6 Replies

4. Shell Programming and Scripting

Print records which do not have expected number of fields in a comma delimited file

Hi, I have a comma (,) delimited file, in which few fields are enclosed with in double quotes " ". I have to print the records in the file which donot have expected number of field with the line number. File1 ==== name,desgnation,doj,project #header#... (7 Replies)
Discussion started by: machomaddy
7 Replies

5. Shell Programming and Scripting

problem in redirecting records using nawk

I am trying to redirect record to two files using nawk if-else. #Identify good and bad records and redirect records using if-then-else nawk -F"|" '{if(NF!=14){printf("%s\n",$0) >> "$fn"_bad_data}else{printf("%s\n",$0) >> $fn}}' "$fn".orig "$fn".orig is the source file name bad... (7 Replies)
Discussion started by: siteregsam
7 Replies

6. Shell Programming and Scripting

Compare two files with different number of records and output only the Extra records from file1

Hi Freinds , I have 2 files . File 1 |nag|HYd|1|Che |esw|Gun|2|hyd |pra|bhe|3|hyd |omu|hei|4|bnsj |uer|oeri|5|uery File 2 |nag|HYd|1|Che |esw|Gun|2|hyd |uer|oi|3|uery output : (9 Replies)
Discussion started by: i150371485
9 Replies

7. UNIX for Dummies Questions & Answers

Splitting the file basis of Line Number

Can u pls advise the unix command as I have a file which contain the records in the below format 333434 435435 435443 434543 343536 Now the total line count is 89380 , now i want to create a separate I am trying to split my large big file into small bits using the line... (2 Replies)
Discussion started by: punpun66
2 Replies

8. UNIX for Dummies Questions & Answers

Make all records with the same number of fields (awk)

Hi, input: AA|BB|CC DD|EE FF what I am trying to get: AA|BB|CC DD|EE| FF|| I tried to create first an UDF for printing repeats, but I think I have an issue with my END section or my array: function repeat(str, n, rep, i) { for(i=1 ;i<n;i++) rep=rep str return rep } ... (6 Replies)
Discussion started by: beca123456
6 Replies

9. UNIX for Beginners Questions & Answers

Split a txt file on the basis of line number

I have to split a file containing 100 lines to 5 files say from lines ,1-20 ,21-30 ,31-40 ,51-60 ,61-100 Here is i can do it for 2 file but how to handle it for more than 2 files awk 'NR < 21{ print >> "a"; next } {print >> "b" }' $input_file Please advidse. Thanks (4 Replies)
Discussion started by: abhaydas
4 Replies

10. UNIX for Beginners Questions & Answers

Is there a UNIX command that can compare fields of files with differing number of fields?

Hi, Below are the sample files. x.txt is from an Excel file that is a list of users from Windows and y.txt is a list of database account. $ head -500 x.txt y.txt ==> x.txt <== TEST01 APP_USER_PROFILE USER03 APP_USER_PROFILE TEST02 APP_USER_EXP_PROFILE TEST04 APP_USER_PROFILE USER01 ... (3 Replies)
Discussion started by: newbie_01
3 Replies
set_color(1)							       fish							      set_color(1)

NAME
set_color - set_color - set the terminal color set_color - set the terminal color Synopsis set_color [-v --version] [-h --help] [-b --background COLOR] [COLOR] Description Change the foreground and/or background color of the terminal. COLOR is one of black, red, green, brown, yellow, blue, magenta, purple, cyan, white and normal. o -b, --background Set the background color o -c, --print-colors Prints a list of all valid color names o -h, --help Display help message and exit o -o, --bold Set bold or extra bright mode o -u, --underline Set underlined mode o -v, --version Display version and exit Calling set_color normal will set the terminal color to whatever is the default color of the terminal. Some terminals use the --bold escape sequence to switch to a brighter color set. On such terminals, set_color white will result in a grey font color, while set_color --bold white will result in a white font color. Not all terminal emulators support all these features. This is not a bug in set_color but a missing feature in the terminal emulator. set_color uses the terminfo database to look up how to change terminal colors on whatever terminal is in use. Some systems have old and incomplete terminfo databases, and may lack color information for terminals that support it. Download and install the latest version of ncurses and recompile fish against it in order to fix this issue. Version 1.23.1 Sun Jan 8 2012 set_color(1)
All times are GMT -4. The time now is 12:02 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy