Sponsored Content
Top Forums Shell Programming and Scripting how to delete duplicate rows based on last column Post 302347766 by reva on Wednesday 26th of August 2009 11:06:48 AM
Old 08-26-2009
ya i have corrected my output just check now once..

Last edited by reva; 09-01-2009 at 12:35 AM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to delete duplicate rows in a file

I have a file content like below. "0000000","ABLNCYI","BOTH",1049,2058,"XYZ","5711002","","Y","","","","","","","","" "0000000","ABLNCYI","BOTH",1049,2058,"XYZ","5711002","","Y","","","","","","","","" "0000000","ABLNCYI","BOTH",1049,2058,"XYZ","5711002","","Y","","","","","","","",""... (5 Replies)
Discussion started by: vamshikrishnab
5 Replies

2. UNIX for Dummies Questions & Answers

Remove duplicate rows of a file based on a value of a column

Hi, I am processing a file and would like to delete duplicate records as indicated by one of its column. e.g. COL1 COL2 COL3 A 1234 1234 B 3k32 2322 C Xk32 TTT A NEW XX22 B 3k32 ... (7 Replies)
Discussion started by: risk_sly
7 Replies

3. UNIX for Dummies Questions & Answers

forming duplicate rows based on value of a key

if the key (A or B or ...others) has 4 in its 3rd column the 1st A row has to form 4 dupicates along with the all the values of A in 4th column (2.9, 3.8, 4.2) . Hope I explain the question clearly. Cheers Ruby input "A" 1 4 2.9 "A" 2 5 ... (7 Replies)
Discussion started by: ruby_sgp
7 Replies

4. Ubuntu

delete duplicate rows with awk files

Hi every body I have some text file with a lots of duplicate rows like this: 165.179.568.197 154.893.836.174 242.473.396.153 165.179.568.197 165.179.568.197 165.179.568.197 154.893.836.174 how can I delete the repeated rows? Thanks Saeideh (2 Replies)
Discussion started by: sashtari
2 Replies

5. UNIX for Dummies Questions & Answers

Remove duplicate rows when >10 based on single column value

Hello, I'm trying to delete duplicates when there are more than 10 duplicates, based on the value of the first column. e.g. a 1 a 2 a 3 b 1 c 1 gives b 1 c 1 but requires 11 duplicates before it deletes. Thanks for the help Video tutorial on how to use code tags in The UNIX... (11 Replies)
Discussion started by: informaticist
11 Replies

6. Shell Programming and Scripting

Delete duplicate rows

Hi, This is a followup to my earlier post him mno klm 20 76 . + . klm_mango unix_00000001; alp fdc klm 123 456 . + . klm_mango unix_0000103; her tkr klm 415 439 . + . klm_mango unix_00001043; abc tvr klm 20 76 . + . klm_mango unix_00000001; abc def klm 83 84 . + . klm_mango... (5 Replies)
Discussion started by: jacobs.smith
5 Replies

7. UNIX for Dummies Questions & Answers

merging rows into new file based on rows and first column

I have 2 files, file01= 7 columns, row unknown (but few) file02= 7 columns, row unknown (but many) now I want to create an output with the first field that is shared in both of them and then subtract the results from the rest of the fields and print there e.g. file 01 James|0|50|25|10|50|30... (1 Reply)
Discussion started by: A-V
1 Replies

8. Shell Programming and Scripting

awk to sum a column based on duplicate strings in another column and show split totals

Hi, I have a similar input format- A_1 2 B_0 4 A_1 1 B_2 5 A_4 1 and looking to print in this output format with headers. can you suggest in awk?awk because i am doing some pattern matching from parent file to print column 1 of my input using awk already.Thanks! letter number_of_letters... (5 Replies)
Discussion started by: prashob123
5 Replies

9. Shell Programming and Scripting

Remove duplicate rows based on one column

Dear members, I need to filter a file based on the 8th column (that is id), and does not mather the other columns, because I want just one id (1 line of each id) and remove the duplicates lines based on this id (8th column), and does not matter wich duplicate will be removed. example of my file... (3 Replies)
Discussion started by: clarissab
3 Replies

10. Shell Programming and Scripting

Extract and exclude rows based on duplicate values

Hello I have a file like this: > cat examplefile ghi|NN603762|eee mno|NN607265|ttt pqr|NN613879|yyy stu|NN615002|uuu jkl|NN607265|rrr vwx|NN615002|iii yzA|NN618555|ooo def|NN190486|www BCD|NN628717|ppp abc|NN190486|qqq EFG|NN628717|aaa HIJ|NN628717|sss > I can sort the file by... (5 Replies)
Discussion started by: CHoggarth
5 Replies
jfs_fsck(8)						  JFS utility - file system check					       jfs_fsck(8)

NAME
jfs_fsck - initiate replay of the JFS transaction log, and check and repair a JFS formatted device SYNOPSIS
jfs_fsck [ -afnpvV ] [ -j journal_device ] [ --omit_journal_replay ] [ --replay_journal_only ] device DESCRIPTION
jfs_fsck is used to replay the JFS transaction log, check a JFS formatted device for errors, and fix any errors found. device is the special file name corresponding to the actual device to be checked (e.g. /dev/hdb1). jfs_fsck must be run as root. WARNING
jfs_fsck should only be used to check an unmounted file system or a file system that is mounted READ ONLY. Using jfs_fsck to check a file system mounted other than READ ONLY could seriously damage the file system! OPTIONS
If no options are selected, the default is -p. -a Autocheck mode - Replay the transaction log. Do not continue fsck processing unless the aggregate state is dirty or the log replay failed. Functionally equivalent to -p. Autocheck mode is typically the default mode used when jfs_fsck is called at boot time. -f Replay the transaction log and force checking even if the file system appears clean. Repair all problems automatically. -j journal_device Specify the journal device. -n Open the file system read only. Do not replay the transaction log. Report errors, but do not repair them. --omit_journal_replay Omit the replay of the transaction log. This option should not be used unless as a last resort (i.e. the log has been severely corrupted and replaying it causes further problems). -p Automatically repair ("preen") the file system. Replay the transaction log. Do not continue fsck processing unless the aggregate state is dirty or the log replay failed. Functionally equivalent to -a. --replay_journal_only Only replay the transaction log. Do not continue with a full file system check if the replay fails or if the file system is still dirty even after a journal replay. In general, this option should only be used for debugging purposes as it could leave the file system in an unmountable state. This option cannot be used with -f, -n, or --omit_journal_replay. -v Verbose messaging - print details and debug statements to stdout. -V Print version information and exit (regardless of any other chosen options). EXAMPLES
Check the 3rd partition on the 2nd hard disk, print extended information to stdout, replay the transaction log, force complete jfs_fsck checking, and give permission to repair all errors: jfs_fsck -v -f /dev/hdb3 Check the 5th partition on the 1st hard disk, and report, but do not repair, any errors: jfs_fsck -n /dev/hda5 EXIT CODE
The exit code returned by jfs_fsck represents one of the following conditions: 0 No errors 1 File system errors corrected and/or transaction log replayed successfully 2 File system errors corrected, system should be rebooted if file system was mounted 4 File system errors left uncorrected 8 Operational error 16 Usage or syntax error 128 Shared library error REPORTING BUGS
If you find a bug in JFS or jfs_fsck, please report it via the bug tracking system ("Report Bugs" section) of the JFS project web site: http://jfs.sourceforge.net/ Please send as much pertinent information as possible, including the complete output of running jfs_fsck with the -v option on the JFS device. SEE ALSO
fsck(8), jfs_mkfs(8), jfs_fscklog(8), jfs_tune(8), jfs_logdump(8), jfs_debugfs(8) AUTHORS
Barry Arndt (barndt@us.ibm.com) William Braswell, Jr. jfs_fsck is maintained by IBM. See the JFS project web site for more details: http://jfs.sourceforge.net/ October 29, 2002 jfs_fsck(8)
All times are GMT -4. The time now is 03:18 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy