Sponsored Content
Top Forums Shell Programming and Scripting Help with merge and remove duplicates Post 302896854 by vgersh99 on Wednesday 9th of April 2014 01:59:42 PM
Old 04-09-2014
I don't see how the 'expected' is different from 'produced'...
Could you provide a better (more representative) set of sample files to explain what you're after.....
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicates

Hello Experts, I have two files named old and new. Below are my example files. I need to compare and print the records that only exist in my new file. I tried the below awk script, this script works perfectly well if the records have exact match, the issue I have is my old file has got extra... (4 Replies)
Discussion started by: forumthreads
4 Replies

2. Shell Programming and Scripting

Merge Two Tables with duplicates in first table

Hi.. File 1: 1 aa rep 1 dd rep 1 kk rep 2 bb sad 2 ss sad 3 ee dam File 2 1 apple fruit 2 mango tree 3 lilly flower output: 1 aaple fruit aa,dd,kk rep (7 Replies)
Discussion started by: empyrean
7 Replies

3. Shell Programming and Scripting

bash - remove duplicates

I need to use a bash script to remove duplicate files from a download list, but I cannot use uniq because the urls are different. I need to go from this: http://***/fae78fe/file1.wmv http://***/39du7si/file1.wmv http://***/d8el2hd/file2.wmv http://***/h893js3/file2.wmv to this: ... (2 Replies)
Discussion started by: locoroco
2 Replies

4. Shell Programming and Scripting

Find duplicates in column 1 and merge their lines (awk?)

Hi, I have a file (sorted by sort) with 8 tab delimited columns. The first column contains duplicated fields and I need to merge all these identical lines. My input file: comp100002 aaa bbb ccc ddd eee fff ggg comp100003 aba aba aba aba aba aba aba comp100003 fff fff fff fff fff fff fff... (5 Replies)
Discussion started by: falcox
5 Replies

5. Shell Programming and Scripting

Merge files without duplicates

Hi all, In a directory of many files, I need to merge only files which do not have identical lines and also the resulatant merge file should not be more than 50000 lines. Basically I need to cover up all text files in that directory and turn them to Merge files.txt with 50000 lines each ... (2 Replies)
Discussion started by: pravfraz
2 Replies

6. UNIX for Dummies Questions & Answers

Remove duplicates from a file

Can u tell me how to remove duplicate records from a file? (11 Replies)
Discussion started by: saga20
11 Replies

7. Shell Programming and Scripting

Remove duplicates

I have a file with the following format: fields seperated by "|" title1|something class|long...content1|keys title2|somhing class|log...content1|kes title1|sothing class|lon...content1|kes title3|shing cls|log...content1|ks I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies

8. Shell Programming and Scripting

Remove top 3 duplicates

hello , I have a requirement with input in below format abc 123 xyz bcd 365 kii abc 987 876 cdf 987 uii abc 456 yuu bcd 654 rrr Expecting Output abc 456 yuu bcd 654 rrr cdf 987 uii (1 Reply)
Discussion started by: Tomlight
1 Replies

9. Shell Programming and Scripting

Sort and Remove duplicates

Here is my task : I need to sort two input files and remove duplicates in the output files : Sort by 13 characters from 97 Ascending Sort by 1 characters from 96 Ascending If duplicates are found retain the first value in the file the input files are variable length, convert... (4 Replies)
Discussion started by: ysvsr1
4 Replies

10. Shell Programming and Scripting

Remove duplicates

Hi I have a below file structure. 200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,, 200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,, 300,20140223,0.001,0.001,0.001,0.001,0.001 300,20140224,0.001,0.001,0.001,0.001,0.001 300,20140225,0.001,0.001,0.001,0.001,0.001 300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies
sccshelp(1)						      General Commands Manual						       sccshelp(1)

NAME
sccshelp - ask for help on SCCS commands SYNOPSIS
[arg]... DESCRIPTION
The command finds information to explain a message from an SCCS command or to explain the use of a SCCS command. Zero or more arguments can be supplied. If no arguments are given, prompts for one: The arguments can be either message numbers (which normally appear in parentheses following messages) or command names, of one of the fol- lowing types: Type 1 Begins with nonnumerics, ends in numerics. The nonnumeric prefix is usually an abbreviation for the program or set of routines which produced the message (e.g., for message 6 from the command). Type 2 Does not contain numerics (as a command, such as Type 3 Is all numeric (e.g., The response of the program is the explanatory information related to the argument, if there is any. You can use to support other commands by means of the file. To do this, create help files in the appropriate format and add the location of the helpfiles to EXTERNAL INFLUENCES
Environment Variables determines the interpretation of text as single- and/or multibyte characters. determines the language in which messages are displayed. If is not specified or is null, it defaults to the value of If is not specified or is null, it defaults to "C" (see lang(5)). If any internationalization variable contains an invalid setting, all internationalization variables default to "C". See environ(5). International Code Set Support Single- and multibyte character code sets are supported. DIAGNOSTICS
When all else fails, try EXAMPLES
If you enter the SCCS command without parameters, you would get the message: If you request help for the command: it displays: If you request help for the error number: it displays: WARNINGS
Only SCCS commands currently use FILES
Directory containing files of message text List of commands supported by sccshelp File containing the locations of help files that are not in the directory sccshelp(1)
All times are GMT -4. The time now is 05:35 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy