Sponsored Content
Full Discussion: Duplicate lines in the file
Top Forums UNIX for Advanced & Expert Users Duplicate lines in the file Post 302073837 by ranj@chn on Thursday 18th of May 2006 04:37:40 AM
Old 05-18-2006
awk solution

awk '{a[$0]++;} END {for (i in a) {if (a[i] > 1) print i;} }' filename
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove Duplicate Lines in File

I am doing KSH script to remove duplicate lines in a file. Let say the file has format below. FileA 1253-6856 3101-4011 1827-1356 1822-1157 1822-1157 1000-1410 1000-1410 1822-1231 1822-1231 3101-4011 1822-1157 1822-1231 and I want to simply it with no duplicate line as file... (5 Replies)
Discussion started by: Teh Tiack Ein
5 Replies

2. UNIX for Dummies Questions & Answers

How to redirect duplicate lines from a file????

Hi, I am having a file which contains many duplicate lines. I wanted to redirect these duplicate lines into another file. Suppose I have a file called file_dup.txt which contains some line as file_dup.txt A100-R1 ACCOUNTING-CONTROL ACTONA-ACTASTOR ADMIN-AUTH-STATS ACTONA-ACTASTOR... (3 Replies)
Discussion started by: zing_foru
3 Replies

3. UNIX for Dummies Questions & Answers

Remove Duplicate lines from File

I have a log file "logreport" that contains several lines as seen below: 04:20:00 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 06:38:08 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 07:11:05 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but... (18 Replies)
Discussion started by: Nysif Steve
18 Replies

4. Shell Programming and Scripting

Duplicate lines in a file

Hi All, I am trying to remove the duplicate entries in a file and print them just once. For example, if my input file has: 00:44,37,67,56,15,12 00:44,34,67,56,15,12 00:44,58,67,56,15,12 00:44,35,67,56,15,12 00:59,37,67,56,15,12 00:59,34,67,56,15,12 00:59,35,67,56,15,12... (7 Replies)
Discussion started by: faiz1985
7 Replies

5. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

6. Shell Programming and Scripting

How do I remove the duplicate lines in this file?

Hey guys, need some help to fix this script. I am trying to remove all the duplicate lines in this file. I wrote the following script, but does not work. What is the problem? The output file should only contain five lines: Later! (5 Replies)
Discussion started by: Ernst
5 Replies

7. UNIX for Advanced & Expert Users

Inserting duplicate lines in a file

Hi, I copied the contents of a binary file into a .text file using hd (hexdump) command. The data in binary file is such that I get in many places like following 00000250 00 00 00 00 3f 2d 91 68 3f 69 fb e7 00 00 00 00 |....?-.h?i......| 00000260 00 00 00 00 00 00 00 00 00 00 00 00 00... (2 Replies)
Discussion started by: KidD312
2 Replies

8. Shell Programming and Scripting

bash keep only duplicate lines in file

hello all in my bash script I have a file and I only want to keep the lines that appear twice in the file.Is there a way to do this? thanks in advance! (4 Replies)
Discussion started by: vlm
4 Replies

9. UNIX for Dummies Questions & Answers

Duplicate lines in a file

I have a file with following data A B C I would like to print like this n times(For eg:5 times) A B C A B C A B C A B C A (7 Replies)
Discussion started by: nsuresh316
7 Replies

10. Shell Programming and Scripting

Remove duplicate lines from a file

Hi, I have a csv file which contains some millions of lines in it. The first line(Header) repeats at every 50000th line. I want to remove all the duplicate headers from the second occurance(should not remove the first line). I don't want to use any pattern from the Header as I have some... (7 Replies)
Discussion started by: sudhakar T
7 Replies
mdbFontEncoding(5)						 The m17n Library						mdbFontEncoding(5)

NAME
mdbFontEncoding - Font Encoding DESCRIPTION
The m17n library loads information about the encoding of each font form the m17n database by the tags <font, encoding>. The data is loaded as a plist of this format. FONT-ENCODING ::= PER-FONT * PER-FONT ::= '(' FONT-SPEC ENCODING [ REPERTORY ] ')' FONT-SPEC ::= '(' [ FOUNDRY FAMILY [ WEIGHT [ STYLE [ STRETCH [ ADSTYLE ]]]]] REGISTRY ')' ENCODING ::= SYMBOL FONT-SPEC is to specify properties of a font. FOUNDRY to REGISTRY are symbols corresponding to Mfoundry to Mregistry property of a font. See m17nFont for the meaning of each property. For instance, this FONT-SPEC: (nil alice0 lao iso8859-1) should be applied to all fonts whose family name is 'alice0 lao', and registry is 'iso8859-1'. ENCODING is a symbol representing a charset. A font matching FONT-SPEC supports all characters of the charset, and a character code is mapped to the corresponding glyph code of the font by this charset. REPERTORY is a symbol representing a charset or 'nil'. Omitting it is the same as specifying ENCODING as REPERTORY. If it is not 'nil', the charset specifies the repertory of the font, i.e, which character it supports. Otherwise, whether a specific character is supported by the font or not is asked to each font driver. For so called Unicode fonts (registry is 'iso10646-1'), it is recommended to specify 'nil' as REPERTORY because such fonts usually supports only a subset of Unicode characters. COPYRIGHT
Copyright (C) 2001 Information-technology Promotion Agency (IPA) Copyright (C) 2001-2011 National Institute of Advanced Industrial Science and Technology (AIST) Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License <http://www.gnu.org/licenses/fdl.html>. Version 1.6.2 12 Jan 2011 mdbFontEncoding(5)
All times are GMT -4. The time now is 02:33 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy