Sponsored Content
Top Forums Shell Programming and Scripting Removing a block of duplicate lines from a file Post 302743797 by Don Cragun on Thursday 13th of December 2012 10:16:59 AM
Old 12-13-2012
The requirements still aren't clear.

Are you always only concerned about matching the first six lines, or are you trying to find the first set of n ilnes that are duplicated later in the file?

If the chosen lines at the start of the file are duplicated multiple times, do you only want to remove the first set of duplicated lines or do you want to remove every set of duplicated lines?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Removing duplicate lines ignore case

hi, I have the following input in file: abc ab a AB b c a C B When I use uniq -u file,the out put file is: abc ab AB c v B C (17 Replies)
Discussion started by: hellsd
17 Replies

2. UNIX for Dummies Questions & Answers

removing duplicate lines from a file

Hi, I am trying to remove duplicate lines from a file. For example the contents of example.txt is: this is a test 2342 this is a test 34343 this is a test 43434 and i want to remove the "this is a test" lines only and end up with the numbers in the file, that is, end up with: 2342... (4 Replies)
Discussion started by: ocelot
4 Replies

3. Shell Programming and Scripting

removing duplicate blank lines

Hi, how to remove the blank lines from the file only If we have more than one blank line. thanks rameez (8 Replies)
Discussion started by: rameezrajas
8 Replies

4. Shell Programming and Scripting

removing the duplicate lines in a file

Hi, I need to concatenate three files in to one destination file.In this if some duplicate data occurs it should be deleted. eg: file1: ----- data1 value1 data2 value2 data3 value3 file2: ----- data1 value1 data4 value4 data5 value5 file3: ----- data1 value1 data4 value4 (3 Replies)
Discussion started by: Sharmila_P
3 Replies

5. Shell Programming and Scripting

Removing duplicates from string (not duplicate lines)

please help me in getting following: Input Desired output x="foo" foo x="foo foo" foo x="foo foo" foo x="foo abc foo" foo abc x="foo foo1 foo2" foo foo1 foo2 I need to remove duplicated from string.. (8 Replies)
Discussion started by: vickylife
8 Replies

6. Shell Programming and Scripting

Removing Duplicate Lines per Section

Hello, I am in need of removing duplicate lines from within a file per section. File: ABC1 012345 header ABC2 7890-000 ABC3 012345 Header Table ABC4 ABC5 593.0000 587.4800 ABC5 593.5000 587.6580 <= dup need to remove ABC5 593.5000 ... (5 Replies)
Discussion started by: petersf
5 Replies

7. Shell Programming and Scripting

removing duplicate lines while maintaing coherence with second file

So I have two files. The first file, file1.txt, has lines of numbers separated by commas. file1.txt 10,2,30,50 22,6,3,15,16,100 73,55 78,40,33,30,11 73,55 99,82,85 22,6,3,15,16,100 The second file, file2.txt, has sentences. file2.txt "the cat is fat" "I like eggs" "fish live in... (6 Replies)
Discussion started by: adrunknarwhal
6 Replies

8. UNIX for Dummies Questions & Answers

Removing a set of Duplicate lines from a file

Hi, How do i remove a set of duplicate lines from a file. My file contains the lines: abc def ghi abc def ghi jkl mno pqr jkl mno (1 Reply)
Discussion started by: raosr020
1 Replies

9. Homework & Coursework Questions

Script: Removing HTML tags and duplicate lines

Use and complete the template provided. The entire template must be completed. If you don't, your post may be deleted! 1. The problem statement, all variables and given/known data: You will write a script that will remove all HTML tags from an HTML document and remove any consecutive... (3 Replies)
Discussion started by: tburns517
3 Replies

10. Shell Programming and Scripting

Removing duplicate lines on first column based with pipe delimiter

Hi, I have tried to remove dublicate lines based on first column with pipe delimiter . but i ma not able to get some uniqu lines Command : sort -t'|' -nuk1 file.txt Input : 38376KZ|09/25/15|1.057 38376KZ|09/25/15|1.057 02006YB|09/25/15|0.859 12593PS|09/25/15|2.803... (2 Replies)
Discussion started by: parithi06
2 Replies
MSGFMT_CREATE(3)							 1							  MSGFMT_CREATE(3)

MessageFormatter::create - Constructs a new Message Formatter

	Object oriented style (method)

SYNOPSIS
publicstatic MessageFormatter MessageFormatter::create (string $locale, string $pattern) DESCRIPTION
Object oriented style (constructor): MessageFormatter::__construct (string $locale, string $pattern) Procedural style MessageFormatter msgfmt_create (string $locale, string $pattern) Constructs a new Message Formatter PARAMETERS
o $locale - The locale to use when formatting arguments o $pattern - The pattern string to stick arguments into. The pattern uses an 'apostrophe-friendly' syntax; it is run through umsg_autoQuoteA- postrophe before being interpreted. RETURN VALUES
The formatter object EXAMPLES
Example #1 msgfmt_create(3) example <?php $fmt = msgfmt_create("en_US", "{0,number,integer} monkeys on {1,number,integer} trees make {2,number} monkeys per tree"); echo msgfmt_format($fmt, array(4560, 123, 4560/123)); $fmt = msgfmt_create("de", "{0,number,integer} Affen auf {1,number,integer} Baumen sind {2,number} Affen pro Baum"); echo msgfmt_format($fmt, array(4560, 123, 4560/123)); ?> Example #2 OO example <?php $fmt = new MessageFormatter("en_US", "{0,number,integer} monkeys on {1,number,integer} trees make {2,number} monkeys per tree"); echo $fmt->format(array(4560, 123, 4560/123)); $fmt = new MessageFormatter("de", "{0,number,integer} Affen auf {1,number,integer} Baumen sind {2,number} Affen pro Baum"); echo $fmt->format(array(4560, 123, 4560/123)); ?> The above example will output: 4,560 monkeys on 123 trees make 37.073 monkeys per tree 4.560 Affen auf 123 Baumen sind 37,073 Affen pro Baum SEE ALSO
msgfmt_format(3), msgfmt_parse(3), msgfmt_get_error_code(3), msgfmt_get_error_message(3). PHP Documentation Group MSGFMT_CREATE(3)
All times are GMT -4. The time now is 03:32 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy