Sponsored Content
Top Forums Shell Programming and Scripting Getting most repeated 3 lines Post 302831063 by zaxxon on Wednesday 10th of July 2013 06:43:59 AM
Old 07-10-2013
You posted nearly the same question here:

https://www.unix.com/shell-programmin...ed-column.html

Please do not open up another thread just because of such an unsignificant difference. Else you get infractions for double posts. Keep that in mind, thanks.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Print specific lines of a repeated set of data

I have a file that needs 1st line, 2nd line, and 26th line printed from every chunk of data. Each chunk of data contains 26 lines (#line+%line+24 data lines = 26 lines of data repeated). Input file: # This is a data file used for blockA (chunk 1). % 10576 A 10 0 1 04 (data1) 03 (data2)... (2 Replies)
Discussion started by: morrbie
2 Replies

2. Shell Programming and Scripting

need to print lines between repeated pattern

Hi all, I have a file that looks like this: uid=bessemsj version: 1 dn: cn=Desk SpecialAdminDesk, ou=Desks, dc=DSS,c=nl,o=Vodafone dn: cn=DSS Advisors, ou=Groups, dc=DSS,c=nl,o=Vodafone dn: cn=DSS Dispatcher,ou=Groups,dc=DSS,c=nl,o=Vodafone dn: cn=Desk Retention Desk,ou=Desks,... (13 Replies)
Discussion started by: Eman_in_forum
13 Replies

3. Shell Programming and Scripting

Deleting repeated lines by keeping only one.

Dear Buddies, Need ur help once again. I have a flat file with around 20 million lines (Huge file it is). However, many of the lines are of no use hence I want to remove it. To find and delete such lines we have certain codes written at the starting of each line. Basis that we can delete the... (2 Replies)
Discussion started by: anushree.a
2 Replies

4. Shell Programming and Scripting

Remove regularly repeated lines

How can i delete some regular repeated lines in a file? example: in_file EDGE 1 2 12 EDGE 2 3 23 EDGE 3 4 34 EDGE 5 6 56 EDGE 6 7 67 EDGE 7 8 78 EDGE 9 10 910 EDGE 10 11 1011 EDGE 11 12 1112 EDGE 13 14 1314 EDGE 14 15 1415 EDGE 15 16 1516 EDGE 17 18 1718 EDGE 18 19 1819 EDGE 19... (8 Replies)
Discussion started by: saeed.soltani
8 Replies

5. Shell Programming and Scripting

How to print the lines which are repeated 3 times in a file?

Hello All, I have a file which has repeated lines. I want to print the lines which are repeated three times. Please help. (3 Replies)
Discussion started by: ailnilanjan
3 Replies

6. Shell Programming and Scripting

Find repeated word and take sum of the second field to it ,for all the repeated words in awk

Hi below is the input file, i need to find repeated words and sum up the values of it which is second field from the repeated work.Im trying but getting no where close to it.Kindly give me a hint on how to go about it Input fruits,apple,20,fruits,mango,20,veg,carrot,12,veg,raddish,30... (11 Replies)
Discussion started by: 100bees
11 Replies

7. Shell Programming and Scripting

Compare two files with repeated lines

Hi all, I've been trying to write a script to compare two files. This is what I want: file 1: a 1 2 b 5 9 c 4 7 file 2: a a c a b Output: a 1 2 a 1 2 (2 Replies)
Discussion started by: ernesto561
2 Replies

8. Shell Programming and Scripting

Repeated lines-case sensitive

Hi, users file contains below names i have a requirement to keep only one case sensitive user. For e.g if user name is "aaa" then only aaa should be there in the file and other matching users(AAA,aaA) should be deleted. Tried multiple options but no luck can you please help. aaa abc AAA... (2 Replies)
Discussion started by: Satyak
2 Replies

9. UNIX for Beginners Questions & Answers

Export lines that have first entry repeated 5 times or above

Dears i want to extract lines only that have first entry repeated 3 times or above , ex data : -bash-3.00$ cat INTCONT-IS.CSV M205-00-106_AMDRN:1-0-6-22,12-662-4833,intContact,2016-11-15 02:32:16,50 M205-00-106_AMDRN:1-0-23-17,12-616-0462,intContact,2016-11-15 02:32:23,50... (5 Replies)
Discussion started by: is2_egypt
5 Replies

10. Shell Programming and Scripting

Remove duplicate lines which has been repeated 4 times

Remove duplicate lines which has been repeated 4 times attached test.txt below command tried and not getting expect output. for i in `cat test.txt | uniq` do num=`cat test.txt | grep $i | wc -l` echo $i $num done test.txt ... (17 Replies)
Discussion started by: Kalia
17 Replies
URI::URL(3)						User Contributed Perl Documentation					       URI::URL(3)

NAME
URI::URL - Uniform Resource Locators SYNOPSIS
$u1 = URI::URL->new($str, $base); $u2 = $u1->abs; DESCRIPTION
This module is provided for backwards compatibility with modules that depend on the interface provided by the "URI::URL" class that used to be distributed with the libwww-perl library. The following differences exist compared to the "URI" class interface: o The URI::URL module exports the url() function as an alternate constructor interface. o The constructor takes an optional $base argument. The "URI::URL" class is a subclass of "URI::WithBase". o The URI::URL->newlocal class method is the same as URI::file->new_abs. o URI::URL::strict(1) o $url->print_on method o $url->crack method o $url->full_path: same as ($uri->abs_path || "/") o $url->netloc: same as $uri->authority o $url->epath, $url->equery: same as $uri->path, $uri->query o $url->path and $url->query pass unescaped strings. o $url->path_components: same as $uri->path_segments (if you don't consider path segment parameters) o $url->params and $url->eparams methods o $url->base method. See URI::WithBase. o $url->abs and $url->rel have an optional $base argument. See URI::WithBase. o $url->frag: same as $uri->fragment o $url->keywords: same as $uri->query_keywords o $url->localpath and friends map to $uri->file. o $url->address and $url->encoded822addr: same as $uri->to for mailto URI o $url->groupart method for news URI o $url->article: same as $uri->message SEE ALSO
URI, URI::WithBase COPYRIGHT
Copyright 1998-2000 Gisle Aas. perl v5.18.2 2012-02-11 URI::URL(3)
All times are GMT -4. The time now is 04:00 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy