Sponsored Content
Top Forums Shell Programming and Scripting Deriving unique entries from multiple repeating patterns Post 302719805 by ks_reddy on Tuesday 23rd of October 2012 06:49:42 AM
Old 10-23-2012
Thanks All...

Dear All,
Thanks for your valuable time.
The code given by Radoulov shown below, is working perfectly for my requirements.

Code:
awk 'NR>1&&!/^\./{if($1<p)c++; p=$1; $0=sprintf( "%c", 65+c) $0}1'

Regards
Sidda
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

need assistance: sed and repeating patterns

hi, I need to write a command with sed to find all the lines in a file that contain patterns of three or more characters that repeat once and put them inside perenthezes. I cannot tell sed what pattern to look for. it should find repeated patterns automatically. example:... (1 Reply)
Discussion started by: metalwarrior
1 Replies

2. UNIX for Dummies Questions & Answers

assitance with sed (repeating patterns)

hi, I need to write a command to look into a text file, find lines that contain patterns of three or more characters that repeat once, and put perenthesizes around them. so for example, the line "123test123" would be changed to "(123)test(123)" and "abcdeabcde" to "(abcde)(abcde)". any hint is... (7 Replies)
Discussion started by: metalwarrior
7 Replies

3. Shell Programming and Scripting

Find multiple patterns on multiple lines and concatenate output

I'm trying to parse COBOL code to combine variables into one string. I have two variable names that get literals moved into them and I'd like to use sed, awk, or similar to find these lines and combine the variables into the final component. These variable names are always VAR1 and VAR2. For... (8 Replies)
Discussion started by: wilg0005
8 Replies

4. UNIX for Advanced & Expert Users

Count number of unique patterns from a log file

Hello Everyone I need your help in fixing this issue., I have a log file which has data of users logging in to an application. I want to search for a particular pattern in the log ISSessionValidated=N If this key word is found , the above 8 lines will contain the name of the user who's... (12 Replies)
Discussion started by: xtechkid
12 Replies

5. UNIX for Dummies Questions & Answers

Using sed command to remove multiple instances of repeating headers in one file?

Hi, I have catenated multiple output files (from a monte carlo run) into one big output file. Each individual file has it's own two line header. So when I catenate, there are multiple two line headers (of the same wording) within the big file. How do I use the sed command to search for the... (1 Reply)
Discussion started by: rebazon
1 Replies

6. Shell Programming and Scripting

Sed replace using same pattern repeating multiple times in a line

Sed replace using same pattern repeating multiple times in a line I have text like below in a file: I am trying to replace the above line to following How can I acheive this? I am able to do it if the occurrence is for 1 time: But If I try like below I am getting like this: I have to... (4 Replies)
Discussion started by: sol_nov
4 Replies

7. Shell Programming and Scripting

Grep from multiple patterns multiple file multiple output

Hi, I want to grep multiple patterns from multiple files and save to multiple outputs. As of now its outputting all to the same file when I use this command. Input : 108 files to check for 390 patterns to check for. output I need to 108 files with the searched patterns. Xargs -I {} grep... (3 Replies)
Discussion started by: Diya123
3 Replies

8. Shell Programming and Scripting

Reading multiple values from multiple lines and columns and setting them to unique variables.

Hello, I would like to ask for help with csh script. An example of an input in .txt file is below, the number of lines varies from file to file and I have 2 or 3 columns with values. I would like to read all the values (probably one by one) and set them to independent unique variables that... (7 Replies)
Discussion started by: FMMOLA
7 Replies

9. Shell Programming and Scripting

Repeating Multiple Fields

I am trying to find a way to repeat fields. I am not really sure how to explain it so let me just post a sample and what I want it to look like. 888123 66232 18 1 19 44422 11 7 23 881133 66231 33 1 34 ... (4 Replies)
Discussion started by: DerangedNick
4 Replies

10. Shell Programming and Scripting

Unique entries in multiple files

Hello, I have a directory with a log files(many of them). Lines look like this: Sep 1 00:05:05 server9 pop3d-ssl: LOGIN, user=abc@example.com, ip=, port= Sep 1 00:05:05 server9 pop3d-ssl: LOGOUT, user=abc@example.com, ip=, port=, top=0, retr=0, rcvd=12, sent=46, time=0 Sep 1 00:05:05... (19 Replies)
Discussion started by: ramirez987
19 Replies
Graphics::Primitive::Insets(3pm)			User Contributed Perl Documentation			  Graphics::Primitive::Insets(3pm)

NAME
Graphics::Primitive::Insets - Space between things DESCRIPTION
Graphics::Primitive::Insets represents the amount of space that surrounds something. This object can be used to represent either padding or margins (in the CSS sense, one being inside the bounding box, the other being outside) SYNOPSIS
use Graphics::Primitive::Insets; my $insets = Graphics::Primitive::Insets->new({ top => 5, bottom => 5, left => 5, right => 5 }); METHODS
Constructor new Creates a new Graphics::Primitive::Insets. Instance Methods as_array Return these insets as an array in the form of top, right, bottom and left. bottom Set/Get the inset from the bottom. equal_to Determine if these Insets are equal to another. left Set/Get the inset from the left. right Set/Get the inset from the right. top Set/Get the inset from the top. zero Sets all the insets (top, left, bottom, right) to 0. AUTHOR
Cory Watson, "<gphat@cpan.org>" SEE ALSO
perl(1) COPYRIGHT &; LICENSE Copyright 2008-2010 by Cory G Watson. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.12.3 2010-08-21 Graphics::Primitive::Insets(3pm)
All times are GMT -4. The time now is 11:33 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy