Sponsored Content
Top Forums Shell Programming and Scripting Add unique header to multiple lines Post 302878622 by balajesuri on Saturday 7th of December 2013 11:18:17 AM
Old 12-07-2013
Code:
awk '{print ">" $0 "\n" $0}' file

Tested the below on GNU sed:
Code:
sed 's/.*/>&\n&/' file

 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

2. Shell Programming and Scripting

search and replace, when found, delete multiple lines, add new set of lines?

hey guys, I tried searching but most 'search and replace' questions are related to one liners. Say I have a file to be replaced that has the following: $ cat testing.txt TESTING AAA BBB CCC DDD EEE FFF GGG HHH ENDTESTING This is the input file: (3 Replies)
Discussion started by: DeuceLee
3 Replies

3. Shell Programming and Scripting

Compare multiple files and print unique lines

Hi friends, I have multiple files. For now, let's say I have two of the following style cat 1.txt cat 2.txt output.txt Please note that my files are not sorted and in the output file I need another extra column that says the file from which it is coming. I have more than 100... (19 Replies)
Discussion started by: jacobs.smith
19 Replies

4. Shell Programming and Scripting

Combine multiple unique lines from event log text file into one line, use PERL or AWK?

I can't decide if I should use AWK or PERL after pouring over these forums for hours today I decided I'd post something and see if I couldn't get some advice. I've got a text file full of hundreds of events in this format: Record Number : 1 Records in Seq : ... (3 Replies)
Discussion started by: Mayday22
3 Replies

5. Shell Programming and Scripting

Transpose lines from individual blocks to unique lines

Hello to all, happy new year 2013! May somebody could help me, is about a very similar problem to the problem I've posted here where the member rdrtx1 and bipinajith helped me a lot. https://www.unix.com/shell-programming-scripting/211147-map-values-blocks-single-line-2.html It is very... (3 Replies)
Discussion started by: Ophiuchus
3 Replies

6. Shell Programming and Scripting

Add column header and row header

Hi, I have an input like this 1 2 3 4 2 3 4 5 4 5 6 7 I would like to count the no. of columns and print a header with a prefix "Col". I would also like to count the no. of rows and print as first column with each line number with a prefix "Row" So, my output would be ... (2 Replies)
Discussion started by: jacobs.smith
2 Replies

7. UNIX for Dummies Questions & Answers

Print unique lines without sort or unique

I would like to print unique lines without sort or unique. Unfortunately the server I am working on does not have sort or unique. I have not been able to contact the administrator of the server to ask him to add it for several weeks. (7 Replies)
Discussion started by: cokedude
7 Replies

8. Shell Programming and Scripting

Reading multiple values from multiple lines and columns and setting them to unique variables.

Hello, I would like to ask for help with csh script. An example of an input in .txt file is below, the number of lines varies from file to file and I have 2 or 3 columns with values. I would like to read all the values (probably one by one) and set them to independent unique variables that... (7 Replies)
Discussion started by: FMMOLA
7 Replies

9. Shell Programming and Scripting

Merging two tables including multiple ocurrence of column identifiers and unique lines

I would like to merge two tables based on column 1: File 1: 1 today 1 green 2 tomorrow 3 red File 2: 1 a lot 1 sometimes 2 at work 2 at home 2 sometimes 3 new 4 a lot 5 sometimes 6 at work (4 Replies)
Discussion started by: BSP
4 Replies

10. Shell Programming and Scripting

Find header in a text file and prepend it to all lines until another header is found

I've been struggling with this one for quite a while and cannot seem to find a solution for this find/replace scenario. Perhaps I'm getting rusty. I have a file that contains a number of metrics (exactly 3 fields per line) from a few appliances that are collected in parallel. To identify the... (3 Replies)
Discussion started by: verdepollo
3 Replies
DISTRIB.PATS(5) 						File Formats Manual						   DISTRIB.PATS(5)

NAME
distrib.pats - default values for Usenet Distribution header DESCRIPTION
The file /etc/news/distrib.pats is used to determine the default value of the Distribution header. It consists of a series of lines; blank lines and lines beginning with a number sign (``#'') are ignored. All other lines consist of three fields separated by a colon: weight:pattern:value The first field is the weight to assign to this match. If a newsgroup matches multiple lines, the line with the heighest weight is used. This should be an arbitrary number greater than zero. Unlike other INN files, the order of lines in this file is not important. The second field is the name of the newsgroup or a wildmat(3)-style pattern to specify a set of newsgroups. Multiple patterns are not allowed. The third field is the value that should be used if this line is picked as the best match. It can be an empty string. Programs such as inews(1) that process a user's posting should consult this file, typically by using the DDxxx routines, documented in lib- inn(3). The intent is that all newsgroups to which an article is posted be used to index into this file, and the value with the highest weight should be used as the value of the Distribution header, if none was specified. HISTORY
Written by Rich $alz <rsalz@uunet.uu.net> for InterNetNews. This is revision 1.7, dated 1996/09/06. SEE ALSO
inews(1), libinn(3). DISTRIB.PATS(5)
All times are GMT -4. The time now is 10:34 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy