Sponsored Content
Top Forums Shell Programming and Scripting Joining lines in a file - help! Post 302644897 by ygemici on Tuesday 22nd of May 2012 03:10:55 PM
Old 05-22-2012
Code:
# cat file
R|This is line 1$
R|This is   $
line 2$
R|This is line 3$
R|This is line 4$
R|This is  $
line 5$
R|This is $
line 6$
R|This is$
line 7$

Code:
# awk '{if(!/^ *line/){x=$0;s1=gensub("(.*is).*","\\1",x);s=gensub("is(.*)","\\1",x);if(s~/ +/)s=" ";if(/line/)print x}\
else{xx=xx?xxFS$0:$0;print s1 s xx}}' file
R|This is line 1
R|This is line 2
R|This is line 3
R|This is line 4
R|This is line 5
R|This is line 6
R|This is line 7

regards
ygemici
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Joining lines in log file

Hi, I need to develop a script to join multiple three lines in a log file into one line for processing with awk and grep. I looked at tr with no success. The first line contains the date time information. The second line contains the error line. The third line is a blank line. Thanks, Mike (3 Replies)
Discussion started by: bubba112557
3 Replies

2. Shell Programming and Scripting

Joining 2 lines in a file together

Hi guys, I've got a log file which has entries that look like this: ------------------------------------------------------------------------------- 06/08/04 07:57:57 AMQ9002: Channel program started. EXPLANATION: Channel program 'INSCCPQ1.HSMTSPQ1' started. ACTION: None. ... (3 Replies)
Discussion started by: m223464
3 Replies

3. UNIX for Dummies Questions & Answers

Joining lines of a text file using GAWK

sir... am having a data file of customer master., containing some important fields as a set one line after another., what i want is to have one set of these fields(rows) one after another in line.........then the second set... and so on... till the last set completed. ... (0 Replies)
Discussion started by: KANNI786
0 Replies

4. UNIX for Dummies Questions & Answers

JOINING MULTIPLE LINES IN A TEXT FILE USING GAWK

sir... am having a data file of customer master., containing some important fields as a set one line after another., what i want is to have one set of these fields(rows) one after another in line.........then the second set... and so on... till the last set completed. I WANT THE DATA... (0 Replies)
Discussion started by: KANNI786
0 Replies

5. Shell Programming and Scripting

Joining lines in a text file using AWK or SED

Hi All I'm struggling a bit here :( I need a way of joining lines contained in a text file. I've seen numerous SED and AWK examples and none of them seem to be working for me. The text file has 4 lines: DELL1427 DOC 30189342 79 Now bear with me on this one as I'm actually... (4 Replies)
Discussion started by: huskie69
4 Replies

6. Shell Programming and Scripting

bash - joining lines in a file

I’m writing a bash shell script and I want to join lines together where two variables on each line are the same ie. 12345variablestuff43212morevariablestuff 12345variablestuff43212morevariablestuff 34657variablestuff78945morevariablestuff 34657variablestuff78945morevariablestuff... (12 Replies)
Discussion started by: Cultcha
12 Replies

7. Shell Programming and Scripting

joining multi-line file into single lines

Hi, I have a file like mentioned below..For each specific id starting with > I want to join the sequence in multiple lines to a single line..Is there a simple way in awk or sed to do this >ENST00000558922 cdna:KNOWN TCCAGGATCCAGCCTCCCGATCACCGCGCTAGTCCTCGCCCTGCCTGGGCTTCCCCAGAG... (2 Replies)
Discussion started by: Diya123
2 Replies

8. Shell Programming and Scripting

Joining lines in TXT file based on first character

Hi, I have a pipe delimeted text file where lines have been split over 2 lines and I need to join them back together. For example the file I have is similar to the following: aaa|bbb |ccc ddd|eee fff|ggg |hhh I ideally need to have it looking like the following aaa|bbb|ccc ddd|eee... (5 Replies)
Discussion started by: fuji_s
5 Replies

9. Shell Programming and Scripting

Issue in Concatenation/Joining of lines in a dynamically generated file

Hi, I have a file containing many records delimited by pipe (|). Each record should contain 17 columnns/fields. there are some fields having fields less than 17.So i am extracting those records to a file using the below command awk 'BEGIN {FS="|"} NF !=17 {print}' feedfile.txt... (8 Replies)
Discussion started by: TomG
8 Replies

10. Shell Programming and Scripting

Joining especific lines in "2n" lines file

Hi to everybody. I have a "2n" lines file. I would like to create a new file with only "n" lines, each line in the new file formed by the proper odd line of the old file joined with the following even line (separated by a space) of the old file. I'd prefer using sed or bash. -example-... (5 Replies)
Discussion started by: felino
5 Replies
GRUB-RENDER-LABEL(1)                                               User Commands                                              GRUB-RENDER-LABEL(1)

NAME
grub-render-label - generate a .disk_label for Apple Macs. SYNOPSIS
grub-render-label [OPTION...] [OPTIONS] DESCRIPTION
Render Apple .disk_label. -b, --bgcolor=COLOR use COLOR for background -c, --color=COLOR use COLOR for text -f, --font=FILE use FILE as font (PF2). -i, --input=FILE read text from FILE. -o, --output=FILE set output filename. Default is STDOUT -t, --text=STRING set the label to render -v, --verbose print verbose messages. -?, --help give this help list --usage give a short usage message -V, --version print program version Mandatory or optional arguments to long options are also mandatory or optional for any corresponding short options. REPORTING BUGS
Report bugs to <bug-grub@gnu.org>. SEE ALSO
The full documentation for grub-render-label is maintained as a Texinfo manual. If the info and grub-render-label programs are properly installed at your site, the command info grub-render-label should give you access to the complete manual. grub-render-label (GRUB) 2.02-2ubuntu8.3 July 2018 GRUB-RENDER-LABEL(1)
All times are GMT -4. The time now is 03:03 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy