Sponsored Content
Full Discussion: Problem with AWK and OFS
Top Forums UNIX for Dummies Questions & Answers Problem with AWK and OFS Post 302566561 by vgersh99 on Thursday 20th of October 2011 03:27:36 PM
Old 10-20-2011
Is that a complete file or just one record? Could you post a more representative sample if you have one?
Code:
nawk '/^\^\^/ {$0=">" FILENAME}1' myFile

 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

OFS in awk

Hi, I have these out put field seperator changed to "|" in my awk command, but it didn't give me the result. Can someone help me find out why? ======================================= /bin/awk 'BEGIN { OFS="|" } { print $0 }' list.tmp.$$ > listtmp.$$ =======================================... (1 Reply)
Discussion started by: whatsfordinner
1 Replies

2. Shell Programming and Scripting

OFS in awk.

OFS is inbuild command in awk. I have a file file.txt abc : def : ghi jkl : mno: pqr stu : vwx :yzz code i used: awk -F ":" 'BEGIN {OFS="|"} {print $1,$2}' file.txt output: abc def jkl mno stu vwx but as i have used OFS="|" and i am expecting output as: abc | def jkl... (4 Replies)
Discussion started by: salil2012
4 Replies

3. Shell Programming and Scripting

AWK - OFS

Hi All, I have a comma seperated delimited file with 10 columns. I need to convert it into TAB seperated delimited file. awk -F"," '{print $1"\t"$2"\t"$3"\t"$4"\t"$5"\t"$6"\t"$7"\t"$8"\t"$9"\t"$10}' a.txt >> b.txt how to use OFS to get the same output. I have tried by googling, but it... (5 Replies)
Discussion started by: Amit.Sagpariya
5 Replies

4. Shell Programming and Scripting

Parsing XML in awk : OFS does not work as expected

Hi, I am trying to parse regular XML file where I have to reduce number of decimal points in some xml elements. I am using following AWK command to achive that : #!/bin/ksh EDITCMD='BEGIN { FS = ""; OFS=FS } { if ( $3 ~ "*\\.*" && length(substr($3,1+index($3,"."))) == 15 ) {... (4 Replies)
Discussion started by: martin.franek
4 Replies

5. Shell Programming and Scripting

Awk OFS issues

Hi Im trying to tidy up the output of a who command when it writes to a log, everything I've tried doesnt seem to work though, any help would be massively appreciated. Im using the awk command to set the OFS as tab. #!/bin/bash who >> /export/home/tjmoore/logusers awk -F 'BEGIN... (3 Replies)
Discussion started by: 02JayJay02
3 Replies

6. Shell Programming and Scripting

Awk OFS issues

Hi, Could anyone tell me what Im doing wrong here any help will be much appreciated #!/bin/bash ls -ltr /export/home/tjmoore > /export/home/tjmoore/log100 awk -F " " /export/home/tjmoore/log100 'BEGIN {OFS="\t";} {print $1,$2,$3,$4,$5, $6,$7,$8,$9;}' > /export/home/tjmoore/log1001 I... (9 Replies)
Discussion started by: 02JayJay02
9 Replies

7. UNIX for Dummies Questions & Answers

OFS in awk

Hello, I have an issue with adding commas as delimiters in this scenario: cat xtr3.rpl|head -5|awk 'BEGIN {OFS=","} {print $1,$2,$3,$4}' Produces this output: 00530083,0000000471,000000000000.00,000000000000.00 00530085,0000000471,000000000000.00,000000000000.00... (10 Replies)
Discussion started by: MIA651
10 Replies

8. Shell Programming and Scripting

OFS print awk

file: sasa|asasa|asasa|asas erer|Erer|rere|ererer Output needed : sasa:asasa:asasa:asas erer:Erer:rere:ererer Im getting output, when i use the $1,$2. awk -F'|' 'BEGIN{OFS=":";} {print $1,$2; }' file Output : sasa:asasa erer:Erer But when i need the whole column, i... (5 Replies)
Discussion started by: Ramesh M
5 Replies

9. Shell Programming and Scripting

OFS does not apply to few records in awk

Hi , I am having a problem with my awk oneliner , which for some reason leaves the first two records Input File $ cat file1 A1:B1:C1:NoLimit M1:M2:M3:Limit A2:B2:C2,C3,C4,C5 A3:B3:C3,C4,C5,C6,C7Desired output A1,B1,C1,NoLimit M1,M2,M3,Limit A2,B2,C2 ,,,C3 ,,,C4 ,,,C5 A3,B3,C3... (5 Replies)
Discussion started by: chidori
5 Replies

10. Shell Programming and Scripting

awk - OFS printing duplicate. Why?

Why the following code printing duplicate records? bash-4.1$ cat rm1 c1 c2 c3 l1 2 3 4 l2 2 3 2 bash-4.1$ awk '{print $0} OFS = "\n"' rm1 c1 c2 c3 c1 c2 c3 l1 2 3 4 l1 2 3... (4 Replies)
Discussion started by: quincyjones
4 Replies
PNGCHUNKS(1)						      General Commands Manual						      PNGCHUNKS(1)

NAME
pngchunks - print information embedded into a PNG file SYNOPSIS
pngchunks file DESCRIPTION
pngchunks is a program that displays informations from files created according to the Portable Network Graphics file format. EXAMPLE
pngchunks input.png Chunk: Data Length 13 (max 2147483647), Type 1380206665 [IHDR] Critical, public, PNG 1.2 compliant, unsafe to copy IHDR Width: 256 IHDR Height: 256 IHDR Bitdepth: 8 IHDR Colortype: 2 IHDR Compression: 0 IHDR Filter: 0 IHDR Interlace: 0 IHDR Compression algorithm is Deflate IHDR Filter method is type zero (None, Sub, Up, Average, Paeth) IHDR Interlacing is disabled Chunk CRC: -753909967 Chunk: Data Length 6 (max 2147483647), Type 1145523042 [bKGD] Ancillary, public, PNG 1.2 compliant, unsafe to copy ... Unknown chunk type Chunk CRC: 554438993 Chunk: Data Length 9 (max 2147483647), Type 1935231088 [pHYs] Ancillary, public, PNG 1.2 compliant, safe to copy ... Unknown chunk type Chunk CRC: -757235972 Chunk: Data Length 7 (max 2147483647), Type 1162692980 [tIME] Ancillary, public, PNG 1.2 compliant, unsafe to copy ... Unknown chunk type Chunk CRC: 429243611 Chunk: Data Length 831 (max 2147483647), Type 1413563465 [IDAT] Critical, public, PNG 1.2 compliant, unsafe to copy IDAT contains image data Chunk CRC: 1406256926 Chunk: Data Length 0 (max 2147483647), Type 1145980233 [IEND] Critical, public, PNG 1.2 compliant, unsafe to copy IEND contains no data Chunk CRC: -1371381630 SEE ALSO
pngchunkdesc(1), pngcp(1), pnginfo(1). AUTHOR
pngchunks was written by Michael Still <mikal@stillhq.com>. This manual page was written by Nelson A. de Oliveira <naoliv@debian.org>, for the Debian project (but may be used by others). Tue, 31 Jul 2007 17:55:51 -0300 PNGCHUNKS(1)
All times are GMT -4. The time now is 10:54 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy