Sponsored Content
Top Forums Shell Programming and Scripting Help Please. Need to add quotes to o field Post 98730 by vgersh99 on Friday 10th of February 2006 11:34:37 AM
Old 02-10-2006
Quote:
Originally Posted by tjbopp
thanks. I have my fields surrounded by " (quotes) like I need, Now my problem is the fields are seperated by 'tabs' how do I change it to be seperated by 'space' ???
Not sure how to specify a 'space' or a 'tab'
borrowing from futurelet....
Code:
awk 'BEGIN { FS = "\t" ; q="\"";  OFS = q " " q }
{ $1 = $1; print q $0 q }'  oldfile >newfile

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

add increment field when first field changes

Hi I have a file that looks like the following: Name1 aaa bbbb Name1 ffd hhghg Name1 dffg ghhhh Name2 rtrt dfff Name2 rrtfr tgtt Name2 dsddf gfggf Name2 ffffg gfgf NAme3 fdff ghhgh Is it possible to format it so that a number... (2 Replies)
Discussion started by: azekry
2 Replies

2. UNIX for Dummies Questions & Answers

Add single quotes in string

Hi All, I love this site, it helps newbie people like me and I appreciate everyone's help! Here is my questions. I am trying to concatenate a single quote into a character/string from a text file for each line (lets say ABC should look like 'ABC'). I tried to use awk print command to do... (1 Reply)
Discussion started by: mrjunsy
1 Replies

3. Shell Programming and Scripting

Add double quotes around the string

I have a line in multiple scripts:select into table /dir1/dir2/file.dat dir1 and dir2 are the same but file.dat is different from script to script. I need to include /dir1/dir2/file.dat into double quotes in each file of my directory:select into table "/dir1/dir2/file.dat" (13 Replies)
Discussion started by: surfer515
13 Replies

4. Shell Programming and Scripting

awk field equal something, then add something to the field

Hi Everyone, a.txt a b c 1 e e e e e a b c 2 e e e e e the output is a b c 1 e e e e e a 00b c 2 e e e e e when 4th field = '2', then add '00' in the front of 2nd field value. Thanks (9 Replies)
Discussion started by: jimmy_y
9 Replies

5. Shell Programming and Scripting

Add field delimiter for the last field

I have a file with three fields and field delimiter '|' like: abc|12:13:45|123 xyz|12:87:32| qwe|54:21:09 In the file the 1st line has proper data -> abc|12:13:45|123 ,the 2nd line doesnt has data for the 3rd field which is okay , the 3rd line doesnt has data for the 3rd field as well the... (5 Replies)
Discussion started by: mehimadri
5 Replies

6. UNIX for Dummies Questions & Answers

Cut a field from a line having quotes as delimiter

Hi , I have extract a single field from the 2nd row of a file of which the format is as given below "Field1","Field2","Field3",...,"Field17",... I need to cut Field17 as such (no quotes required).. I give the command head -2 file_name | tail -1 | cut -d "," -f17 But the output is... (2 Replies)
Discussion started by: nivin_govindan
2 Replies

7. Shell Programming and Scripting

awk, comma as field separator and text inside double quotes as a field.

Hi, all I need to get fields in a line that are separated by commas, some of the fields are enclosed with double quotes, and they are supposed to be treated as a single field even if there are commas inside the quotes. sample input: for this line, 5 fields are supposed to be extracted, they... (8 Replies)
Discussion started by: kevintse
8 Replies

8. Shell Programming and Scripting

awk - single quotes as field separator

How can I use single quotes as field separator in awk? (1 Reply)
Discussion started by: locoroco
1 Replies

9. Shell Programming and Scripting

Remove quotes and commas from field

In the attached file I am trying to remove all the "" and , (quotes and commas) from $2 and $3 and the "" (quotes) from $4. I tried the below as a start: awk -F"|" '{gsub(/\,/,X,$2)} 1' OFS="\t" enhancer.txt > comma.txt Thank you :). (6 Replies)
Discussion started by: cmccabe
6 Replies

10. Shell Programming and Scripting

awk to parse comma separated field and removing comma in between number and double quotes

Hi Experts, Please support I have below data in file in comma seperated, but 4th column is containing comma in between numbers, bcz of which when i tried to parse the file the column 6th value(5049641141) is being removed from the file and value(222.82) in column 5 becoming value of column6. ... (3 Replies)
Discussion started by: as7951
3 Replies
set_color(1)                                                           fish                                                           set_color(1)

NAME
set_color - set_color - set the terminal color set_color - set the terminal color Synopsis set_color [-v --version] [-h --help] [-b --background COLOR] [COLOR] Description Change the foreground and/or background color of the terminal. COLOR is one of black, red, green, brown, yellow, blue, magenta, purple, cyan, white and normal. o -b, --background Set the background color o -c, --print-colors Prints a list of all valid color names o -h, --help Display help message and exit o -o, --bold Set bold or extra bright mode o -u, --underline Set underlined mode o -v, --version Display version and exit Calling set_color normal will set the terminal color to whatever is the default color of the terminal. Some terminals use the --bold escape sequence to switch to a brighter color set. On such terminals, set_color white will result in a grey font color, while set_color --bold white will result in a white font color. Not all terminal emulators support all these features. This is not a bug in set_color but a missing feature in the terminal emulator. set_color uses the terminfo database to look up how to change terminal colors on whatever terminal is in use. Some systems have old and incomplete terminfo databases, and may lack color information for terminals that support it. Download and install the latest version of ncurses and recompile fish against it in order to fix this issue. Version 1.23.1 Sun Jan 8 2012 set_color(1)
All times are GMT -4. The time now is 06:31 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy