Sponsored Content
Top Forums Shell Programming and Scripting Handle special characters in awk -F Post 302942301 by vibhor_agarwali on Monday 27th of April 2015 11:22:45 AM
Old 04-27-2015
Anything that doesn't require escaping the special characters will be nice.
Awk is not a must & any other utility will do.
Found awk to be giving the best results till now though.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

awk/sed with special characters

i have this script that searches for a pattern. However it fails if the pattern includes some special characters. So far, it fails with the following strings: 1. -Cr 2. $Mj 3. H'412 would a sed or awk be more effective? i don't want the users to put the (\) during the search (they... (5 Replies)
Discussion started by: apalex
5 Replies

2. Shell Programming and Scripting

Handling special characters using awk

Hi all, How do I extract a value without special characters? I need to extract the value of %Used from below and if its greater than 80, need to send a notification. I am doing this right now..Its giving 17%..Is there a way to extract the value and assign it to a variable in one step? df |grep... (3 Replies)
Discussion started by: sam_78_nyc
3 Replies

3. Shell Programming and Scripting

awk search pattern with special characters passed from CL

I'm very new to awk and sed and I've been struggling with this for a while. I'm trying to search a file for a string with special characters and this string is a command line argument to a simple script. ./myscript "searchpattern" file #!/bin/sh awk "/$1/" $2 > dupelistfilter.txt sed... (6 Replies)
Discussion started by: cue
6 Replies

4. Shell Programming and Scripting

awk print $1 escape all special characters

I'm using awk '{print $1}' and it works most of the time to print the contents of a mysql query loop, but occationally I get a field with some special character in it, is there a way to tell awk to ignore all special characters between my FS? I have >186K records, so building a list of ALL special... (6 Replies)
Discussion started by: unclecameron
6 Replies

5. Shell Programming and Scripting

awk loop: display special characters

Hi everybody; I have a code and this fetches data from first.txt,modify it and outputs it to second.txt file. l awk 'NR>1 {print "l ./gcsw "$1" lt all lset Data="$2" Value "$3}' /home/gcsw/first.txt > /home/gcsw/second.txt this outputs as: l ./gcsw 123 lt all lset Data=456 Value 789 ... (1 Reply)
Discussion started by: gc_sw
1 Replies

6. Shell Programming and Scripting

Sed or awk : pattern selection based on special characters

Hello All, I am here again scratching my head on pattern selection with special characters. I have a large file having around 200 entries and i have to select a single line based on a pattern. I am able to do that: Code: cat mytest.txt | awk -F: '/myregex/ { print $2}' ... (6 Replies)
Discussion started by: usha rao
6 Replies

7. UNIX for Dummies Questions & Answers

awk for removing special characters and extra commas

Hi, I have a .csv file which as empty lines with comma and some special characters in 3rd column as below. Source data 1,2,3,4,%#,6 ,,,,,, 1,2,3,4,5,6 Target Data 1,2,3,4,5,6I need to remove blank lines and special charcters I am trying to get this using the below awk awk -F","... (2 Replies)
Discussion started by: shruthidwh
2 Replies

8. Shell Programming and Scripting

awk match shell variable that contains special characters?

How to match a shell variable that contains parenthesis (and other special characters like "!") file.txt contains: Charles Dickens Matthew Lewis (writer) name="Matthew Lewis (writer)"; awk -v na="$name" ' $0 ~ na' file.txt Ideally this would match $name in file.txt (in this... (3 Replies)
Discussion started by: Mid Ocean
3 Replies

9. Shell Programming and Scripting

awk conditions failing (special characters?)

This is really frustrating because I can't figure it out. I'm running a health check script. One of the items I'm checking is the amount of memory on a server. I use the free command, which outputs something like this (excerpt) Mem: 100 100 100 100 Swap: 100 100 100 100 In my debugging... (5 Replies)
Discussion started by: JustaDude
5 Replies

10. Shell Programming and Scripting

Awk: split column if special characters

Hi, I've data like these: Gene1,Gene2 snp1 Gene3 snp2 Gene4 snp3 I'd like to split line if comma and then print remaining information for the respective gene. My code: awk '{ if($1 ~ /,/){ n = split($0, t, ",") (7 Replies)
Discussion started by: genome
7 Replies
NICE(1) 						    BSD General Commands Manual 						   NICE(1)

NAME
nice -- execute a utility with an altered scheduling priority SYNOPSIS
nice [-n increment] utility [argument ...] DESCRIPTION
nice runs utility at an altered scheduling priority. If an increment is given, it is used; otherwise an increment of 10 is assumed. The super-user can run utilities with priorities higher than normal by using a negative increment. The priority can be adjusted over a range of -20 (the highest) to 20 (the lowest). Available options: -n increment A positive or negative decimal integer used to modify the system scheduling priority of utility. DIAGNOSTICS
The nice utility shall exit with one of the following values: 1-125 An error occurred in the nice utility. 126 The utility was found but could not be invoked. 127 The utility could not be found. Otherwise, the exit status of nice shall be that of utility. COMPATIBILITY
The historic -increment option has been deprecated but is still supported in this implementation. SEE ALSO
csh(1), getpriority(2), setpriority(2), renice(8) STANDARDS
The nice utility conforms to IEEE Std 1003.2-1992 (``POSIX.2''). HISTORY
A nice utility appeared in Version 6 AT&T UNIX. BUGS
nice is built into csh(1) with a slightly different syntax than described here. The form 'nice +10' nices to positive nice, and 'nice -10' can be used by the super-user to give a process more of the processor. BSD
June 6, 1993 BSD
All times are GMT -4. The time now is 03:38 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy