Sponsored Content
Operating Systems Linux matching pattern and replacement Post 302200998 by Franklin52 on Friday 30th of May 2008 04:03:10 PM
Old 05-30-2008
Dont use a "$" before variables within the awk command:

Code:
awk -v tn="$TN" -v ts="$target_schema" '/^GRANT/ && !/tn/{sub(ts,ts."VW_")}1' ${target_schema}.$TN.${ecmdate}.sql > $tmpfile_cln

Regards
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Pattern Replacement

There is a requirement that i need to replaced a pattern by another pattern in all the files in my entire file system. there are 1000s of file in the system. let the pattern is "calcuta". i have to replace this pattern by "kolkata" in all those files which contain "calcuta". I am only able to... (12 Replies)
Discussion started by: palash2k
12 Replies

2. Shell Programming and Scripting

comment/delete a particular pattern starting from second line of the matching pattern

Hi, I have file 1.txt with following entries as shown: 0152364|134444|10.20.30.40|015236433 0233654|122555|10.20.30.50|023365433 ** ** ** In file 2.txt I have the following entries as shown: 0152364|134444|10.20.30.40|015236433 0233654|122555|10.20.30.50|023365433... (4 Replies)
Discussion started by: imas
4 Replies

3. Shell Programming and Scripting

counting the lines matching a pattern, in between two pattern, and generate a tab

Hi all, I'm looking for some help. I have a file (very long) that is organized like below: >Cluster 0 0 283nt, >01_FRYJ6ZM12HMXZS... at +/99% 1 279nt, >01_FRYJ6ZM12HN12A... at +/99% 2 281nt, >01_FRYJ6ZM12HM4TS... at +/99% 3 283nt, >01_FRYJ6ZM12HM946... at +/99% 4 279nt,... (4 Replies)
Discussion started by: d.chauliac
4 Replies

4. UNIX for Dummies Questions & Answers

awk pattern replacement

Hi I'm a newbie in unix and I'm having trouble in creating a script. I want to search for a pattern '_good' and insert new lines that contains '_bad', '_med', '_fail' while also ensure that the line contains _good is removed here some of the data UPDATE SCHOOL SET GRADE =... (1 Reply)
Discussion started by: sexyTrojan
1 Replies

5. Shell Programming and Scripting

Sed for selective pattern replacement

Hi I am having a code snippet grant permission to all user sts|ln|uSe|PSG sajncht|se|Use|PPSPSG psg|ln|use|TSPSG sts_user.Me revoke I need to change all occurance of use (uSe,Use,use) with USE. I am using the following sed command for this sed 's//USE/g' s_sample.txt Output: (7 Replies)
Discussion started by: sudeep.id
7 Replies

6. Shell Programming and Scripting

Sed: printing lines AFTER pattern matching EXCLUDING the line containing the pattern

'Hi I'm using the following code to extract the lines(and redirect them to a txt file) after the pattern match. But the output is inclusive of the line with pattern match. Which option is to be used to exclude the line containing the pattern? sed -n '/Conn.*User/,$p' > consumers.txt (11 Replies)
Discussion started by: essem
11 Replies

7. Shell Programming and Scripting

PHP - Regex for matching string containing pattern but without pattern itself

The sample file: dept1: user1,user2,user3 dept2: user4,user5,user6 dept3: user7,user8,user9 I want to match by '/^dept2.*/' but don't want to have substring 'dept2:' in output. How to compose such regex? (8 Replies)
Discussion started by: urello
8 Replies

8. Shell Programming and Scripting

Pattern Matching and replacement

Hello Everybody, I need a help in the below pattern matching and replacement issue I have a file : emp.txt 21356 suresh 12/12/2012 23511 ramesh 11/06/2011 31456 biswajit 09/08/2013 53134 archan 06/02/2009 first field:- employee id, 2nd field is name and third field is date of joining ... (10 Replies)
Discussion started by: shellscripting
10 Replies

9. Shell Programming and Scripting

Help with Pattern Matching and replacement in Gz files

Hi Techies, I need a help in finding junk characters and remove them from a Datafile. we have a file and it had crores of records like below SGSN_MCC_MNC=01150 but sometime due to the issue with sending server we are getting some junk characters in the middle of data like below ... (6 Replies)
Discussion started by: mahi_mayu069
6 Replies

10. UNIX for Dummies Questions & Answers

Grep -v lines starting with pattern 1 and not matching pattern 2

Hi all! Thanks for taking the time to view this! I want to grep out all lines of a file that starts with pattern 1 but also does not match with the second pattern. Example: Drink a soda Eat a banana Eat multiple bananas Drink an apple juice Eat an apple Eat multiple apples I... (8 Replies)
Discussion started by: demmel
8 Replies
English(3pm)						 Perl Programmers Reference Guide					      English(3pm)

NAME
English - use nice English (or awk) names for ugly punctuation variables SYNOPSIS
use English; use English qw( -no_match_vars ) ; # Avoids regex performance penalty # in perl 5.16 and earlier ... if ($ERRNO =~ /denied/) { ... } DESCRIPTION
This module provides aliases for the built-in variables whose names no one seems to like to read. Variables with side-effects which get triggered just by accessing them (like $0) will still be affected. For those variables that have an awk version, both long and short English alternatives are provided. For example, the $/ variable can be referred to either $RS or $INPUT_RECORD_SEPARATOR if you are using the English module. See perlvar for a complete list of these. PERFORMANCE
NOTE: This was fixed in perl 5.20. Mentioning these three variables no longer makes a speed difference. This section still applies if your code is to run on perl 5.18 or earlier. This module can provoke sizeable inefficiencies for regular expressions, due to unfortunate implementation details. If performance matters in your application and you don't need $PREMATCH, $MATCH, or $POSTMATCH, try doing use English qw( -no_match_vars ) ; . It is especially important to do this in modules to avoid penalizing all applications which use them. perl v5.18.2 2014-01-06 English(3pm)
All times are GMT -4. The time now is 07:27 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy