It needs to be the exact phrase, or a regular expression in quotes.
For example:
returns nothing, since no lines contain literally just the text Warning and nothing else. So since our phrase is never found, we get no output beyond the first line of the CSV defining the fields.
If I make that a regular expression, however:
it works fine. We need the wildcard * because otherwise we're just looking for lines that contain just the exact string we're looking for, and not the string followed by zero or more other characters.
Lastly, we can of course put the full entire search phrase without wildcards, and get the same output:
Hmm, I don't undertsand that. It would make sense as a response if you were typing it with the left-hand quote missing (since the shell would interpret the ! symbol as you wanting to repeat a command called " from your Bash history), but with both the quotes present that shouldn't happen.
It seems when I am escaping the exclamation, it is working fine. Anyhow, thanks a lot for your patience for writing all these scripts for me. You have definitely made things a lot easier for me.
Dear friends
i have big file and i want to export the filw with new column for the lines that have same duplicate value in first column : ex : , ex :
-bash-3.00$ cat INTCONT-IS.CSV
M205-00-106_AMDRN:1-0-6-22,12-662-4833,intContact,2016-11-15 02:32:16,50... (9 Replies)
Hi All,
hope you all are doing well!
I kindly ask you for shell scripting help, here is the description:
I have huge number of files shown below on date wise, which contains different strings(numbers you can say) including 505001 and 602001.
... (14 Replies)
Hi All,
I'm stuck-up in finding a way to skip the delimiter which come within double quotes using awk or any other better option. can someone please help me out.
Below are the details:
Delimited: |
Sample data: 742433154|"SYN|THESIS MED CHEM PTY.... (2 Replies)
Hi All,
I need to extract duplicate rows from a file and write these bad records into another file. And need to have a count of these bad records.
i have a command
awk '
{s++}
END {
for(i in s) {
if(s>1) {
print i
}
}
}' ${TMP_DUPE_RECS}>>${TMP_BAD_DATA_DUPE_RECS}... (5 Replies)
I am very new to shell scripting, current try to do a sorting of a text file in paragraphs with ksh script.
example:
File content:
A1100001 line 1 = "testing"
line 2 = something,
line 3 = 100
D1200003 line 1 = "testing"
line 2 = something,
line 3 = 100
B1200003 line 1 =... (3 Replies)
Hi all,
I want to extract some paragraphs out of a file under certain conditions.
- The paragraph must start with 'fmri'
- The paragraph must contain the string 'restarter svc:/system/svc/restarter:default'
My input is like that :
fmri svc:/system/vxpbx:default
state_time Wed... (4 Replies)
Hi,
I search all forum, but I can not find solutions of my problem :(
I have multiple files (5000 files), inside there is this data :
FILE 1:
1195.921 -898.995 0.750312E-02-0.497526E-02 0.195382E-05 0.609417E-05
-2021.287 1305.479-0.819754E-02 0.107572E-01 0.313018E-05 0.885066E-05
... (15 Replies)
Hello everybody!
I am quit new here and hope you can help me.
Using an awk script I am trying to extract data from several files. The structure of the input files is as follows:
TimeStep parameter1 parameter2 parameter3 parameter4
e.g.
1 X Y Z L
1 D H Z I
1 H Y E W
2 D H G F
2 R... (2 Replies)
Hi,
I am trying to filter out those paragraphs that contains 'CONNECT', 'alter system switch logfile'. That means say the input file is :
-------------------------------------------------------
Wed Jun 7 00:32:31 2006
ACTION : 'CONNECT'
CLIENT USER: prdadm
CLIENT TERMINAL:
Wed Jun 7... (7 Replies)
I]hi all
i am in confusion since last 2 days :(
i posted thraed yesterday and some friends did help but still i couldnt get solution to my problem
let it be very clear
i have a long log file of alkatel switch and i have to seperate the minor major and critical alarms shown by ! , !! and !!!... (6 Replies)