Search Results

Search: Posts Made By: neelmani
2,149
Posted By neelmani
@scrutinizer thanks a lot. suppose we have...
@scrutinizer
thanks a lot.
suppose we have to perform my serch on a folder like:

sed -n '/(/{/(.*select/!N;//p;}' path_name/*

i am getting all the matching line without the file name. Could...
2,149
Posted By neelmani
@Scrutinizer: thanks for your reply.. could you...
@Scrutinizer: thanks for your reply.. could you please explain me the below command(especially the part highlighted in red.

sed -n '/(/{/(.*select/!N;//p;}' infile

@sinari
Hey thanks for...
2,149
Posted By neelmani
@scrutinizer + sinari.. thanks for you reply. The...
@scrutinizer + sinari.. thanks for you reply. The sed command given by you is working for b.txt. But it will not work for all the cases to catch a two word keyword(here it is "(" and "select" words)...
2,149
Posted By neelmani
@spacebar Thanks for our reply. but i dont want...
@spacebar Thanks for our reply. but i dont want all to be listed out. i want to catch a two word keyword(here it is "(" and "select" words) which may be seperated by a new line(may be searated by ...
2,149
Posted By neelmani
How to catch a two word keyword which may contain a new line(may include spaces or tab) in it?
How to catch a two word keyword which may contain a new line(may include spaces or tab) in it.
for example there is a file a.txt.

$more a.txt
create view
as
(select from
............
3,241
Posted By neelmani
how to ignore multiline comment from a file while reading it
Hi friends , I want to ignore single and multiline comment( enclosed by " \* *\" ) of a file whle reading it. I am using the below code.


nawk '/\/\*/{f=1} /\*\//{f=0;next} !f' proc.txt | while...
19,955
Posted By neelmani
i checked with the below command, its working: ...
i checked with the below command, its working:

> dos2unix file.sql file.sql


previously i was not giving the target filename with the dos2unix command

thanks for your time
19,955
Posted By neelmani
hey sorry its working but what if i want to keep...
hey sorry its working but what if i want to keep the same file name


@: > tr -d '\015' < y > y




its generating an empty y file.
19,955
Posted By neelmani
when i am using the above cmd i am getiing the...
when i am using the above cmd i am getiing the following message

tr -d '\015' y.sql x.sql
Usage: tr [ -cds ] [ String1 [ String2 ] ]


:(
19,955
Posted By neelmani
problem with dos2unix command
hi friends i am using the dos2unix command to remove the ^M characters coming at the end of each line , but i getting the folowing message:


> dos2unix file.sql
could not open /dev/kbd to get...
4,802
Posted By neelmani
thanks. one issue insted of hardcoding it, if i...
thanks. one issue insted of hardcoding it, if i am storing the "as" keywornd in a file and serching it.
for example:
word="as"


awk '/^\$word$/{print p}{p=$0}' file


not getting anything...
4,802
Posted By neelmani
its working!!. could you please explin the "p=$0"...
its working!!. could you please explin the "p=$0" part. thanks
4,802
Posted By neelmani
issue with grep -B option
hi friends,
i have a file where every word is present in a new line for example:
more file1:


i want to fetch previous line wherever i am getting "as" as a keyword.
i tried at home the...
2,557
Posted By neelmani
@scrutinizer : thanks its working. :) may i know ...
@scrutinizer : thanks its working. :) may i know the reason behind writting the full path?
2,557
Posted By neelmani
but that option does not seems to be working in...
but that option does not seems to be working in ksh88 :(

> echo "sachin#tendulkar" | grep -x "sachin"
grep: illegal option -- x
Usage: grep -hblcnsviw pattern file . . .


i dont know what...
2,557
Posted By neelmani
Exact match and #
Hi friends,
i am using the following grep command for exact word match:

>echo "sachin#tendulkar" | grep -iw "sachin"
output: sachin#tendulkar


as we can see in the above example that its...
959
Posted By neelmani
thanks for your reply bartus. but i am not...
thanks for your reply bartus. but i am not familiar with the perl commad. will it work in ksh88??i will really appreciate if anyone can give me awk or sed or any similar command. thank you
959
Posted By neelmani
how to get a worrd which falls after a keyword separated by comma
Hi friends,
i have a file which contains all words(including comma) in different line.
for example:
more file.txt

select
column1
from
table1
,
table2
join
table3
0n
condition

i...
15,179
Posted By neelmani
Problem with tr command
Hi friends,
Today I found one strange behaviour of the tr command.
I used the following command:

echo "NEE"|tr [A-Z] [a-z]

Sometimes it was giving "nee" as ouput . sometimes it was giving...
2,425
Posted By neelmani
problem with the my wrapper script
Hi friends,
i am working in ksh88. i am running the follwing wapper script in background to run two jobs parallely((eg nohup wrapper.ksh &)::

wrapper.ksh...
1,231
Posted By neelmani
thanks lot :)
thanks lot :)
1,231
Posted By neelmani
@corona thanks for your reply. i have two more...
@corona thanks for your reply. i have two more questions.1> for the above solution, does all four script needs to be in same folder??.. 2> will it work in korn shell?
1,231
Posted By neelmani
how to run a script in back ground within a script
Hi friends,
i have two scripts(call it bg1.ksh and bg2.ksh) which needs to be run parallely on background from a script(call it test1.ksh).then i have to wait till these two back ground...
2,188
Posted By neelmani
@glev2005 not working i am getting same error as...
@glev2005 not working i am getting same error as :fgrep: illegal option -- R
actually the r option is not there in fgrep.. any other way??
2,188
Posted By neelmani
How to recursively search for a list of keywords in a given directory?
Hi all,
how to recursively search for a list of keywords in a given directory??

for example:
suppose i have kept all the keywords in a file called "procnamelist" (in separate line)
and i have...
Showing results 1 to 25 of 48

 
All times are GMT -4. The time now is 12:46 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy