Search Results

Search: Posts Made By: chetanojha
1,082
Posted By anbu23
file_path=${output_dir} files_list=`ls...
file_path=${output_dir}
files_list=`ls ${output_dir}/*MIGRATE*${city_name}*.out`
2,462
Posted By RudiC
Wouldn't it have been nice to see your attempt...
Wouldn't it have been nice to see your attempt with awk?

Howsoever, try
awk '
NR == FNR {TR[$1] = $2
SQ[NR] = $1
next
}
FNR == 1 ...
2,462
Posted By RudiC
Try awk ' NR == FNR {TR[$1] = $2 ...
Try
awk '
NR == FNR {TR[$1] = $2
SQ[NR] = $1
next
}
FNR == 1 {CITY = $3
for (n=split(PLINES, TMP); n; n--)...
2,462
Posted By Aia
$ awk 'FNR == NR{v[$1]=$2; next} FNR==1...
$ awk 'FNR == NR{v[$1]=$2; next} FNR==1 {n=$2=v[$3]+1} FNR==2 || $3=="All_Total" {$2=n}1' FS=: file1 FS=\| OFS=\| file2
Sale|11|London|3|28022018
Approval|11
Agent1|12|abc|abc|
Agent2|12|abc|abc|...
2,462
Posted By RudiC
If matching "Final" for the last line is OK, try ...
If matching "Final" for the last line is OK, try
awk '
NR == FNR {TR[$1] = $2
next
}
FNR == 1 {CITY = $3
for (n=split(PLINES,...
2,462
Posted By Aia
Would it be possible for you to post an example...
Would it be possible for you to post an example representing your actual input and output file?

It would help, as well, if you could pin-point what you have tried.
2,192
Posted By Don Cragun
Actually, you can do this in awk without a...
Actually, you can do this in awk without a temporary file because this is a special case:
awk '{sub(/\r/,""); print $NF > FILENAME; exit}' Input_file
The special case works only when awk has...
2,192
Posted By RavinderSingh13
Hello chetanojha, Command you ran is again...
Hello chetanojha,

Command you ran is again putting the output into same Input_file, note that awk doesn't have that facility in it, you could further try following on same then.

awk...
2,273
Posted By durden_tyler
Yup, my bad. The blank line was the issue. An...
Yup, my bad. The blank line was the issue.
An alternative solution using Perl invoked from a Bash shell script is as follows:


$
$ cat -n dataset
1 ...
2,273
Posted By vgersh99
modifying Don's approach with the new data file...
modifying Don's approach with the new data file input format:

#!/usr/bin/bash

printf '%s\n' "$@" |
awk -F, '
NR == FNR {
keys[$0]
next
}
$2 ~ "^[0-9][0-9]*$" { inv=$2;next}
{
...
2,273
Posted By RudiC
As always it pays off to do your posting VERY...
As always it pays off to do your posting VERY carefully! You see the difficulties that arise by just dropping an empty line!

Adapting Don Cragun's proposal to your new structures, try
awk -F, -v...
2,273
Posted By durden_tyler
I agree with RudiC that the End-Of-Line (EOL)...
I agree with RudiC that the End-Of-Line (EOL) characters in your data file are, most likely, non-standard. If you are on Unix/Linux, they should be "\n".
Can you show the EOL characters in your...
2,273
Posted By RudiC
This is what I get: 990001 MICHELIN 990002...
This is what I get:
990001 MICHELIN
990002 PIRELLI
990002 FORD
I suspect your input file having a non-standard structure like DOS line terminators (<CR>, \r, 0x0D) for instance which leads to the...
2,273
Posted By RudiC
Please consider the space between "-" and...
Please consider the space between "-" and "dataset" in Don Cragun's proposal. "-" designates the stdin file descriptor.

1. specify several file names in lieu of just "dataset". Please be aware...
2,273
Posted By Don Cragun
If you want to search for multiple strings on one...
If you want to search for multiple strings on one invocation of your script or if you want to print the 2nd field from C1 lines even if a given search string is also found on a C1 (as well as on a...
1,000
Posted By Scrutinizer
The single quotes prevent the shell variables...
The single quotes prevent the shell variables from being expanded. A better method is to use awk variables and pass shell variables to them using the -v option:

Try something like this:
awk -v...
21,337
Posted By RudiC
Try awk ' {gsub (/[|~]/,",") ...
Try awk ' {gsub (/[|~]/,",")
gsub (/“|”/, "\047")
}
NR==1 {gsub (/\047/,"")
TMP = "insert into EMPLOYEE (" $0 ") values ("
...
21,337
Posted By Akshay Hegde
or try this if sep is ~ awk -vFS="\\\~"...
or try this

if sep is ~
awk -vFS="\\\~" -vtable="Employee" -f import.awk infile

else if sep is |
awk -vFS="\\\|" -vtable="Employee" -f import.awk infile
21,337
Posted By Akshay Hegde
Ok use this function clean(x){...
Ok use this

function clean(x){ gsub(/[”“\047\042]/,x); gsub(FS,",") }
FNR==1{ clean(); h = $0 ; next}
{
clean("\047");
print "insert into",table,"("h")","values","("$0");"
}

...
21,337
Posted By pravin27
Modified code #!/bin/sh _fileName=$1 ...
Modified code
#!/bin/sh

_fileName=$1
_tableName=$(basename ${_fileName})
_SQLFile=${_tableName}".sql"

tr "|" "," < ${_fileName} | awk -F"," -v tableNM=$_tableName -v sqlfn=${_SQLFile}...
21,337
Posted By RudiC
This is what I get from above:insert into...
This is what I get from above:insert into Employee (Name,Dept,Empno,Salary,DOB,DOJ) values ('''Bell''' ,'''12''','''400''','''$2000''','''31/01/1965''','''01/10/1999''');
Don't forget, those ”“ are...
21,337
Posted By Akshay Hegde
This might help you akshay@Aix:/tmp$ cat...
This might help you

akshay@Aix:/tmp$ cat infile
“Name”|”Dept”|”Empno”|”Salary”|”DOB”|”DOJ”
“Alexander”|”10”|”200”|”$1000”|”25/05/1977”|”01/01/2015”...
21,337
Posted By RudiC
Sorry, can't read your post. Did you try the Go...
Sorry, can't read your post. Did you try the Go advanced and there Preview Post?

However, put it into the general action before the NR==1 pattern, like awk ' {gsub (/\|/,",")
...
21,337
Posted By RudiC
For that, try awk ' {gsub (/\|/,",") ...
For that, try awk ' {gsub (/\|/,",")
}
NR==1 {gsub (/"/,"")
TMP = "insert into EMPLOYEE (" $0 ") values ("
next
...
21,337
Posted By pravin27
Could this help you ? #!/bin/sh ...
Could this help you ?

#!/bin/sh

_fileName=$1
_tableName=$(basename ${_fileName})
_SQLFile=${_tableName}".sql"

tr "|" "," < ${_fileName} | awk -F"," -v tableNM=$_tableName -v...
Showing results 1 to 25 of 26

 
All times are GMT -4. The time now is 03:45 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy