Search Results

Search: Posts Made By: nikhil jain
5,608
Posted By nikhil jain
Thanks Corona for your reply. Can the problem...
Thanks Corona for your reply.
Can the problem statement be addressed differently without using expect.
5,608
Posted By nikhil jain
Help with Expect Utility
Hi aLL,

I have a requirement where in i need to read the file from while loop as shown in the code below


while read line
do
command $line
done < list.txt


But after every command it...
1,242
Posted By nikhil jain
File Operations
Hi Folks,

Below is example of an Input data which is used, based on the last 2, 3 & 4 column, I want my first column data to be collated as shown in the output section.



a,ac,tc,ic...
9,435
Posted By nikhil jain
Hi Yoda, I did used the code again you...
Hi Yoda,

I did used the code again you shared in the link with no modifications except for the file names and file paths, It worked swiftly.

Thanks a lot.
9,435
Posted By nikhil jain
Hi Yoda, I just referred the code shared in...
Hi Yoda,

I just referred the code shared in the link and i did changes accordingly which i'm assuming but i'm not still getting the attachment and this time even the html format took a hit in the...
9,435
Posted By nikhil jain
Help with attachment in sendmail
Hi Folks

I have below code, which is able to successfully send the content of the output file as html which is body, but i did tried uuencode & mailx -a for sending attachments but to no avail.
...
1,117
Posted By nikhil jain
Hi ALL, How do i get that with the extra...
Hi ALL,


How do i get that with the extra row? Sum of all the numeral columns.

Oilv name Total Count ACKED NOT_ACKED
AWX3 3 0 3
...
1,117
Posted By nikhil jain
Trying to find the Count of Other Column
Content of My File below :-


Name,Direport,Management chain,Owner,Entity,Oilv name,Oilv policy class,Oilv policy type,Oilv type severity,Entity status,Acked,Resolution,Plan to fix by,Id...
1,967
Posted By nikhil jain
Hi Rudi I'm able to do through sed, was just...
Hi Rudi

I'm able to do through sed, was just bit curious if it was possible with the same awk script you shared. anyways thanks my solution below

content.txt is my data file and the process.sh...
1,967
Posted By nikhil jain
Hi Rudi, Thanks for the solution, but the...
Hi Rudi,

Thanks for the solution, but the o/p i'm getting is bit different.

I dont want the comma(,) between the src urls, want it only after the directive url as there are only 2 columns.
...
1,967
Posted By nikhil jain
Hi Don, I tried writing below code, It was...
Hi Don,

I tried writing below code, It was not working, hence turned upto the forum for the help

paste -d, -s norm.txt |awk -F "directive url is : " '{print $2 $3 $4}' | awk -F ",,,Src urls are...
1,967
Posted By nikhil jain
Data Processing
I have below Data ***************************************************
********************BEGINNING-1********************

directive url is :...
842
Posted By nikhil jain
Hi Ravinder Thanks a lot :) It did worked. ...
Hi Ravinder

Thanks a lot :) It did worked.

---------- Post updated 03-29-17 at 01:02 PM ---------- Previous update was 03-28-17 at 05:04 PM ----------

awk...
842
Posted By nikhil jain
File Processing
Hi ALL,

I have below file generated as o/p after my java code runs, Student Id 22552530 Below is the size
(22ms, 835 bytes
Student Id 2124592 Below is the size
(55ms, 703 bytes
Student Id...
1,131
Posted By nikhil jain
Zaxxon, I'll try implementing if u give the...
Zaxxon,

I'll try implementing if u give the solution for small file as well.
Plz help
1,131
Posted By nikhil jain
Rudi, It is a huge file of some 8 GB's, the...
Rudi,

It is a huge file of some 8 GB's, the prob is we have constraint of space.. Hence can't try...
1,131
Posted By nikhil jain
[sdp@blr-qe101 .nikhil]$ sh filler.sh c10.txt ...
[sdp@blr-qe101 .nikhil]$ sh filler.sh c10.txt
unique_bank_transaction_id|merchant name_GT|MERCHANT_NAME_TDE|output
100|100|100|100
[sdp@blr-qe101 .nikhil]$ sh filler.sh 10.txt ...
1,131
Posted By nikhil jain
Proper Column wise matching
My below code works fine if none of the columns has pipe as its content in it, If it has the pipe in any of the content then the value moves to the next column.

I wanted my code to work fine even...
2,223
Posted By nikhil jain
Yep got to know thru google, any other solution ?...
Yep got to know thru google, any other solution ? which will help in performance booster.
2,866
Posted By nikhil jain
Don, To tell u frankly even i was not aware...
Don,

To tell u frankly even i was not aware of the same until i ran your script that it won't work on that part of data.
Can you plz check and help on the same ?

My file is of huge size and it...
2,866
Posted By nikhil jain
Don, I apologise for that, I just gave the...
Don,

I apologise for that, I just gave the part of my file as sample file, My whole file is of around 2 GB, This was quite unusual, dint expected.
Can you plz help with this?
2,223
Posted By nikhil jain
Stomp, Thanks a lot for that, but this thing...
Stomp,

Thanks a lot for that, but this thing does not ignore the case and do a strong word checking even after options "i" and "w" used.
May be something to do with "F" option, It does overwrite...
2,223
Posted By nikhil jain
Rudi, Thanks for that, it works fine for...
Rudi,

Thanks for that, it works fine for the smaller number of files, with huge files size varying in 5-6 GB, performance dips gradually.
Is there any alternate approach?

MadeinGermany --...
2,866
Posted By nikhil jain
Don, I tried on my whole file after...
Don,

I tried on my whole file after executing your script, It is not able to replace true positive in every place properly, one of the small examples shown below.
after executing your script
...
2,223
Posted By nikhil jain
Processing too slow with loop
I have 2 files

file 1 : contains
ALINE
ALINE BANG
B ON A
B.B.V.A.
BANG AMER CORG
BANG ON MORENA
BANG ON MORENAIC
BANG ON MORENAICA
BANG ON MORENAICA CORP
BANG ON MORENAICA N.A



...
Showing results 1 to 25 of 176

 
All times are GMT -4. The time now is 06:45 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy