Search Results

Search: Posts Made By: mystition
1,228
Posted By mystition
Dear Both; Many thanks for the response....
Dear Both;

Many thanks for the response. Unfortunately - these commands are not working as expected.

Input File 1:

$ cat test2.txt
1-1001JRL,Recurring
1-1001W5O,One-Time...
1,228
Posted By mystition
Issue with awk when joining two files when field has '-' hyphen
Dear Community;

I need to join two files but I am facing issues.

1st file has multiple columns. Primary (1st) columns has unique values. There are other columns out of which some has...
2,154
Posted By mystition
Many Thanks Rudi; This worked fine. Can you...
Many Thanks Rudi; This worked fine.

Can you please suggest what I was doing wrong?

BR//
2,154
Posted By mystition
awk date too many open files
Dear Community;

I have a csv file with msb and lsb in $3 and $5 fields which provides the epochtime (factor 65536). Further I need to convert epochtime to readable datetime.
But am getting an...
2,112
Posted By mystition
While the other codes using awk were giving...
While the other codes using awk were giving slight errors (while printing the 'id' in 'description', they were replacing the existing text) which could be tweaked, your code worked perfectly. Many...
2,112
Posted By mystition
sed with pattern using variable
Dear Community;

I have a long xml file (100k+ lines) with patterns like below:

<OfferDefinition Id="123">
<Type>Timer</Type>
<Description>Test Text1</Description>
...
1,178
Posted By mystition
Worked like a charm... Thanks once again Rudy!
Worked like a charm... Thanks once again Rudy!
1,178
Posted By mystition
Previous and Post lines in a pattern
Dear Community;

I am posting this after looking at several solutions that were not fully relevant to the issue that I am facing.

I have a large xml file, 100k+ lines which have patterns like...
4,575
Posted By mystition
Perfect! Thanks Rudy! Can you please explain...
Perfect! Thanks Rudy!

Can you please explain the second part of the code:
{delete A[$1]} END {for (a in A) print A[a] > "COND2"}'

Actually for Condition 3 - I also need to print one more...
4,575
Posted By mystition
Comparing multiple columns using awk
Hello All;

I have two files with below conditions:

1. Entries in file A is missing in file B (primary is field 1)
2. Entries in file B is missing in file A (primary is field 1)
3. Field 1 is...
6,468
Posted By mystition
Works like magic. I only edited the range and it...
Works like magic. I only edited the range and it worked smoothly for me.

Can you please explain the code?:b:

----------------------------

My understanding is that you are dividing the Field1...
6,468
Posted By mystition
I am trying to understand your code - its a...
I am trying to understand your code - its a really good piece of work.

But I am getting an error.


awk -F, 'FNR==NR{a[$3]=$1 FS $2;next}{for(i in a){c=substr($1,length($1)-1);split(a[i],d);...
6,468
Posted By mystition
Sorry for being abstract. I have the ranges with...
Sorry for being abstract. I have the ranges with different lengths to imply that there can be 'n' number of ranges - which I can specify manually. I need help with the logic to do so.

----------...
6,468
Posted By mystition
[SOLVED] awk and substr
Hello All;

I have an input file 'abc.txt' with below text:

512345977,213458,100021
512345978,213454,100031
512345979,213452,100051
512345980,213455,100061
512345981,213456,100071...
5,703
Posted By mystition
Thanks RudiC; Can you please explain the...
Thanks RudiC;

Can you please explain the code:

I was able to understand the first part - where we are setting the fields in file 1 in array with a FS before them .
awk 'NR==FNR {T[$1]=FS $1;...
5,703
Posted By mystition
Thanks
I just moved the field separator and it worked. Many thanks dear!


awk -F"," 'FNR==NR{A[$1]=$1;next} ($1 in A){print $0 FS A[$1]} !($1 in A){print $0}' file1 file2
5,703
Posted By mystition
Join two files using awk
Hello All;

I have two files:

File1:

abc
def
pqr


File2:

abc,123
mno,456
def,989
pqr,787
ghj,678
841
Posted By mystition
Need Help with awk
Dear Community;

I have the following file:

596688171,2,2014-05-27,,06:01:20+0300,,2,,,
582187651,2,2014-04-29,,04:27:28+0300,,2,,,
...
Forum: Solaris 04-21-2013
1,735
Posted By mystition
Thanks
This is again one of the many times that you have helped!!!
Many Thanks DukeNuke:)
Forum: Solaris 04-21-2013
1,735
Posted By mystition
Info on commands
Hey Community!

I need to use the following commands in Solaris 5.10 machine:

xxd
sha256sum

I have them in my RHEL machine, but I am unable to find a package using which I can use them in my...
Forum: Solaris 09-23-2012
2,663
Posted By mystition
Many Thanks jlliagre, Your post was very...
Many Thanks jlliagre,

Your post was very helpful. Only installgrub /boot/grub/stage1 /boot/grub/stage2 c0t5000CCA02533FC85d0s0 was not working for me. But the pools are created now as per...
Forum: Solaris 09-20-2012
2,663
Posted By mystition
Zpool query
Hi,

I have an X86pc with Solaris 10 and ZFS system. It has 8 similar disks.
I need help in creating some zpools and changing the mount-point of a slice.

Currently, the zpool in my system is...
Forum: Solaris 08-14-2012
9,187
Posted By mystition
Hi, Am not sure how can i check this???...
Hi,


Am not sure how can i check this??? :p
Please suggest.

Yes, all the four disks have the same geometry.

No, these disks are all new disks and have not been used by any other system....
Forum: Solaris 08-13-2012
9,187
Posted By mystition
fmthard error
Hello,

I have a X86 server (solaris 10) with 4 similar disks. One of these 4 disks is the root disk.
I need to create a mirror disk to the root disk and two hot spares using the remaining 3...
Forum: Solaris 07-15-2012
3,583
Posted By mystition
Solaris Re-installation failure
Hello Community.

I had installed solaris 5.10 OS on an X86 server using a custom-jumpstart installation program and it was performed successfully.

Due to some issues, i had to go for...
Showing results 1 to 25 of 41

 
All times are GMT -4. The time now is 12:07 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy