Search Results

Search: Posts Made By: uuuunnnn
2,173
Posted By uuuunnnn
The field separator is #|#
The field separator is #|#
2,173
Posted By uuuunnnn
To @vgersh99, Our field separator is '#|#' ...
To @vgersh99,

Our field separator is '#|#'

Col1#|#Col2#|#Col3
1#|#11#|#123
2#|#12#|#241

----------------------------------------

To @jim mcnamara,

We have the piece of code which...
2,173
Posted By uuuunnnn
Failure means the code is not getting executed. ...
Failure means the code is not getting executed.
Please look at only at the awk command and split command

"$awkcmd" -F"#|#" -v c="${hashTableFields[index]}" 'NR==1{
for (i=1; i<=NF; i++)
...
2,173
Posted By uuuunnnn
AWK commands, in different OS
I have developed a bash shell script which works perfectly fine when any bash shell is in use except SunOS.

There are two commands that is not compatible.

#!/usr/bin/env bash

Problem
1. The...
1,759
Posted By uuuunnnn
To all of you who replied to this thread at once....
To all of you who replied to this thread at once.

Yoda
----
So it sounds like script1 and script2 are connecting to 2 different DB instances.

My Reply:- No they are running on same instance....
1,759
Posted By uuuunnnn
Not a possible solution because 1. SPOOL...
Not a possible solution because

1. SPOOL will write a file but I cannot read the file in sqlscript2. Do you know anyway to read a file in sqlscript? Please note I do not have PL/SQL context. I...
1,759
Posted By uuuunnnn
UNIX Solution - Calling SQL*Plus scripts
I have a requirement of bash shell script calling SQL*Plus script.

Shell Script

sqlplus <user>/<pwd> @sqlscript1 parameters
sqlplus <user>/<pwd> @sqlscript2 parameters

Now I need the values...
2,225
Posted By uuuunnnn
Thanks everyone. Yes the issue was that I...
Thanks everyone.

Yes the issue was that I was copying the values from windows to unix and carriage return was entered. As suggested I entered the values in unix and used

sed '/^$/d'...
2,225
Posted By uuuunnnn
I did that, but it completely deleted all lines ...
I did that, but it completely deleted all lines

[oratest@uswclora15 ~]$ tr -d '\r' < temp.hash.txt > temp_Input_file && mv temp_Input_file temp.hash.txt
[oratest@uswclora15 ~]$ vi...
2,225
Posted By uuuunnnn
This is the result [oratest@uswclora15 ~]$...
This is the result

[oratest@uswclora15 ~]$ od -tx1c temp.hash.txt
0000000 39 30 0d 0a 0d 0a 30 0d 0a 0d 0a 38 39 2e 35 36
9 0 \r \n \r \n 0 \r \n \r \n 8 9 . 5 6...
2,225
Posted By uuuunnnn
Tried many options but unable to delete blank lines from text file
Hi,

I tried the following options but was unable to delete blank lines from file

Input file = temp.hash.txt
temp.hash.txt content
90

0

89.56
0
0


57575.4544
56.89
1,816
Posted By uuuunnnn
Thanks everyone for the reply. durden_tyler 's...
Thanks everyone for the reply. durden_tyler 's comments helped me. Thanks.

The issue was in the temp.hash.txt, we encountered null values and the sum was not getting computed and resulted in...
1,816
Posted By uuuunnnn
Sum working but getting syntax error - awk
Hi,

I am using the following command to get the sum and it is working correctly but I am getting syntax error as well.

# ------------------------------------------------------------------------...
2,354
Posted By uuuunnnn
Exactly.. I saw there were extra spaces before...
Exactly.. I saw there were extra spaces before the command started. When i removed the spaces the commands worked !!!
2,354
Posted By uuuunnnn
Error in Shell Script - Can anyone help Pls
Please find my shell script below

-------------------------------------

#!/usr/bin/ksh
ORAUSER=$1
P_REQUEST_ID=$4
current_time=`date +%m%d%y.%H%M%S`
echo "Process started at `date...
7,134
Posted By uuuunnnn
Using Cursor in Unix - How to do it (Need Help)
Hi,

I have a table in which i have the following data


JOB_NO FILE_ID FILE_NAME
1546148 1378788 PDF Sample -1.pdf
1546148 1378789 PDF Sample -2.pdf
1546149 1378790 PDF Sample -3.pdf
...
Showing results 1 to 16 of 16

 
All times are GMT -4. The time now is 06:38 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy