Search Results

Search: Posts Made By: Gangadhar Reddy
1,114
Posted By Gangadhar Reddy
Help with listing file name containing particular text and count of lines with 10 characters.
Hi,
I've 2 queries.

I need to list files which doesn't contain a particular text in the content. For example say, I need to list files which doesn't contain string "abc" from all files ending...
7,273
Posted By Gangadhar Reddy
Yup. I used the max depth option and it worked. ...
Yup. I used the max depth option and it worked.

find ./ -maxdepth 1 -type f -name "*.csv" -mtime +6 -exec rm -f {} \;


Thanks everyone for all your help.
7,273
Posted By Gangadhar Reddy
I'm new to unix. I would like to ignore the...
I'm new to unix. I would like to ignore the error. Probably that's better as in other environments there will be other subdirectories.

So u mean to say, I've to do this way. I hope it will ignore...
7,273
Posted By Gangadhar Reddy
[Solved] Issue with deleting files through find
Hi,
I have a script similar to this

#!/bin/ksh
cd /orcl/bir/eod_badfiles
find ./ -type f -name "*.csv" -mtime +6 -exec rm -f {} \;
find ./ -type f -name "*.bad" -mtime +6 -exec rm -f {} \;
cd...
1,686
Posted By Gangadhar Reddy
Need help deleting files one week older
Hi,
I need to delete *.bad files which are 1 week old. How can I achieve that. I tried doing through below script but it deletes all the files.


find ./ -mtime +7 -exec rm *.bad {} \;


The...
874
Posted By Gangadhar Reddy
That's simply awesome. It worked as needed....
That's simply awesome. It worked as needed. Thanks much for the help.
874
Posted By Gangadhar Reddy
Merging and differentiating 2 files
I have 2 csv files say file1 and file2. Based on 2 columns, I want to check if the values of file1 is present in file 2 or not. If it's not present then it should create a file with the values which...
2,286
Posted By Gangadhar Reddy
We connect after spool. There are statements...
We connect after spool. There are statements which have to run with diff login's and hence we keeep on switching user's .. that's when the password is spooled in log file.
2,286
Posted By Gangadhar Reddy
Yes, this is Oracle sql*plus. The connect is...
Yes, this is Oracle sql*plus. The connect is after spool on. And accessed by UNIX login. One thing that we could do is set echo off .. which will not display the sql statements which are running....
2,286
Posted By Gangadhar Reddy
Password Obscuring Technique
Hi,
We have a unix shell script which tries login to database. The user name and password to connect to database is stored in a file connection.sql.

Now connection.sql has contents

def...
2,817
Posted By Gangadhar Reddy
Thanks everyone for showing so much interest....
Thanks everyone for showing so much interest. First of all i'm not setting $HOME. It is already set and i'm not sure how. Even I was of the belief that .profile is invoked first and sets the home...
2,817
Posted By Gangadhar Reddy
Need to navigate to HOME directory when I log in
Hi,
Currently i'm logging as a user say atgdev. When I login it takes me to directory /.
I see the home directory set as /home/atgdev/

I want that when i log in it shud directly go to my home...
1,452
Posted By Gangadhar Reddy
Thank you dear .. It worked :)
Thank you dear .. It worked :)
1,452
Posted By Gangadhar Reddy
[Solved] Need help changing a field from MM/DD/YY to DD/MM/YY format
Hi,
I need help changing a field from MM/DD/YY to DD/MM/YY format. Suppose a file a.csv. The record is


"11/16/09","ABC"," 1","EU","520892414","1","600","31351000","1234567","ANR BANK CO....
1,131
Posted By Gangadhar Reddy
Between, I don't need the sequence appended...
Between, I don't need the sequence appended (ignore that) .. just the Proc Date : 19/11/10 & PROC UNIT : 273.

In the report, there might be multiple records for a particular date or may be...
1,131
Posted By Gangadhar Reddy
[Solved] Need help formatting a file
I have a report similar to the below:

^L"0.1","Run Date : 19/11/10 Navneet Bank, N.A. PAGE NO : 1"
"0.2",Proc Date...
1,139
Posted By Gangadhar Reddy
Merging lines
Thanks it worked for me. I have one more question on top of that. We had few records which were splitted in 2 lines instead of one. Now i identified those lines. The file is too big to open via vi...
19,150
Posted By Gangadhar Reddy
Thanks so much for ur help guys. Also, I wanna...
Thanks so much for ur help guys. Also, I wanna print the line numbers of those lines too? How shud i achieve that?
19,150
Posted By Gangadhar Reddy
Need to find lines where the length is less than 50 characters
Hi,
I have a big file say abc.csv. And in that file, I need to find lines whose length is less than 50 characters. How can it be achieved? Thanks in advance.

Thanks
1,201
Posted By Gangadhar Reddy
Ravi, Thanks a lot for your help. It worked....
Ravi,
Thanks a lot for your help. It worked. But I'm also trying some other option. For example storing sourcefile & targetfile name in a file and then reading it. This will help me if i want to add...
1,201
Posted By Gangadhar Reddy
Yes, I do have Oracle installed. How can I...
Yes, I do have Oracle installed.

How can I achieve?
1,201
Posted By Gangadhar Reddy
Need code to ftp files in loop
I have files like this beginning from 082008 (MMYYYY) to 112010 (MMYYYY)


I need to fetch this files through ftp in loop. How can I achieve it?

I tried with the following code. But I'm not...
2,756
Posted By Gangadhar Reddy
Thanks everyone for the explanations. It helped...
Thanks everyone for the explanations. It helped and I got it :)

Regards,
Gangadhar
2,756
Posted By Gangadhar Reddy
Whats does this export command do?
Hi,
I know that export command is used to set environment variable.

Can you please explain what ":--100" in the below command is doing?


export ROWSTOLOAD=${ROWSTOLOAD:--100}


Thanks,...
6,769
Posted By Gangadhar Reddy
Cgkmal, Check the attachment. Can you write a...
Cgkmal,
Check the attachment. Can you write a script to format rows in column.

Thanks,
Gangadhar
Showing results 1 to 25 of 39

 
All times are GMT -4. The time now is 09:16 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy