Sponsored Content
Operating Systems AIX Want to find the last access time for large group of files at one go Post 302584054 by dbanrb on Thursday 22nd of December 2011 04:20:27 AM
Old 12-22-2011
Want to find the last access time for large group of files at one go

Dear All,

I'm working as a DBA and dont have much knowledge at OS level commands.we have requirement that we need find the files which has been last accessed >= apr 2010and also access date <= apr 2010 for a large set of files.Do know some commands likeistat, ls -u.But can anyone provide me the exact command with FIND.Awaiting for your response and thanking you in adv

Regards,
Vandana
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Need to find large files

I have found the following code on this forum ls -lh | awk '{print $5,$9}' | sort -n Its purpose is to show a list of files in a dir sorted by file size. I need to make it recursive ls -lhR | awk '{print $5,$9}' | sort -n The problem is that there are lots of files on the... (3 Replies)
Discussion started by: jadionne
3 Replies

2. UNIX for Dummies Questions & Answers

Problem using find with prune on large number of files

Hi all; I'm having a problem when want to list a large number of files in current directory using find together with the prune option. First i used this command but it list all the files including those in sub directories: find . -name "*.dat" | xargs ls -ltr Then i modified the command... (2 Replies)
Discussion started by: ashikin_8119
2 Replies

3. Shell Programming and Scripting

Find all files with group read OR group write OR user write permission

I need to find all the files that have group Read or Write permission or files that have user write permission. This is what I have so far: find . -exec ls -l {} \; | awk '/-...rw..w./ {print $1 " " $3 " " $4 " " $9}' It shows me all files where group read = true, group write = true... (5 Replies)
Discussion started by: shunter63
5 Replies

4. UNIX for Advanced & Expert Users

Find common Strings in two large files

Hi , I have a text file in the format DB2: DB2: WB: WB: WB: WB: and a second text file of the format Time=00:00:00.473 Time=00:00:00.436 Time=00:00:00.016 Time=00:00:00.027 Time=00:00:00.471 Time=00:00:00.436 the last string in both the text files is of the... (4 Replies)
Discussion started by: kanthrajgowda
4 Replies

5. Shell Programming and Scripting

Using find in a directory containing large number of files

Hi All, I have searched this forum for related posts but could not find one that fits mine. I have a shell script which removes all the XML tags including the text inside the tags from some 4 million XML files. The shell script looks like this (MODIFIED): find . "*.xml" -print | while read... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

6. UNIX for Dummies Questions & Answers

find large files in root filesystem and exclude others

I am wondering if there is a way to search for top 10 files in size in root filesystem but exclude all other mounts including nfs mounts . For example excluded /var /boot /app /app1 /u01/ (1 Reply)
Discussion started by: gubbu
1 Replies

7. Red Hat

Access time of files and directories

My query please: What I saw how access times of a file and directories work. 1) For a file the access time is the time when I 1st access it after last modification of the file, i.e., if the file is modified at 10 AM and then I access it at 11 AM. After than whenever I access without... (7 Replies)
Discussion started by: ravisingh
7 Replies

8. UNIX for Dummies Questions & Answers

Find common numbers from two very large files using awk or the like

I've got two files that each contain a 16-digit number in positions 1-16. The first file has 63,120 entries all sorted numerically. The second file has 142,479 entries, also sorted numerically. I want to read through each file and output the entries that appear in both. So far I've had no... (13 Replies)
Discussion started by: Scottie1954
13 Replies

9. Shell Programming and Scripting

Find large files

All, I'm running a simple find for large files in a bash shell that works fine: find <dir> -xdev -ls | awk '{print $7,$8,$9,$10,$11}' | sort -nr | head It gives me everything I want: file size, time stamp, and file name. However, I'd like to have the file size in human readable form. ... (4 Replies)
Discussion started by: hburnswell
4 Replies

10. Shell Programming and Scripting

Find Large Files Recursively From Specific Directory

Hi. I found many scripts in the web of achieving this. But I like to use this one find /EDWH-DMT03 -xdev -size +10000 -exec ls -la {} \;|sort -n -k 5 > LARGE.rst But the problem is, why it still list out files with 89 bytes as the output? Is there anything wrong with the command? My... (7 Replies)
Discussion started by: aimy
7 Replies
RADDEPEND(1)						      General Commands Manual						      RADDEPEND(1)

NAME
raddepend - find RADIANCE scene dependencies SYNOPSIS
raddepend file .. DESCRIPTION
Raddepend uses getbbox(1) to expand scene file arguments and find file dependencies for make(1) or rad(1). Raddepend looks only in the current directory, so dependencies hidden elsewhere in the filesystem will not be found or named. The output is the name of files, one per line, that were accessed during the expansion of the input file arguments. The file arguments are excluded from the list. If no input files are given, the standard input is read. AUTHOR
Greg Ward BUGS
On some older NFS systems, the file access dates are not updated promptly. As a result, raddepend may not be 100% reliable on these sys- tems. If the output seems to be missing essential files, this is no doubt why. The only fix is to put in a longer sleep time between the getbbox call and the final ls(1). SEE ALSO
make(1), oconv(1), rad(1), xform(1) RADIANCE
4/15/94 RADDEPEND(1)
All times are GMT -4. The time now is 05:54 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy