Want to find the last access time for large group of files at one go


 
Thread Tools Search this Thread
Operating Systems AIX Want to find the last access time for large group of files at one go
# 1  
Old 12-22-2011
Want to find the last access time for large group of files at one go

Dear All,

I'm working as a DBA and dont have much knowledge at OS level commands.we have requirement that we need find the files which has been last accessed >= apr 2010and also access date <= apr 2010 for a large set of files.Do know some commands likeistat, ls -u.But can anyone provide me the exact command with FIND.Awaiting for your response and thanking you in adv

Regards,
Vandana
# 2  
Old 12-22-2011
Code:
 
date <= apr 2010
find /path/to/search -type f -mtime +631 -print 
last accessed >= apr 2010
find /path/to/search -type f -mtime -631 -print

# 3  
Old 12-22-2011
Code:
perl -MFile::Find '-MTime::Local qw/timelocal/' -le'
  find { 
    wanted => sub {
      return unless -f;
      $atime = (stat)[8];
      print $File::Find::name, scalar localtime $atime
        if timelocal(0, 0, 0, 1, 3, 2010 - 1900) <= $atime and
          $atime <= timelocal(59, 59, 23, 30, 3, 2010 - 1900)           
        }
    }, shift
  ' .

The argument to pass to the script (a dot . in this example) is the name of the directory to search.
# 4  
Old 12-22-2011
Hi,

Code given above is not understand..can u please explain

Regards
Vandana

---------- Post updated at 04:13 PM ---------- Previous update was at 04:12 PM ----------

Hi itkamaraj,

I need to find files by access time and not by modification time.
# 5  
Old 12-22-2011
man find search for -atime
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Find Large Files Recursively From Specific Directory

Hi. I found many scripts in the web of achieving this. But I like to use this one find /EDWH-DMT03 -xdev -size +10000 -exec ls -la {} \;|sort -n -k 5 > LARGE.rst But the problem is, why it still list out files with 89 bytes as the output? Is there anything wrong with the command? My... (7 Replies)
Discussion started by: aimy
7 Replies

2. Shell Programming and Scripting

Find large files

All, I'm running a simple find for large files in a bash shell that works fine: find <dir> -xdev -ls | awk '{print $7,$8,$9,$10,$11}' | sort -nr | head It gives me everything I want: file size, time stamp, and file name. However, I'd like to have the file size in human readable form. ... (4 Replies)
Discussion started by: hburnswell
4 Replies

3. UNIX for Dummies Questions & Answers

Find common numbers from two very large files using awk or the like

I've got two files that each contain a 16-digit number in positions 1-16. The first file has 63,120 entries all sorted numerically. The second file has 142,479 entries, also sorted numerically. I want to read through each file and output the entries that appear in both. So far I've had no... (13 Replies)
Discussion started by: Scottie1954
13 Replies

4. Red Hat

Access time of files and directories

My query please: What I saw how access times of a file and directories work. 1) For a file the access time is the time when I 1st access it after last modification of the file, i.e., if the file is modified at 10 AM and then I access it at 11 AM. After than whenever I access without... (7 Replies)
Discussion started by: ravisingh
7 Replies

5. UNIX for Dummies Questions & Answers

find large files in root filesystem and exclude others

I am wondering if there is a way to search for top 10 files in size in root filesystem but exclude all other mounts including nfs mounts . For example excluded /var /boot /app /app1 /u01/ (1 Reply)
Discussion started by: gubbu
1 Replies

6. Shell Programming and Scripting

Using find in a directory containing large number of files

Hi All, I have searched this forum for related posts but could not find one that fits mine. I have a shell script which removes all the XML tags including the text inside the tags from some 4 million XML files. The shell script looks like this (MODIFIED): find . "*.xml" -print | while read... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

7. UNIX for Advanced & Expert Users

Find common Strings in two large files

Hi , I have a text file in the format DB2: DB2: WB: WB: WB: WB: and a second text file of the format Time=00:00:00.473 Time=00:00:00.436 Time=00:00:00.016 Time=00:00:00.027 Time=00:00:00.471 Time=00:00:00.436 the last string in both the text files is of the... (4 Replies)
Discussion started by: kanthrajgowda
4 Replies

8. Shell Programming and Scripting

Find all files with group read OR group write OR user write permission

I need to find all the files that have group Read or Write permission or files that have user write permission. This is what I have so far: find . -exec ls -l {} \; | awk '/-...rw..w./ {print $1 " " $3 " " $4 " " $9}' It shows me all files where group read = true, group write = true... (5 Replies)
Discussion started by: shunter63
5 Replies

9. UNIX for Dummies Questions & Answers

Problem using find with prune on large number of files

Hi all; I'm having a problem when want to list a large number of files in current directory using find together with the prune option. First i used this command but it list all the files including those in sub directories: find . -name "*.dat" | xargs ls -ltr Then i modified the command... (2 Replies)
Discussion started by: ashikin_8119
2 Replies

10. UNIX for Dummies Questions & Answers

Need to find large files

I have found the following code on this forum ls -lh | awk '{print $5,$9}' | sort -n Its purpose is to show a list of files in a dir sorted by file size. I need to make it recursive ls -lhR | awk '{print $5,$9}' | sort -n The problem is that there are lots of files on the... (3 Replies)
Discussion started by: jadionne
3 Replies
Login or Register to Ask a Question