Sponsored Content
Top Forums Shell Programming and Scripting Searching for files over 30 days old in current directory Post 42585 by Ygor on Friday 31st of October 2003 05:49:47 AM
Old 10-31-2003
Try....
Code:
find $PWD -type f -mtime +30 -print|grep -Ev "$PWD/.*/"

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

ls latest 4 days or specify days of files in the directory

Hi, I would like to list latest 2 days, 3 days or 4 days,etc of files in the directory... how? is it using ls? (3 Replies)
Discussion started by: happyv
3 Replies

2. Shell Programming and Scripting

searching files through all subdirectories beneath the current directory

i want to make a bash script that searches a specific pattern in files through all subdirectories beneath the current directory..without using the command grep-R but only the command grep.. e.g for i in * do grep "pattern" $i ..... ... done using the character (*) the script... (5 Replies)
Discussion started by: milagros
5 Replies

3. Shell Programming and Scripting

searching content of files in the current and sub directories

Hi I was wondering why command 2 doesn't work like command 1 below. 1. find . -exec grep "test" '{}' \; -print 2. ls -R | grep "test" I am trying to search "test" from all the files in the current and sub directories. What's wrong with my command 2? Thanks in advance for your help (4 Replies)
Discussion started by: tiger99
4 Replies

4. Shell Programming and Scripting

Finding files in current directory when 100,000's files in current directory

Hi All I was wondering what is the most efficient way to find files in the current directory(that may contain 100,000's files), that meets a certain specified file type and of a certain age. I have experimented with the find command in unix but it also searches all sub directories. I have... (2 Replies)
Discussion started by: kewong007
2 Replies

5. Shell Programming and Scripting

mget * (obtein files from current directory but not the files form sub-directories)

Hello, Using the instruction mget (within ftp) and with "Interactive mode off", I want to get all files from directory (DirAA), but not the files in sub-directories. The files names don't follow any defined rule, so they can be just letters without (.) period Directory structure example: ... (0 Replies)
Discussion started by: Peter321
0 Replies

6. Shell Programming and Scripting

How to strip ^M at end of each files for all files found in current directory

I am trying to use a loop to strip of the funny character ^M at the end of all lines in each file found in current directory and I have used the following in a script: find . -type f -name '*.txt' | while read file do echo "stripping ^M from ..." ex - "$file" > $tempfile %s/^M//g wq! # mv... (4 Replies)
Discussion started by: bisip99
4 Replies

7. UNIX for Dummies Questions & Answers

Need Help in reading N days files from a Directory & combining the files

Hi All, Request your expertise in tackling one requirement in my project,(i dont have much expertise in Shell Scripting). The requirement is as below, 1) We store the last run date of a process in a file. When the batch run the next time, it should read this file, get the last run date from... (1 Reply)
Discussion started by: dsfreddie
1 Replies

8. Shell Programming and Scripting

How to clean 3 days before files, from a directory?

I have a Solaris System. I am using bash shell. I want to prepare a script which can do the below. There are few directories i need to clean. In those directories, I need to delete files which are older than 3 days. 3 days before files need to be deleted. The directories are as follows.... (7 Replies)
Discussion started by: Saidul
7 Replies

9. UNIX for Advanced & Expert Users

Find all files in the current directory excluding hidden files and directories

Find all files in the current directory only excluding hidden directories and files. For the below command, though it's not deleting hidden files.. it is traversing through the hidden directories and listing normal which should be avoided. `find . \( ! -name ".*" -prune \) -mtime +${n_days}... (7 Replies)
Discussion started by: ksailesh1
7 Replies

10. UNIX for Beginners Questions & Answers

Searching for a files based on current date directory

Hi All, I've been trying to do some recursive searching but not been very successful. Can someone please help. Scenario: I have directory structure /dir1/dir2/dir3/ 2019/ 11/ 17 18 19 20 so what I want to do is run a script and as its 2019/11/18/ today it would go and only search... (3 Replies)
Discussion started by: israr75
3 Replies
PWD(1)								   User Commands							    PWD(1)

NAME
pwd - print name of current/working directory SYNOPSIS
pwd [OPTION]... DESCRIPTION
Print the full filename of the current working directory. -L, --logical use PWD from environment, even if it contains symlinks -P, --physical avoid all symlinks --help display this help and exit --version output version information and exit NOTE: your shell may have its own version of pwd, which usually supersedes the version described here. Please refer to your shell's docu- mentation for details about the options it supports. AUTHOR
Written by Jim Meyering. REPORTING BUGS
Report pwd bugs to bug-coreutils@gnu.org GNU coreutils home page: <http://www.gnu.org/software/coreutils/> General help using GNU software: <http://www.gnu.org/gethelp/> Report pwd translation bugs to <http://translationproject.org/team/> COPYRIGHT
Copyright (C) 2010 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>. This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. SEE ALSO
getcwd(3) The full documentation for pwd is maintained as a Texinfo manual. If the info and pwd programs are properly installed at your site, the command info coreutils 'pwd invocation' should give you access to the complete manual. GNU coreutils 8.5 February 2011 PWD(1)
All times are GMT -4. The time now is 11:02 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy