08-02-2007
Listing files with full path
Hi,
I need to store all the files in a directory to a text file with its full path.
The example below can explain:
./File1.txt
./File2.txt
./Folder1/File11.txt
./Folder1/File12.txt
./Folder1/Folder11/File111.txt
./Folder2/file21.txt
:
:
The ls -R1 command won't give the result as I desired. Please help.
Regards,
Sethu.
10 More Discussions You Might Find Interesting
1. UNIX for Advanced & Expert Users
How can i list every single file on a sun solaris server running 2.8 starting from '/' with the full path included in it?
example.
/
...
...
...
/etc/inetd.conf
/etc/passwd
/etc/shadow
...
...
...
/var/adm/messages
/var/adm/messages.0
/var/adm/messages.1
...
...
...... (4 Replies)
Discussion started by: Sowser
4 Replies
2. Shell Programming and Scripting
Hi all,
How to save file full name to a file. I tried the following but don't know to include path name.
$ ls -l | awk '{print $9}' > outputfile.dat
$ cat outputfile.dat
fifth.txt
first.txt
fourth.txt
second.txt
third.txt
My wanted result is ie:
... (3 Replies)
Discussion started by: mr_bold
3 Replies
3. Red Hat
How can i perform a ls or other command to list the full paths of files from a ls?
Looked through the man page for ls, no luck
$ cd /tmp/
$ ls -l
total 6
drwx------ 2 root root 4096 Nov 7 2008 keyring-7b5rMv
drwx------ 2 bcr bcr 4096 Dec 7 2007 keyring-cGhir8
$
I'd be looking for... (1 Reply)
Discussion started by: brendan76
1 Replies
4. Shell Programming and Scripting
hello every one,
i'm a novice in the field of Linux, so please help me out with this problem.
a text file with the following syntax is given:
file1
file2
file3
file4
file5
a script is to be written to list all d file names and tar the files with the filename... (3 Replies)
Discussion started by: Amruthesh C
3 Replies
5. Windows & DOS: Issues & Discussions
Hi,
(Apologies, I'm sure I'm not the first person to raise this question but so far in my searches haven't found a good answer).
I would like to output a listing per line of filename (including full path) and 'last updated' timestamp. e.g:
Z:\dir1\file1.txt 01/02/2010 10:43... (5 Replies)
Discussion started by: GM_AIX
5 Replies
6. Shell Programming and Scripting
/Path/snowbird9/nrfCompMgrRave1230100920.log.gz:09/20/2010 06:14:51 ERROR Error Message.
/Path/snowbird6/nrfCompMgrRave1220100920.log.gz:09/20/2010 06:14:51 ERROR Error Message.
/Path/snowbird14/nrfCompMgrRave920100920.log.gz:09/20/2010 06:14:51 ERROR Error Message.... (0 Replies)
Discussion started by: Shirisha
0 Replies
7. Shell Programming and Scripting
Hi all,
I want to check the list of all directories and links in a particular directory and here, i have the list of the directories/links which i need to print on screen.
I used the below command to check the dir/links,
cd path1 ; ls -ltd `cat dir_links_list`
But here, i don't want to... (3 Replies)
Discussion started by: raghu.iv85
3 Replies
8. Shell Programming and Scripting
How to list all Subdirectories and files with its full path in a parent directory? (1 Reply)
Discussion started by: johnveslin
1 Replies
9. Shell Programming and Scripting
Hi,
I need to do find and replace, but the pattern is not full known.
for example,
my file has /proj/app-d1/sun or /data/site-d1/conf
here app-d1 and site-d1 is not constant. It may be different in different files. common part is /proj/xx/sun and /data/xxx/conf
i want to find where ever... (6 Replies)
Discussion started by: rbalaj16
6 Replies
10. UNIX for Beginners Questions & Answers
my requirement is 30 days old files along with size and pull path of the file (file should be listed in descending by size).
output:
12345 /app/testing/file1
12341 /app/testing/file2 (5 Replies)
Discussion started by: Rajesh123
5 Replies
LEARN ABOUT REDHAT
www::robotrules
WWW::RobotRules(3) User Contributed Perl Documentation WWW::RobotRules(3)
NAME
WWW::RobotsRules - Parse robots.txt files
SYNOPSIS
require WWW::RobotRules;
my $robotsrules = new WWW::RobotRules 'MOMspider/1.0';
use LWP::Simple qw(get);
$url = "http://some.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
$url = "http://some.other.place/robots.txt";
my $robots_txt = get $url;
$robotsrules->parse($url, $robots_txt);
# Now we are able to check if a URL is valid for those servers that
# we have obtained and parsed "robots.txt" files for.
if($robotsrules->allowed($url)) {
$c = get $url;
...
}
DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in
<http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access
to parts of their web site.
The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited.
The same WWW::RobotRules object can parse multiple /robots.txt files.
The following methods are provided:
$rules = WWW::RobotRules->new($robot_name)
This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot.
$rules->parse($robot_txt_url, $content, $fresh_until)
The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file.
$rules->allowed($uri)
Returns TRUE if this robot is allowed to retrieve this URL.
$rules->agent([$name])
Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache.
ROBOTS.TXT
The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of
<http://info.webcrawler.com/mak/projects/robots/norobots.html>):
The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form
<field-name>: <value>
The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The
following <field-names> can be used:
User-Agent
The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is
present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If
the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records.
Disallow
The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that
starts with this value will not be retrieved
ROBOTS.TXT EXAMPLES
The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called
"cybermapper":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
# Cybermapper knows where to go.
User-agent: cybermapper
Disallow:
This example indicates that no robots should visit this site further:
# go away
User-agent: *
Disallow: /
SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File
libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)