Sponsored Content
Full Discussion: Rename files
Top Forums Shell Programming and Scripting Rename files Post 302894150 by anbu23 on Monday 24th of March 2014 06:37:54 AM
Old 03-24-2014
What have you tried?
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

rename files

hey all, I have files in the format of ABCD20061101 and ABCDEF20061101 in one directory, I would like to change all ABCD20061101 to ABCDEF20061101 and the problem is if I do a simple pattern match of ABCD, then those ABCDEF20061101 would also... (2 Replies)
Discussion started by: mpang_
2 Replies

2. UNIX for Dummies Questions & Answers

rename files help

Hi, I've already search for this issue but I found different scripts for rename files, but I don't know how to customize it for my needs. Here's what I want to do: I have a lot of files inside many directories, like this: /aa/01.txt /aa/02.txt /ab/01.txt /ab/02.txt I want all those files... (2 Replies)
Discussion started by: piltrafa
2 Replies

3. Shell Programming and Scripting

Rename files

Hello, I've a list of file like this img_001 img_22 img_44 and I would rename all with this form photo_0001 photo_0002 photo_0003 photo_0004 suggestions?Thanks to all. (2 Replies)
Discussion started by: cv313x
2 Replies

4. Shell Programming and Scripting

Rename many files

Hi all I have files in the following format: 01_anote1.pdf 01_bnote1.pdf 01_control1.pdf 01_ethics1.pdf 01_invoice1.pdf 01_invoice_21.pdf 20_quote_l1.pdf I need to rename them to 01_anote.pdf 01_bnote.pdf 01_control.pdf 01_ethics.pdf 01_invoice.pdf (9 Replies)
Discussion started by: lmatlebyane
9 Replies

5. Shell Programming and Scripting

Rename files

Hi, I wanna rename bunch of files which has ":" to -. ie. rename file named file1:file1 to file1-file1. any ideas? (2 Replies)
Discussion started by: linuxaddict7
2 Replies

6. Shell Programming and Scripting

rename files Ax based on strings found in files Bx

Hi, I'm not very experienced in shell scripting and that's probably why I came across the following problem: I do have several hundred pairs of text files (PF00x.spl and PF00x.shd) where the first file (PF00x.spl) needs to be renamed according a string that is included in the second file... (12 Replies)
Discussion started by: inCH
12 Replies

7. Shell Programming and Scripting

How to rename files

Hi Guys, I have to rename about 180 files in different folders in linux. For example, abc_110117.txt eff_110117.txt zzz_110117.txt After renaming the files, these files should like like abc.txt eff.txt zzz.txt I created a small script to rename the files like ls... (2 Replies)
Discussion started by: naveed
2 Replies

8. UNIX for Dummies Questions & Answers

Rename all .sh files to .pl

I have various .sh and .pl files in one directory. I want to rename all the .sh files to .pl i.e testscript.sh --> testscript.pl I am trying to use mv *.sh *.pl It doesnt work though!! (3 Replies)
Discussion started by: chrisjones
3 Replies

9. Shell Programming and Scripting

Script to unzip files and Rename the Output-files

Hi all, I have a many folders with zipped files in them. The zipped files are txt files from different folders. The txt files have the same names. If i try to find . -type f -name "*.zip" -exec cp -R {} /myhome/ZIP \; it fails since the ZIP files from different folders have the same names and... (2 Replies)
Discussion started by: pmkenya
2 Replies

10. Shell Programming and Scripting

SBATCH trinity for multiple files and rename/move the output files

Hey guys, I have wrote the following script to apply a module named "trinity" on my files. (it takes two input files and spit a trinity.fasta as output) #!/bin/bash -l #SBATCH -p node #SBATCH -A <projectID> #SBATCH -n 16 #SBATCH -t 7-00:00:00 #SBATCH --mem=128GB #SBATCH --mail-type=ALL... (1 Reply)
Discussion started by: @man
1 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 10:38 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy