Sponsored Content
Top Forums Shell Programming and Scripting Sending Multipes files as attachents in directory Post 303032381 by Corona688 on Friday 15th of March 2019 05:53:41 PM
Old 03-15-2019
Bumping up posts or double posting is not permitted in these forums.

Please read the rules, which you agreed to when you registered, if you have not already done so.

You may receive an infraction for this. If so, don't worry, just try to follow the rules more carefully. The infraction will expire in the near future

Thank You.

The UNIX and Linux Forums.
 

9 More Discussions You Might Find Interesting

1. How do I send email?

sending files as attachments

How do I send a file as an attachment on a Unix system (9 Replies)
Discussion started by: SmartJuniorUnix
9 Replies

2. UNIX for Dummies Questions & Answers

sending files as attachments

How do I send a file as an attachment on a Unix system (9 Replies)
Discussion started by: SmartJuniorUnix
9 Replies

3. Shell Programming and Scripting

Sending an email with more than one files

Hi, I would like to send an email with more than one attachement. I am using uuencode. I want to achive by suing uuencode. Also please let me know other ways. -Thambi (7 Replies)
Discussion started by: thambi
7 Replies

4. Shell Programming and Scripting

script to monitor files in a directory and sending the alert

Hi All, We are having important config files in an directory which was accessable by all /auto/config/Testbed/>ls config1.intial config2.intial config3.inital often we find that some of the lines are missing in config files, we doubt if some one is removing. I would like to write... (0 Replies)
Discussion started by: shellscripter
0 Replies

5. Shell Programming and Scripting

Grepping file names, comparing them to a directory of files, and moving them into a new directory

got it figured out :) (1 Reply)
Discussion started by: sHockz
1 Replies

6. Programming

GFORTRAN sending .o files in separate directory

I am using gfortran and want to send .o files in separate directory. Currently I have the following in a bash script. The .mod files are handled ok, but the .o files are still created in the same directory as the source code fsrc="newunit.f08 math.f08 tString.f08 prout.f08 prary.f08... (1 Reply)
Discussion started by: kristinu
1 Replies

7. Shell Programming and Scripting

List files with date, create directory, move to the created directory

Hi all, i have a folder, with tons of files containing as following, on /my/folder/jobs/ some_name_2016-01-17-22-38-58_some name_0_0.zip.done some_name_2016-01-17-22-40-30_some name_0_0.zip.done some_name_2016-01-17-22-48-50_some name_0_0.zip.done and these can be lots of similar files,... (6 Replies)
Discussion started by: charli1
6 Replies

8. Shell Programming and Scripting

Directory containing files,Print names of the files in the directory that are exactly same content.

Given a directory containing say a few thousand files, please output a list of all the names of the files in the directory that are exactly the same, i.e. have the same contents. func(a_directory_name) output -> {“matches”: , ... ]} e.g. func(“/home/my/files”) where the directory... (7 Replies)
Discussion started by: anuragpgtgerman
7 Replies

9. Shell Programming and Scripting

Xargs and rsync not sending all files

Hi all, I have a script issue I can't seem to work out. In a directory I have several folders and I want to send just the sapdata1 to sapdata14 folders and their contents but not sapdataXX/.snapshot the script is: #!/bin/bash # SETUP OPTIONS export SRCDIR="/scratch/doug/test/sapdata*"... (5 Replies)
Discussion started by: DougyC
5 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 01:31 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy