Sponsored Content
Full Discussion: File Exists
Top Forums Shell Programming and Scripting File Exists Post 302397078 by shady4u on Saturday 20th of February 2010 06:23:59 PM
Old 02-20-2010
Let me explain you the full scenerio. For e.g i have 3 files created by informatica process and stored in Unix server and files are

1. a.txt
2 b.txt
3 c.txt

Now before i start my next process i need to make sure these files exists in folder and only b.txt has records in it i mean there should be some records inside the file or file should not be empty.

If the b.txt file is empty then i need to runmy informatica process again and make sure b.txt has data in it.

once b.txt has data then i go to next process.

overall i am checking two things here one is files are loaded today only today not yesterday and one of the file has data in it.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Check File Exists and compare to previous day file script

We have data files that are ftp'd every morning to a SUN server. The file names are exactly the same except for that each has the date included in its name. I have to write script to do 2 things: STEP 1) Verify that the file arrived in morning. STEP 2) Compare the file size of the current... (3 Replies)
Discussion started by: rbknisely
3 Replies

2. Shell Programming and Scripting

Need to write a script in UNIX to find a file if another file exists

So I have a lot of Java applications on my servers all having their own folder from the applications subdirectory. Now, I need to do the following. Search all the applications subdirectories for message.jar. If the message.jar file exists, I need to search the application directory for... (1 Reply)
Discussion started by: mmdawg
1 Replies

3. Shell Programming and Scripting

file exists

hello, I want to know if a file exists, but do not know contador = 0 while(file$contador exists) do bla bla bla contador=$(($contador+1)) done how can i do it? thanks (4 Replies)
Discussion started by: uri_crack
4 Replies

4. Shell Programming and Scripting

Grep pattern from different file and display if it exists in the required file

Hi, I have two files say xxx.txt and yyy.txt. xxx.txt is with list of patterns within double quotes. Eg. "this is the line1" "this is the line2" The yyy.txt with lot of lines. eg: "This is a test message which contains rubbish information just to fill the page which is of no use. this is... (3 Replies)
Discussion started by: abinash
3 Replies

5. Shell Programming and Scripting

Newbie.. Find if a file exists and open, if not create the desired file..

Hey all, I'm brand new to script writing, I'm wanting to make a script that will ask for a file and then retrieve that file if it exists, and if it doesn't exist, create the file with the desired name, and I'm completely stuck.. so far.. #! bin/bash echo "Enter desired file" read "$file" if ... (5 Replies)
Discussion started by: Byrang
5 Replies

6. Shell Programming and Scripting

Script to check for the file existence, if file exists it should echo the no of modified days

Hi, I am looking for a shell script with the following. 1. It should check whether a particular file exists in a location #!/bin/sh if ; then echo "xxx.txt File Exists" else echo "File Not Found" fi 2. If file exists, it should check for the modified date and run a command... (2 Replies)
Discussion started by: karthikeyan_mac
2 Replies

7. Windows & DOS: Issues & Discussions

Script that, if file exists in Samba share, moves file to Unix server

I'm looking to do pretty much what the title says. I want a script that runs, it can run on Unix or Windows, doesn't matter, and searches a Samba shares for a .txt file. If the file exists, the script will move (or possibly copy) the file from the Samba share into a directory on our Unix... (3 Replies)
Discussion started by: twcostello
3 Replies

8. Shell Programming and Scripting

File exists, but cannot be opened.How to check- whether it could be opened to read when it exists

Hi #Testing for file existence if ; then echo 'SCHOOL data is available for processing' else echo 'SCHOOL DATA IS NOT AVAILABLE FOR PROCESSING' : i wrote a script, where it begins by checking if file exists or not. If it exists, it truncates the database... (2 Replies)
Discussion started by: rxg
2 Replies

9. Shell Programming and Scripting

Search if file exists for a file pattern stored in array

Hi experts, I have two arrays one has the file paths to be searched in , and the other has the files to be serached.For eg searchfile.dat will have abc303 xyz123 i have to search for files that could be abc303*.dat or for that matter any extension . abc303*.dat.gz The following code... (2 Replies)
Discussion started by: 100bees
2 Replies

10. UNIX for Beginners Questions & Answers

File exists or not

Dear All, I am facing a small issue in my code where in I am searching for a file in a directory and if found it sends me a message along with the time and filename. However if a file is not found based on the search pattern, the result which I am getting is all the files present in that... (7 Replies)
Discussion started by: grvk101
7 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 10:58 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy