Sponsored Content
Top Forums Shell Programming and Scripting How to check missing sequence? Post 302770955 by RudiC on Tuesday 19th of February 2013 07:00:13 AM
Old 02-19-2013
a) the filenames seem to be created in 10 min increments, not hours
b) the file name format specifier you give does not fit the file names, at least one "m" is missing
c) the results of a thorough search on these fora had put you in a position to solve your problem on your own, or at least provided a starting point, e.g. this link.
This User Gave Thanks to RudiC For This Post:
 

10 More Discussions You Might Find Interesting

1. Programming

find the missing sequence in hash perl

Dear Perl's Users, Could anyone help me how to solve my problem. I have data with details below. TTY NAME SEQUENCES U-0 UNIX 0 U-1 UNIX 1 U-2 UNIX 2 <-- From 2 jump to 5 U-5 UNIX 5 U-6 UNIX 6 <-- From 6 jump to 20 U-20 ... (2 Replies)
Discussion started by: askari
2 Replies

2. Shell Programming and Scripting

print out missing files in a sequence

Hello all, I have several directories with a sequence of files like this IM-0001-0001.dcm IM-0001-0002.dcm IM-0001-0003.dcm IM-0001-0004.dcm IM-0001-0005.dcm I would like to print out the name of the file that is missing. I currently have the following ineffecient way to do this... (4 Replies)
Discussion started by: avatar_007
4 Replies

3. Shell Programming and Scripting

How to take the missing sequence Number?

Am using unix aix KSH... I have the files called MMRR0106.DAT MMRR0206.DAT MMRR0406.DAT MMRR0506.DAT MMRR0806.DAT .... ... MMRR3006.DAT MMRR0207.DAT These files are in one dircetory /venky ? I want the output like this ? Missing files are : MMRR0306.DAT MMRR0606.DAT... (7 Replies)
Discussion started by: Venkatesh1
7 Replies

4. UNIX for Advanced & Expert Users

Checking missing data's sequence (shell script | UNIX command)

Dear All members, i have some trouble here, i want to ask your help. The case is: I have some data, it's like: -ABCD1234 -ABCD1235 -ABCD1237 -BCDE1111 -BCDE1112 -BCDE1114 there is some missing data's sequence (the format is: ABCD = name 1234 = sequence). I want to print the... (2 Replies)
Discussion started by: septian.tri
2 Replies

5. Shell Programming and Scripting

Case script to get missing sequence among files

I want to use case statement to find the range of missing sequence in my directory which it has some few ( dat & DAT ) files my directory /home/arm/my_folder/20130428 contains : f01_201304280000.DAT f01_201304280001.DAT f01_201304280003.DAT f02_201304280000.dat f02_201304280002.dat... (2 Replies)
Discussion started by: arm
2 Replies

6. Shell Programming and Scripting

Find missing sequence

Hi, I need to find out the missing sequence from a list. However the issue is there is not a fixed start and end, it depends on the generation of files. For eg, it might start with 4000 and end with 9000. Based on this, I need a script which greps the start and end sequence from the... (3 Replies)
Discussion started by: danish0909
3 Replies

7. Shell Programming and Scripting

Identifying Missing File Sequence

Hi, I have a file which contains few columns and the first column has the file names, and I would like to identify the missing file sequence number form the file and would copy to another file. My files has data in below format. APKRISPSIN320131231201319_0983,1,54,125,... (5 Replies)
Discussion started by: rramkrishnas
5 Replies

8. Shell Programming and Scripting

Find the missing sequence

Dear all i am having file with max 24 entries. i want to find which sequence is missing file is like this df00231587.dat df01231587.dat df03231587.dat df05231587.dat . . . df23231587.dat the changing seq is 00-23,so i would like to find out which seq is missing like in above... (13 Replies)
Discussion started by: sagar_1986
13 Replies

9. Shell Programming and Scripting

How to find a missing file sequence using shell scripting?

Hey guys, I want the below files to be processed with the help of BASH so that i will be able to find the missing file names : PP01674520141228X.gz PP01674620141228X.gz PP01674820141228X.gz PP01674920141228X.gz PP01675420141228X.gz PP01675520141228X.gz PP01676020141228X.gz . . . .... (4 Replies)
Discussion started by: TANUJ
4 Replies

10. Shell Programming and Scripting

To check the missing file based on sequence number.

Hi All, I have a requirement that i need to list only the missing sequences with a unix script. For Example: Input: FILE_001.txt FILE_002.txt FILE_005.txt FILE_006.txt FILE_008.txt FILE_009.txt FILE_010.txt FILE_014.txt Output: FILE_003.txt FILE_004.txt FILE_007.txt FILE_011.txt... (5 Replies)
Discussion started by: Arun1992
5 Replies
WWW::RobotRules(3pm)					User Contributed Perl Documentation				      WWW::RobotRules(3pm)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File COPYRIGHT
Copyright 1995-2009, Gisle Aas Copyright 1995, Martijn Koster This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.10.1 2011-03-13 WWW::RobotRules(3pm)
All times are GMT -4. The time now is 01:49 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy