Sponsored Content
Top Forums Shell Programming and Scripting command to list .txt and .TXT file Post 302272634 by pludi on Wednesday 31st of December 2008 08:37:35 AM
Old 12-31-2008
Answered here. Read the answers to your questions first before double-/cross-posting, please.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

echo "ABC" > file1.txt file2.txt file3.txt

Hi Guru's, I need to create 3 files with the contents "ABC" using single command. Iam using: echo "ABC" > file1.txt file2.txt file3.txt the above command is not working. pls help me... With Regards / Ganapati (4 Replies)
Discussion started by: ganapati
4 Replies

2. UNIX for Dummies Questions & Answers

Binary txt file received when i use uuencode to send txt file as attachment

Hi, I have already read a lot of posts on sending attachments in unix...but none of them were of help for my problem...so here goes.. i wanna attach a text file and send to a mail id..used the following code : uuencode "$File1" "$File1" ;|mail -s "$Mail_sub" abc@abc.com it works... (2 Replies)
Discussion started by: ash22
2 Replies

3. Shell Programming and Scripting

AWK CSV to TXT format, TXT file not in a correct column format

HI guys, I have created a script to read 1 column in a csv file and then place it in text file. However, when i checked out the text file, it is not in a column format... Example: CSV file contains name,age aa,11 bb,22 cc,33 After using awk to get first column TXT file... (1 Reply)
Discussion started by: mdap
1 Replies

4. Solaris

list files .txt and .TXT in one command

Dear experts, In a directory i have both *.TXT and *.txt files. I have a script- for file in `ls *.txt`; do mv $file /tmp/$file How to list both *.txt and*.TXT file in one command so that script will move both .txt or .TXT whatever it find. br//purple (4 Replies)
Discussion started by: thepurple
4 Replies

5. Shell Programming and Scripting

awk append fileA.txt to growing file B.txt

This is appending a column. My question is fairly simple. I have a program generating data in a form like so: 1 20 2 22 3 23 4 12 5 43 For ever iteration I'm generating this data. I have the basic idea with cut -f 2 fileA.txt | paste -d >> FileB.txt ???? I want FileB.txt to grow, and... (4 Replies)
Discussion started by: theawknewbie
4 Replies

6. Shell Programming and Scripting

Need to append the date | abcddate.txt to the first line of my txt file

I want to add/append the info in the following format to my.txt file. 20130702|abcd20130702.txt FN|SN|DOB I tried the below script but it throws me some exceptions. <#!/bin/sh dt = date '+%y%m%d'members; echo $dt+|+members+$dt; /usr/bin/awk -f BEGIN { FS="|"; OFS="|"; } { print... (6 Replies)
Discussion started by: harik1982
6 Replies

7. Windows & DOS: Issues & Discussions

2 Questions: replace text in txt file, add text to end of txt file

so... Lets assume I have a text file. The text file contains multiple "#" symbols. I want to replace all thos "#"s with a STRING using DOS/Batch I want to add a certain TEXT to the end of each line. How can I do this WITHOUT aid of sed, grep or anything linux related ? (1 Reply)
Discussion started by: pasc
1 Replies

8. Shell Programming and Scripting

Desired output.txt for reading txt file using awk?

Dear all, I have a huge txt file (DATA.txt) with the following content . From this txt file, I want the following output using some shell script. Any help is greatly appreciated. Greetings, emily DATA.txt (snippet of the huge text file) 407202849... (2 Replies)
Discussion started by: emily
2 Replies

9. UNIX for Dummies Questions & Answers

Split Every Line In Txt Into Separate Txt File, Named Same As The Line

Hi All Is there a way to export every line into new txt file where by the title of each txt output are same as the line ? I have this txt files containing names: Kandra Vanhooser Rhona Menefee Reynaldo Hutt Houston Rafferty Charmaine Lord Albertine Poucher Juana Maes Mitch Lobel... (2 Replies)
Discussion started by: Nexeu
2 Replies

10. Programming

[Python] replicating "sha256 -C checksum_file.txt file.txt"

Hello everyone, Since my python knowledge is limimted, I've challenged myself to learn as much as possible to help me with my carrere. I'm currently trying to convert a shell script to python, just to give myself a task. There is one section of the script that I'm having issues converting and... (2 Replies)
Discussion started by: da1
2 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File perl v5.12.1 2009-10-03 WWW::RobotRules(3)
All times are GMT -4. The time now is 01:27 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy