Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Adding a date as a first column Post 302197297 by fabtagon on Tuesday 20th of May 2008 06:36:30 PM
Old 05-20-2008
(echo -n `curl http://www.example.com/temperatures.txt | grep mylocation` ; date +param) >> mytemperatures.txt
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Delete a row from a file if one column containing a date is greater than the current system date

Hello gurus, I am hoping someone can help me with the required code/script to make this work. I have the following file with records starting at line 4: NETW~US60~000000000013220694~002~~IT~USD~2.24~20110201~99991231~01~01~20101104~... (4 Replies)
Discussion started by: chumsky
4 Replies

2. Shell Programming and Scripting

Rename a header column by adding another column entry to the header column name URGENT!!

Hi All, I have a file example.csv which looks like this GrpID,TargetID,Signal,Avg_Num CSCH74_1_1,2007,61,256 CSCH74_1_1,212007,647,679 CSCH74_1_1,12007,3,32 CSCH74_1_1,207,299,777 I want the output as GrpID,TragetID,Signal-CSCH74_1_1,Avg_Num CSCH74_1_1,2007,61,256... (4 Replies)
Discussion started by: Vavad
4 Replies

3. UNIX for Dummies Questions & Answers

Rename a header column by adding another column entry to the header column name

Hi All, I have a file example.csv which looks like this GrpID,TargetID,Signal,Avg_Num CSCH74_1_1,2007,61,256 CSCH74_1_1,212007,647,679 CSCH74_1_1,12007,3,32 CSCH74_1_1,207,299,777 I want the output as GrpID,TragetID,Signal-CSCH74_1_1,Avg_Num CSCH74_1_1,2007,61,256... (1 Reply)
Discussion started by: Vavad
1 Replies

4. Shell Programming and Scripting

Adding days to system date then compare to a date

Hi! I am trying to read a file and every line has a specific date as one of its fields. I want to take that date and compare it to the date today plus 6 days. while read line do date=substr($line, $datepos, 8) #date is expected to be YYYYMMDD if ; then ...proceed commands ... (1 Reply)
Discussion started by: kokoro
1 Replies

5. UNIX for Dummies Questions & Answers

Adding hours and minutes to current date (Only to date not to time)

Hi, I want to add some hours and minutes to the current date. For example, if the current date is "July 16, 2012 15:20", i want to add 5 hours 30 minutes to "July 16, 2012 00:00" not to "July 16, 2012 15:20". Please help. Thanks! (4 Replies)
Discussion started by: manojgarg
4 Replies

6. Shell Programming and Scripting

Need to add a date column (today's date) in file

Hi I have file with number status and date1 and date1 field, want add a column today between column date1 and date2. file1.txt number status date1 date2 ===== ==== === ===== 34567 open 27/06/13 28/06/13 45678 open 27/06/13 28/06/13 43567 open 27/06/13 28/06/13 ... (1 Reply)
Discussion started by: vijay_rajni
1 Replies

7. Shell Programming and Scripting

Adding values of a column based on another column

Hello, I have a data such as this: ENSGALG00000000189 329 G A 4 2 0 ENSGALG00000000189 518 T C 5 1 0 ENSGALG00000000189 1104 G A 5 1 0 ENSGALG00000000187 3687 G T 5 1 0 ENSGALG00000000187 4533 A T 4 2 0 ENSGALG00000000233 5811 T C 4 2 0 ENSGALG00000000233 5998 C A 5 1 0 I want to... (3 Replies)
Discussion started by: Homa
3 Replies

8. Shell Programming and Scripting

How to awk or grep the last column in file when date on column contains spaces?

Hi have a large spreadsheet which has 4 columns APM00111803814 server_2 96085 Corp IT Desktop and Apps APM00111803814 server_2 96085 Corp IT Desktop and Apps APM00111803814 server_2 96034 Storage Mgmt Team APM00111803814 server_2 96152 GWP... (6 Replies)
Discussion started by: kieranfoley
6 Replies

9. Shell Programming and Scripting

Get maximum per column from CSV file, based on date column

Hello everyone, I am using ksh on Solaris 10 and I'm gathering data in a CSV file that looks like this: 20170628-23:25:01,1,0,0,1,1,1,1,55,55,1 20170628-23:30:01,1,0,0,1,1,1,1,56,56,1 20170628-23:35:00,1,0,0,1,1,2,1,57,57,2 20170628-23:40:00,1,0,0,1,1,1,1,58,58,2... (6 Replies)
Discussion started by: ejianu
6 Replies

10. Shell Programming and Scripting

Adding an extra date column in UNIX file

Hi All, I have a file with only one column of data (without delimiter). For Ex: cat temp.txt 22055 21088 93840 30990 50990 50950 I want to insert an additional column with current date as value. So, i have used below command but didn't get the result as excepted. Could onyone over... (5 Replies)
Discussion started by: Suresh
5 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 11:20 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy