Sponsored Content
Top Forums Shell Programming and Scripting Script to move the first line of a file to the end Post 302272054 by cfajohnson on Monday 29th of December 2008 01:36:19 PM
Old 12-29-2008
Quote:
Originally Posted by matrixmadhan
This is UUOC, cat is not needed here

just sed '<operation>' filename would do

Nor is head required:

Code:
IFS= read -r line < list.txt
printf "%s\n" "$line" >> new_list.txt

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Help on shell script : syntax error at line 62: `end of file' unexpected

Hi All, I have written a korn script (code pasted below). It is giving the error while debugging "new.sh: syntax error at line 62: `end of file' unexpected". I have re-written the whole code in VI and explored all help related to this error on this Unix forum and tried it. Somehow, I could... (7 Replies)
Discussion started by: schandrakar1
7 Replies

2. Shell Programming and Scripting

script to move two lines to the end of a file

My input file is multiline file and I am writing a script to search for a pattern and move the line with the pattern and the next line to the end of the file. Since I am trying to learn awk, I thought I would try it. My input looks like the following: D #testpoint 1 510.0 D #testpoint2 ... (5 Replies)
Discussion started by: banjo25
5 Replies

3. Shell Programming and Scripting

Move a line to end of file

Can somebody help me with a script .... Read a file /etc/inittab find the string starting with rcml and move it entirely towards the end of file. rcml:2:once:/usr/sni/aix52/rc.ml > /dev/console 2>&1 I basically want to change the startup sequence. (2 Replies)
Discussion started by: imanuk2007
2 Replies

4. Shell Programming and Scripting

awk script to move a line after the matched pattern line

I have the following text format in a file which lists the question first and then 5 choices after that the explanantion and finally the answer. 1.The amount of time it takes for most of a worker’s occupational knowledge and skills to become obsolete has been declining because of the... (2 Replies)
Discussion started by: nanchil_guy
2 Replies

5. Shell Programming and Scripting

help needed with shell script to append to the end of a specific line in a file on multiple servers

Hi Folks, I was given a task to append three IP's at the end of a specific (and unique) line within a file on multiple servers. I was not able to do that with the help of a script. All I could was: for i in server1 server2 server3 server4 do ssh $i done I know 'sed' could be used to... (5 Replies)
Discussion started by: momin
5 Replies

6. Shell Programming and Scripting

how to move a word to the end of file

Hey guys, I want move a specific word from the middle of the text and move it the end of the file, which means I want the word to be deleted from it's line and moved to the end of file. I know how to use sed for adding a word the end of file, but I don't know how to move words. tnx (2 Replies)
Discussion started by: Johanni
2 Replies

7. Shell Programming and Scripting

sed: how to move matched pattern to end of previous line

Hello, I'm new to this forum. I've been doing a lot of sed work lately and have found many useful tips on this forum. I've hit a roadblock in a project, though, and could really use some help. I have a text file with many lines like the following, i.e., some lines begin with a single word... (3 Replies)
Discussion started by: paroikoi
3 Replies

8. Shell Programming and Scripting

Cannot execute/finish script because of last line syntax error: unexpected end of file/token `done'

first of all I thought the argument DONE is necessary for all scripts that have or begin with do statements which I have on my script, However, I still don't completely understand why I am receiving an error I tried adding another done argument statement but didn't do any good. I appreciate... (3 Replies)
Discussion started by: wolf@=NK
3 Replies

9. Shell Programming and Scripting

With script bash, read file line per line starting at the end

Hello, I'm works on Ubuntu server My goal : I would like to read file line per line, but i want to started at the end of file. Currently, I use instructions : while read line; do COMMAND done < /var/log/apache2/access.log But, the first line, i don't want this. The file is long... (5 Replies)
Discussion started by: Fuziion
5 Replies

10. Shell Programming and Scripting

Printing string from last field of the nth line of file to start (or end) of each line (awk I think)

My file (the output of an experiment) starts off looking like this, _____________________________________________________________ Subjects incorporated to date: 001 Data file started on machine PKSHS260-05CP ********************************************************************** Subject 1,... (9 Replies)
Discussion started by: samonl
9 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 09:33 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy