Sponsored Content
Full Discussion: Question on awk source files
Top Forums Shell Programming and Scripting Question on awk source files Post 302979534 by JSKOBS on Tuesday 16th of August 2016 03:44:52 AM
Old 08-16-2016
Hi Ravinder, i think i confused...
we need the file record count like below, not each line count
Code:
2342343|file.txt
762834623|asdf.txt  
23|lkjh.txt  
9098|bvcx.txt


Moderator's Comments:
Mod Comment Please use CODE (not PHP) tags as required by forum rules!

Last edited by RudiC; 08-16-2016 at 05:48 AM.. Reason: Added CODE tags.
 

10 More Discussions You Might Find Interesting

1. OS X (Apple)

open-source driver question

Hi, I'm a linux guy and have used netbsd, openbsd, freebsd etc in the past but never tangled with the kernel or drivers outside of Linux. My mother has fried her ethernet port on her iMac (G4 I think); I recently sent her a silicom USB U2E (usb 2 ethernet) dongle which is evidently not... (2 Replies)
Discussion started by: sjalex
2 Replies

2. Shell Programming and Scripting

noob question - is awk the tool to clean dirty text files?

Hi, nevermind. I think I've found the answer. It appears I was looking for index, match, sub, and gsub. I want to write a shell script that will clean the html out of a bunch of files and format the data for import into excel. Awk seems like a powerful tool, but it seems oriented to... (1 Reply)
Discussion started by: yogert909
1 Replies

3. Programming

Vi question for writing source code

Hi guys, I'm modifying an old f77 code using vi. I'm modifying the code remotely from my windows machine using xming using vi. I'm using tabs to move past the first 6 columns of the code and to keep my loops and if statements neat, but when I hit the tab key, vi displays a big red block which is... (7 Replies)
Discussion started by: rks171
7 Replies

4. Shell Programming and Scripting

Awk help with source and previous line loop

Hello, I've written a ksh awk script to ping multiple servers and write the results to a file. That part is working ok. I then want to extract the names of only the server which are available. This is indicated by '1 packets received'. The server name actually appears above that line so I found... (4 Replies)
Discussion started by: Grueben
4 Replies

5. UNIX for Dummies Questions & Answers

Source all files in a directory

Hi everyone, Is there an efficient way to source all of the files contained in a directory? Theoretically I could create a FOR loop and successively source each file, but I just wanted to check if there was a cleaner method. Thanks! Mike (3 Replies)
Discussion started by: msb65
3 Replies

6. Solaris

Question about SunOS version, how to include in C source

Sorry if here is the wrong place to put this question, but for college we develop small programs in C using Solaris. Most of time is OK for us not to document nothing, until now. Every time program is executed must print OS name. Does Solaris has some predefined macros which I can include... (3 Replies)
Discussion started by: solaris_user
3 Replies

7. Shell Programming and Scripting

need a shell script to extract the files from source file and check whether those files existonserve

Hi, I am new to shell scripting.Please help me on this.I am using solaris 10 OS and shell i am using is # echo $0 -sh My requirement is i have source file say makefile.I need to extract files with extensions (.c |.cxx |.h |.hxx |.sc) from the makefile.after doing so i need to check whether... (13 Replies)
Discussion started by: muraliinfy04
13 Replies

8. Shell Programming and Scripting

Awk command with two source files

Hello, I have two source files: sourcefile1.dat: 12345 xxx yyy zzz 23456 qqq ttt rrr 34567 ppp jjj ggg 45678 fff ddd sss 56789 nnn mmm ccc sourcefile2.dat: 12345.gif 34567.gif I want to obtain a simple awk one linger to obtain the following: xxx yyy zzz 12345.gif qqq ttt rrr... (15 Replies)
Discussion started by: palex
15 Replies

9. UNIX for Dummies Questions & Answers

Concatenate files and delete source files. Also have to add a comment.

- Concatenate files and delete source files. Also have to add a comment. - I need to concatenate 3 files which have the same characters in the beginning and have to remove those files and add a comment and the end. Example: cat REJ_FILE_ABC.txt REJ_FILE_XYZ.txt REJ_FILE_PQR.txt >... (0 Replies)
Discussion started by: eskay
0 Replies

10. Shell Programming and Scripting

Error files count while coping files from source to destination locaton as well count success full

hi All, Any one answer my requirement. I have source location src_dir="/home/oracle/arun/IRMS-CM" My Target location dest_dir="/home/oracle/arun/LiveLink/IRMS-CM/$dc/$pc/$ct" my source text files check with below example.text file content $fn "\t" $dc "\t" $pc "\t" ... (3 Replies)
Discussion started by: sravanreddy
3 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 08:46 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy