Sponsored Content
Full Discussion: Problem to read archive
Top Forums Shell Programming and Scripting Problem to read archive Post 302368135 by rdcwayx on Wednesday 4th of November 2009 07:15:27 AM
Old 11-04-2009
Code:
$ cat file.txt |awk '{print $3}'  |sort |uniq -c |sort  -k2 |awk '{print $2"_file.txt with",$1, "records"}'
02_file.txt with 3 records
06_file.txt with 2 records
08_file.txt with 1 records
09_file.txt with 1 records

Code:
$ awk '{a[$3]+=1} END {for (i in a) print i"_file.txt with",a[i],"records" }' file.txt  |sort
02_file.txt with 3 records
06_file.txt with 2 records
08_file.txt with 1 records
09_file.txt with 1 records


Last edited by rdcwayx; 11-04-2009 at 08:27 AM..
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Read from file then purge or archive.

Hi All, I have a root directory /tmp and I want to purge files or archive files in its subsequent subfolders.I listed the path of files I want to purge(archive) and the #of days. (purge) DAYS PATH 7 /tmp/arsenal/* 5 /tmp/chelsea/* (archive? the same as above but different folders... (15 Replies)
Discussion started by: kayarsenal
15 Replies

2. Shell Programming and Scripting

read list of filenames from text file, archive, and remove

I posted a week ago regarding this scripting question, but I need to revisit and have a few more questions answered.. User cfajohnson was extremely helpful with the archive script, but clarification on my part is needed to help steer the answer in a direction that works in this particular... (5 Replies)
Discussion started by: fxvisions
5 Replies

3. Solaris

problem in creating flash archive

Dear all I am in a problem. I have created a master server on which I have install a Solaris 10 OS as well as Oracle 10g with some additional solaris packages. Now I want to create a flash archive of this server and install that flash archive on another server, so that the new server will have... (6 Replies)
Discussion started by: girish.batra
6 Replies

4. Shell Programming and Scripting

Read specific file from a zip archive without extracting

Hi All, I would like to extract specific file from a zip archive. I have a zip archive "sample.zip". sample.zip contains few text files and images... text1.txt, text2.txt, pic.jpg etc... I need to read specific file "text2.txt" from "sample.zip" WITHOUT EXTRACTING the zip file. ... (4 Replies)
Discussion started by: sridharg
4 Replies

5. Programming

read file from tar.gz archive

I want to write a c-program which reads a textfile from a tar.gz archive. How can I do it? (9 Replies)
Discussion started by: krylin
9 Replies

6. Shell Programming and Scripting

Archive::Tar problem

Hello, I have a problem using Archive::Tar. it seem very trivial but i cannot get it work. First I have a list of files I grab from a directory. Then I create a tar archive and write the files into the archive. everything works great, except that I cannot properly extract the files. What... (0 Replies)
Discussion started by: amcrisan
0 Replies

7. Shell Programming and Scripting

Extracting from archive | compressing to new archive

Hi there, I have one huge archive (it's a system image). I need sometime to create smaller archives with only one or two file from my big archive. So I'm looking for a command that extracts files from an archive and pipe them to another one. I tried the following : tar -xzOf oldarchive.tgz... (5 Replies)
Discussion started by: chebarbudo
5 Replies

8. Solaris

Boot-archive Problem

I disabled the boot-archive service by using #svcadm disable svc:/system/boot-archive:default then i rebooted my system but i am unable to boot. It throws the following errors CONSOLE LOGIN SERVICE(S) CANNOT RUN then it automatically asked me for the maintenance mode passwd. i logged... (3 Replies)
Discussion started by: dinu
3 Replies

9. Shell Programming and Scripting

Script to archive logs and sftp to another archive server

Requirement: Under fuse application we have placeholders called containers; Every container has their logs under: <container1>/data/log/fuse.log <container1>/data/log/fuse.log.1 <container1>/data/log/fuse.log.XX <container2>/data/log/fuse.log... (6 Replies)
Discussion started by: Arjun Goswami
6 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotsRules - Parse robots.txt files SYNOPSIS
require WWW::RobotRules; my $robotsrules = new WWW::RobotRules 'MOMspider/1.0'; use LWP::Simple qw(get); $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $robotsrules->parse($url, $robots_txt); # Now we are able to check if a URL is valid for those servers that # we have obtained and parsed "robots.txt" files for. if($robotsrules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses a /robots.txt file as specified in "A Standard for Robot Exclusion", described in <http://info.webcrawler.com/mak/projects/robots/norobots.html> Webmasters can use the /robots.txt file to disallow conforming robots access to parts of their web site. The parsed file is kept in the WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can parse multiple /robots.txt files. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://info.webcrawler.com/mak/projects/robots/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File libwww-perl-5.65 2001-04-20 WWW::RobotRules(3)
All times are GMT -4. The time now is 10:23 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy