Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Getting file size from memory Post 302744067 by SkySmart on Thursday 13th of December 2012 07:54:35 PM
Old 12-13-2012
Getting file size from memory

i want to avoid writing to a file on the disk. i'd like to do this in memory.

i have a situation where i'm running
Code:
cat file.txt | head -l 2024  > /tmp/data.txt

now, i check the size of the data.txt by doing a "
Code:
du -sh /tmp/data.txt

how can i get the size of "head -l 2024" WITHOUT having to write to /tmp/data.txt.

basically i would like to avoid the "> /tmp/data.txt"

Last edited by jim mcnamara; 12-13-2012 at 09:07 PM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to get memory size in HP and not as user root?

Hi falks, I'm trying to get the momory size in HP-UX as user oracle. The command to do it is: /usr/sbin/dmesg | grep "Physical" | awk '{print $2}' The problem is that only user root can run this command and i need to run it as user oracle. Do you know another way to get the memory size in HP... (2 Replies)
Discussion started by: nir_s
2 Replies

2. UNIX for Advanced & Expert Users

memory size under AIX

Hi, how to know size of physical memory under AIX ? Many thanks. PS : man -k memory man : 0703-310 Fichier man introuvable. uname -a AIX server1 1 5 005202DF4C00 (3 Replies)
Discussion started by: big123456
3 Replies

3. Solaris

Command to check memory size

Hi, I am looking for acoomand on HP where by i can see the CPU increasing for a given process ... I know i can see this from top/prstat .. But it will give for all the processes - I want something like say ps where i can call it from a shell script a few times and check if it is has increased... (0 Replies)
Discussion started by: nano2
0 Replies

4. Solaris

Finding out the memory size via the iLOM

I would like to know if it is possible to find out how much memory is in a machine from the iLOM prompt on an x86 box? I have retrieved the MAC address details from the iLOM promt before using show /SYS/MB/NETx and wondering if I can do the same for the Memory although I can't seem to find anything... (4 Replies)
Discussion started by: Chains
4 Replies

5. Solaris

How to know the size of the program currently executing in memory

hey everybody, i am currently working on solaris 10 os on a m5000 server. my problem is when i want the exact size of a program in execution, i am unable to do it. earlier i thought the RSS field of prstat but because of its large size it cant be the size. pmap -x shows some output but it includes... (2 Replies)
Discussion started by: aryansheikh
2 Replies

6. IP Networking

IPMI - Get physical memory size

Hi, Does anybody know how to get the RAM size of a powerless server (OS off), with a network hardware management protocol like IPMI ??? Thx (0 Replies)
Discussion started by: sncr24
0 Replies

7. UNIX for Advanced & Expert Users

Out of Memory error when free memory size is large

I was running a program and it stopped and showed "Out of Memory!". at that time, the RAM used by this process is around 4G and the free memory size of the machine is around 30G. Does anybody know what maybe the reason? this program is written with Perl. the OS of the machine is Solaris U8. And I... (1 Reply)
Discussion started by: lilili07
1 Replies

8. UNIX for Dummies Questions & Answers

swap memory and original size of HD

few questions a. where can I find the RAM of a server? im about to install redhat on a server (reformat). need to know because it will be my basis for swap size. i saw something line 3048MB detected upon boot. is this the memory? b. what is the command in lunux to check the original size of... (2 Replies)
Discussion started by: lhareigh890
2 Replies

9. Solaris

Memory or CPU size

Is there a command or file I can look at that tells me how much real memory a machine has? A little background. In my shop we run a bunch of java programs, sometimes some of these jobs have config definitions that call for 2G. I would like to know how many I can run before I exhaust rescources. Any... (12 Replies)
Discussion started by: Harleyrci
12 Replies

10. Programming

Size of memory used by a program

Hello, Here is a portion of my code: a=(int *) malloc(dim*dim*sizeof(int)); b=(int *) malloc(dim*dim*sizeof(int)); c=(int *) malloc(dim*dim*sizeof(int)); for(i=0;i<dim;i++) for(j=0;j<dim;j++) c= rand(); for(i=0;i<dim;i++) for(j=0;j<dim;j++) b=rand(); ... (6 Replies)
Discussion started by: chercheur111
6 Replies
WWW::RobotRules(3)					User Contributed Perl Documentation					WWW::RobotRules(3)

NAME
WWW::RobotRules - database of robots.txt-derived permissions SYNOPSIS
use WWW::RobotRules; my $rules = WWW::RobotRules->new('MOMspider/1.0'); use LWP::Simple qw(get); { my $url = "http://some.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } { my $url = "http://some.other.place/robots.txt"; my $robots_txt = get $url; $rules->parse($url, $robots_txt) if defined $robots_txt; } # Now we can check if a URL is valid for those servers # whose "robots.txt" files we've gotten and parsed: if($rules->allowed($url)) { $c = get $url; ... } DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. ROBOTS.TXT The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of <http://www.robotstxt.org/wc/norobots.html>): The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form <field-name>: <value> The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The following <field-names> can be used: User-Agent The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records. The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be the first field in a new record. Disallow The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that starts with this value will not be retrieved Unrecognized records are ignored. ROBOTS.TXT EXAMPLES The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space Disallow: /tmp/ # these will soon disappear This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called "cybermapper": User-agent: * Disallow: /cyberworld/map/ # This is an infinite virtual URL space # Cybermapper knows where to go. User-agent: cybermapper Disallow: This example indicates that no robots should visit this site further: # go away User-agent: * Disallow: / This is an example of a malformed robots.txt file. # robots.txt for ancientcastle.example.com # I've locked myself away. User-agent: * Disallow: / # The castle is your home now, so you can go anywhere you like. User-agent: Belle Disallow: /west-wing/ # except the west wing! # It's good to be the Prince... User-agent: Beast Disallow: This file is missing the required blank lines between records. However, the intention is clear. SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File perl v5.12.1 2009-10-03 WWW::RobotRules(3)
All times are GMT -4. The time now is 02:28 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy