05-19-2009
Schedule a cron job which dumps the ps output in a file, like that:
ps -whatever >> psoutput.txt
After you have collected all the needed data in psoutput.txt, the file can be easily processed by a suitable awk program.
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi,
I have a huge file of bibliographic records in some standard format.I need a script to do some repeatable task as follows:
1. Needs to create folders as the strings starts with "item_*" from the input file
2. Create a file "contents" in each folders having "license.txt(tab... (5 Replies)
Discussion started by: srsahu75
5 Replies
2. Shell Programming and Scripting
Input file data contents:
>seq_1
MSNQSPPQSQRPGHSHSHSHSHAGLASSTSSHSNPSANASYNLNGPRTGGDQRYRASVDA
>seq_2
AGAAGRGWGRDVTAAASPNPRNGGGRPASDLLSVGNAGGQASFASPETIDRWFEDLQHYE
>seq_3
ATLEEMAAASLDANFKEELSAIEQWFRVLSEAERTAALYSLLQSSTQVQMRFFVTVLQQM
ARADPITALLSPANPGQASMEAQMDAKLAAMGLKSPASPAVRQYARQSLSGDTYLSPHSA... (7 Replies)
Discussion started by: patrick87
7 Replies
3. UNIX for Dummies Questions & Answers
Hi All,
HP-UX dev4 B.11.11 U 9000/800 3251073457
I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this?
... (18 Replies)
Discussion started by: alok.behria
18 Replies
4. Shell Programming and Scripting
Hi,
I have to search a number in a very long listing of files.the total size of the files in which I have to search is 10 Tera Bytes.
How to search a number in such a huge amount of data effectively.I used fgrep but it is taking many hours to search. Is there any other feasible solution to... (3 Replies)
Discussion started by: vsachan
3 Replies
5. Red Hat
Hi All,
My disk usage show 100 % . When I check “df –kh” it shows my root partition is full. But when I run the “du –skh /” shows only 7 GB is used.
Filesystem Size Used Avail Use% Mounted on
/dev/sda1 30G 28G 260MB 100% /
How I can identify who is using the 20 GB of memory.
Os: Centos... (10 Replies)
Discussion started by: kalpeer
10 Replies
6. Shell Programming and Scripting
Hi Friends,
I have a file with sample amount data as follows:
-89990.3456
8788798.990000128
55109787.20
-12455558989.90876
I need to exclude the '-' symbol in order to treat all values as an absolute one and then I need to sum up.The record count is around 1 million.
How... (8 Replies)
Discussion started by: Ravichander
8 Replies
7. Shell Programming and Scripting
Dear all ,
I found that if we work with thousands line of data, awk does not work perfectly. It will cut hundreds line (others are deleted) and works only on the remain data.
I used this command : awk '$1==1{$1="Si"}{print>FILENAME}' coba.xyz to change value of first column whose value is 1... (4 Replies)
Discussion started by: ariesto
4 Replies
8. Solaris
Dear Experts,
I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers?
I already tried using Rsync and tar command. But using these command is too long.
Please advice.
Thanks
Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies
9. Shell Programming and Scripting
HI Guys,
I have Big XML file with Below Format :-
Input :-
<pokl>MKL=1,FN=1,GBNo=B10C</pokl>
<d>192</d>
<d>315</d>
<d>35</d>
<d>0,7,8</d>
<pokl>MKL=1,dFN=1,GBNo=B11C</pokl>
<d>162</d>
<d>315</d>
<d>35</d>
<d>0,5,6</d>
<pokl>MKL=1,dFN=1,GBNo=B12C</pokl>
<d>188</d> (4 Replies)
Discussion started by: pareshkp
4 Replies
10. UNIX for Advanced & Expert Users
Optimization shell/awk script to aggregate (sum) for all the columns of Huge data file
File delimiter "|"
Need to have Sum of all columns, with column number : aggregation (summation) for each column
File not having the header
Like below -
Column 1 "Total
Column 2 : "Total
...
...... (2 Replies)
Discussion started by: kartikirans
2 Replies
LEARN ABOUT DEBIAN
sc_warts2text
SC_WARTS2TEXT(1) BSD General Commands Manual SC_WARTS2TEXT(1)
NAME
sc_warts2text -- simple dump of information contained in a warts file.
SYNOPSIS
sc_warts2text [-d ip2descr-file] [file ...]
DESCRIPTION
The sc_warts2text utility provides a simple dump of information contained in a sequence of warts files. The output is the same as that which
would have been provided by scamper if the text output option had been chosen instead of the warts output option when the data was collected.
The options are as follows:
-d ip2descr-file
specifies the name of a file with IP-address, description mappings, one mapping per line. See the examples section for further
information.
While the output of sc_warts2text is structured and suitable for initial analyses of results, the format of the output is not suitable for
automated parsing and analysis as the output of sc_warts2text will change overtime with no regard to backwards compatibility. Analyses of
the contents of a warts file should be made using specialised programs which link against the scamper file API.
EXAMPLES
The command:
sc_warts2text file1.warts file2.warts
will decode and print the contents of file1.warts, followed by the contents of file2.warts.
The command:
gzcat file1.warts.gz | sc_warts2text
will print the contents of the uncompressed file supplied on stdin.
Given a set of IP-address, description pairs in a file name mappings.txt:
192.0.2.1 "foo"
192.0.2.2 "bar"
then the command gzcat file1.warts.gz | sc_warts2text -d mappings.txt will print the description associated with a given destination address
before each result is presented.
SEE ALSO
scamper(1), sc_wartsdump(1)
AUTHORS
sc_warts2text is written by Matthew Luckie <mjl@luckie.org.nz>.
BSD
October 15, 2010 BSD