Sponsored Content
Top Forums UNIX for Advanced & Expert Users A variable and sum of its value in a huge data. Post 302317436 by varungupta on Tuesday 19th of May 2009 02:49:04 AM
Old 05-19-2009
Quote:
Originally Posted by ghostdog74
you can format your ps output using -o option, eg ps -eo args,pid ...check the man page for more info
Suppose I got 2 columns that I highlighted above, then what about the sum of each those BOLD faced variables and its value (in RED face) for an hour, with a gap of 10 seconds.
Smilie
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to extract data from a huge file?

Hi, I have a huge file of bibliographic records in some standard format.I need a script to do some repeatable task as follows: 1. Needs to create folders as the strings starts with "item_*" from the input file 2. Create a file "contents" in each folders having "license.txt(tab... (5 Replies)
Discussion started by: srsahu75
5 Replies

2. Shell Programming and Scripting

Split a huge data into few different files?!

Input file data contents: >seq_1 MSNQSPPQSQRPGHSHSHSHSHAGLASSTSSHSNPSANASYNLNGPRTGGDQRYRASVDA >seq_2 AGAAGRGWGRDVTAAASPNPRNGGGRPASDLLSVGNAGGQASFASPETIDRWFEDLQHYE >seq_3 ATLEEMAAASLDANFKEELSAIEQWFRVLSEAERTAALYSLLQSSTQVQMRFFVTVLQQM ARADPITALLSPANPGQASMEAQMDAKLAAMGLKSPASPAVRQYARQSLSGDTYLSPHSA... (7 Replies)
Discussion started by: patrick87
7 Replies

3. UNIX for Dummies Questions & Answers

Copy huge data into vi editor

Hi All, HP-UX dev4 B.11.11 U 9000/800 3251073457 I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this? ... (18 Replies)
Discussion started by: alok.behria
18 Replies

4. Shell Programming and Scripting

search a number in very very huge amount of data

Hi, I have to search a number in a very long listing of files.the total size of the files in which I have to search is 10 Tera Bytes. How to search a number in such a huge amount of data effectively.I used fgrep but it is taking many hours to search. Is there any other feasible solution to... (3 Replies)
Discussion started by: vsachan
3 Replies

5. Red Hat

Disk is Full but really does not contain huge data

Hi All, My disk usage show 100 % . When I check “df –kh” it shows my root partition is full. But when I run the “du –skh /” shows only 7 GB is used. Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 28G 260MB 100% / How I can identify who is using the 20 GB of memory. Os: Centos... (10 Replies)
Discussion started by: kalpeer
10 Replies

6. Shell Programming and Scripting

Aggregation of huge data

Hi Friends, I have a file with sample amount data as follows: -89990.3456 8788798.990000128 55109787.20 -12455558989.90876 I need to exclude the '-' symbol in order to treat all values as an absolute one and then I need to sum up.The record count is around 1 million. How... (8 Replies)
Discussion started by: Ravichander
8 Replies

7. Shell Programming and Scripting

awk does not work well with huge data?

Dear all , I found that if we work with thousands line of data, awk does not work perfectly. It will cut hundreds line (others are deleted) and works only on the remain data. I used this command : awk '$1==1{$1="Si"}{print>FILENAME}' coba.xyz to change value of first column whose value is 1... (4 Replies)
Discussion started by: ariesto
4 Replies

8. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

9. Shell Programming and Scripting

Phrase XML with Huge Data

HI Guys, I have Big XML file with Below Format :- Input :- <pokl>MKL=1,FN=1,GBNo=B10C</pokl> <d>192</d> <d>315</d> <d>35</d> <d>0,7,8</d> <pokl>MKL=1,dFN=1,GBNo=B11C</pokl> <d>162</d> <d>315</d> <d>35</d> <d>0,5,6</d> <pokl>MKL=1,dFN=1,GBNo=B12C</pokl> <d>188</d> (4 Replies)
Discussion started by: pareshkp
4 Replies

10. UNIX for Advanced & Expert Users

Need Optimization shell/awk script to aggreagte (sum) for all the columns of Huge data file

Optimization shell/awk script to aggregate (sum) for all the columns of Huge data file File delimiter "|" Need to have Sum of all columns, with column number : aggregation (summation) for each column File not having the header Like below - Column 1 "Total Column 2 : "Total ... ...... (2 Replies)
Discussion started by: kartikirans
2 Replies
TPM QUOTE 
TOOLS(8) TPM QUOTE TOOLS(8) NAME
TPM Quote Tools PROGRAMS
tpm_mkuuid, tpm_mkaik, tpm_loadkey, tpm_unloadkey, tpm_getpcrhash, tpm_updatepcrhash, tpm_getquote, tpm_verifyquote DESCRIPTION
TPM Quote Tools is a collection of programs that provide support for TPM based attestation using the TPM quote operation. A TPM contains a set of Platform Configuration Registers (PCRs). In a well configured machine, some of these registers are set to known values during the boot up process or at other times. For example, a PCR might contain the hash of a boot loader in memory before it is run. The TPM quote operation is used to authoritatively verify the contents of a TPM's Platform Configuration Registers (PCRs). During provi- sioning, a composite hash of a selected set of PCRs is computed. The TPM quote operation produces a composite hash that can be compared with the one computed while provisioning. To use the TPM quote operation, keys must be generated. During provisioning, an Attestation Identity Key (AIK) is generated for each TPM, and the public part of the key is made available to entities that validate quotes. The TPM quote operation returns signed data and a signature. The data that is signed contains the PCRs selected for the operation, the composite hash for the selected PCRs, and a nonce provided as input, and used to prevent replay attacks. At provisioning time, the data that is signed is stored, not just the composite hash. The signature is discarded. An entity that wishes to evaluate a machine generates a nonce, and sends it along with the set of PCR used to generate the composite PCR hash at provisioning time. For this use of the TPM quote operation, the signed data is ignored, and the signature returned is used to val- idate the state of the TPM's PCRs. Given the signature, the evaluating entity replaces the nonce in the signed data generated at provi- sioning time, and checks to see if the signature is valid for the data. If so, this check ensures the selected PCRs contain values that match the ones measured during provisioning. A typical scenario for an enterprise using these tools follows. The tools expect AIKs to be referenced via one enterprise-wide Universally Unique Identifier (UUID). The program tpm_mkuuid creates one. For each machine being checked, an AIK is created using tpm_mkaik. The key blob produced is bound to the UUID on its machine using tpm_loadkey. The public key associated with the AIK is sent to the entities that verify quotes. Finally, the expected PCR composite hash is obtained using tpm_getpcrhash. When the expected PCR values change, a new hash can be generated with tpm_updatepcrhash. The program to obtain a quote, and thus measure the current state of the PCRs is tpm_getquote. The program that verifies the quote describes the same PCR composite hash as was measured initially is tpm_verifyquote. SEE ALSO
tpm_mkuuid(8), tpm_mkaik(8), tpm_loadkey(8), tpm_unloadkey(8), tpm_getpcrhash(8), tpm_updatepcrhash(8), tpm_getquote(8), tpm_verifyquote(8) Oct 2010 TPM QUOTE TOOLS(8)
All times are GMT -4. The time now is 02:30 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy