Sponsored Content
Top Forums Shell Programming and Scripting Removing duplicate records in a file based on single column Post 302548986 by jgt on Saturday 20th of August 2011 07:44:44 AM
Old 08-20-2011
Quote:
Originally Posted by G.K.K
i am allowed to use awk/sed command alone Smilie. can someone give suggestion how exactly i can code it in single command line.
Who makes up these rules, and why????
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Filtering records of a file based on a value of a column

Hi all, I would like to extract records of a file based on a condition. The file contains 47 fields, and I would like to extract only those records that match a certain value in one of the columns, e.g. COL1 COL2 COL3 ............... COL47 1 XX 45 ... (4 Replies)
Discussion started by: risk_sly
4 Replies

2. Shell Programming and Scripting

Find Duplicate records in first Column in File

Hi, Need to find a duplicate records on the first column, ANU4501710430989 0000000W20389390 ANU4501710430989 0000000W67065483 ANU4501130050520 0000000W80838713 ANU4501210170685 0000000W69246611... (3 Replies)
Discussion started by: Murugesh
3 Replies

3. Shell Programming and Scripting

Removing duplicate records from 2 files

Can anyone help me to removing duplicate records from 2 separate files in UNIX? Please find the sample records for both the files cat Monday.dat 3FAHP0JA1AR319226MOHMED ATEK 966504453742 SAU2010DE 3LNHL2GC6AR636361HEA DEUK CHOI 821057314531 KOR2010LE 3MEHM0JG7AR652083MUTLAB NAL-NAFISAH... (4 Replies)
Discussion started by: zooby
4 Replies

4. Shell Programming and Scripting

duplicate row based on single column

I am a newbie to shell scripting .. I have a .csv file. It has 1000 some rows and about 7 columns... but before I insert this data to a table I have to parse it and clean it ..basing on the value of the first column..which a string of phone number type... example below.. column 1 ... (2 Replies)
Discussion started by: mitr
2 Replies

5. Linux

Need awk script for removing duplicate records

I have log file having Traffic line 2011-05-21 15:11:50.356599 TCP (6), length: 52) 10.10.10.1.3020 > 10.10.10.254.50404: 2011-05-21 15:11:50.652739 TCP (6), length: 52) 10.10.10.254.50404 > 10.10.10.1.3020: 2011-05-21 15:11:50.652558 TCP (6), length: 89) 10.10.10.1.3020 >... (1 Reply)
Discussion started by: Rastamed
1 Replies

6. UNIX for Dummies Questions & Answers

Remove duplicate rows when >10 based on single column value

Hello, I'm trying to delete duplicates when there are more than 10 duplicates, based on the value of the first column. e.g. a 1 a 2 a 3 b 1 c 1 gives b 1 c 1 but requires 11 duplicates before it deletes. Thanks for the help Video tutorial on how to use code tags in The UNIX... (11 Replies)
Discussion started by: informaticist
11 Replies

7. Shell Programming and Scripting

Removing duplicate records in a file based on single column explanation

I was reading this thread. It looks like a simpler way to say this is to only keep uniq lines based on field or column 1. https://www.unix.com/shell-programming-scripting/165717-removing-duplicate-records-file-based-single-column.html Can someone explain this command please? How are there no... (5 Replies)
Discussion started by: cokedude
5 Replies

8. Shell Programming and Scripting

Removing duplicate lines on first column based with pipe delimiter

Hi, I have tried to remove dublicate lines based on first column with pipe delimiter . but i ma not able to get some uniqu lines Command : sort -t'|' -nuk1 file.txt Input : 38376KZ|09/25/15|1.057 38376KZ|09/25/15|1.057 02006YB|09/25/15|0.859 12593PS|09/25/15|2.803... (2 Replies)
Discussion started by: parithi06
2 Replies

9. Shell Programming and Scripting

Filter duplicate records from csv file with condition on one column

I have csv file with 30, 40 columns Pasting just three column for problem description I want to filter record if column 1 matches CN or DN then, check for values in column 2 if column contain 1235, 1235 then in column 3 values must be sequence of 2345, 2345 and if column 2 contains 6789, 6789... (5 Replies)
Discussion started by: as7951
5 Replies

10. Shell Programming and Scripting

CSV File:Filter duplicate records from column1 & another column having unique record

Hi Experts, I have csv file with 30, 40 columns Pasting just 2 column for problem description. Need to print error if below combination is not present in file check for column-1 (DocumentNumber) and filter columns where value in DocumentNumber field is same. For all such rows, the field... (7 Replies)
Discussion started by: as7951
7 Replies
TPM QUOTE 
TOOLS(8) TPM QUOTE TOOLS(8) NAME
TPM Quote Tools PROGRAMS
tpm_mkuuid, tpm_mkaik, tpm_loadkey, tpm_unloadkey, tpm_getpcrhash, tpm_updatepcrhash, tpm_getquote, tpm_verifyquote DESCRIPTION
TPM Quote Tools is a collection of programs that provide support for TPM based attestation using the TPM quote operation. A TPM contains a set of Platform Configuration Registers (PCRs). In a well configured machine, some of these registers are set to known values during the boot up process or at other times. For example, a PCR might contain the hash of a boot loader in memory before it is run. The TPM quote operation is used to authoritatively verify the contents of a TPM's Platform Configuration Registers (PCRs). During provi- sioning, a composite hash of a selected set of PCRs is computed. The TPM quote operation produces a composite hash that can be compared with the one computed while provisioning. To use the TPM quote operation, keys must be generated. During provisioning, an Attestation Identity Key (AIK) is generated for each TPM, and the public part of the key is made available to entities that validate quotes. The TPM quote operation returns signed data and a signature. The data that is signed contains the PCRs selected for the operation, the composite hash for the selected PCRs, and a nonce provided as input, and used to prevent replay attacks. At provisioning time, the data that is signed is stored, not just the composite hash. The signature is discarded. An entity that wishes to evaluate a machine generates a nonce, and sends it along with the set of PCR used to generate the composite PCR hash at provisioning time. For this use of the TPM quote operation, the signed data is ignored, and the signature returned is used to val- idate the state of the TPM's PCRs. Given the signature, the evaluating entity replaces the nonce in the signed data generated at provi- sioning time, and checks to see if the signature is valid for the data. If so, this check ensures the selected PCRs contain values that match the ones measured during provisioning. A typical scenario for an enterprise using these tools follows. The tools expect AIKs to be referenced via one enterprise-wide Universally Unique Identifier (UUID). The program tpm_mkuuid creates one. For each machine being checked, an AIK is created using tpm_mkaik. The key blob produced is bound to the UUID on its machine using tpm_loadkey. The public key associated with the AIK is sent to the entities that verify quotes. Finally, the expected PCR composite hash is obtained using tpm_getpcrhash. When the expected PCR values change, a new hash can be generated with tpm_updatepcrhash. The program to obtain a quote, and thus measure the current state of the PCRs is tpm_getquote. The program that verifies the quote describes the same PCR composite hash as was measured initially is tpm_verifyquote. SEE ALSO
tpm_mkuuid(8), tpm_mkaik(8), tpm_loadkey(8), tpm_unloadkey(8), tpm_getpcrhash(8), tpm_updatepcrhash(8), tpm_getquote(8), tpm_verifyquote(8) Oct 2010 TPM QUOTE TOOLS(8)
All times are GMT -4. The time now is 08:13 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy