Sponsored Content
Top Forums UNIX for Dummies Questions & Answers help to identify duplicate columns adjacent value Post 302512659 by tene on Monday 11th of April 2011 07:33:21 AM
Old 04-11-2011
Can you post some lines of the inputfile, the command you executed and the output you got?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Identify duplicate words in a line using command

Hi, Let me explain the problem clearly: Let the entries in my file be: lion,tiger,bear apple,mango,orange,apple,grape unix,windows,solaris,windows,linux red,blue,green,yellow orange,maroon,pink,violet,orange,pink Can we detect the lines in which one of the words(separated by field... (8 Replies)
Discussion started by: srinivasan_85
8 Replies

2. Shell Programming and Scripting

how to identify duplicate columns in a row

Hi, How to identify duplicate columns in a row? Input data: may have 30 columns 9211480750 LK 120070417 920091030 9211480893 AZ 120070607 9205323621 O7 120090914 120090914 1420090914 2020090914 2020090914 9211479568 AZ 120070327 320090730 9211479571 MM 120070326 9211480892 MM 120070324... (3 Replies)
Discussion started by: suresh3566
3 Replies

3. UNIX for Dummies Questions & Answers

Duplicate columns and lines

Hi all, I have a tab-delimited file and want to remove identical lines, i.e. all of line 1,2,4 because the columns are the same as the columns in other lines. Any input is appreciated. abc gi4597 9997 cgcgtgcg $%^&*()()* abc gi4597 9997 cgcgtgcg $%^&*()()* ttt ... (1 Reply)
Discussion started by: dr_sabz
1 Replies

4. Shell Programming and Scripting

How to calculate the difference between two adjacent columns?

Dear All, I need to find the difference between two adjacent columns. The file is having 'i' columns and i need to find the difference between two adjacent columns (like $1 difference $2; $2 difference $3; .... and $(i-1) difference $i). I have used the following coding awk '{ for (i=1; i<NF;... (7 Replies)
Discussion started by: Fredrick
7 Replies

5. Shell Programming and Scripting

Remove Duplicate by considering multiple columns

hi friends, my input chr1 exon 35204 35266 gene_id "GOLGB1"; transcript_id "GOLGB1"; chr1 exon 42357 42473 gene_id "GOLGB1"; transcript_id "GOLGB1"; chr1 exon 45261 45404 gene_id "GOLGB1"; transcript_id "GOLGB1"; chr1 exon 50701 50778 gene_id "GOLGB1"; transcript_id "GOLGB1";... (2 Replies)
Discussion started by: jacobs.smith
2 Replies

6. Shell Programming and Scripting

Check to identify duplicate values at first column in csv file

Hello experts, I have a requirement where I have to implement two checks on a csv file: 1. Check to see if the value in first column is duplicate, if any value is duplicate script should exit. 2. Check to verify if the value at second column is between "yes" or "no", if it is anything else... (4 Replies)
Discussion started by: avikaljain
4 Replies

7. Shell Programming and Scripting

Identify max value in diff columns for same row

Hi, I have a file with 1M records ABC 200 400 2.4 5.6 ABC 410 299 12 1.5 XYZ 4 5 6 7 MNO 22 40 30 70 MNO 47 55 80 150 What I want is for all the rows it should take the max value where there are duplicates output ABC 410 400 12 5.6 XYZ 4 5 6 7 MNO 47 55 80 150 How can i... (6 Replies)
Discussion started by: Diya123
6 Replies

8. Shell Programming and Scripting

Count duplicate lines ignoring certain columns

I have this structure: col1 col2 col3 col4 col5 27 xxx 38 aaa ttt 2 xxx 38 aaa yyy 1 xxx 38 aaa yyy I need to collapse duplicate lines ignoring column 1 and add values of duplicate lines (col1) so it will look like this: col1 col2 col3 col4 col5 27 xxx 38 aaa ttt ... (3 Replies)
Discussion started by: coppuca
3 Replies

9. Shell Programming and Scripting

Remove columns with duplicate entries

I have a 13gb file. It has the following columns: The 3rd column is basically correlation values. I want to delete those rows which are repeated between the columns: A B 0.04 B C 0.56 B B 1 A A 1 C D 1 C C 1 Desired Output: (preferably in a .csv format A,B,0.04 B,C,0.56 C,D,1... (3 Replies)
Discussion started by: Sanchari
3 Replies

10. Shell Programming and Scripting

Identify duplicate values at first column in csv file

Input 1,ABCD,no 2,system,yes 3,ABCD,yes 4,XYZ,no 5,XYZ,yes 6,pc,noCode used to find duplicate with regard to 2nd column awk 'NR == 1 {p=$2; next} p == $2 { print "Line" NR "$2 is duplicated"} {p=$2}' FS="," ./input.csv Now is there a wise way to de-duplicate the entire line (remove... (4 Replies)
Discussion started by: deadyetagain
4 Replies
DB2_SPECIAL_COLUMNS(3)							 1						    DB2_SPECIAL_COLUMNS(3)

db2_special_columns - Returns a result set listing the unique row identifier columns for a table

SYNOPSIS
resource db2_special_columns (resource $connection, string $qualifier, string $schema, string $table_name, int $scope) DESCRIPTION
Returns a result set listing the unique row identifier columns for a table. PARAMETERS
o $connection - A valid connection to an IBM DB2, Cloudscape, or Apache Derby database. o $qualifier - A qualifier for DB2 databases running on OS/390 or z/OS servers. For other databases, pass NULL or an empty string. o $schema - The schema which contains the tables. o $table_name - The name of the table. o $scope - Integer value representing the minimum duration for which the unique row identifier is valid. This can be one of the following values: +--------------+--------------------------------------+---+ |Integer value | | | | | | | | | SQL constant | | | | | | | | Description | | | | | | +--------------+--------------------------------------+---+ | 0 | | | | | | | | | SQL_SCOPE_CURROW | | | | | | | | Row identifier is valid only while | | | | the cursor is positioned on the row. | | | | | | | 1 | | | | | | | | | SQL_SCOPE_TRANSACTION | | | | | | | | Row identifier is valid for the | | | | duration of the transaction. | | | | | | | 2 | | | | | | | | | SQL_SCOPE_SESSION | | | | | | | | Row identifier is valid for the | | | | duration of the connection. | | | | | | +--------------+--------------------------------------+---+ RETURN VALUES
Returns a statement resource with a result set containing rows with unique row identifier information for a table. The rows are composed of the following columns: +------------+---------------------------------------------------+ |Column name | | | | | | | Description | | | | +------------+---------------------------------------------------+ | SCOPE | | | | | | | | | | | | | box, tab (|); c | c | c | . T{ Integer | | | value | | | | | | SQL constant | | | | | | Description | | | | +------------+---------------------------------------------------+ | 0 | | | | | | | SQL_SCOPE_CURROW | | | | | | Row identifier is valid only while the cursor is | | | positioned on the row. | | | | | 1 | | | | | | | SQL_SCOPE_TRANSACTION | | | | | | Row identifier is valid for the duration of the | | | transaction. | | | | | 2 | | | | | | | SQL_SCOPE_SESSION | | | | | | Row identifier is valid for the duration of the | | | connection. | | | | +------------+---------------------------------------------------+ T} T{ COLUMN_NAME T} |T{ Name of the unique column. T} T{ DATA_TYPE T} |T{ SQL data type for the column. T} T{ TYPE_NAME T} |T{ Character string representation of the SQL data type for the column. T} T{ COLUMN_SIZE T} |T{ An integer value representing the size of the column. T} T{ BUFFER_LENGTH T} |T{ Maximum number of bytes necessary to store data from this column. T} T{ DECIMAL_DIGITS T} |T{ The scale of the column, or NULL where scale is not applicable. T} T{ NUM_PREC_RADIX T} |T{ An integer value of either 10 (representing an exact numeric data type), 2 (representing an approximate numeric data type), or NULL (rep- resenting a data type for which radix is not applicable). T} T{ PSEUDO_COLUMN T} |T{ Always returns 1. T} SEE ALSO
db2_column_privileges(3), db2_columns(3), db2_foreign_keys(3), db2_primary_keys(3), db2_procedure_columns(3), db2_procedures(3), db2_sta- tistics(3), db2_table_privileges(3), db2_tables(3). PHP Documentation Group DB2_SPECIAL_COLUMNS(3)
All times are GMT -4. The time now is 01:17 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy