Hi I have a file that are a list of people & their credentials i recieve frequently The issue is that whne I catnet this list that duplicat entries exists & are NOT CONSECUTIVE (i.e. uniq -1 may not weork here )
I'm trying to write a scrip that will remove duplicate entries
the script can... (5 Replies)
Hi,
I have a text file with 2000 rows and 2000 columns (number of columns might vary from row to row) and "comma" is the delimiter.
In every row, there maybe few duplicates and we need to remove those duplicates and "shift left" the consequent values.
ex:
111 222 111 555
444 999 666... (6 Replies)
I have input file like below.
I00789524 0213 5212
D00789524 0213 5212
I00778787 2154 5412
The first two records are same(Duplicates) except I & D in the first character. I want non duplicates(ie. 3rd line) to be output. How can we get this . Can you help. Is there any single AWK or SED... (3 Replies)
Hi,
I have this large file and sometimes there are duplicates and I want to basically find them and figure how many there are.
So I have a file with multiple columns and the last column (9) has the duplicates.
eg.
yan
tar
tar
man
ban
tan
tub
tub
tub
Basically what I want to... (6 Replies)
Hi,
How to eliminate the duplicate values in unix? I have a excel file which contains duplicate values.
Need to use this in a script.
Thanks in advance. (3 Replies)
Hi All
In unix ,we have a file ,there we have to remove the duplicates by using one specific column.
Can any body tell me the command.
ex:
file1
id,name
1,ww
2,qwq
2,asas
3,asa
4,asas
4,asas
o/p:
1,ww
2,qwq
3,asa (7 Replies)
i want to remove all the duplictaes in a file.I dont want even a single entry.
For the input data:
12345|12|34
12345|13|23
3456|12|90
15670|12|13
12345|10|14
3456|12|13
i need the below data in one file
15670|12|13
and the below data in another file (9 Replies)
Hello,
I'm moving some disks from the rootvg on AIX 5.3.
# replacepv hdiskOLD hdiskNEW
I have for example hdisk12 and hdisk13 with hd5 (boot) LV and want to move hdisk13 So 1st I'm excluding it from the bootlist:
# bootlist -om normal hdisk12
then
# replacepv hdisk13... (7 Replies)
Hi guys,
I am trying to identify the number of duplicate entries in a string inputed by the user. Here is a command I use:
$ user_input="M T T"
$echo "${user_input}" | awk '{for(i=0;i<=NF;i++) print $i }'| sort | uniq -d
The above works fine for string with multiple letters. The problem is... (2 Replies)
Mac OS 10.9
Let me preface this by saying this is not for marketing or spamming purposes.
I have a script that scans all the email messages in a directory (~/Library/Mail/Mailboxes) and outputs a single column list of email addresses. This will run multiple times a day and append the output... (3 Replies)
Discussion started by: sudo
3 Replies
LEARN ABOUT PHP
collator_sort
COLLATOR_SORT(3) 1 COLLATOR_SORT(3)Collator::sort - Sort array using specified collator
Object oriented style
SYNOPSIS
public bool Collator::sort (array &$arr, [int $sort_flag])
DESCRIPTION
Procedural style
bool collator_sort (Collator $coll, array &$arr, [int $sort_flag])
This function sorts an array according to current locale rules.
Equivalent to standard PHP sort(3) .
PARAMETERS
o $coll
-Collator object.
o $arr
- Array of strings to sort.
o $sort_flag
- Optional sorting type, one of the following:
o Collator::SORT_REGULAR - compare items normally (don't change types)
o Collator::SORT_NUMERIC - compare items numerically
o Collator::SORT_STRING - compare items as strings
Default sorting type is Collator::SORT_REGULAR. It is also used if an invalid $sort_flag value has been specified.
RETURN VALUES
Returns TRUE on success or FALSE on failure.
EXAMPLES
Example #1
collator_sort(3) example
<?php
$coll = collator_create( 'en_US' );
$arr = array( 'at', 'as', 'as' );
var_export( $arr );
collator_sort( $coll, $arr );
var_export( $arr );
?>
The above example will output:
array (
0 => 'at',
1 => 'as',
2 => 'as',
)array (
0 => 'as',
1 => 'as',
2 => 'at',
)
SEE ALSO
Collator constants, collator_asort(3), collator_sort_with_sort_keys(3).
PHP Documentation Group COLLATOR_SORT(3)