Sponsored Content
Full Discussion: Remove duplicate email
Top Forums UNIX for Beginners Questions & Answers Remove duplicate email Post 303041521 by vgersh99 on Tuesday 26th of November 2019 03:59:48 PM
Old 11-26-2019
how about (for starters):
Code:
awk -v str='x*.com' '$1~str && !a[$1]++' myFile

This User Gave Thanks to vgersh99 For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

remove duplicate

i have a text its contain many record, but its written in one line, i want to remove from that line the duplicate record, not record have fixed width ex: width = 4 inputfile test.txt =abc cdf abc abc cdf fgh fgh abc abc i want the outputfile =abc cdf fgh only those records can any one help... (4 Replies)
Discussion started by: kazanoova2
4 Replies

2. Shell Programming and Scripting

Remove duplicate ???

Hi all, I have a out.log file CARR|02/26/2006 10:58:30.107|CDxAcct=1405157051 CARR|02/26/2006 11:11:30.107|CDxAcct=1405157051 CARR|02/26/2006 11:18:30.107|CDxAcct=7659579782 CARR|02/26/2006 11:28:30.107|CDxAcct=9534922327 CARR|02/26/2006 11:38:30.107|CDxAcct=9534922327 CARR|02/26/2006... (3 Replies)
Discussion started by: sabercats
3 Replies

3. Shell Programming and Scripting

Remove duplicate

Hi all, I have a text file fileA.txt DXRV|02/28/2006 11:36:49.049|SAC||||CDxAcct=2420991350 DXRV|02/28/2006 11:37:06.404|SAC||||CDxAcct=6070970034 DXRV|02/28/2006 11:37:25.740|SAC||||CDxAcct=2420991350 DXRV|02/28/2006 11:38:32.633|SAC||||CDxAcct=6070970034 DXRV|02/28/2006... (2 Replies)
Discussion started by: sabercats
2 Replies

4. Shell Programming and Scripting

Remove duplicate text

Hello, I have a log file which is generated by a script which looks like this: userid: 7 starttime: Sat May 24 23:24:13 CEST 2008 endtime: Sat May 24 23:26:57 CEST 2008 total time spent: 2.73072 minutes / 163.843 seconds date: Sat Jun 7 16:09:03 CEST 2008 userid: 8 starttime: Sun May... (7 Replies)
Discussion started by: dejavu88
7 Replies

5. UNIX for Dummies Questions & Answers

Remove duplicate in array

Hi, I have a list of numbers stored in an array as below. 5 7 10 30 30 40 50 Please advise how could I remove the duplicate value in the array ? Thanks in advance. (5 Replies)
Discussion started by: Rock
5 Replies

6. Shell Programming and Scripting

remove duplicate

Hi, I am tryung to use shell or perl to remove duplicate characters for example , if I have " I love google" it will become I love ggle" or even "I loveggle" if removing duplicate white space Thanks CC (6 Replies)
Discussion started by: ccp
6 Replies

7. Shell Programming and Scripting

How to remove duplicate ID's?

HI I have file contains 1000'f of duplicate id's with (upper and lower first character) as below i/p: a411532A411532a508661A508661c411532C411532 Requirement: But i need to ignore lowercase id's and need only below id's o/p: A411532 A508661 C411532 (9 Replies)
Discussion started by: buzzme
9 Replies

8. Shell Programming and Scripting

Remove duplicate

Hi , I have a pipe seperated file repo.psv where i need to remove duplicates based on the 1st column only. Can anyone help with a Unix script ? Input: 15277105||Common Stick|ESHR||Common Stock|CYRO AB 15277105||Common Stick|ESHR||Common Stock|CYRO AB 16111278||Common Stick|ESHR||Common... (12 Replies)
Discussion started by: samrat dutta
12 Replies

9. UNIX for Dummies Questions & Answers

Remove duplicate

Hi, How can I replace || with space and then remove duplicate from following text? T111||T222||T444||T222||T555 Thanks in advance (10 Replies)
Discussion started by: tinku981
10 Replies

10. Shell Programming and Scripting

Remove duplicate records

Hi, i am working on a script that would remove records or lines in a flat file. The only difference in the file is the "NOT NULL" word. Please see below example of the input file. INPUT FILE:> CREATE a ( TRIAL_CLIENT NOT NULL VARCHAR2(60), TRIAL_FUND NOT NULL... (3 Replies)
Discussion started by: reignangel2003
3 Replies
SSP(3)							   BSD Library Functions Manual 						    SSP(3)

NAME
ssp -- bounds checked libc functions LIBRARY
Buffer Overflow Protection Library (libssp, -lssp) SYNOPSIS
#include <ssp/stdio.h> int sprintf(char *str, const char *fmt, ...); int vsprintf(char *str, const char *fmt, va_list ap); int snprintf(char *str, size_t len, const char *fmt, ...); int vsnprintf(char *str, size_t len, const char *fmt, va_list ap); char * gets(char *str); char * fgets(char *str, int len, FILE *fp); #include <ssp/string.h> void * memcpy(void *str, const void *ptr, size_t len); void * memmove(void *str, const void *ptr, size_t len); void * memset(void *str, int val, size_t len); char * strcpy(char *str, const char *ptr, size_t len); char * strcat(char *str, const char *ptr, size_t len); char * strncpy(char *str, const char *ptr, size_t len); char * strncat(char *str, const char *ptr, size_t len); #include <ssp/strings.h> void * bcopy(const void *ptr, void *str, size_t len); void * bzero(void *str, size_t len); #include <ssp/unistd.h> ssize_t read(int fd, void *str, size_t len); int readlink(const char * restrict path, char * restrict str, size_t len); int getcwd(char *str, size_t len); DESCRIPTION
When _FORTIFY_SOURCE bounds checking is enabled as described below, the above functions get overwritten to use the __builtin_object_size(3) function to compute the size of str, if known at compile time, and perform bounds check on it in order to avoid data buffer or stack buffer overflows. If an overflow is detected, the routines will call abort(3). To enable these function overrides the following should be added to the gcc(1) command line: ``-I/usr/include/ssp'' to override the standard include files and ``-D_FORTIFY_SOURCE=1'' or ``-D_FORTIFY_SOURCE=2''. If _FORTIFY_SOURCE is set to 1 the code will compute the maximum possible buffer size for str, and if set to 2 it will compute the minimum buffer size. SEE ALSO
gcc(1), __builtin_object_size(3), stdio(3), string(3), security(7) HISTORY
The ssp library appeared NetBSD 4.0. BSD
March 21, 2011 BSD
All times are GMT -4. The time now is 08:44 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy