"Duplicate lines in a file"

Post #302661401 by itkamaraj on Monday 25th of June 2012 08:50:24 AM

Full Discussion: Duplicate lines in a file
in bash,

Code:
for i in {1..5}; do cat filename.txt ; done

 
Test Your Knowledge in Computers #906
Difficulty: Easy
A Unix shell repeatedly prints a prompt, waits for a command line on stdin, and then carries out some action, as directed by the contents of the command line.
True or False?

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Duplicate lines in the file

Hi, I have a file with duplicate lines in it. I want to keep only the duplicate lines and delete the non duplicates. Can some one please help me? Regards Narayana Gupta (3 Replies)
Discussion started by: guptan
3 Replies

2. UNIX for Dummies Questions & Answers

removing duplicate lines from a file

Hi, I am trying to remove duplicate lines from a file. For example the contents of example.txt is: this is a test 2342 this is a test 34343 this is a test 43434 and i want to remove the "this is a test" lines only and end up with the numbers in the file, that is, end up with: 2342... (4 Replies)
Discussion started by: ocelot
4 Replies

3. UNIX for Dummies Questions & Answers

How to redirect duplicate lines from a file????

Hi, I am having a file which contains many duplicate lines. I wanted to redirect these duplicate lines into another file. Suppose I have a file called file_dup.txt which contains some line as file_dup.txt A100-R1 ACCOUNTING-CONTROL ACTONA-ACTASTOR ADMIN-AUTH-STATS ACTONA-ACTASTOR... (3 Replies)
Discussion started by: zing_foru
3 Replies

4. UNIX for Dummies Questions & Answers

Remove Duplicate lines from File

I have a log file "logreport" that contains several lines as seen below: 04:20:00 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 06:38:08 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 07:11:05 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but... (18 Replies)
Discussion started by: Nysif Steve
18 Replies

5. Shell Programming and Scripting

Duplicate lines in a file

Hi All, I am trying to remove the duplicate entries in a file and print them just once. For example, if my input file has: 00:44,37,67,56,15,12 00:44,34,67,56,15,12 00:44,58,67,56,15,12 00:44,35,67,56,15,12 00:59,37,67,56,15,12 00:59,34,67,56,15,12 00:59,35,67,56,15,12... (7 Replies)
Discussion started by: faiz1985
7 Replies

6. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

7. Shell Programming and Scripting

How do I remove the duplicate lines in this file?

Hey guys, need some help to fix this script. I am trying to remove all the duplicate lines in this file. I wrote the following script, but does not work. What is the problem? The output file should only contain five lines: Later! (5 Replies)
Discussion started by: Ernst
5 Replies

8. UNIX for Advanced & Expert Users

Inserting duplicate lines in a file

Hi, I copied the contents of a binary file into a .text file using hd (hexdump) command. The data in binary file is such that I get in many places like following 00000250 00 00 00 00 3f 2d 91 68 3f 69 fb e7 00 00 00 00 |....?-.h?i......| 00000260 00 00 00 00 00 00 00 00 00 00 00 00 00... (2 Replies)
Discussion started by: KidD312
2 Replies

9. Shell Programming and Scripting

bash keep only duplicate lines in file

hello all in my bash script I have a file and I only want to keep the lines that appear twice in the file.Is there a way to do this? thanks in advance! (4 Replies)
Discussion started by: vlm
4 Replies

10. Shell Programming and Scripting

Remove duplicate lines from a file

Hi, I have a csv file which contains some millions of lines in it. The first line(Header) repeats at every 50000th line. I want to remove all the duplicate headers from the second occurance(should not remove the first line). I don't want to use any pattern from the Header as I have some... (7 Replies)
Discussion started by: sudhakar T
7 Replies
MEMCACHED_SERVER_ADD(3) 					   libmemcached 					   MEMCACHED_SERVER_ADD(3)

NAME
memcached_server_add - libmemcached Documentation SYNOPSIS
#include <libmemcached/memcached.h> memcached_server_fn uint32_t memcached_server_count(memcached_st *ptr) memcached_return_t memcached_server_add(memcached_st *ptr, const char *hostname, in_port_t port) memcached_return_t memcached_server_add_udp(memcached_st *ptr, const char *hostname, in_port_t port) memcached_return_t memcached_server_add_unix_socket(memcached_st *ptr, const char *socket) memcached_return_t memcached_server_push(memcached_st *ptr, const memcached_server_st *list) memcached_server_instance_st memcached_server_by_key(memcached_st *ptr, const char *key, size_t key_length, memcached_return_t *error) memcached_server_instance_st memcached_server_get_last_disconnect(const memcached_st *ptr) memcached_return_t memcached_server_cursor(const memcached_st *ptr, const memcached_server_fn *callback, void *context, uint32_t num- ber_of_callbacks) compile and link with -lmemcached DESCRIPTION
libmemcached performs operations on a list of hosts. The order of these hosts determine routing to keys. Functions are provided to add keys to memcached_st structures. To manipulate lists of servers see memcached_server_st(3). memcached_server_count() provides you a count of the current number of servers being used by a memcached_st structure. memcached_server_add() pushes a single TCP server into the memcached_st structure. This server will be placed at the end. Duplicate servers are allowed, so duplication is not checked. Executing this function with the MEMCACHED_BEHAVIOR_USE_UDP behavior set will result in a MEMCACHED_INVALID_HOST_PROTOCOL. memcached_server_add_udp() pushes a single UDP server into the memcached_st structure. This server will be placed at the end. Duplicate servers are allowed, so duplication is not checked. Executing this function with out setting the MEMCACHED_BEHAVIOR_USE_UDP behavior will result in a MEMCACHED_INVALID_HOST_PROTOCOL. memcached_server_add_unix_socket() pushes a single UNIX socket into the memcached_st structure. This UNIX socket will be placed at the end. Duplicate servers are allowed, so duplication is not checked. The length of the filename must be one character less than MEM- CACHED_MAX_HOST_LENGTH. memcached_server_push() pushes an array of memcached_server_st into the memcached_st structure. These servers will be placed at the end. Duplicate servers are allowed, so duplication is not checked. A copy is made of structure so the list provided (and any operations on the list) are not saved. memcached_server_by_key() allows you to provide a key and retrieve the server which would be used for assignment. memcached_server_get_last_disconnect() returns a pointer to the last server for which there was a connection problem. It does not mean this particular server is currently dead but if the library is reporting a server is, the returned server is a very good candidate. memcached_server_cursor() takes a memcached_st and loops through the list of hosts currently in the cursor calling the list of callback functions provided. You can optionally pass in a value via context which will be provided to each callback function. An error return from any callback will terminate the loop. memcached_server_cursor() is passed the original caller memcached_st in its current state. RETURN
Varies, see particular functions. HOME
To find out more information please check: http://libmemcached.org/ SEE ALSO
memcached(1) libmemcached(3) memcached_strerror(3) AUTHOR
Brian Aker COPYRIGHT
2011, Brian Aker DataDifferential, http://datadifferential.com/ 1.0.8 May 22, 2012 MEMCACHED_SERVER_ADD(3)

Featured Tech Videos

All times are GMT -4. The time now is 05:55 PM.
Unix & Linux Forums Content Copyright 1993-2019. All Rights Reserved.
Privacy Policy