Sponsored Content
Full Discussion: Duplicate lines in a file
Top Forums UNIX for Dummies Questions & Answers Duplicate lines in a file Post 302661377 by nsuresh316 on Monday 25th of June 2012 08:24:52 AM
Old 06-25-2012
Duplicate lines in a file

I have a file with following data

A
B
C

I would like to print like this n times(For eg:5 times)
A
B
C
A
B
C
A
B
C
A
B
C
A
B
C
A
B
C
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove Duplicate Lines in File

I am doing KSH script to remove duplicate lines in a file. Let say the file has format below. FileA 1253-6856 3101-4011 1827-1356 1822-1157 1822-1157 1000-1410 1000-1410 1822-1231 1822-1231 3101-4011 1822-1157 1822-1231 and I want to simply it with no duplicate line as file... (5 Replies)
Discussion started by: Teh Tiack Ein
5 Replies

2. UNIX for Advanced & Expert Users

Duplicate lines in the file

Hi, I have a file with duplicate lines in it. I want to keep only the duplicate lines and delete the non duplicates. Can some one please help me? Regards Narayana Gupta (3 Replies)
Discussion started by: guptan
3 Replies

3. UNIX for Dummies Questions & Answers

How to redirect duplicate lines from a file????

Hi, I am having a file which contains many duplicate lines. I wanted to redirect these duplicate lines into another file. Suppose I have a file called file_dup.txt which contains some line as file_dup.txt A100-R1 ACCOUNTING-CONTROL ACTONA-ACTASTOR ADMIN-AUTH-STATS ACTONA-ACTASTOR... (3 Replies)
Discussion started by: zing_foru
3 Replies

4. UNIX for Dummies Questions & Answers

Remove Duplicate lines from File

I have a log file "logreport" that contains several lines as seen below: 04:20:00 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 06:38:08 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 07:11:05 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but... (18 Replies)
Discussion started by: Nysif Steve
18 Replies

5. Shell Programming and Scripting

Duplicate lines in a file

Hi All, I am trying to remove the duplicate entries in a file and print them just once. For example, if my input file has: 00:44,37,67,56,15,12 00:44,34,67,56,15,12 00:44,58,67,56,15,12 00:44,35,67,56,15,12 00:59,37,67,56,15,12 00:59,34,67,56,15,12 00:59,35,67,56,15,12... (7 Replies)
Discussion started by: faiz1985
7 Replies

6. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

7. Shell Programming and Scripting

How do I remove the duplicate lines in this file?

Hey guys, need some help to fix this script. I am trying to remove all the duplicate lines in this file. I wrote the following script, but does not work. What is the problem? The output file should only contain five lines: Later! (5 Replies)
Discussion started by: Ernst
5 Replies

8. UNIX for Advanced & Expert Users

Inserting duplicate lines in a file

Hi, I copied the contents of a binary file into a .text file using hd (hexdump) command. The data in binary file is such that I get in many places like following 00000250 00 00 00 00 3f 2d 91 68 3f 69 fb e7 00 00 00 00 |....?-.h?i......| 00000260 00 00 00 00 00 00 00 00 00 00 00 00 00... (2 Replies)
Discussion started by: KidD312
2 Replies

9. Shell Programming and Scripting

bash keep only duplicate lines in file

hello all in my bash script I have a file and I only want to keep the lines that appear twice in the file.Is there a way to do this? thanks in advance! (4 Replies)
Discussion started by: vlm
4 Replies

10. Shell Programming and Scripting

Remove duplicate lines from a file

Hi, I have a csv file which contains some millions of lines in it. The first line(Header) repeats at every 50000th line. I want to remove all the duplicate headers from the second occurance(should not remove the first line). I don't want to use any pattern from the Header as I have some... (7 Replies)
Discussion started by: sudhakar T
7 Replies
dappprof(1m)							   USER COMMANDS						      dappprof(1m)

NAME
dappprof - profile user and lib function usage. Uses DTrace. SYNOPSIS
dappprof [-acehoTU] [-u lib] { -p PID | command } DESCRIPTION
dappprof prints details on user and library call times for processes as a summary style aggragation. By default the user fuctions are traced, options can be used to trace library activity. Output can include function counts, elapsed times and on cpu times. The elapsed times are interesting, to help identify functions that take some time to complete (during which the process may have slept). CPU time helps us identify syscalls that are consuming CPU cycles to run. Since this uses DTrace, only users with root privileges can run this command. OPTIONS
-a print all data -c print function counts -e print elapsed times, ns -o print CPU times, ns -T print totals -p PID examine this PID -u lib trace this library instead -U trace all library and user functions EXAMPLES
run and examine the "df -h" command, # dappprof df -h print elapsed times, on-cpu times and counts for "df -h", # dappprof -ceo df -h print elapsed times for PID 1871, # dappprof -p 1871 print all data for PID 1871, # dappprof -ap 1871 FIELDS
CALL Function call name ELAPSED Total elapsed time, nanoseconds CPU Total on-cpu time, nanoseconds COUNT Number of occurrences DOCUMENTATION
See the DTraceToolkit for further documentation under the Docs directory. The DTraceToolkit docs may include full worked examples with ver- bose descriptions explaining the output. EXIT
dappprof will sample until Ctrl-C is hit. AUTHOR
Brendan Gregg [Sydney, Australia] SEE ALSO
dapptrace(1M), dtrace(1M), apptrace(1) version 1.10 May 14, 2005 dappprof(1m)
All times are GMT -4. The time now is 09:54 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy