Sponsored Content
Top Forums Shell Programming and Scripting checking duplicate entry in file Post 302551937 by sulti on Thursday 1st of September 2011 06:05:26 AM
Old 09-01-2011
Code:
sort inputfile | uniq -c | sort -g -k1

 

10 More Discussions You Might Find Interesting

1. HP-UX

Hazardous Duplicate Cron Entry?

Hi All, How to prevent starting of processes that have duplicate entries in cron file, i have written a shell script to validate with "ps |grep" command before starting the process, but still when same process started at same time, it may not be able to detect the existing process. Sample... (3 Replies)
Discussion started by: nag_sundaram
3 Replies

2. UNIX for Dummies Questions & Answers

Remove duplicate entry in one line

Can anyone help me how can i print only the unique entry in a line? MI_AP MI_AP MI_CM MI_MF RC_NAP MBS_AP SF_RAN MBS_AP NT_CAR so that it will on output the one unique entry per line. MI_AP MI_CM MI_MF RC_NAP MBS_AP SF_RAN NT_CAR I can't find the same situation on the knowledge... (5 Replies)
Discussion started by: kharen11
5 Replies

3. Shell Programming and Scripting

Print Only second Duplicate entry in the file

I have file where it contains 2 columns. In two columns the first column is repeated more than once. I wanted to take the unique record in first column and the corresponding second column value . The below is the example of the file: 8244100320012955|000b063471a4... (4 Replies)
Discussion started by: ravi_rn
4 Replies

4. Shell Programming and Scripting

Need to delete duplicate lease entry

Hi *, I need to delete duplicate lease entries in file according to MAC/IP. I'm having tempfile which contains many lease info and need to have one entry for each IP(not more than that), if it contains more than one entry for same set, need to be deleted that entry... EX: lease... (4 Replies)
Discussion started by: SMNK
4 Replies

5. UNIX for Dummies Questions & Answers

Nested for loops for checking duplicate files

I am very new to bash scripting and this is my first script. I am trying to write a script that takes an argument d as the directory. It looks through the files to find duplicates and delete them. Here's some sorta-pseudocode but am unsure how to implement it: #! /bin/bash #get... (1 Reply)
Discussion started by: shubham92
1 Replies

6. Shell Programming and Scripting

Checking for duplicate code

I have a short line of code that checks very rudimentary for duplicate code: sort myfile.cpp | uniq -c | grep -v "^.*1 " | grep -v "}" It sorts the file, counts occurrences of each line, removes single occurrences and removes the ubiquitous closing brace. The language is C++, but is easily... (3 Replies)
Discussion started by: figaro
3 Replies

7. Shell Programming and Scripting

REMOVE DUPLICATE IN a ROW AFTER CHECKING THE FIRST SIMILAR NAME

Hi all I have a big file like this in rows and columns from 2 column onwards the next column is desciption of previous column means 3rd columns is description of 2 columns and 5 column is description of 4 column. All cloumns are separated by comma ... (1 Reply)
Discussion started by: manigrover
1 Replies

8. Shell Programming and Scripting

Checking crontab job entry in 3 different hosts

Hi Gurus, I am trying to connect to remote host from current host to check crontab entries. I have started like this ssh -n -l db2psp 205.191.156.17 ". ~/.profile >/dev/null 2>/dev/null; cd log ;ls | wc -l" I got this error ? ssh: connect to host 205.191.156.17 port 22:... (1 Reply)
Discussion started by: rocking77
1 Replies

9. Shell Programming and Scripting

Deleting duplicate glosses in a dictionary entry

I am working on an Urdu to Hindi dictionary and I have created the following file structure: Headword=Gloss1,Gloss2,Gloss3 i.e. glosses delimited by a comma. It so happens that in some cases (around 6000+ in a file of over 200,000+ the glosses are duplicated. Since this may be a... (3 Replies)
Discussion started by: gimley
3 Replies

10. UNIX for Beginners Questions & Answers

Iterate through a list - checking for a duplicate then report it ot

I have a job that produces a file of barcodes that gets added to every time the job runs I want to check the list to see if the barcode is already in the list and report it out if it is. (3 Replies)
Discussion started by: worky
3 Replies
ppmtosixel(1)						      General Commands Manual						     ppmtosixel(1)

NAME
ppmtosixel - convert a portable pixmap into DEC sixel format SYNOPSIS
ppmtosixel [-raw] [-margin] [ppmfile] DESCRIPTION
Reads a portable pixmap as input. Produces sixel commands (SIX) as output. The output is formatted for color printing, e.g. for a DEC LJ250 color inkjet printer. If RGB values from the PPM file do not have maxval=100, the RGB values are rescaled. A printer control header and a color assignment table begin the SIX file. Image data is written in a compressed format by default. A printer control footer ends the image file. OPTIONS
-raw If specified, each pixel will be explicitly described in the image file. If -raw is not specified, output will default to com- pressed format in which identical adjacent pixels are replaced by "repeat pixel" commands. A raw file is often an order of magni- tude larger than a compressed file and prints much slower. -margin If -margin is not specified, the image will be start at the left margin (of the window, paper, or whatever). If -margin is speci- fied, a 1.5 inch left margin will offset the image. PRINTING
Generally, sixel files must reach the printer unfiltered. Use the lpr -x option or cat filename > /dev/tty0?. BUGS
Upon rescaling, truncation of the least significant bits of RGB values may result in poor color conversion. If the original PPM maxval was greater than 100, rescaling also reduces the image depth. While the actual RGB values from the ppm file are more or less retained, the color palette of the LJ250 may not match the colors on your screen. This seems to be a printer limitation. SEE ALSO
ppm(5) AUTHOR
Copyright (C) 1991 by Rick Vinci. 26 April 1991 ppmtosixel(1)
All times are GMT -4. The time now is 08:11 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy