Linux and UNIX Man Pages

Linux & Unix Commands - Search Man Pages

lp-set-dup(1) [debian man page]

lp-set-dup(1)						      General Commands Manual						     lp-set-dup(1)

NAME
lp-set-dup - mark one or more bugs as duplicate of another bug SYNOPSIS
lp-set-dup [-f] <main bug> <duplicate bug> [<duplicate bug> ...] lp-set-dup --help DESCRIPTION
lp-set-dup allow to easily mark one or more bug as duplicate of another bug. It checks for permission to operate on a given bug first, then perform required tasks on Launchpad. OPTIONS
Listed below are the command line options for lp-set-dup: -h, --help Display a help message and exit. -f Skip confirmation prompt. -l INSTANCE, --lpinstance=INSTANCE Use the specified instance of Launchpad (e.g. "staging"), instead of the default of "production". --no-conf Do not read any configuration files, or configuration from environment variables. ENVIRONMENT
All of the CONFIGURATION VARIABLES below are also supported as environment variables. Variables in the environment take precedence to those in configuration files. CONFIGURATION VARIABLES
The following variables can be set in the environment or in ubuntu-dev-tools(5) configuration files. In each case, the script-specific variable takes precedence over the package-wide variable. LP_SET_DUP_LPINSTANCE, UBUNTUTOOLS_LPINSTANCE The default value for --lpinstance. SEE ALSO
ubuntu-dev-tools(5) AUTHORS
lp-set-dup was written by Loic Minier <lool@dooz.org>, and this manual page was written by Luca Falavigna <dktrkranz@debian.org>. Both are released under the terms of the GNU General Public License, version 2. lptools March 6 2010 lp-set-dup(1)

Check Out this Related Man Page

GRAB-ATTACHMENTS(1)					      General Commands Manual					       GRAB-ATTACHMENTS(1)

NAME
grab-attachments - downloads attachments from a Launchpad bug SYNOPSIS
grab-attachments [options] bug-number... grab-attachments -h DESCRIPTION
grab-attachments is a script to download all attachments from a Launchpad bug report or bug reports with a source package task into a directory named after the bug e.g. bug-1. OPTIONS
Listed below are the command line options for grab-attachments: bug-number Specifies the Launchpad bug number that the script should download attachments from. -h, --help Display a help message and exit. -l INSTANCE, --lpinstance=INSTANCE Use the specified instance of Launchpad (e.g. "staging"), instead of the default of "production". --no-conf Do not read any configuration files, or configuration from environment variables. -d, --duplicates Download attachments from duplicates too. -p SRCPACKAGE, --package=SRCPACKAGE Download attachments from all bugs with a task for this source package. AUTHOR
lp-grab-attachments was written by Daniel Holbach and this manual page was written by Jonathan Patrick Davies. Both are released under the GNU General Public License, version 3. lptools 10 August 2008 GRAB-ATTACHMENTS(1)
Man Page

15 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

removing line and duplicate line

Hi, I have 3 lines in a text file that is similar to this (as a result of a diff between 2 files): 35,36d34 < DATA.EVENT.EVENT_ID.s = "3661208" < DATA.EVENT.EVENT_ID.s = "3661208" I am trying to get it down to just this: DATA.EVENT.EVENT_ID.s = "3661208" How can I do this?... (11 Replies)
Discussion started by: ocelot
11 Replies

2. UNIX for Dummies Questions & Answers

How to redirect duplicate lines from a file????

Hi, I am having a file which contains many duplicate lines. I wanted to redirect these duplicate lines into another file. Suppose I have a file called file_dup.txt which contains some line as file_dup.txt A100-R1 ACCOUNTING-CONTROL ACTONA-ACTASTOR ADMIN-AUTH-STATS ACTONA-ACTASTOR... (3 Replies)
Discussion started by: zing_foru
3 Replies

3. Shell Programming and Scripting

Check host file for duplicate entries

I need a KSH script that will check a host file for duplicate IP's and/or host names and report out the errors. Anyone out there have one they would like to share? Something like: Hostname blahblah appears X times IP Address xxx.xxx.xxx.xxx appears X times TIA (4 Replies)
Discussion started by: ThreeDot
4 Replies

4. Programming

Shell Implementation not working correctly

//save in/out int tmpin = dup(0); int tmpout = dup(1); //set initial input int fdin; if(_inputFile) { fdin = open(_inputFile, O_RDONLY | O_CREAT, S_IREAD | S_IWRITE); } else { //use default input fdin = dup(tmpin); } int ret; int fdout; for(int i = 0; i... (14 Replies)
Discussion started by: AirBronto
14 Replies

5. Shell Programming and Scripting

Finding Duplicate files

How do you delete and and find duplicate files? (1 Reply)
Discussion started by: Jicom4
1 Replies

6. Shell Programming and Scripting

Find duplicate files

What utility do you recommend for simply finding all duplicate files among all files? (4 Replies)
Discussion started by: kiasas
4 Replies

7. UNIX for Dummies Questions & Answers

Duplicates

Hi, How to eliminate the duplicate values in unix? I have a excel file which contains duplicate values. Need to use this in a script. Thanks in advance. (3 Replies)
Discussion started by: venkatesht
3 Replies

8. Shell Programming and Scripting

perl/shell need help to remove duplicate lines from files

Dear All, I have multiple files having number of records, consist of more than 10 columns some column values are duplicate and i want to remove these duplicate values from these files. Duplicate values may come in different files.... all files laying in single directory.. Need help to... (3 Replies)
Discussion started by: arvindng
3 Replies

9. UNIX for Dummies Questions & Answers

A duplicate file script Question

Hello, my main goal is to find duplicate files, I dug up a simple code on the internet and my only question is will it work on large files and/or directories with numerous files. #!/bin/bash DIR="/pwd" for file1 in ${DIR}; do for file2 in ${DIR}; do if ; then ... (3 Replies)
Discussion started by: BionicMonk
3 Replies

10. Shell Programming and Scripting

Remove duplicate lines from a 50 MB file size

hi, Please help me to write a command to delete duplicate lines from a file. And the size of file is 50 MB. How to remove duplicate lins from such a big file. (6 Replies)
Discussion started by: vsachan
6 Replies

11. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

12. Shell Programming and Scripting

Honey, I broke awk! (duplicate line removal in 30M line 3.7GB csv file)

I have a script that builds a database ~30 million lines, ~3.7 GB .cvs file. After multiple optimzations It takes about 62 min to bring in and parse all the files and used to take 10 min to remove duplicates until I was requested to add another column. I am using the highly optimized awk code: awk... (34 Replies)
Discussion started by: Michael Stora
34 Replies

13. Shell Programming and Scripting

Remove duplicate values with condition

Hi Gents, Please can you help me to get the desired output . In the first column I have some duplicate records, The condition is that all need to reject the duplicate record keeping the last occurrence. But the condition is. If the last occurrence is equal to value 14 or 98 in column 3 and... (2 Replies)
Discussion started by: jiam912
2 Replies

14. UNIX and Linux Applications

Deja-dup make my / full. So I cannot restore my back up

The problematic directory is the following: /root/.cache/deja-dup This directory grows until my "/" is full and then the restoring activity fails. I already tried to create a symbolic link with origin another partition where I have more space. However during the restoring activity ... (4 Replies)
Discussion started by: puertas12
4 Replies

15. UNIX for Beginners Questions & Answers

Remove duplicate email

cat path/to/dir/file.html | grep -i 'x*.com' > path/to/dir/file.txt Before xyz.com xyz.com After cat path/to/dir/file.html | grep -i 'x*.com' | sed '$!s/$/,/' | tr -d '\n'> path/to/dir/file.txt Result --> Preferred output: xyz.com, xyz.com The preferred is the exact output I... (9 Replies)
Discussion started by: lpoolfc
9 Replies