Sponsored Content
Top Forums Shell Programming and Scripting Write only changes to file - avoid duplicates Post 303028014 by bakunin on Saturday 29th of December 2018 05:45:00 PM
Old 12-29-2018
Quote:
Originally Posted by RudiC
Wouldn't a periodical sort -u of the final result file solve those problems in one go at minimal cost? I'd guess searching the entire file for each new IP could become resource hungry.
That depends. In principle: yes, that would suffice - IF the period between runs of sort -u is sufficiently small to fit thread-O/Ps needs. What exactly "what he needs" is and what therefore "sufficiently small" constitutes is everybodies guess. IF (and again, who knows if this is a moot point or not) there must not be any duplicates in the final file THEN my approach (or something along similar lines) will be necessary. Also, if appending the output file happens without strict regularity the run of the sort -u might collide with the new entries overwriting them - concurrent access can have that effect. It would need some sort of inter-process communication in this case.

But all that is conjecture. Thread-O/P will have to state his problem more clearly to select the right approach.

I hope this helps.

bakunin
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Avoid Duplicates in a file

Hi Gurus, I had a question regarding avoiding duplicates.i have a file abc.txt abc.txt ------- READER_1_1_1> HIER_28056 XML Reader: Error occurred while parsing:; line number ; column number READER_1_3_1> Sun Mar 23 23:52:48 2008 READER_1_3_1> HIER_28056 XML Reader: Error occurred while... (7 Replies)
Discussion started by: pssandeep
7 Replies

2. Shell Programming and Scripting

How to avoid a temp file

Hi all. I want to check the free space on a given FS and process the output. Right now, I'm using a temp file to avoid using df twice. This is what I'm doing #!/usr/bin/ksh ... df -k $FS_NAME > $TMP_FILE 2>&1 if ]; then RESULT="CRITICAL - $(cat $TMP_FILE)" else cat $TMP_FILE | ...... (3 Replies)
Discussion started by: fox1212
3 Replies

3. HP-UX

Parameter to avoid file being deleted by SAM

Good afternoon. I am a newbie. We just had a potentially big problem (negated to having good backups). Basically, there is an option in SAM, to delete all the data from the system that a user ever created. Lo and behold, silly me, I choose that option, and all sorts of needed files... (5 Replies)
Discussion started by: instant000
5 Replies

4. Shell Programming and Scripting

Avoid file creation in a script...achive same result

Guys following lines help me in getting numbers from PID column ,to be thrown into first column of a CSV file. COLUMNS=2047 /usr/bin/ps -eo pid,ppid,uid,user,args | grep -v "PID" > /tmp/masterPID.txt cat /tmp/masterPID.txt|while read line do PID=`echo $line|awk '{print $1}'` echo "$PID"... (4 Replies)
Discussion started by: ak835
4 Replies

5. Shell Programming and Scripting

avoid open file to check field.

Hi Everyone, # cat a.txt 94,aqqc,62345907, 5,aeec,77, # cat 1.pl #!/usr/bin/perl use strict; use warnings; use Date::Manip; open(my $FA, "/root/a.txt") or die "$!"; while(<$FA>) { chomp; my @tmp=split(/\,/, $_); if (index($tmp, "qq") ne -1) { ... (4 Replies)
Discussion started by: jimmy_y
4 Replies

6. Programming

[c]Why first file is creating after the second. How to avoid

Hi, My Code is as below: nbECRITS = fwrite(strstr(data->buffer, ";") + 1, sizeof(char), (data->buffsize) - LEN_NOM_FIC, fic_sortie); fclose(fic_sortie); sprintf(PATH_BALISE, "%s.balise", PATH); fic_balise_data = fopen(PATH_BALISE, "a+"); if (fic_balise_data == NULL) {... (1 Reply)
Discussion started by: ezee
1 Replies

7. Shell Programming and Scripting

Request to check remove duplicates but write before it

Hi alll I have a file with following kind input I want in output duplicates should not be there but there should be numbering mentioned before that like (4 Replies)
Discussion started by: manigrover
4 Replies

8. Shell Programming and Scripting

Request to check:remove duplicates and write sytematically

Hi all I have a file with following input It contains 5 columns gene name drug drug ID disease approved Now the same gene is repeated many times with different data in column2,3 ,4,5 I want to arrange dat in such a way that there shuld be one entry in the column(no... (2 Replies)
Discussion started by: manigrover
2 Replies

9. Shell Programming and Scripting

How to avoid ssh :Write failed: Broken pipe?

Hello, I am trying to run some code on Matlab over ssh . The code takes around 5-6 hours to complete. so after giving the command to run it , I locked my machine and then went off to sleep at night, only to discover in the morning that I get this message : ...Code running, partial results... (1 Reply)
Discussion started by: ajayram
1 Replies

10. Shell Programming and Scripting

Avoid overwriting backup file when multiple entries need to replace in one file input from another

Hello, I have been working on script in which search and replace the multiple pattern. 1. update_params.sh read the multiple pattern from input file ParamMapping.txt(old_entry|New_entry) and passing this values one by one to change_text.sh 2. change_text.sh read... (0 Replies)
Discussion started by: ketanraut
0 Replies
MSGUNIQ(1)								GNU								MSGUNIQ(1)

NAME
msguniq - unify duplicate translations in message catalog SYNOPSIS
msguniq [OPTION] [INPUTFILE] DESCRIPTION
Unifies duplicate translations in a translation catalog. Finds duplicate translations of the same message ID. Such duplicates are invalid input for other programs like msgfmt, msgmerge or msgcat. By default, duplicates are merged together. When using the --repeated option, only duplicates are output, and all other messages are discarded. Comments and extracted comments will be cumulated, except that if --use-first is specified, they will be taken from the first translation. File positions will be cumulated. When using the --unique option, duplicates are discarded. Mandatory arguments to long options are mandatory for short options too. Input file location: INPUTFILE input PO file -D, --directory=DIRECTORY add DIRECTORY to list for input files search If no input file is given or if it is -, standard input is read. Output file location: -o, --output-file=FILE write output to specified file The results are written to standard output if no output file is specified or if it is -. Message selection: -d, --repeated print only duplicates -u, --unique print only unique messages, discard duplicates Output details: -t, --to-code=NAME encoding for output --use-first use first available translation for each message, don't merge several translations -e, --no-escape do not use C escapes in output (default) -E, --escape use C escapes in output, no extended chars --force-po write PO file even if empty -i, --indent write the .po file using indented style --no-location do not write '#: filename:line' lines -n, --add-location generate '#: filename:line' lines (default) --strict write out strict Uniforum conforming .po file -w, --width=NUMBER set output page width --no-wrap do not break long message lines, longer than the output page width, into several lines -s, --sort-output generate sorted output -F, --sort-by-file sort output by file location Informative output: -h, --help display this help and exit -V, --version output version information and exit AUTHOR
Written by Bruno Haible. REPORTING BUGS
Report bugs to <bug-gnu-gettext@gnu.org>. COPYRIGHT
Copyright (C) 2001-2002 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICU- LAR PURPOSE. SEE ALSO
The full documentation for msguniq is maintained as a Texinfo manual. If the info and msguniq programs are properly installed at your site, the command info msguniq should give you access to the complete manual. GNU gettext 0.11.4 July 2002 MSGUNIQ(1)
All times are GMT -4. The time now is 06:26 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy