Sponsored Content
Top Forums Shell Programming and Scripting Help with merge and remove duplicates Post 302896856 by roy121 on Wednesday 9th of April 2014 02:03:13 PM
Old 04-09-2014
For the value 4300

produced value is:
4300 233445569

expected value is:-
4300 234569
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicates

Hello Experts, I have two files named old and new. Below are my example files. I need to compare and print the records that only exist in my new file. I tried the below awk script, this script works perfectly well if the records have exact match, the issue I have is my old file has got extra... (4 Replies)
Discussion started by: forumthreads
4 Replies

2. Shell Programming and Scripting

Merge Two Tables with duplicates in first table

Hi.. File 1: 1 aa rep 1 dd rep 1 kk rep 2 bb sad 2 ss sad 3 ee dam File 2 1 apple fruit 2 mango tree 3 lilly flower output: 1 aaple fruit aa,dd,kk rep (7 Replies)
Discussion started by: empyrean
7 Replies

3. Shell Programming and Scripting

bash - remove duplicates

I need to use a bash script to remove duplicate files from a download list, but I cannot use uniq because the urls are different. I need to go from this: http://***/fae78fe/file1.wmv http://***/39du7si/file1.wmv http://***/d8el2hd/file2.wmv http://***/h893js3/file2.wmv to this: ... (2 Replies)
Discussion started by: locoroco
2 Replies

4. Shell Programming and Scripting

Find duplicates in column 1 and merge their lines (awk?)

Hi, I have a file (sorted by sort) with 8 tab delimited columns. The first column contains duplicated fields and I need to merge all these identical lines. My input file: comp100002 aaa bbb ccc ddd eee fff ggg comp100003 aba aba aba aba aba aba aba comp100003 fff fff fff fff fff fff fff... (5 Replies)
Discussion started by: falcox
5 Replies

5. Shell Programming and Scripting

Merge files without duplicates

Hi all, In a directory of many files, I need to merge only files which do not have identical lines and also the resulatant merge file should not be more than 50000 lines. Basically I need to cover up all text files in that directory and turn them to Merge files.txt with 50000 lines each ... (2 Replies)
Discussion started by: pravfraz
2 Replies

6. UNIX for Dummies Questions & Answers

Remove duplicates from a file

Can u tell me how to remove duplicate records from a file? (11 Replies)
Discussion started by: saga20
11 Replies

7. Shell Programming and Scripting

Remove duplicates

I have a file with the following format: fields seperated by "|" title1|something class|long...content1|keys title2|somhing class|log...content1|kes title1|sothing class|lon...content1|kes title3|shing cls|log...content1|ks I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies

8. Shell Programming and Scripting

Remove top 3 duplicates

hello , I have a requirement with input in below format abc 123 xyz bcd 365 kii abc 987 876 cdf 987 uii abc 456 yuu bcd 654 rrr Expecting Output abc 456 yuu bcd 654 rrr cdf 987 uii (1 Reply)
Discussion started by: Tomlight
1 Replies

9. Shell Programming and Scripting

Sort and Remove duplicates

Here is my task : I need to sort two input files and remove duplicates in the output files : Sort by 13 characters from 97 Ascending Sort by 1 characters from 96 Ascending If duplicates are found retain the first value in the file the input files are variable length, convert... (4 Replies)
Discussion started by: ysvsr1
4 Replies

10. Shell Programming and Scripting

Remove duplicates

Hi I have a below file structure. 200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,, 200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,, 300,20140223,0.001,0.001,0.001,0.001,0.001 300,20140224,0.001,0.001,0.001,0.001,0.001 300,20140225,0.001,0.001,0.001,0.001,0.001 300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies
sane-xerox_mfp(5)					   SANE Scanner Access Now Easy 					 sane-xerox_mfp(5)

NAME
sane-xerox_mfp - SANE backend for Xerox Phaser 3200MFP device DESCRIPTION
The sane-xerox_mfp library implements a SANE (Scanner Access Now Easy) backend that provides access to the following USB and network multi- function-peripheral: Phaser 3200MFP Dell MFP Laser Printer 1815dn Xerox Phaser 6110MFP Samsung CLX-3170fn & CLX-3175FW Samsung SCX-4200 Samsung SCX-4300 Samsung SCX-4500 Samsung SCX-4500W Samsung SCX4725-FN Xerox WorkCentre 3119 Series If you own a scanner other than the ones listed above that works with this backend, please let us know this by sending the scanner's exact model name and the USB vendor and device ids (e.g. from /proc/bus/usb/devices, sane-find-scanner or syslog) to us. Even if the scanner's name is only slightly different from the models mentioned above, please let us know. CONFIGURATION
/etc/sane.d/xerox_mfp.conf USB scanners do not need any configuration. For SCX-4500W in network mode you need to specify tcp host_address [port] The host_address is passed through resolver, thus can be a dotted quad or a name from /etc/hosts or resolvable through DNS. FILES
/etc/sane.d/xerox_mfp.conf The backend configuration file. By default all scanner types/models are enabled, you may want to comment out unwanted. /usr/lib64/sane/libsane-xerox_mfp.a The static library implementing this backend. /usr/lib64/sane/libsane-xerox_mfp.so The shared library implementing this backend (present on systems that support dynamic loading). ENVIRONMENT
SANE_DEBUG_XEROX_MFP If the library was compiled with debug support enabled, this environment variable controls the debug level for this backend. Higher debug levels increase the verbosity of the output. Example: export SANE_DEBUG_XEROX_MFP=4 AUTHOR
Alex Belkin <abc@telekom.ru> Samsung SCX-4500W scan over network support Alexander Kuznetsov <acca(at)cpan.org> BUGS
: Multicast autoconfiguration for LAN scanners is not implemented yet. IPv6 addressing never been tested. SEE ALSO
sane(7), sane-usb(5) 15 Dec 2008 sane-xerox_mfp(5)
All times are GMT -4. The time now is 07:31 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy