Sponsored Content
Full Discussion: Duplicate values merge
Top Forums Shell Programming and Scripting Duplicate values merge Post 302768858 by jiam912 on Sunday 10th of February 2013 02:27:56 PM
Old 02-10-2013
Thanks to everybody for your great job.

---------- Post updated at 02:27 PM ---------- Previous update was at 02:37 AM ----------

Gents,

please other thing

Code:
1050  22057485    219    223
1050  21897425    278    279    287 
1051  20497465    602    603    605 
1051  20517500    677    681     
1052  20577555    775    778     
1052  20357560    778    780     
1053  23717535    794    805     
1053  23657530    797    798    799

How i can count the total of values only from the 4 column to the end.

In this case the total of values will be 11.

How I can get this value..?.

Thanks for your help
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

merge files with same row values

Hi everyone, I'm just wondering how could I using awk language merge two files by comparison of one their row. I mean, I have one file like this: file#1: 21/07/2009 11:45:00 100.0000000 27.2727280 21/07/2009 11:50:00 75.9856644 25.2492676 21/07/2009 11:55:00 51.9713287 23.2258072... (4 Replies)
Discussion started by: tonet
4 Replies

2. Shell Programming and Scripting

Awk: How to merge duplicate lines and print in a single

The input file: >cat module1 200611051053 95 200523457498 35 200617890187 57 200726098123 66 200645676712 71 200744556590 68 >cat module2 200645676712 ... (10 Replies)
Discussion started by: winter9
10 Replies

3. Shell Programming and Scripting

duplicate values

Hi, How to enumerate duplicate values, without sorting the file. example 1 1 2 1 3 1 1 2 2 2 3 2 1 3 2 3 3 3 Where the first column have the repetead values without sorting, I would like to get the value of the times that the value is repetead , as I show... (2 Replies)
Discussion started by: jiam912
2 Replies

4. Shell Programming and Scripting

Extract values of duplicate keys

I have two questions that are related, so it would be great if you can help me with both! Question1: I have a file A that looks like this: a x b y b z c w I want to get something like: a x b y; z c w Given that a,b,c has no spaces. But the other letters might contain spaces. ... (2 Replies)
Discussion started by: Viernes
2 Replies

5. Shell Programming and Scripting

Append values of duplicate entries

My input file is: LOC_Os01g01870 GO:0006139 LOC_Os01g01870 GO:0009058 LOC_Os01g02570 GO:0006464 LOC_Os01g02570 GO:0009987 LOC_Os01g02570 GO:0008152 LOC_Os01g04380 GO:0006950 LOC_Os01g04380 GO:0009628 I want to append the duplicate values in a tab/space... (2 Replies)
Discussion started by: Sanchari
2 Replies

6. Shell Programming and Scripting

Remove duplicate values with condition

Hi Gents, Please can you help me to get the desired output . In the first column I have some duplicate records, The condition is that all need to reject the duplicate record keeping the last occurrence. But the condition is. If the last occurrence is equal to value 14 or 98 in column 3 and... (2 Replies)
Discussion started by: jiam912
2 Replies

7. Shell Programming and Scripting

How to merge two files with unique values matching.?

I have one script as below: #!/bin/ksh Outputfile1="/home/OutputFile1.xls" Outputfile2="/home/OutputFile2.xls" InputFile1="/home/InputFile1.sql" InputFile2="/home/InputFile2.sql" echo "Select hobby, class, subject, sports, rollNumber from Student_Table" >> InputFile1 echo "Select rollNumber... (3 Replies)
Discussion started by: Sharma331
3 Replies

8. Shell Programming and Scripting

Find duplicate values in specific column and delete all the duplicate values

Dear folks I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Discussion started by: sajmar
4 Replies

9. Shell Programming and Scripting

Join and merge multiple files with duplicate key and fill void columns

Join and merge multiple files with duplicate key and fill void columns Hi guys, I have many files that I want to merge: file1.csv: 1|abc 1|def 2|ghi 2|jkl 3|mno 3|pqr file2.csv: (5 Replies)
Discussion started by: yjacknewton
5 Replies
DU(1)								   User Commands							     DU(1)

NAME
du - estimate file space usage SYNOPSIS
du [OPTION]... [FILE]... du [OPTION]... --files0-from=F DESCRIPTION
Summarize disk usage of each FILE, recursively for directories. Mandatory arguments to long options are mandatory for short options too. -0, --null end each output line with 0 byte rather than newline -a, --all write counts for all files, not just directories --apparent-size print apparent sizes, rather than disk usage; although the apparent size is usually smaller, it may be larger due to holes in ('sparse') files, internal fragmentation, indirect blocks, and the like -B, --block-size=SIZE scale sizes by SIZE before printing them; e.g., '-BM' prints sizes in units of 1,048,576 bytes; see SIZE format below -b, --bytes equivalent to '--apparent-size --block-size=1' -c, --total produce a grand total -D, --dereference-args dereference only symlinks that are listed on the command line -d, --max-depth=N print the total for a directory (or file, with --all) only if it is N or fewer levels below the command line argument; --max-depth=0 is the same as --summarize --files0-from=F summarize disk usage of the NUL-terminated file names specified in file F; if F is -, then read names from standard input -H equivalent to --dereference-args (-D) -h, --human-readable print sizes in human readable format (e.g., 1K 234M 2G) --inodes list inode usage information instead of block usage -k like --block-size=1K -L, --dereference dereference all symbolic links -l, --count-links count sizes many times if hard linked -m like --block-size=1M -P, --no-dereference don't follow any symbolic links (this is the default) -S, --separate-dirs for directories do not include size of subdirectories --si like -h, but use powers of 1000 not 1024 -s, --summarize display only a total for each argument -t, --threshold=SIZE exclude entries smaller than SIZE if positive, or entries greater than SIZE if negative --time show time of the last modification of any file in the directory, or any of its subdirectories --time=WORD show time as WORD instead of modification time: atime, access, use, ctime or status --time-style=STYLE show times using STYLE, which can be: full-iso, long-iso, iso, or +FORMAT; FORMAT is interpreted like in 'date' -X, --exclude-from=FILE exclude files that match any pattern in FILE --exclude=PATTERN exclude files that match PATTERN -x, --one-file-system skip directories on different file systems --help display this help and exit --version output version information and exit Display values are in units of the first available SIZE from --block-size, and the DU_BLOCK_SIZE, BLOCK_SIZE and BLOCKSIZE environment variables. Otherwise, units default to 1024 bytes (or 512 if POSIXLY_CORRECT is set). SIZE is an integer and optional unit (example: 10M is 10*1024*1024). Units are K, M, G, T, P, E, Z, Y (powers of 1024) or KB, MB, ... (powers of 1000). GNU coreutils online help: <http://www.gnu.org/software/coreutils/> Report du translation bugs to <http://translationproject.org/team/> PATTERNS
PATTERN is a shell pattern (not a regular expression). The pattern ? matches any one character, whereas * matches any string (composed of zero, one or multiple characters). For example, *.o will match any files whose names end in .o. Therefore, the command du --exclude='*.o' will skip all files and subdirectories ending in .o (including the file .o itself). AUTHOR
Written by Torbjorn Granlund, David MacKenzie, Paul Eggert, and Jim Meyering. COPYRIGHT
Copyright (C) 2013 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>. This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. SEE ALSO
The full documentation for du is maintained as a Texinfo manual. If the info and du programs are properly installed at your site, the com- mand info coreutils 'du invocation' should give you access to the complete manual. GNU coreutils 8.22 June 2014 DU(1)
All times are GMT -4. The time now is 09:03 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy