Sponsored Content
Full Discussion: Duplication | awk | result
Top Forums UNIX for Beginners Questions & Answers Duplication | awk | result Post 303035763 by Aurimas on Sunday 2nd of June 2019 11:00:53 AM
Old 06-02-2019
Thank you, it works. However, the output now is 1 for all the cases, which is incorrect as I need to calculate AAA instances for 28 complexes (1st complex should give 8 as there are 8 times ALA is repeated (40 if we include that ALA is calculated 5 times with specific number (56 as was in our case)) , 2nd - 8 as well, 3rd - 8, 4th - 9 and so on) and it varies from 2 instances of ALA (10 times repeated (2 x 5)) and can get any value from 2 to 37 when AAA is chosen as ALA. Should I use cat $i | and some specification of grep before giving it to awk?
The output I get now is:
Quote:
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
How is it possible to solve that so I get:
Quote:
8
8
8
9
18
15
9
14
19
7
14
11
8
11
18
11
10
19
34
5
2
12
7
16
7
4
29
3
 

10 More Discussions You Might Find Interesting

1. Windows & DOS: Issues & Discussions

File Duplication

hi all how to find the file duplication in a windows 2000 server as usual replies are sincerely appreciated. thanks raguram R (3 Replies)
Discussion started by: raguramtgr
3 Replies

2. HP-UX

awk to output cmd result

I was wondering if it was possible to tell awk to print the output of a command in the print. .... | awk '{print $0}' I would like it to print the date right before $0, so something like (this doesn't work though) .... | awk '{print date $0}' (4 Replies)
Discussion started by: IMTheNachoMan
4 Replies

3. UNIX for Advanced & Expert Users

mount LVM duplication drives

Hi, I'm stuck in an awkward situation please help :) I have two identical Seagate 80GB harddrives. My objective is a bit strange. 1.I want to have a cloned disk as bootable backup 2.When booting using the master drive, I also want to mount the cloned backup disk so I can do incremental... (6 Replies)
Discussion started by: onthetopo
6 Replies

4. Shell Programming and Scripting

File Duplication Script?

I have a file, let's say 1.jpg, and I have a text file that contains a list of filenames I would like to duplicate 1.jpg as (i.e., 2.jpg, 3.jpg, 4.jpg, etc.). The filenames that I want to create are all on separate lines, one per line. I'm sure there's a simple solution, but I'm not claiming to... (7 Replies)
Discussion started by: futurestar
7 Replies

5. Shell Programming and Scripting

How to avoid duplication within 2 files?

Hi all, Actually 2 files are there - file1, file2. file1 contains ---> london mosco america russia mosco file2 contains --> europe india japan mosco england london Question is I want to print all the city names without duplication cities in those... (10 Replies)
Discussion started by: balan_mca
10 Replies

6. UNIX for Advanced & Expert Users

File Descriptor redirection and duplication

i have many questions concerning the FD. it was stated that "to redirect Error to output std, you have to write the following code" # ls -alt FileNotThere File > logfile 2>&1 # cat logfile ls: cannot access FileNotThere: No such file or directory -rw-r--r-- 1 root root 0 2010-02-26... (9 Replies)
Discussion started by: ahmad.zuhd
9 Replies

7. Programming

Table Duplication in PHP

Hey, I am making a Facebook like Page system as my first project, So far it's been bate in mind I did it from my 3DS at the same time as my PC gets replaced, So far it's turned out great. Now I am on to creation the blocking system I need to get the code to say If the user already likes the... (0 Replies)
Discussion started by: AimyThomas
0 Replies

8. Linux

De-Duplication Problem

Hi all, I download and install lessfs for deduplication, I copy files in /SharedFiles directory and lessfs work right and not store again copy files, but, when i delete all files in /SharedFiles , not return free space to total space, files not show in /SharedFiles , but not copy new files in... (3 Replies)
Discussion started by: saeedha
3 Replies

9. UNIX for Beginners Questions & Answers

Line duplication with awk?!

So while this seemed totally trivial it turned out to be much more difficult than I had thought. I have a file with 3 rows, and I "just" want to add each field n number of times. E.g. > cat file.txt 0.5 -0.1 0.6 for n=3 into: cat newfile.txt 0.5 0.5 0.5 -0.1 -0.1 -0.1 0.6 0.6 0.6 I... (4 Replies)
Discussion started by: Glorp
4 Replies

10. UNIX for Beginners Questions & Answers

Unexpected result from awk

Hello, Giving those commands: cat > myfile 1 2 3 ^D cat myfile | awk '{ s=s+$1 ; print s}' The output is: 1 3 6 It seems like this command iterates each time on a different row so $1 is the first field of each row.. But what caused it to refer to each row ?. What I mean... (3 Replies)
Discussion started by: uniran
3 Replies
UNIQ(1) 						      General Commands Manual							   UNIQ(1)

NAME
uniq - report repeated lines in a file SYNOPSIS
uniq [ -udc [ +n ] [ -n ] ] [ input [ output ] ] DESCRIPTION
Uniq reads the input file comparing adjacent lines. In the normal case, the second and succeeding copies of repeated lines are removed; the remainder is written on the output file. Note that repeated lines must be adjacent in order to be found; see sort(1). If the -u flag is used, just the lines that are not repeated in the original file are output. The -d option specifies that one copy of just the repeated lines is to be written. The normal mode output is the union of the -u and -d mode outputs. The -c option supersedes -u and -d and generates an output report in default style but with each line preceded by a count of the number of times it occurred. The n arguments specify skipping an initial portion of each line in the comparison: -n The first n fields together with any blanks before each are ignored. A field is defined as a string of non-space, non-tab charac- ters separated by tabs and spaces from its neighbors. +n The first n characters are ignored. Fields are skipped before characters. SEE ALSO
sort(1), comm(1) UNIQ(1)
All times are GMT -4. The time now is 07:37 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy