10 More Discussions You Might Find Interesting
1. UNIX for Beginners Questions & Answers
How can i get the duplicates rows from a file using unix, for example i have data like
a,1
b,2
c,3
d,4
a,1
c,3
e,5
i want output to be like
a,1
c,3 (4 Replies)
Discussion started by: ggupta
4 Replies
2. Shell Programming and Scripting
Hi everybody
I have a .txt file that contains some assembly code for optimizing it i need to remove some replicated parts.
for example I have:e_li r0,-1
e_li r25,-1
e_lis r25,0000
add r31, r31 ,r0
e_li r28,-1
e_lis r28,0000
add r31, r31 ,r0
e_li r28,-1 ... (3 Replies)
Discussion started by: Behrouzx77
3 Replies
3. UNIX for Dummies Questions & Answers
Gurus,
From a file I need to remove duplicate rows based on the first column data but also we need to consider a date column where we need to keep the latest date (13th column).
Ex:
Input File:
Output File:
I know how to take out the duplicates but I couldn't figure out... (5 Replies)
Discussion started by: shash
5 Replies
4. Shell Programming and Scripting
notes: i am using cygwin and notepad++ only for checking this and my OS is XP.
#!/bin/bash
typeset -i totalvalue=(wc -w /cygdrive/c/cygwinfiles/database.txt)
typeset -i totallines=(wc -l /cygdrive/c/cygwinfiles/database.txt)
typeset -i columnlines=`expr $totalvalue / $totallines`
awk -F' ' -v... (5 Replies)
Discussion started by: whitecross
5 Replies
5. HP-UX
Hi all,
I have written one shell script. The output file of this script is having sql output.
In that file, I want to extract the rows which are having multiple entries(duplicate rows).
For example, the output file will be like the following way.
... (7 Replies)
Discussion started by: raghu.iv85
7 Replies
6. UNIX for Dummies Questions & Answers
Hi,
I am processing a file and would like to delete duplicate records as indicated by one of its column. e.g.
COL1 COL2 COL3
A 1234 1234
B 3k32 2322
C Xk32 TTT
A NEW XX22
B 3k32 ... (7 Replies)
Discussion started by: risk_sly
7 Replies
7. Shell Programming and Scripting
Hi,
I need to concatenate three files in to one destination file.In this if some duplicate data occurs it should be deleted.
eg:
file1:
-----
data1 value1
data2 value2
data3 value3
file2:
-----
data1 value1
data4 value4
data5 value5
file3:
-----
data1 value1
data4 value4 (3 Replies)
Discussion started by: Sharmila_P
3 Replies
8. Shell Programming and Scripting
I have a file content like below.
"0000000","ABLNCYI","BOTH",1049,2058,"XYZ","5711002","","Y","","","","","","","",""
"0000000","ABLNCYI","BOTH",1049,2058,"XYZ","5711002","","Y","","","","","","","",""
"0000000","ABLNCYI","BOTH",1049,2058,"XYZ","5711002","","Y","","","","","","","",""... (5 Replies)
Discussion started by: vamshikrishnab
5 Replies
9. Shell Programming and Scripting
hi all
can anyone please let me know if there is a way to find out duplicate rows in a file. i have a file that has hundreds of numbers(all in next row).
i want to find out the numbers that are repeted in the file.
eg.
123434
534
5575
4746767
347624
5575
i want 5575
please help (3 Replies)
Discussion started by: infyanurag
3 Replies
10. UNIX for Dummies Questions & Answers
Hi,
I am trying to remove duplicate lines from a file. For example the contents of example.txt is:
this is a test
2342
this is a test
34343
this is a test
43434
and i want to remove the "this is a test" lines only and end up with the numbers in the file, that is, end up with:
2342... (4 Replies)
Discussion started by: ocelot
4 Replies
xlhtml(1) General Commands Manual xlhtml(1)
NAME
xlhtml - A program for converting Microsoft Excel Files .xls
SYNOPSIS
xlhtml [-a] [-asc] [-csv] [-xml] [-bcNNNNNN] [-bi/path] [-c] [-dp] [-v] [-fw] [-m] [-nc] [-nh] [-tcNNNNNN] [-te] [-xc:N-N] [-xp:N] [-xr:N-
N] FILE
DESCRIPTION
This manual page explains the xlhtml program. The program xlhtml is used to convert Microsoft Excel Spreadsheet files into either html or
tab delimitted ASCII. The program can be interfaced with helper scripts for viewing email attachments. Most use of this program is through
the helper scripts and one would probably rarely resort to using the commandline interface.
OPTIONS
-a aggressively optimize html by removing </TR> </TD> or VALIGN="bottom". Some older browsers may not display properly in this mode.
-asc Ascii out of -dp and extraction data (-xc, -xp, -xr)
-csv Output in Comma Separated Values of -dp and extraction data (-xc, -xp, -xr)
-xml Output in XML of -dp and extraction data (-xc, -xp, -xr)
-bc Override the background color. e.g. -bc808080 for gray
-bi Use background image. e.g. -bi/home/httpd/icon/tar.gif
-c Centers the tables horizontally
-dp Dump page count and max columns and rows per page
-v Prints program version
-fw suppress formula warnings about accuracy
-m No encoding for multibyte
-nc tells it not to colorize the output.
-nh Suppress header and body tags in html output
-tc Override the text color. e.g. -tcFF0000 for red
-te Trims empty rows & columns at the edges of a worksheet
-xc Columns (separated by a dash) for extraction (zero based)
-xp Page for extraction (zero based), one page only
-xr Rows (separated by a dash) to be extracted (zero based)
An example of the extraction command line is: xlhtml -fw -asc -xp:0 -xr:2-6 -xc:0-1 Test.xls
The extraction output is: Formatted output of cells by column left to right, columns separated by a tab, end of row is: 0x0A, end of file:
AUTHOR
Steve Grubb, Charles N Wyble
xlhtml May 15, 2002 xlhtml(1)