Sponsored Content
Top Forums Shell Programming and Scripting Compare two sample files and find common Post 302726721 by Priyanka Chopra on Monday 5th of November 2012 05:59:12 AM
Old 11-05-2012
I checked basically it seems there are irregular spaces between columns in second file. Thats seems to be the main issue......Smilie
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

To find all common lines from 'n' no. of files

Hi, I have one situation. I have some 6-7 no. of files in one directory & I have to extract all the lines which exist in all these files. means I need to extract all common lines from all these files & put them in a separate file. Please help. I know it could be done with the help of... (11 Replies)
Discussion started by: The Observer
11 Replies

2. Shell Programming and Scripting

Files common in two sets ??? How to find ??

Suppose we have 2 set of files set 1 set 2 ------ ------ abc hgb def ppp mgh vvv nmk sdf hgb ... (1 Reply)
Discussion started by: skyineyes
1 Replies

3. UNIX for Dummies Questions & Answers

find common lines using just one column to compare and result with all columns

Hi. If we have this file A B C 7 8 9 1 2 10 and this other file A C D F 7 9 2 3 9 2 3 4 The result i´m looking for is intersection with A B C D F so the answer here will be (10 Replies)
Discussion started by: alcalina
10 Replies

4. UNIX for Dummies Questions & Answers

compare two files based on common field in unix

I have two files in UNIX. 1st file is Entity and Second File is References. 1st File has only one column named Entity ID and 2nd file has two columns Entity ID | Person ID. I want to produce a output file where entity id's are matching in both the files. Entity File 624197 624252 624264... (4 Replies)
Discussion started by: PRS
4 Replies

5. Shell Programming and Scripting

Compare a common field in two files and append a column from File 1 in File2

Hi Friends, I am new to Shell Scripting and need your help in the below situation. - I have two files (File 1 and File 2) and the contents of the files are mentioned below. - "Application handle" is the common field in both the files. (NOTE :- PLEASE REFER TO THE ATTACHMENT "Compare files... (2 Replies)
Discussion started by: Santoshbn
2 Replies

6. Shell Programming and Scripting

Compare multiple files, and extract items that are common to ALL files only

I have this code awk 'NR==FNR{a=$1;next} a' file1 file2 which does what I need it to do, but for only two files. I want to make it so that I can have multiple files (for example 30) and the code will return only the items that are in every single one of those files and ignore the ones... (7 Replies)
Discussion started by: castrojc
7 Replies

7. Shell Programming and Scripting

Compare multiple files, identify common records and combine unique values into one file

Good morning all, I have a problem that is one step beyond a standard awk compare. I would like to compare three files which have several thousand records against a fourth file. All of them have a value in each row that is identical, and one value in each of those rows which may be duplicated... (1 Reply)
Discussion started by: nashton
1 Replies

8. Shell Programming and Scripting

Find common files between two directories

I have two directories Dir 1 /home/sid/release1 Dir 2 /home/sid/release2 I want to find the common files between the two directories Dir 1 files /home/sid/release1>ls -lrt total 16 -rw-r--r-- 1 sid cool 0 Jun 19 12:53 File123 -rw-r--r-- 1 sid cool 0 Jun 19 12:53... (5 Replies)
Discussion started by: sidnow
5 Replies

9. UNIX for Beginners Questions & Answers

Compare two files and print based on common variable value.

Hi All, i have below two files. FILE: NAME="/dev/sda" TYPE="disk" SIZE="60G" OWNER="root" GROUP="disk" MODE="brw-rw----" PKNAME="" MOUNTPOINT="" NAME="/dev/sda1" TYPE="part" SIZE="500M" OWNER="root" GROUP="disk" MODE="brw-rw----" PKNAME="/dev/sda" MOUNTPOINT="/boot" NAME="/dev/sda2"... (3 Replies)
Discussion started by: balu1234
3 Replies
xlhtml(1)						      General Commands Manual							 xlhtml(1)

NAME
xlhtml - A program for converting Microsoft Excel Files .xls SYNOPSIS
xlhtml [-a] [-asc] [-csv] [-xml] [-bcNNNNNN] [-bi/path] [-c] [-dp] [-v] [-fw] [-m] [-nc] [-nh] [-tcNNNNNN] [-te] [-xc:N-N] [-xp:N] [-xr:N- N] FILE DESCRIPTION
This manual page explains the xlhtml program. The program xlhtml is used to convert Microsoft Excel Spreadsheet files into either html or tab delimitted ASCII. The program can be interfaced with helper scripts for viewing email attachments. Most use of this program is through the helper scripts and one would probably rarely resort to using the commandline interface. OPTIONS
-a aggressively optimize html by removing </TR> </TD> or VALIGN="bottom". Some older browsers may not display properly in this mode. -asc Ascii out of -dp and extraction data (-xc, -xp, -xr) -csv Output in Comma Separated Values of -dp and extraction data (-xc, -xp, -xr) -xml Output in XML of -dp and extraction data (-xc, -xp, -xr) -bc Override the background color. e.g. -bc808080 for gray -bi Use background image. e.g. -bi/home/httpd/icon/tar.gif -c Centers the tables horizontally -dp Dump page count and max columns and rows per page -v Prints program version -fw suppress formula warnings about accuracy -m No encoding for multibyte -nc tells it not to colorize the output. -nh Suppress header and body tags in html output -tc Override the text color. e.g. -tcFF0000 for red -te Trims empty rows & columns at the edges of a worksheet -xc Columns (separated by a dash) for extraction (zero based) -xp Page for extraction (zero based), one page only -xr Rows (separated by a dash) to be extracted (zero based) An example of the extraction command line is: xlhtml -fw -asc -xp:0 -xr:2-6 -xc:0-1 Test.xls The extraction output is: Formatted output of cells by column left to right, columns separated by a tab, end of row is: 0x0A, end of file: AUTHOR
Steve Grubb, Charles N Wyble xlhtml May 15, 2002 xlhtml(1)
All times are GMT -4. The time now is 04:51 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy