Sponsored Content
Top Forums Shell Programming and Scripting Merging few files into one, duplicates are removed Post 302754087 by itkamaraj on Thursday 10th of January 2013 03:13:36 AM
Old 01-10-2013
Code:
 
cat *.txt | sort -u > output.txt

 

8 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Bring back removed files

Dear People I have removed some of my files and directories( by using rm and rmdir commands) by mistake. I wish to bring them back. How is it possible?( I am using solaris 2.6) best regards Reza Nazarian:( (2 Replies)
Discussion started by: Reza Nazarian
2 Replies

2. UNIX for Dummies Questions & Answers

Will Old Files Be Removed

I have windows Xp installed, and decided to install Solaris Sun Unix 10. The hard disk was previousely partitioned into 5 partition. C: = Win98 D = WinXP and e,f,g,h are applications and so on. When istalling Sun Unix, will all the drives be removed, or I will specify where to install it. Thanks... (5 Replies)
Discussion started by: sunsation
5 Replies

3. UNIX for Dummies Questions & Answers

recovering files removed with rm

Hello, I was reading the manual on rm and it states that when you use 'rm' the files are usual recoverable, how is this done? Does it assume that a backup system is in place? Cheers Jack (4 Replies)
Discussion started by: jack1981
4 Replies

4. Shell Programming and Scripting

Find duplicates from multuple files with 2 diff types of files

I need to compare 2 diff type of files and find out the duplicate after comparing each types of files: Type 1 file name is like: file1.abc (the extension abc could any 3 characters but I can narrow it down or hardcode for 10/15 combinations). The other file is file1.bcd01abc (the extension... (2 Replies)
Discussion started by: ricky007
2 Replies

5. Shell Programming and Scripting

Duplicates to be removed

Hi, I have a text file with 2000 rows and 2000 columns (number of columns might vary from row to row) and "comma" is the delimiter. In every row, there maybe few duplicates and we need to remove those duplicates and "shift left" the consequent values. ex: 111 222 111 555 444 999 666... (6 Replies)
Discussion started by: prvnrk
6 Replies

6. UNIX for Advanced & Expert Users

can I view removed/deleted files

I have a script that have deleted some files, and I need to know which files were deleted. Is there a log file in linux that shows deleted files? I use ubuntu. (2 Replies)
Discussion started by: locoroco
2 Replies

7. UNIX for Advanced & Expert Users

How to find duplicates contents in a files by comparing other files?

Hi Guys , we have one directory ...in that directory all files will be set on each day.. files must have header ,contents ,footer.. i wants to compare the header,contents,footer ..if its same means display an error message as 'files contents same' (7 Replies)
Discussion started by: Venkatesh1
7 Replies

8. Red Hat

Can all files under /tmp be safely removed

I wanted to know whether all files under /tmp can be safely removed. I guess that /tmp may also have temporary files for applications currently being worked on, so at the most those applications may just shut down. I hope that my question is clear whether all files under /tmp can be safely... (5 Replies)
Discussion started by: RHCE
5 Replies
h5jam(1)						      General Commands Manual							  h5jam(1)

NAME
h5jam - Add a user block to a HDF5 file SYNOPSIS
h5jam -u user_block -i in_file.h5 [-o out_file.h5] [--clobber] DESCRIPTION
h5jam concatenates a user_block file and an HDF5 file to create an HDF5 file with a user block. The user block can be either binary or text. The output file is padded so that the HDF5 header begins on byte 512, 1024, etc.. (See the HDF5 File Format.) If out_file.h5 is given, a new file is created with the user_block followed by the contents of in_file.h5. In this case, infile.h5 is unchanged. If out_file.h5 is not specified, the user_block is added to in_file.h5. If in_file.h5 already has a user block, the contents of user_block will be added to the end of the existing user block, and the file shifted to the next boundary. If --clobber is set, any existing user block will be overwritten. EXAMPLE USAGE
Create new file, newfile.h5, with the text in file mytext.txt as the user block for the HDF5 file file.h5. h5jam -u mytext.txt -i file.h5 -o newfile.h5 Add text in file mytext.txt to front of HDF5 dataset, file.h5. h5jam -u mytext.txt -i file.h5 Overwrite the user block (if any) in file.h5 with the contents of mytext.txt. h5jam -u mytext.txt -i file.h5 --clobber RETURN VALUE
h5jam returns the size of the output file, or -1 if an error occurs. CAVEATS
This tool copies all the data (sequentially) in the file(s) to new offsets. For a large file, this copy will take a long time. The most efficient way to create a user block is to create the file with a user block (see H5Pset_user_block), and write the user block data into that space from a program. The user block is completely opaque to the HDF5 library and to the h5jam and h5unjam tools. The user block is simply read or written as a string of bytes, which could be text or any kind of binary data. It is up to the user to know what the contents of the user block means and how to process it. When the user block is extracted, all the data is written to the output, including any padding or unwritten data. This tool moves the HDF5 file through byte copies, i.e., it does not read or interpret the HDF5 objects. SEE ALSO
h5dump(1), h5ls(1), h5diff(1), h5import(1), gif2h5(1), h52gif(1), h5perf(1), h5unjam(1). h5jam(1)
All times are GMT -4. The time now is 07:19 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy