The FASTEST copy method?


 
Thread Tools Search this Thread
Operating Systems Solaris The FASTEST copy method?
# 1  
Old 03-02-2011
The FASTEST copy method?

Hi Experts,
I've been asked if there is a fast way to duplicate a file(10GB) and zip it at the same time. The zipped file would be FTP'd.....management is asking this. Maybe there is a better method all together? any ideas? CP will not cut it.

Thanks in advance
Harley
Harleyrci
# 2  
Old 03-03-2011
You can avoid reading the file twice by piping it into tee:
Code:
tee copy < original | gzip > compressed.gz

This will write the original uncompressed data into copy, while at the same time writing it to stdout where a compressor like gzip can handle it.

If at all possible, avoid storing copy and compressed.gz on the same disk you're reading from, it will increase performance to not have to wait for writes to finish for reads to happen. In an ideal world, you could store copy on a different disk from compressed.gz too, to avoid splitting the disk's speed between different writes.

If you really mean .zip and not just some sort of compressor,
Code:
tee copy < original | zip compressed -

...will create a 'compressed.zip' containing a file named '-'.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Check fastest server and using it

hello we have upload some data in 15 servers in usa asia ... i consider to add new feature , script can detect download speed between localhost and destination and use fastest server, i have cut this part from a script which have this feature, download a xx MB file from all its source and... (0 Replies)
Discussion started by: nimafire
0 Replies

2. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

3. Shell Programming and Scripting

Fastest way calculating directory

Hi expert, Is there any fastest way to calculate recursive directory, and I have total 600 directories have 100000 files and 10 directory approximately 9000000 - 10000000 each files per directory. currently using this command "du -k --max-depth=0" to get the size but very slow it take 24 hours... (9 Replies)
Discussion started by: rufino
9 Replies

4. Solaris

svc:/network/physical:default: Method "/lib/svc/method/net-physical" failed with exit status 96. [ n

After a memory upgrade all network interfaces are misconfigued. How do i resolve this issue. Below are some out puts.thanks. ifconfig: plumb: SIOCLIFADDIF: eg000g0:2: no such interface # ifconfig eg1000g0:2 plumb ifconfig: plumb: SIOCLIFADDIF: eg1000g0:2: no such interface # ifconfig... (2 Replies)
Discussion started by: andersonedouard
2 Replies

5. Shell Programming and Scripting

Fastest way to delete line

I have a 5 GB text file(log/debug) I want to delete all lines containing 'TRACE' Command used sed -i '/TRACE/d' mylog.txt Is there any other fastest way to do this? (1 Reply)
Discussion started by: johnbach
1 Replies

6. UNIX for Advanced & Expert Users

Fastest way for searching the file

I want to search a file in fastest manner. Presently I am using 'find' command. But it is taking around 15min for searching. Is there any other method through which I can do it fast? (3 Replies)
Discussion started by: vaibhavbhat
3 Replies

7. Shell Programming and Scripting

how to delete/remove directory in fastest way

hello i need help to remove directory . The directory is not empty ., it contains several sub directories and files inside that.. total number of files in one directory is 12,24,446 . rm -rf doesnt work . it is prompting for every file .. i want to delete without prompting and... (6 Replies)
Discussion started by: getdpg
6 Replies

8. UNIX for Dummies Questions & Answers

Fastest way to traverse through large directories

Hi! I have thousands of sub-directories, and hundreds of thousands of files in them. What is the fast way to find out which files are older than a certain date? Is the "find" command the fastest? Or is there some other way? Right now I have a C script that traverses through and checks... (5 Replies)
Discussion started by: sreedharange
5 Replies

9. Shell Programming and Scripting

fastest way to remove duplicates.

I have searched the FAQ - by using sort, duplicates, etc.... but I didn't get any articles or results on it. Currently, I am using: sort -u file1 > file2 to remove duplicates. For a file size of 1giga byte approx. time taken to remove duplicates is 1hr 21 mins. Is there any other faster way... (15 Replies)
Discussion started by: radhika
15 Replies

10. UNIX for Dummies Questions & Answers

fastest copy command

wich is the fastest command in HP-UX to copy an entire disk to dat tapes, or even disk to disk? thanks (0 Replies)
Discussion started by: vascobrito
0 Replies
Login or Register to Ask a Question