Strange difference in file size when copying LARGE file..


 
Thread Tools Search this Thread
Special Forums Hardware Filesystems, Disks and Memory Strange difference in file size when copying LARGE file..
Prev   Next
# 1  
Old 06-03-2006
Strange difference in file size when copying LARGE file..

Hi, Im trying to take a database backup. one of the files is 26 GB. I am using cp -pr to create a backup copy of the database. after the copying is complete, if i do du -hrs on the folders i saw a difference of 2GB.

The weird fact is that the BACKUP folder was 2 GB more than the original one!

when i did du -hrs * inside each of the 2 folders and compared the output, i found the culprit. it was a file temp.dbf which was 26 GB on the source(original) folder and 28 GB in the BACKUP folder. I tried the copy procedure twice and I am getting the same result.

does anyone have any explanation for this. please let me know. I want to make sure everything is right here in the copy.

PS:- Solaris SPARC 23 BIT.

Thanks
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Programming

Copying a file and reading size I get always zero.

I need to copy a file (srcFile) to a new ubication (destFile). Then I calculate the size of the copy file using stat. The problem is that I always get a cero size value although the file has not cero size in the hard disk. The source code that does this is : PROGRAM 1: char srcFile;... (4 Replies)
Discussion started by: clxspain
4 Replies

2. Shell Programming and Scripting

Compare large file and identify difference in separate file

I have a very large system generated file containing around 500K rows size 100MB like following HOME|ALICE STREET|3||NEW LISTING HOME|NEWPORT STREET|1||NEW LISTING HOME|KING STREET|5||NEW LISTING HOME|WINSOME AVENUE|4||MODIFICATION CAR|TOYOTA|4||NEW LISTING CAR|FORD|4||NEW... (9 Replies)
Discussion started by: jubaier
9 Replies

3. Shell Programming and Scripting

Copying number by looking a large file

Hi All, I have a big file which looks like this: abc 34.32 cdf 343.45 computer 1.34 ladder 2.3422 I have some 100000 .TXT files which look like this: computer cdf align I have to open each of the text files and read the words from the text files. Then I have to look into that... (2 Replies)
Discussion started by: shoaibjameel123
2 Replies

4. Shell Programming and Scripting

Start copying large file while its still being restored from tape

Hello, I need to copy a 700GB tape-image file over a network. I want to start the copy process before the tape-image has finished being restored from the tape. The tape restore speed is about 78 Mbps and the file transfer speed over the network is about 45 Mbps I don't want to use a pipe, since... (7 Replies)
Discussion started by: swamik
7 Replies

5. Shell Programming and Scripting

Columns comparision of two large size files and printing the difference

Hi Experts, My requirement is to compare the second field/column in two files, if the second column is same in both the files then compare the first field. If the first is not matching then print the first and second fields of both the files. first file (a .txt) < 1210018971FF0000,... (6 Replies)
Discussion started by: krao
6 Replies

6. UNIX for Dummies Questions & Answers

Copying a Large File

I have a large file that I append entries to the end of every few seconds. Its grown to >150MB. Its basically a log file but a perl script is writing to it. I need to make a copy of it to a new directory. I realize the latest entries occuring while the copy is taking place will not be recorded... (1 Reply)
Discussion started by: lforum
1 Replies

7. UNIX for Dummies Questions & Answers

Copying large file problem on SVR4 Unix

We have 3 Unix servers all running SVR4 Unix 1.4. I have no problems copying files to and from 2 of the servers using either the rcp command or ftp but when i come to transfer large files to the third server the copy gives up part way through and crashes this server. Copying smaller files using RCP... (7 Replies)
Discussion started by: coatesd
7 Replies

8. Shell Programming and Scripting

Split a large file with patterns and size

Hi, I have a large file with a repeating pattern in it. Now i want the file split into the block of patterns with a specified no. of lines in each file. i.e. The file is like 1... 2... 2... 3... 1... 2... 3... 1... 2... 2... 2... 2... 2... 3... where 1 is the start of the block... (5 Replies)
Discussion started by: sudhamacs
5 Replies

9. UNIX for Dummies Questions & Answers

Editing a large size file

I would like to edit a doc which is large file size. I can't use "vi" command due to out of memory. $ vi large.dat ex: 0602-101 Out of memory saving lines for undo. Please help. Thanks. (2 Replies)
Discussion started by: Rock
2 Replies

10. UNIX for Dummies Questions & Answers

prevent file size is too large

We have EDP members will do some testing job in my system , but sometimes these process will generate some error to the system log or any file ( usually the members don't know the log is reached to this level ) , then make the system crashed , could suggest the way how can to prevent this problem ?... (2 Replies)
Discussion started by: ust
2 Replies
Login or Register to Ask a Question
RDIFF-BACKUP-FS(1)					      General Commands Manual						RDIFF-BACKUP-FS(1)

NAME
rdiff-backup-fs - Filesystem for accessing rdiff-backup archives. SYNOPSIS
rdiff-backup-fs <mount_point> <repository> [repositories ...] [-option ...] DESCRIPTION
rdiff-backup-fs is a filesystem in userspace that reads rdiff-backup archives and provides convenient access. OPTIONS
--debug <0-4> Run rdiff-backup-fs in foreground with given verbosity of debug messages. -f, --full Store information about all revisions in memory. CAUTION: this may take a lot of memory if your archive contains many revisions. -l, --last Displays files from the most recent increment as directories, each holding every version of the file. CAUTION: this stores informa- tion about all revisions in memory and therefore may take a lot of memory if archive contains many revisions. -c <n>, --caching <n> How many files retrieved from the rdiff-backup archive may be cached by filesystem. By default rdiff-backup-fs will cache up to 10 files. If this switch is set to 0, no caching will be done. -r <n>, --revisions <n> How many revisions should be stored in memory for on demand revision retrieval. By default rdiff-backup-fs will store up to 10 revi- sions in memory. -d, --directory <path> Set directory for directory with temporary files. By default rdiff-backup-fs uses /tmp. -v, --version Print version of rdiff-backup-fs and exit. SEE ALSO
rdiff-backup(1) COPYRIGHT
rdiff-backup-fs is Copyright (c) 2007-2011 Filip Gruszczyski. rdiff-backup-fs is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MER- CHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see <http://www.gnu.org/licenses/>. AUTHORS
Filip Gruszczyski <gruszczy@gmail.com> RDIFF-BACKUP-FS(1)