Sponsored Content
Full Discussion: Copying of large files fail
Top Forums Shell Programming and Scripting Copying of large files fail Post 302418032 by methyl on Sunday 2nd of May 2010 03:18:41 PM
Old 05-02-2010
Precision is everything in computing. This post is so vague that it defies belief when it presumably comes from a Systems Administrator with a real problem.

Please post the precise versions of both Operating Systems.
Please state all filesystem settings. How were they created and with what parameters?
Please provide exact directory listings of any files involved.
Please post precise commands typed or provide crontab details.
Please state any error messages precisely.
Please provide detail of an example copy which works.
Please post the output from unix command "ulimit -a" for the user concerned. Are you "root"?


I have seen many examples of problems with files above 2Gb but never with 4 Gb. Does it work for say 3 Gb?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

copying a large filesystem

Hi there In my organisation we have a solaris network with /home being automounted from /export/home on a central file server (usual stuff) however, the guy who originally set this up only allocated 3gb to /export/home and now we are really struggling for space. I have a new 18gb disk installed... (3 Replies)
Discussion started by: hcclnoodles
3 Replies

2. Filesystems, Disks and Memory

Strange difference in file size when copying LARGE file..

Hi, Im trying to take a database backup. one of the files is 26 GB. I am using cp -pr to create a backup copy of the database. after the copying is complete, if i do du -hrs on the folders i saw a difference of 2GB. The weird fact is that the BACKUP folder was 2 GB more than the original one! ... (1 Reply)
Discussion started by: 0ktalmagik
1 Replies

3. UNIX for Dummies Questions & Answers

Copying large file problem on SVR4 Unix

We have 3 Unix servers all running SVR4 Unix 1.4. I have no problems copying files to and from 2 of the servers using either the rcp command or ftp but when i come to transfer large files to the third server the copy gives up part way through and crashes this server. Copying smaller files using RCP... (7 Replies)
Discussion started by: coatesd
7 Replies

4. UNIX for Advanced & Expert Users

copying of files by userB, dir & files owned by userA

I am userB and have a dir /temp1 This dir is owned by me. How do I recursively copy files from another users's dir userA? I need to preserve the original user who created files, original group information, original create date, mod date etc. I tried cp -pr /home/userA/* . ... (2 Replies)
Discussion started by: Hangman2
2 Replies

5. UNIX for Dummies Questions & Answers

Copying a Large File

I have a large file that I append entries to the end of every few seconds. Its grown to >150MB. Its basically a log file but a perl script is writing to it. I need to make a copy of it to a new directory. I realize the latest entries occuring while the copy is taking place will not be recorded... (1 Reply)
Discussion started by: lforum
1 Replies

6. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

7. Shell Programming and Scripting

Start copying large file while its still being restored from tape

Hello, I need to copy a 700GB tape-image file over a network. I want to start the copy process before the tape-image has finished being restored from the tape. The tape restore speed is about 78 Mbps and the file transfer speed over the network is about 45 Mbps I don't want to use a pipe, since... (7 Replies)
Discussion started by: swamik
7 Replies

8. SCO

Need advice: Copying large CSV report files off SCO system

I have a SCO Unix server from 1999 running SCO 5.0.5 and some ancient accounting software called Real World A report writer program on the system is used to generate CSV files from accounting that we write with DOSCOPY commands to 3.5" floppies In the next 60 days we will be decommissioning... (11 Replies)
Discussion started by: magnetman
11 Replies

9. Shell Programming and Scripting

Copying number by looking a large file

Hi All, I have a big file which looks like this: abc 34.32 cdf 343.45 computer 1.34 ladder 2.3422 I have some 100000 .TXT files which look like this: computer cdf align I have to open each of the text files and read the words from the text files. Then I have to look into that... (2 Replies)
Discussion started by: shoaibjameel123
2 Replies

10. Shell Programming and Scripting

Copying large files in a bash script stops execution

Hello, I'm new to this forum and like to first of all say hello to everyone. I've got a really annoying problem at the moment. I'm trying to rsync some files (about 200MB with one file of 120MB) from a Raspberry PI with raspbian to a debian server via rsync. This procedure is stored in a... (3 Replies)
Discussion started by: wex_storm
3 Replies
DH_COMPRESS(1)							     Debhelper							    DH_COMPRESS(1)

NAME
       dh_compress - compress files and fix symlinks in package build directories

SYNOPSIS
       dh_compress [debhelperoptions] [-Xitem] [-A] [file...]

DESCRIPTION
       dh_compress is a debhelper program that is responsible for compressing the files in package build directories, and makes sure that any
       symlinks that pointed to the files before they were compressed are updated to point to the new files.

       By default, dh_compress compresses files that Debian policy mandates should be compressed, namely all files in usr/share/info,
       usr/share/man, files in usr/share/doc that are larger than 4k in size, (except the copyright file, .html and other web files, image files,
       and files that appear to be already compressed based on their extensions), and all changelog files. Plus PCF fonts underneath
       usr/share/fonts/X11/

FILES
       debian/package.compress
	   These files are deprecated.

	   If this file exists, the default files are not compressed. Instead, the file is ran as a shell script, and all filenames that the shell
	   script outputs will be compressed. The shell script will be run from inside the package build directory. Note though that using -X is a
	   much better idea in general; you should only use a debian/package.compress file if you really need to.

OPTIONS
       -Xitem, --exclude=item
	   Exclude files that contain item anywhere in their filename from being compressed. For example, -X.tiff will exclude TIFF files from
	   compression.  You may use this option multiple times to build up a list of things to exclude.

       -A, --all
	   Compress all files specified by command line parameters in ALL packages acted on.

       file ...
	   Add these files to the list of files to compress.

CONFORMS TO
       Debian policy, version 3.0

SEE ALSO
       debhelper(7)

       This program is a part of debhelper.

AUTHOR
       Joey Hess <joeyh@debian.org>

11.1.6ubuntu2							    2018-05-10							    DH_COMPRESS(1)
All times are GMT -4. The time now is 07:57 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy