Copying of large files fail


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Copying of large files fail
Prev   Next
# 1  
Old 05-01-2010
Copying of large files fail

Hi,
I have a process which duplicates files for different environments. As the files arrive, my script (korn shell) makes copies of them (giving a unique name) and then renames the original file so that my process won't get triggered again.

I don't like it either, but it's what we were told to do Smilie

The problem is, some of these files are basically DB refresh files. Some of these refresh files have gotten very large (over 4GB) and are now failing to be duplicated.

My script simply uses the UNIX cp command to make duplicate copies of the files. The files in question are sitting on a NAS (network attached storage device) while my script resides on our DB UNIX server.

The NAS is fully capable of handling files that exceed 4GB, but for some reason the cp command is failing.
I have no control over the NAS, but I can communicate with the team who manages it.
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Copying large files in a bash script stops execution

Hello, I'm new to this forum and like to first of all say hello to everyone. I've got a really annoying problem at the moment. I'm trying to rsync some files (about 200MB with one file of 120MB) from a Raspberry PI with raspbian to a debian server via rsync. This procedure is stored in a... (3 Replies)
Discussion started by: wex_storm
3 Replies

2. Shell Programming and Scripting

Copying number by looking a large file

Hi All, I have a big file which looks like this: abc 34.32 cdf 343.45 computer 1.34 ladder 2.3422 I have some 100000 .TXT files which look like this: computer cdf align I have to open each of the text files and read the words from the text files. Then I have to look into that... (2 Replies)
Discussion started by: shoaibjameel123
2 Replies

3. SCO

Need advice: Copying large CSV report files off SCO system

I have a SCO Unix server from 1999 running SCO 5.0.5 and some ancient accounting software called Real World A report writer program on the system is used to generate CSV files from accounting that we write with DOSCOPY commands to 3.5" floppies In the next 60 days we will be decommissioning... (11 Replies)
Discussion started by: magnetman
11 Replies

4. Shell Programming and Scripting

Start copying large file while its still being restored from tape

Hello, I need to copy a 700GB tape-image file over a network. I want to start the copy process before the tape-image has finished being restored from the tape. The tape restore speed is about 78 Mbps and the file transfer speed over the network is about 45 Mbps I don't want to use a pipe, since... (7 Replies)
Discussion started by: swamik
7 Replies

5. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

6. UNIX for Dummies Questions & Answers

Copying a Large File

I have a large file that I append entries to the end of every few seconds. Its grown to >150MB. Its basically a log file but a perl script is writing to it. I need to make a copy of it to a new directory. I realize the latest entries occuring while the copy is taking place will not be recorded... (1 Reply)
Discussion started by: lforum
1 Replies

7. UNIX for Advanced & Expert Users

copying of files by userB, dir & files owned by userA

I am userB and have a dir /temp1 This dir is owned by me. How do I recursively copy files from another users's dir userA? I need to preserve the original user who created files, original group information, original create date, mod date etc. I tried cp -pr /home/userA/* . ... (2 Replies)
Discussion started by: Hangman2
2 Replies

8. UNIX for Dummies Questions & Answers

Copying large file problem on SVR4 Unix

We have 3 Unix servers all running SVR4 Unix 1.4. I have no problems copying files to and from 2 of the servers using either the rcp command or ftp but when i come to transfer large files to the third server the copy gives up part way through and crashes this server. Copying smaller files using RCP... (7 Replies)
Discussion started by: coatesd
7 Replies

9. Filesystems, Disks and Memory

Strange difference in file size when copying LARGE file..

Hi, Im trying to take a database backup. one of the files is 26 GB. I am using cp -pr to create a backup copy of the database. after the copying is complete, if i do du -hrs on the folders i saw a difference of 2GB. The weird fact is that the BACKUP folder was 2 GB more than the original one! ... (1 Reply)
Discussion started by: 0ktalmagik
1 Replies

10. UNIX for Dummies Questions & Answers

copying a large filesystem

Hi there In my organisation we have a solaris network with /home being automounted from /export/home on a central file server (usual stuff) however, the guy who originally set this up only allocated 3gb to /export/home and now we are really struggling for space. I have a new 18gb disk installed... (3 Replies)
Discussion started by: hcclnoodles
3 Replies
Login or Register to Ask a Question
DH_COMPRESS(1)							     Debhelper							    DH_COMPRESS(1)

NAME
dh_compress - compress files and fix symlinks in package build directories SYNOPSIS
dh_compress [debhelperoptions] [-Xitem] [-A] [file...] DESCRIPTION
dh_compress is a debhelper program that is responsible for compressing the files in package build directories, and makes sure that any symlinks that pointed to the files before they were compressed are updated to point to the new files. By default, dh_compress compresses files that Debian policy mandates should be compressed, namely all files in usr/share/info, usr/share/man, files in usr/share/doc that are larger than 4k in size, (except the copyright file, .html and other web files, image files, and files that appear to be already compressed based on their extensions), and all changelog files. Plus PCF fonts underneath usr/share/fonts/X11/ FILES
debian/package.compress These files are deprecated. If this file exists, the default files are not compressed. Instead, the file is ran as a shell script, and all filenames that the shell script outputs will be compressed. The shell script will be run from inside the package build directory. Note though that using -X is a much better idea in general; you should only use a debian/package.compress file if you really need to. OPTIONS
-Xitem, --exclude=item Exclude files that contain item anywhere in their filename from being compressed. For example, -X.tiff will exclude TIFF files from compression. You may use this option multiple times to build up a list of things to exclude. -A, --all Compress all files specified by command line parameters in ALL packages acted on. file ... Add these files to the list of files to compress. CONFORMS TO
Debian policy, version 3.0 SEE ALSO
debhelper(7) This program is a part of debhelper. AUTHOR
Joey Hess <joeyh@debian.org> 11.1.6ubuntu2 2018-05-10 DH_COMPRESS(1)