Hi,
I have a process which duplicates files for different environments. As the files arrive, my script (korn shell) makes copies of them (giving a unique name) and then renames the original file so that my process won't get triggered again.
I don't like it either, but it's what we were told to do
The problem is, some of these files are basically DB refresh files. Some of these refresh files have gotten very large (over 4GB) and are now failing to be duplicated.
My script simply uses the UNIX
cp command to make duplicate copies of the files. The files in question are sitting on a NAS (network attached storage device) while my script resides on our DB UNIX server.
The NAS is fully capable of handling files that exceed 4GB, but for some reason the
cp command is failing.
I have no control over the NAS, but I can communicate with the team who manages it.