Sponsored Content
Top Forums Shell Programming and Scripting Start copying large file while its still being restored from tape Post 302539030 by swamik on Friday 15th of July 2011 02:46:00 AM
Old 07-15-2011
Thanks - yes I was also thinking of this. Much simpler!
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

copying a large filesystem

Hi there In my organisation we have a solaris network with /home being automounted from /export/home on a central file server (usual stuff) however, the guy who originally set this up only allocated 3gb to /export/home and now we are really struggling for space. I have a new 18gb disk installed... (3 Replies)
Discussion started by: hcclnoodles
3 Replies

2. Filesystems, Disks and Memory

Strange difference in file size when copying LARGE file..

Hi, Im trying to take a database backup. one of the files is 26 GB. I am using cp -pr to create a backup copy of the database. after the copying is complete, if i do du -hrs on the folders i saw a difference of 2GB. The weird fact is that the BACKUP folder was 2 GB more than the original one! ... (1 Reply)
Discussion started by: 0ktalmagik
1 Replies

3. UNIX for Dummies Questions & Answers

Writing large files to tape

I have a zipped file that is ~ 10GB. I tried tarring it off to a tape, but I receive: tar: <filename> too large to archive. Use E function modifier. The file is stored on a UFS mount, so I was unable to use ufsdump. What other options do I have? (I don't have a local file system large... (3 Replies)
Discussion started by: FredSmith
3 Replies

4. AIX

Copying to tape drive throws error

Hi All I am trying to copy files present in a partition (server 2) which is mounted to a different server (server 1) as tape drive is connected to it. I ran the below command to copy files within a partition: svr01:root:/sunfileserver> tar -cvf * a <foldername>/<filename>/<filename> a... (4 Replies)
Discussion started by: vathsan
4 Replies

5. UNIX for Dummies Questions & Answers

Copying large file problem on SVR4 Unix

We have 3 Unix servers all running SVR4 Unix 1.4. I have no problems copying files to and from 2 of the servers using either the rcp command or ftp but when i come to transfer large files to the third server the copy gives up part way through and crashes this server. Copying smaller files using RCP... (7 Replies)
Discussion started by: coatesd
7 Replies

6. Shell Programming and Scripting

Copying of large files fail

Hi, I have a process which duplicates files for different environments. As the files arrive, my script (korn shell) makes copies of them (giving a unique name) and then renames the original file so that my process won't get triggered again. I don't like it either, but it's what we were told to... (4 Replies)
Discussion started by: GoldenEye4ever
4 Replies

7. UNIX for Dummies Questions & Answers

Copying a Large File

I have a large file that I append entries to the end of every few seconds. Its grown to >150MB. Its basically a log file but a perl script is writing to it. I need to make a copy of it to a new directory. I realize the latest entries occuring while the copy is taking place will not be recorded... (1 Reply)
Discussion started by: lforum
1 Replies

8. Shell Programming and Scripting

Copying number by looking a large file

Hi All, I have a big file which looks like this: abc 34.32 cdf 343.45 computer 1.34 ladder 2.3422 I have some 100000 .TXT files which look like this: computer cdf align I have to open each of the text files and read the words from the text files. Then I have to look into that... (2 Replies)
Discussion started by: shoaibjameel123
2 Replies

9. UNIX for Dummies Questions & Answers

Copying tape-to-tape on UNIX

I am using a 4mm tape to backup my Unix system. However, I wanted to make a copy all of the files and archive headers (or just the archive headers if that's possible) created on one of my tapes to another 4mm tape. I only have one tape drive. Is there a command that will complete such task? ... (1 Reply)
Discussion started by: acoco
1 Replies

10. Shell Programming and Scripting

Copying large files in a bash script stops execution

Hello, I'm new to this forum and like to first of all say hello to everyone. I've got a really annoying problem at the moment. I'm trying to rsync some files (about 200MB with one file of 120MB) from a Raspberry PI with raspbian to a debian server via rsync. This procedure is stored in a... (3 Replies)
Discussion started by: wex_storm
3 Replies
Perl::Critic::Policy::Subroutines::ProhibitExcessComplexUser3Contributed Perl DocumePerl::Critic::Policy::Subroutines::ProhibitExcessComplexity(3)

NAME
Perl::Critic::Policy::Subroutines::ProhibitExcessComplexity - Minimize complexity by factoring code into smaller subroutines. AFFILIATION
This Policy is part of the core Perl::Critic distribution. DESCRIPTION
All else being equal, complicated code is more error-prone and more expensive to maintain than simpler code. The first step towards managing complexity is to establish formal complexity metrics. One such metric is the McCabe score, which describes the number of possible paths through a subroutine. This Policy approximates the McCabe score by summing the number of conditional statements and operators within a subroutine. Research has shown that a McCabe score higher than 20 is a sign of high-risk, potentially untestable code. See <http://en.wikipedia.org/wiki/Cyclomatic_complexity> for some discussion about the McCabe number and other complexity metrics. The usual prescription for reducing complexity is to refactor code into smaller subroutines. Mark Dominus book "Higher Order Perl" also describes callbacks, recursion, memoization, iterators, and other techniques that help create simple and extensible Perl code. CONFIGURATION
The maximum acceptable McCabe can be set with the "max_mccabe" configuration item. Any subroutine with a McCabe score higher than this number will generate a policy violation. The default is 20. An example section for a .perlcriticrc: [Subroutines::ProhibitExcessComplexity] max_mccabe = 30 NOTES
"Everything should be made as simple as possible, but no simpler." -- Albert Einstein Complexity is subjective, but formal complexity metrics are still incredibly valuable. Every problem has an inherent level of complexity, so it is not necessarily optimal to minimize the McCabe number. So don't get offended if your code triggers this Policy. Just consider if there might be a simpler way to get the job done. AUTHOR
Jeffrey Ryan Thalhammer <jeff@imaginative-software.com> COPYRIGHT
Copyright (c) 2005-2011 Imaginative Software Systems. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself. The full text of this license can be found in the LICENSE file included with this module. perl v5.16.3 2014-06-09 Perl::Critic::Policy::Subroutines::ProhibitExcessComplexity(3)
All times are GMT -4. The time now is 06:56 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy