Sponsored Content
Operating Systems Linux File size limitation in Linux Post 302930286 by jim mcnamara on Thursday 1st of January 2015 05:24:21 PM
Old 01-01-2015
Consider: sparse files.

A sparse file that shows 10MB used space when copied or tar-ed may occupy exponentially more file space in the new file or tar file.... Those things are the bane of backups.

I am not saying this applies here but it should be considered when the backup of nnGB will not fit in a backup file of nnGB + a tiny amount.

Sparse file have "holes", this has nice diagrams:

Sparse file - Wikipedia, the free encyclopedia

Also note that you can mount file systems in such a way as to "obscure" an underlying group of files. Ex:
a directory tree with 2GB total in the whole thing :/path/to/confusion
If you mount a file system on /path/to/confusion on the confusion directory, all of the files are there but they no longer are visible to some tools. Results of du versus df is one of them.

This may produce the same weird results being discussed. Again, you may want to consider it. Look in fstab for confirmation.
 

10 More Discussions You Might Find Interesting

1. UNIX Desktop Questions & Answers

Size Limitation for a user directory

Hi all, I want to set a size limitation for some user in the system, for an example, each user only have 5MB free space in the system. The user cannot user more than 5 MB space. Is it possible to do this? Thanks! (1 Reply)
Discussion started by: felix_koo
1 Replies

2. HP-UX

HP-UX 11i - File Size Limitation And Number Of Folders Limitation

Hi All, Can anyone please clarify me the following questions: 1. Is there any file size limitation in HP-UX 11i, that I can able to create upto certain size of file (say 2 GB) and not more then that???? 2. At max. how many files we can able to keep inside a folder???? 3. How many... (2 Replies)
Discussion started by: sundeep_mohanty
2 Replies

3. Shell Programming and Scripting

File size limitation of unix sort command.

hi , iam trying to sort millions of records which is delimited and i cant able to use sort command more than 60 million..if i try to do so i got an message stating that "File size limit exceeded",Is there any file size limit for using sort command.. How can i solve this problem. thanks ... (7 Replies)
Discussion started by: cskumar
7 Replies

4. Linux

File size limitation for rcp

Hi I am trying to rcp a file from Solaris box to Linux. When the file size is 2,205,255,047, the rcp fails with the message Jan 10 01:11:53 hqsas167 rsh: pam_authenticate: error Authentication failed However when I rcp a file with smaller size - 9,434,477 - the rcp completes with... (2 Replies)
Discussion started by: schoubal
2 Replies

5. Shell Programming and Scripting

Size limitation in Tar command

Hi to every body there, I am new this forum and this is my first post. I am a new user of Unix, is there any size limitation of files while creating tar file. Thanks in advance (4 Replies)
Discussion started by: Manvar Khan
4 Replies

6. Shell Programming and Scripting

fetchmail - log file size limitation

Hi, I am using fetchmail in my application so as to download mails to the localhost where the application is hosted from the mailserver.Fetchmail is configured as as to run as a daemon polling mails during an interval of 1sec. So my concern here is, during each 2sec it is writing two... (10 Replies)
Discussion started by: DILEEP410
10 Replies

7. UNIX for Advanced & Expert Users

Find command -size option limitation ?

Hi All, I ran code in test environment to find the files more than 1TB given below is a snippet from code: FILE_SYSTEM=/home/arun MAX_FILE_LIMIT=1099511627776 find $FILE_SYSTEM -type f -size +"$MAX_FILE_LIMIT"c -ls -xdev 2>/dev/null | while read fname do echo "File larger than... (3 Replies)
Discussion started by: Arunprasad
3 Replies

8. Solaris

How to extend 2 GB file size limitation

Hello All, I am using a SunOS machine. My application creates output files for the downstream systems. However output files are restricted to 2GB of file size in SunOS due to which I am forced to create multiple files which is not supported by the downstream due to some limitations. Is... (5 Replies)
Discussion started by: pasupuleti81
5 Replies

9. UNIX for Advanced & Expert Users

size for sum variable limitation on awk

Hello first, truth been told, I'm not even close to be advanced user. I'm posting here because maybe my question is complicated enough to need your expert help I need to use awk (or nawk - I don't have gawk) to validate some files by computing the total sum for a large numeric variable. It... (1 Reply)
Discussion started by: cwitarsa
1 Replies

10. Linux

File size limitation in the EST 2012 x86_64 GNU/Linux

Hello Friends, I tried to take tar backup in my server, but it ended with an error. It said that: /home/back/pallava_backup/fbackup_backup/stape_config /home/back/romam_new.tar.gz tar: /home/backup/back.tar.gz: Cannot write: No space left on device tar: Error is not recoverable: exiting... (10 Replies)
Discussion started by: siva3492
10 Replies
svn-fast-backup(1)					      General Commands Manual						svn-fast-backup(1)

NAME
svn-fast-backup - very fast backup for Subversion fsfs repositories. SYNOPSIS
svn-fast-backup [-q] [-k{N|all}] [-f] [-t] [-s] repos_path backup_dir DESCRIPTION
svn-fast-backup uses rsync snapshots for very fast backup of a Subversion fsfs repository at repos_path to backup_dir/repos-rev, the latest revision number in the repository. Multiple fsfs backups share data via hardlinks, so old backups are almost free, since a newer revision of a repository is almost a complete superset of an older revision. This is good for replacing incremental log-dump+restore-style backups because it is just as space-conserving and even faster; there is no inter-backup state (old backups are essentially caches); each backup directory is self-contained. It has the same command-line interface as svn-hot-backup(1) (if you use --force), but only works for fsfs repositories. svn-fast-backup keeps 64 backups by default and deletes backups older than these; this can be adjusted with the -k option. OPTIONS
-h, --help Shows some brief help text. -q, --quiet Quieter-than-usual operation. -k, --keep=N Keep a specified number of backups; the default is to keep 64. -k, --keep=all Do not delete any old backups at all. -f, --force Make a new backup even if one with the current revision exists. -t, --trace Show actions. -s, --simulate Don't perform actions. AUTHOR
Voluntary contributions made by many individuals. Copyright (C) 2006 CollabNet. 2006-11-09 svn-fast-backup(1)
All times are GMT -4. The time now is 05:29 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy