Sponsored Content
Operating Systems Linux Slackware What is the medium usually used to backup large trees? Post 302750201 by jim mcnamara on Monday 31st of December 2012 06:12:13 AM
Old 12-31-2012
If you mean directory trees, tape. Not disk.
 

6 More Discussions You Might Find Interesting

1. AIX

Backup single large file

Hi I have a single large file 11gb that I need to copy/backup to tape then restore on another system. I tried tar but that complained about the file being too large Anyone have any suggestions how I can do this with AIX 5.2 Much appreciated. (3 Replies)
Discussion started by: Alvescot
3 Replies

2. Shell Programming and Scripting

CGI , Perl and Trees

I have been trying to get this for weeks now but maybe someone knows or has a snippet of code to display a collapsible tree view. something like this: +Yahoo! -/site.html -/site2.html +Google -/site.php -/site2.php (1 Reply)
Discussion started by: Dabheeruz
1 Replies

3. Programming

2-4 trees in C

i am trying to write a program in order to learn how to work with trees and especially 2-4 trees. the general idea is that each node is represented by 4 cells and 5 pointers? (maybe 2 arrays then? ) let's suppose that we insert simply int numbers in all cells. firstly we initialize the root... (2 Replies)
Discussion started by: bashuser2
2 Replies

4. Solaris

Medium Changer not detected.

Hello Gurus, We are in the process of configuring SAN based backup for our DB hosted on Solaris 10 (SPARC and X86) Servers. But the Robotic arm (Medium Changer - HP) is not getting detected on the server. Need experts help in checking this from the host (Solaris Server) end. Thank You. (0 Replies)
Discussion started by: EmbedUX
0 Replies

5. Shell Programming and Scripting

rsync backup mode(--backup) Are there any options to remove backup folders on successful deployment?

Hi Everyone, we are running rsync with --backup mode, Are there any rsync options to remove backup folders on successful deployment? Thanks in adv. (0 Replies)
Discussion started by: MVEERA
0 Replies

6. Shell Programming and Scripting

How to copy very large directory trees

I have constant trouble with XCOPY/s for multi-gigabyte transfers. I need a utility like XCOPY/S that remembers where it left off if I reboot. Is there such a utility? How about a free utility (free as in free beer)? How about an md5sum sanity check too? I posted the above query in another... (3 Replies)
Discussion started by: siegfried
3 Replies
AMFETCHDUMP(8)						  System Administration Commands					    AMFETCHDUMP(8)

NAME
amfetchdump - extract backup images from multiple Amanda tapes. SYNOPSIS
amfetchdump [-c|-C|-L] [-p|-n] [-a] [-O directory] [-d device] [-h] [--header-file filename] [--header-fd fd] [-o configoption...] config hostname [disk [ date [ level [ hostname [...] ] ] ]] DESCRIPTION
Amfetchdump pulls one or more matching dumps from tape or from the holding disk, handling the reassembly of multi-tape split dump files as well as any tape autochanger operations. It will automatically use the Amanda catalog to locate available dumps on tape, in the same way that the find feature of amadmin(8) lists available dumps. The hostname, diskname, datestamp, and level dump specifications are further described in amanda-match(7). Note that at minimum a hostname must be specified. Unless -p is used, backup images are extracted to files in the current directory named: If a changer error occurs, or the -d option is given, then amfetchdump prompts for each required volume. hostname.diskname.datestamp.dumplevel OPTIONS
-p Pipe exactly one complete dump file to stdout, instead of writing the file to disk. This will restore only the first matching dumpfile (where "first" is determined by the dump log search facility). -h Output the amanda header as a 32K block to same output as the image. --header-fd fd Output the amanda header to the numbered file descriptor. --header-file filename Output the amanda header to the filename. -d device_or_changer Restore from this device or changer instead of the default, prompting for each volume. -O directory Output restored files to this directory, instead of to the current working directory. -c Compress output, fastest method available. -C Compress output, smallest file size method available. -l Leave dumps in the compressed/uncompressed state in which they were found on tape. By default, amfetchdump will automatically uncompress when restoring. -a Assume that all tapes are already available, via tape changer or otherwise, instead of prompting the operator to ensure that all tapes are loaded. -n Do not reassemble split dump files at all, just restore each piece as an individual file. -o configoption See the "CONFIGURATION OVERRIDE" section in amanda(8). EXAMPLES
All the examples here assume your configuration is called SetA. Here's a simple case, restoring all known dumps of the host vanya to the current working directory. $ amfetchdump SetA vanya A more likely scenario involves restoring a particular dump from a particular date. We'll pipe this one to GNU-tar as well, to automatically extract the dump. $ amfetchdump -p SetA vanya /home 20051020 | gtar -xvpf - CAVEATS
Amfetchdump is dependent on accessing your server's config, tape changer, and (normally) dump logs. As such, it's not necessarily the most useful tool when those have all been wiped out and you desperately need to pull things from your tape. Pains have been taken to make it as capable as possible, but for seriously minimialist restores, look to amrestore(8) or dd(8) instead. SEE ALSO
amanda(8), amanda-match(7), amadmin(8), amrestore(8) The Amanda Wiki: : http://wiki.zmanda.com/ AUTHORS
John Stange <building@nap.edu> National Academies Press Ian Turner <ian@zmanda.com> Zmanda, Inc. (http://www.zmanda.com) Amanda 3.3.1 02/21/2012 AMFETCHDUMP(8)
All times are GMT -4. The time now is 10:13 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy