Sponsored Content
Operating Systems AIX Tar extraction error in aix 5.3 Post 302520275 by bakunin on Friday 6th of May 2011 09:42:27 AM
Old 05-06-2011
Have you googled for the error message? I case you haven't:

from this link:
Quote:
Solution: The issue was resolved by using the slibclean command. According to the manual,

Quote:
The slibclean command unloads all object files with load and use counts of 0. It can also be used to remove object files that are no longer used from both the shared library region and in the shared library and kernel text regions by removing object files that are no longer required.

...or this message from comp.unix.aix:

Quote:
The error is caused by the fact that an instance of /xlt/thefile
is running. You should be able to see it with ps aux. Since the
operating system uses demand paging, not all of the pages of a program
are necessarily mapped into memory when it starts. This means that
the process may request more pages of that program to be mapped into
memory at any time. So the operating system doesn't let you remove
the file.
You may also test the tar file by issuing:

Code:
tar -tvf APP.tar

and see if this works or produces the same error. You also might compare its output to files you have on your system. If you have tarred the files with absolute pathes it will be untarred to absolute pathes too, not the directory which happens to be your $PWD.

TIP: if you want your output formatted pagewise pipe it to either "more" or "pg".

I hope this helps.

bakunin
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Very slow Tar extraction from tape

I'm pulling a 1MB file from tape using tar. It's a 300GB DLT tape and it does have a lot of files on it because it's go the entire OS and Oracle RMAN files and 3000 table exports, but it's taking 2-3 hours to pull this file off of it. Is this type of performance what I should expect? The... (0 Replies)
Discussion started by: citrowske
0 Replies

2. Shell Programming and Scripting

Help with tar extraction!

I have this tar file which has files of (.ksh, .ini &.sql) and their hard and soft links. Later when the original files and their directories are deleted (or rather lost as in a system crash), I have this tar file as the only source to restore all of them. In such a case when I do, tar... (4 Replies)
Discussion started by: manthasirisha
4 Replies

3. AIX

AIX 4.2: tar using compression ?

Hi, is there a way to use compression on the TAR command running on a AIX 4.2 ? I did a "man tar" but did not see mentions of using compression, nor how to even find out the tar version. I want to look into ways of reducing the amount of time to do backups. One backup is dumping database... (9 Replies)
Discussion started by: Browser_ice
9 Replies

4. UNIX for Advanced & Expert Users

tar and gzip extraction issues

Not sure if this is really in the right forum but here goes.... Looking for a way to extract individual compressed files from a compressed tarball WITHOUT tar -zxvf and then recompressing. Basically we need to be able to chunk out an individual compressed file while it still remains... (6 Replies)
Discussion started by: athos
6 Replies

5. Emergency UNIX and Linux Support

extraction of directory and below using gnu tar

i need to restore everything in a certain directory and lower. I have a tgz archive of all of the files, and i need to restore everything in /user/home/xxxx/ and below. this is a users home directory. this is a dumb question and i know when i see the answer i am going to say DUH, but i am... (2 Replies)
Discussion started by: frankkahle
2 Replies

6. Shell Programming and Scripting

Extraction of .tar.gz creates additional unwanted directories

I'm working on a project that requires me to compress then relocate directories to a different location based on their last date of modification. After running the script I check to see if it worked, and upon unzipping the tar.gz using I created everything that should be there is. I then performed... (4 Replies)
Discussion started by: jrymer
4 Replies

7. UNIX for Dummies Questions & Answers

How to tar a directory in AIX?

Hi i want to tar a directory.. i have tried few command but it is not working . please let me know how to tar and untar a directory below are the error which i am getting tar -zxvf tl11cp01_042414_071123.tar.gz tar: Not a recognized flag: z tar -zxvf... (3 Replies)
Discussion started by: scriptor
3 Replies

8. Shell Programming and Scripting

Need help on ARJ extraction in IBM AIX Machine

Hi, We have the requirement that, needs to extract the *.arj (archive) files in IBM AIX platform. Anyone can guide me to how to extract the files using ARJ or any other zip technique. Regards, Deepak. (3 Replies)
Discussion started by: mkdeepak87
3 Replies

9. AIX

GNU TAR vs NATIVE AIX TAR

Hello, Getting this very strange error, made tar/zip through gnu tar GNU Tar ( successful tar and zip without any errors ) /opt/freeware/bin/tar cvf - /oraapp| gzip > /backup/bkp_15_6_16_oraapp.tgz GNU unTar error root@test8:/>gunzip < /config1/bkp_15_6_16_oraapp.tgz |... (5 Replies)
Discussion started by: filosophizer
5 Replies

10. Shell Programming and Scripting

Speed up extraction od tar.bz2 files using bash

The below bash will untar each tar.bz2 folder in the directory, then remove the tar.bz2. Each of the tar.bz2 folders ranges from 40-75GB and currently takes ~2 hours to extract. Is there a way to speed up the extraction process? I am using a xeon processor with 12 cores. Thank you :). ... (7 Replies)
Discussion started by: cmccabe
7 Replies
htdig(1)						      General Commands Manual							  htdig(1)

NAME
htpurge - remove unused documents from the database (general maintenance script) SYNOPSIS
htpurge [-][-a][-c configfile][-u][-v] DESCRIPTION
Htpurge functions to remove specified URLs from the databases as well as bad URLs, unretrieved URLs, obsolete documents, etc. It is recom- mended that htpurge be run after htdig to clean out any documents of this sort. OPTIONS - Take URL list from standard input (rather than specified with -u). Format of input file is one URL per line. -a Use alternate work files. Tells htpurge to append .work to database files, causing a second copy of the database to be built. This allows the original files to be used by htsearch during the run. -c configfile Use the specified configfile instead of the default. -u URL Add this URL to the list of documents to remove. Must be specified multiple times if more than one URL are to be removed. Should nor be used together with -. -v Verbose mode. This increases the verbosity of the program. Using more than 2 is probably only useful for debugging purposes. The default verbose mode (using only one -v) gives a nice progress report while digging. FILES
/etc/htdig/htdig.conf The default configuration file. SEE ALSO
Please refer to the HTML pages (in the htdig-doc package) /usr/share/doc/htdig-doc/html/index.html and the manual pages htdigconfig(8) , htdig(1) and htmerge(1) for a detailed description of ht://Dig and its commands. AUTHOR
This manual page was written by Robert Ribnitz, based on the HTML documentation of ht://Dig. January 2004 htdig(1)
All times are GMT -4. The time now is 10:54 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy