06-14-2006
May be useful to Oracle DBA
---------------- Export Section
# Create new Named pipes.
mknod -p /dev/split_pipe
mknod -p /dev/compress_pipe # You can use the existing named pipe
# itself, instead of creating new.
======================================================================
Create a shell script under a file, named Split_export.sh
======================================================================
# -b1000m indicates to split command to split the input file into every 1000 MB size.
# As it splits, the split command will suffix aa, ab, ac, ad ... upto zz to the file name specified.
# The export file name is expfile.
nohup split -b1000m < /dev/split_pipe > /DumpDir/expfile &
nohup compress < /dev/compress_pipe > /dev/split_pipe &
exp username/password full=y file=/dev/compress_pipe and other parameters for export.
=======================================================================
After saving the above three commands in split_export.sh, execute the following.
=======================================================================
chmod a+x split_export.sh
nohup split_export.sh > /tmp/split_export.log 1>&2 &
=======================================================================
After a few minutes you should see files in the export dump directory.
=======================================================================
------------------ IMPORT Section
======================================================================
Create a shell script with the following command under the file name split_import.sh.
After creating provide execution permission to this script as follows:
======================================================================
Chmod a+x split_import.sh
# The import script assumes in this example that the above export script created 2 split files
# called expfileaa and expfileab. The order of the file for the cat command is very important.
nohup cat /dumpdir/expfileaa /dumpdir/expfileab > /dev/split_pipe &
# sleep 3 seconds
Sleep 3
nohup uncompress < /dev/split_pipe > /dev/compress_pipe &
#Sleep at this point is very important as some time is needed to uncompress the file and send it to the pipe.
sleep 60
imp username/password file=/dev/compress_pipe and other parameters for export.
nohup split_import.sh > /tmp/split_import.log 1>&2 &
=======================================================================
Wait for the import to finish.
=======================================================================
10 More Discussions You Might Find Interesting
1. AIX
Can some one help me to know how I can restrict tar to span on multiple volumes if it get bigger then 2 gig. I am working on AIX 5.2 and created a tar of 4 gig but I want to move the tar to 4.3 that has limit of 2 gig.
Thanks. (0 Replies)
Discussion started by: interim05
0 Replies
2. AIX
Hi,
I am using AIX 5.2, and I want to copy some files from one server to a remote server using tar command. Can anybody tell me exact command?
Thanks.
Aqeel (2 Replies)
Discussion started by: system-admin
2 Replies
3. Shell Programming and Scripting
find /base/directory -size +2048M > /tmp/tempfile
1] is there an option to specify the size in Gb
2] the output lists files with size such as:
1152157
(doing ls -l so I guess it is in bytes) So I am missing something here because this file has a size below 2 Gb.
Any hints??
thx. (3 Replies)
Discussion started by: melanie_pfefer
3 Replies
4. Shell Programming and Scripting
Hello ,
I have to write a crontab line make a check on a file and, if bigger than 2Gb, to stop apache daemon, delete the file and restart apache .
Someone have suggestions ?
Thanks (2 Replies)
Discussion started by: gogol_bordello
2 Replies
5. AIX
To speed up our backups, I found a way to compress all the backups files without running out of space. But before starting to use this in our procedures, I want to know if the command 'compress' has any issues in AIX 4.2 with files bigger then 1Gig. Our backup files have sizes ranging between 600Mg... (1 Reply)
Discussion started by: Browser_ice
1 Replies
6. UNIX for Advanced & Expert Users
Hi the following c-code utilizing the 'read()' man 2 read method cant read in files larger that 2gig.
Hi I've found a strange problem on ubuntu64bit, that limits the data you are allowed to allocate on a 64bit platform using the c function 'read()'
The following program wont allow to allocate... (14 Replies)
Discussion started by: monkeyking
14 Replies
7. Shell Programming and Scripting
Hi all,
I have a tar file and inside that tar file is a folder with additional tar.gz files. What I want to do is look inside the first tar file and then find the second tar file I'm looking for, look inside that tar.gz file to find a certain directory. I'm encountering issues by trying to... (1 Reply)
Discussion started by: bashnewbee
1 Replies
8. AIX
Hello,
I have a directory which suddenly got filled up 100% from 70% ; (this is an oracle directory which has application and database in it ORACLE EBS)
I do make cold backups, last month when I made cold backup of the directory /oratec the tar zip file was 31GB and this month when i made the... (5 Replies)
Discussion started by: filosophizer
5 Replies
9. IP Networking
I am sending files over WAN from Ubuntu 8 to OSR 5.07 on regular basis without a problem using ftp and rsync.
Today I can not send anything bigger than ~1350 bytes, tried ftp, rsync, scp, it becomes frozen and I have to kill the process. Smaller files are sent without issue. Telnet and ssh... (1 Reply)
Discussion started by: migurus
1 Replies
10. AIX
Hello,
Getting this very strange error, made tar/zip through gnu tar
GNU Tar ( successful tar and zip without any errors )
/opt/freeware/bin/tar cvf - /oraapp| gzip > /backup/bkp_15_6_16_oraapp.tgz
GNU unTar error
root@test8:/>gunzip < /config1/bkp_15_6_16_oraapp.tgz |... (5 Replies)
Discussion started by: filosophizer
5 Replies