I am trying to backup Databases in my company which installed on AIX oracle 10g. From the start they have not used the archives to backup DB. Instead they use expdp utility to export the data in to dump file and rename the file extension to .tmp and ftp to another machine to compress the file. (Don't ask why they change extension to .tmp)
I am trying to short the process by zip the file before sending file to Storage (NAS). But when compress large database dump files (oracle dump, used ZIP,gzip commands) I get below error. File size more than 20GB.
zip error: Entry too big to split, read, or write (file exceeds Zip's 4GB uncompressed size limit)
I have read increase ulimit will help on this issue but my machine ulimit values are:
Could you direct the export/dump to a pipe file instead? Create one with:-
After this exists, start the following (in the background):-
You may well be able to substitute compress with gzip or other tools. You will want the flags that read and write standard input/output. I think it's -c for gzip, but haven't got it, so can't be sure.
After this is running, start the export/dump/whatever.
Thanks for the reply. Since the setup is very old I don't want to change much as not sure what are AIX configurations and what patches they applied (Work Around scripts). That's why i don't want to use pipe file system. It would be great if i can use GZIP or ZIP instead.
Thanks for the reply. Since the setup is very old I don't want to change much as not sure what are AIX configurations and what patches they applied (Work Around scripts). That's why i don't want to use pipe file system. It would be great if i can use GZIP or ZIP instead.
I'm not sure why patch levels would affect this. We've been running it on AIX 4.3.3 ( oh the shame) for ages and have migrated databases across the network from AIX 5.1 to RHEL 6.5 (export to pipe, dd across the network to a pipe, import from a pipe) It's just standard stuff.
The process also removes the need to store a huge intermediate file and then have space to compress it to along with the time saving of doing it all in one. On the migration to RHEL, we saved 23 hours over a process that was taking about 40.
If you really want to stick with a similar process to what you have and can't even upgrade gzip, perhaps you could split the file into smaller chunks, compress each, then concatenate them on the other side, so on the source server:-
Then transfer all the exp.dat.split*zip files to the target server and:-
..... and load your data in the normal way.
You may need to experiment with the values of -a & -b for the split so you get a managable number of files and of a suitable size to still allow gzip to work. Have a read of the split manual pages for more information.
Thanks Robin and Bakunin for helping me out. I will upgrade the ZIP and GZIP and see how it goes.
Robin as the system is not stable setup (overall) if some issue fired up they will think its the new setup (trying to save my self while fixing ). Any Known issues with split and concatenate in another machine? This sounds interesting as well.
1. The problem statement, all variables and given/known data:
are the oracle dump files compatible to direct import into db2?
I already tried many times but it always truncated results.
anyone can help/ advice or suggest?
2. Relevant commands, code, scripts, algorithms:
exp... (3 Replies)
Hi All,
I am facing one problem related to importing Oracle Dump file.
We have two different version of AIX boxes with oracle (version 10.2.0.4.0) installed.
On one AIX box (version 6.1) we are creating oracle dump file (*.dmp) using oracle exp utility and importing it onto another AIX box... (1 Reply)
Hi All,
I have a linux centos instance which has a dump file. I need to import the dump file to the oracle server which is located at some remote location. I have installed the oracle client on my machine and I am able to connect to the remote oracle server.
Now how to import the dump to the... (3 Replies)
Hi All,
I have a full oracle dump file that I have exported from a production server. I want to import a specific schema out of the full dump.
Is that possible in oracle. What will be the command for that? (6 Replies)
Hi for all!
I need for my job one shell script very concrete about connection to oracle databases and writing to text file.
Well I tell you:
1.- I must read a file as parameter from linux interface:
> example.sh fileToRead
2.-I must read line to line in "fileToRead" and take a... (9 Replies)
Hi All,
I tried searching a lot about this but to no avail. I have a HTML file. I used
links -dump file_page.html > text_html.txt
What the above command gave me was a filtered text from the HTML file with tags removed. Now, the the output from the above command looked something like this:... (1 Reply)
Hi In SunOS the gdb command outputs the following info.
---Type <return> to continue, or q <return> to quit---
Reading symbols from /opt/dcs_5.1/lib/libssl.so...done.
Loaded symbols for /opt/dcs_5.1/lib/libssl.so
Reading symbols from /opt/dcs_5.1/lib/libcrypto.so...done.
Loaded symbols for... (1 Reply)
Hi,
We have access to 2 filesystems:
Filesystem Size Used Avail Use% Mounted on
/dev/mapper/system-oraoid 30G 30G 119M 100% /app
/dev/mapper/system-tmp 2.0G 442M 1.6G 22% /tmp
As you can observe the file system on /app is full. In order to recover... (2 Replies)
Hi,
I have a core dump problem in HP-UX. I have searched in this forum, but could not get any answer related to my issue.
I debugged the core file by 'gdb' and by using 'bt' cmd I got the function and line no, but I am not sure that the problem lies on the same function .
I want to know... (0 Replies)