10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi Friends,
I have a file with sample amount data as follows:
-89990.3456
8788798.990000128
55109787.20
-12455558989.90876
I need to exclude the '-' symbol in order to treat all values as an absolute one and then I need to sum up.The record count is around 1 million.
How... (8 Replies)
Discussion started by: Ravichander
8 Replies
2. UNIX for Dummies Questions & Answers
Hi all,
I hope you are well. I am very happy to see your contribution. I am eager to become part of it.
I have the following question. I have two huge files to compare (almost 3GB each). The files are simulation outputs. The format of the files are as below
For clear picture, please see... (9 Replies)
Discussion started by: kaaliakahn
9 Replies
3. Shell Programming and Scripting
I have a DB folder which sizes to 60GB approx. It has logs which size from 500MB - 1GB. I have an Installation which would update the DB. I need to backup this DB folder, just incase my Installation FAILS. But I do not need the logs in my backup. How do I exclude them during compression (tar)?
... (2 Replies)
Discussion started by: DevendraG
2 Replies
4. AIX
Dear Guy’s
By using dd command or any strong command, I’d like to copy huge data from file system to another file system
Sours File system: /sfsapp
File system has 250 GB of data
Target File system: /tgtapp
I’d like to copy all these files and directories from /sfsapp to /tgtapp as... (28 Replies)
Discussion started by: Mr.AIX
28 Replies
5. Shell Programming and Scripting
Hi, all:
I've got two folders, say, "folder1" and "folder2".
Under each, there are thousands of files.
It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command.
However, if I change the above question a... (1 Reply)
Discussion started by: jiapei100
1 Replies
6. Shell Programming and Scripting
hi
i receive about 5000 files per day in my system. Each of them are like:
cat ABC.april24.dat
ABH00001990 01993 409009092 0909 INI iop 9033
AAB0000237893784 8430900 898383 AUS 34349089008 849843 9474822
AAA00003849893498098394 84834 348348439 -438939 IN
AAA00004438493893849384... (2 Replies)
Discussion started by: Prateek007
2 Replies
7. High Performance Computing
we have one file (11 Million) line that is being matched with (10 Billion) line.
the proof of concept we are trying , is to join them on Unix :
All files are delimited and they have composite keys..
could unix be faster than Oracle in This regards..
Please advice (1 Reply)
Discussion started by: magedfawzy
1 Replies
8. UNIX for Advanced & Expert Users
Hi , i need a fast way to delete duplicates entrys from very huge files ( >2 Gbs ) , these files are in plain text.
I tried all the usual methods ( awk / sort /uniq / sed /grep .. ) but it always ended with the same result (memory core dump)
In using HP-UX large servers.
Any advice will... (8 Replies)
Discussion started by: Klashxx
8 Replies
9. UNIX for Dummies Questions & Answers
Hi,
As per my requirement, I need to take difference between two big files(around 6.5 GB) and get the difference to a output file without any line numbers or '<' or '>' in front of each new line.
As DIFF command wont work for big files, i tried to use BDIFF instead.
I am getting incorrect... (13 Replies)
Discussion started by: pyaranoid
13 Replies
10. Shell Programming and Scripting
Hi,
I have two files file A and File B. File A is a error file and File B is source file. In the error file. First line is the actual error and second line gives the information about the record (client ID) that throws error. I need to compare the first field (which doesnt start with '//') of... (11 Replies)
Discussion started by: kmkbuddy_1983
11 Replies