a problem with large files


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting a problem with large files
# 15  
Old 07-13-2010
yeah i'm on Solaris..i will check and feed u back ...thx a lot but could u please explain to me the command
# 16  
Old 07-13-2010
Code:
awk 'NR==1{getline x<f}NR==x{print;getline x<f}' f=file file1 > file2

- We are are reading all lines in "file1"
- If we are reading line 1 from "file1" then read the variable x from f (which is set to "file")
- If we are on line x in file1 then print the line and read the next variable x from "file"
- Repeat until we reach the end of file1
- Output to file2
This User Gave Thanks to Scrutinizer For This Post:
# 17  
Old 07-14-2010
it worked but it only printed the first line no. in file which is line no. 16 in file 1...why is that, as i know nawk must go through all the file !!!

---------- Post updated at 02:43 PM ---------- Previous update was at 09:20 AM ----------

Quote:
Originally Posted by binlib
Code:
awk 'FNR==NR{n[$0];next}FNR in n'  file file1 > file2


it worked with nawk.....i can't describe how can i thank ya ....thanx a billion Smilie

---------- Post updated at 02:44 PM ---------- Previous update was at 02:43 PM ----------

Quote:
Originally Posted by Scrutinizer
Code:
awk 'NR==1{getline x<f}NR==x{print;getline x<f}' f=file file1 > file2

- We are are reading all lines in "file1"
- If we are reading line 1 from "file1" then read the variable x from f (which is set to "file")
- If we are on line x in file1 then print the line and read the next variable x from "file"
- Repeat until we reach the end of file1
- Output to file2

Really thanks a lot for your help Smilie
Login or Register to Ask a Question

Previous Thread | Next Thread

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

A Large Percent Problem

Hello everyone, I have two matrices at same sizes. I need to re-calculate the numbers in matrix A according to the percentages in martix B it is like matrix A is 10.00 20.00 30.00 40.00 60.00 70.00 80.00 90.00 20.00 30.00 80.00 50.00 martix B is 00.08 00.05 ... (2 Replies)
Discussion started by: miriammiriam
2 Replies

2. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

3. Shell Programming and Scripting

Divide large data files into smaller files

Hello everyone! I have 2 types of files in the following format: 1) *.fa >1234 ...some text... >2345 ...some text... >3456 ...some text... . . . . 2) *.info >1234 (7 Replies)
Discussion started by: ad23
7 Replies

4. UNIX for Dummies Questions & Answers

Large Problem with nautilus

Hi, I am a torrent-maniak and I use Transmission. All things were good but Nautilus begun to show problem while I was runnning Transmission.Its situation was becoming worse and worse. Now, when I boot I can hardly open a nautilus window and browse my files.It will "stack" in seconds for sure! I... (2 Replies)
Discussion started by: hakermania
2 Replies

5. UNIX for Dummies Questions & Answers

Large file problem

I have a large file, around 570 gb that I want to copy to tape. However, my tape drive will load only up to 500 gb. I don't have enough space on disk to compress it before copying to tape. Can I compress and tar to tape in one command without writing a compressed disk file? Any suggestions... (8 Replies)
Discussion started by: iancrozier
8 Replies

6. UNIX for Dummies Questions & Answers

Problem using find with prune on large number of files

Hi all; I'm having a problem when want to list a large number of files in current directory using find together with the prune option. First i used this command but it list all the files including those in sub directories: find . -name "*.dat" | xargs ls -ltr Then i modified the command... (2 Replies)
Discussion started by: ashikin_8119
2 Replies

7. UNIX for Advanced & Expert Users

Large file FTP problem

We are experiencing a problem on a lengthy data transfer by FTP through a firewall. Since there are two ports in use on a ftp transfer (data and control), one sits idle while the other's transfering data. The idle port (control) will get timed out and the data transfer won't know that it's... (3 Replies)
Discussion started by: rprajendran
3 Replies

8. Shell Programming and Scripting

problem with 0 byte and large files

how to remove all zero byte files in a particular directory and also files that are morew than 1GB. pLEASE let me know (3 Replies)
Discussion started by: dsravan
3 Replies

9. Shell Programming and Scripting

Problem in processing a very large file.

Hi Friends, Getting an error while processing a very large file using an sqlloader........ The file is larger than 2 GB. Now need to change the compiler to 64-bit so that the file can be processed. Is there any command for the same. Thanks in advance. (1 Reply)
Discussion started by: Rohini Vijay
1 Replies
Login or Register to Ask a Question