Sponsored Content
Top Forums Shell Programming and Scripting Want to extract certain lines from big file Post 302965126 by mad man on Sunday 24th of January 2016 04:55:15 AM
Old 01-24-2016
Hi Don,

Sorry for the inconvenience.

The code you have posted last is not working for me please find the way how i used it.

Code:
 
big_file='/tmp/remedixz.20160120_085021_41222370_1'
trannum="/tmp/transnum"
file_new="${big_file}_23962395676"
awk -F '~' '
FNR == NR {
	t[$1]
	tc = FNR
	next
}
{
	l[++lc] = $0
}
$1 == "%%YEDTRN" && $2 in t {
	remove t[transnum = $2]
	tc--
}
$1 == "0000EOT" {

	if(transnum) {
		for(i = 1; i <= lc; i++)
			print l[i] > ("$file_new:" transnum)
		close("$file_new:" transnum)
		printf("Transaction #%s extracted to file $file_new:%s\n", transnum,
		    transnum)
	}
	if(tc) {
		lc = 0
		transnum = ""
	} else {
		exit
	}
}' $trannum $big_file

---------- Post updated at 03:25 PM ---------- Previous update was at 03:13 PM ----------

Hi Don,

The SED you have modified and posted did not thrown any error like last time but the output file is empty.

Thanks
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

How to view a big file(143M big)

1 . Thanks everyone who read the post first. 2 . I have a log file which size is 143M , I can not use vi open it .I can not use xedit open it too. How to view it ? If I want to view 200-300 ,how can I implement it 3 . Thanks (3 Replies)
Discussion started by: chenhao_no1
3 Replies

2. UNIX for Dummies Questions & Answers

How big is too big a config.log file?

I have a 5000 line config.log file with several "maybe" errors. Any reccomendations on finding solvable problems? (2 Replies)
Discussion started by: NeedLotsofHelp
2 Replies

3. Shell Programming and Scripting

Print #of lines after search string in a big file

I have a command which prints #lines after and before the search string in the huge file nawk 'c-->0;$0~s{if(b)for(c=b+1;c>1;c--)print r;print;c=a}b{r=$0}' b=0 a=10 s="STRING1" FILE The file is 5 gig big. It works great and prints 10 lines after the lines which contains search string in... (8 Replies)
Discussion started by: prash184u
8 Replies

4. Shell Programming and Scripting

Re: Deleting lines from big file.

Hi, I have a big (2.7 GB) text file. Each lines has '|' saperator to saperate each columns. I want to delete those lines which has text like '|0|0|0|0|0' I tried: sed '/|0|0|0|0|0/d' test.txt Unfortunately, it scans the file but does nothing. file content sample:... (4 Replies)
Discussion started by: dipeshvshah
4 Replies

5. Shell Programming and Scripting

Extract some lines from one file and add those lines to current file

hi, i have two files. file1.sh echo "unix" echo "linux" file2.sh echo "unix linux forums" now the output i need is $./file2.sh unix linux forums (3 Replies)
Discussion started by: snreddy_gopu
3 Replies

6. UNIX for Advanced & Expert Users

Delete first 100 lines from a BIG File

Hi, I need a unix command to delete first n (say 100) lines from a log file. I need to delete some lines from the file without using any temporary file. I found sed -i is an useful command for this but its not supported in my environment( AIX 6.1 ). File size is approx 100MB. Thanks in... (18 Replies)
Discussion started by: unohu
18 Replies

7. Shell Programming and Scripting

Extract certain entries from big file:Request to check

Hi all I have a big file which I have attached here. And, I have to fetch certain entries and arrange in 5 columns Name Drug DAP ID disease approved or notIn the attached file data is arranged with tab separated columns in this way: and other data is... (2 Replies)
Discussion started by: manigrover
2 Replies

8. Shell Programming and Scripting

Extract certain columns from big data

The dataset I'm working on is about 450G, with about 7000 colums and 30,000,000 rows. I want to extract about 2000 columns from the original file to form a new file. I have the list of number of the columns I need, but don't know how to extract them. Thanks! (14 Replies)
Discussion started by: happypoker
14 Replies

9. UNIX for Beginners Questions & Answers

How to copy only some lines from very big file?

Dear all, I have stuck with this problem for some days. I have a very big file, this file can not open by vi command. There are 200 loops in this file, in each loop will have one line like this: GWA quasiparticle energy with Z factor (eV) And I need 98 lines next after this line. Is... (6 Replies)
Discussion started by: phamnu
6 Replies

10. Shell Programming and Scripting

Extract Big and continuous regions

Hi all, I have a file like this I want to extract only those regions which are big and continous chr1 3280000 3440000 chr1 3440000 3920000 chr1 3600000 3920000 # region coming within the 3440000 3920000. so i don't want it to be printed in output chr1 3920000 4800000 chr1 ... (2 Replies)
Discussion started by: amrutha_sastry
2 Replies
SHAR(1net)							  Wang Institute							SHAR(1net)

NAME
shar - create file storage archive for extraction by /bin/sh SYNOPSIS
shar [-abcmsuv] [-p prefix] [-d delim] files > archive DESCRIPTION
shar prints its input files with special command lines around them to be used by the shell, /bin/sh , to extract the files later. The out- put can be filtered through the shell to recreate copies of the original files. shar allows directories to be named, and shar prints the necessary commands (mkdir & cd) to create new directories and fill them. shar will not allow existing files to be over-written; such files must be removed by the user extracting the files. OPTIONS
-a All the options. The options: -v -c -b -p <tab>X are implied. -b Extract files into basenames so that files with absolute path names are put into the current directory. This option has strange effects when directories are archived. -c Check file size on extraction by counting characters. An error message is reported to the person doing the extraction if the sizes don't match. One reason why the sizes may not match is that shar will append a newline to complete incomplete last lines; shar prints a message that mentions added newlines. Another reason why the sizes may not match is that some network mail programs remove non-whitespace control characters. shar prints a message that mentions control characters to the extractor. -d Use this as the ``end of file'' delimiter instead of the default. The only reason to change it is if you suspect a file contains the default delimiter: SHAR_EOF. -m Reset the exact protection modes of files when they are extracted (using the chmod program). By default, the extractor's default file modes are used, and executable files (e.g., shell scripts) are made executable. -p Use this as the prefix to each line of the archived files. This is to make sure that special characters at the start of lines are not eaten up by programs like mailers. If this option is used, the files will be extracted with the stream editor sed rather than cat so it is more efficient and portable to avoid setting the prefix, though perhaps less safe if you don't know what is in the files. -s Silent running. All checking and extra output is inhibited. -u Archive the input files with the uuencode format for later extraction with uudecode. This will allow you to send files with control characters in them, but will slow down the extracting. You must be sure that the receiving party has access to uudecode. -v Print verbose feedback messages about what shar is doing to be printed during extraction. Sizes of plain files are echoed to allow a simple validity check. SEE ALSO
sh(1), tar(1), cpio(1), tp(1), uuencode(1), uudecode(1) fpack(1) is a plain-file packer useful for UNIX and MSDOS AUTHOR
Gary Perlman (based on a shell version by James Gosling, with additions motivated by many people on the UNIX network: Derek Zahn, Michael Thompson, H. Morrow Long, Fred Avolio, Gran Uddeborg, Chuck Wegrzyn, nucleus!randy@TORONTO, & Bill McKeeman) LIMITATIONS
shar does not know anything about links between files. UNIX User's Manual March 4, 1986 SHAR(1net)
All times are GMT -4. The time now is 04:25 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy