Sponsored Content
Full Discussion: how to read next records
Top Forums Shell Programming and Scripting how to read next records Post 302339515 by ubeejani on Thursday 30th of July 2009 05:19:41 PM
Old 07-30-2009
Question how to read next records

Hello friends,
I am newbie in programing. I am facing some problems in awk. Please help me.
I have a file with many data sets. Each data set is separated by an empty line. For example
Col1 Col2 Col3 Col4 Col5
0.85 0.07 Fre 42:86 25
0.73 0.03 frp 21:10 28
0.64 0.04 Fre 42:86 63
0.47 0.08 nie 25:76 32
0.37 0.01 veb 00:71 26
0.63 0.48 Fre 42:86 65
0.65 0.32 frp 21:10 19
0.47 0.08 nie 25:76 43

0.53 0.56 nie 25:76 52
0.32 0.43 veb 00:71 18
0.85 0.07 Fre 43:86 40
0.65 0.32 frp 21:10 30
0.85 0.07 Fre 43:86 50
0.53 0.56 nie 25:76 20
0.65 0.32 frp 21:10 40
0.85 0.07 Fre 43:86 50

0.85 0.07 Fre 43:86 60
0.32 0.43 veb 01:71 30
0.32 0.43 veb 01:71 80
0.65 0.32 frp 21:10 40
0.85 0.07 Fre 43:86 70
0.53 0.56 nie 25:76 60
0.65 0.32 frp 21:10 50

In this data Col4 is primary key.I want to check that if Col5 value is increasing for a specific record in first data set.If it is true for some records(as for "Fre 42:86 25/Fre 42:86 63/Fre 42:86 65" and "nie 25:76 32/nie 25:76 43") then read the next data set and check if the Col5 value for these records(Fre 42:86 and nie 25:76) is greater than 30. If true then just count it.So all the values in Col5 for "Fre 42:86" is greater than 30 but not for "nie 25:76" as there is one record "nie 25:76 20" and it's Col5 value is less than 30.Therefore I have 1 count for first data set.
Similarly then I should check the second data set that if Col5 value is increasing for a specific record in second data set.If true then read the next(third) data set and check if Col5 value for those records is greater than 30.If true then just count it and so on.
Please if anyone can guide/suggest/help me in solving this problem.
Thanks in advance.
Regards,
Ubee
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to read a particular line in a file with 500000 records init

I am trying to open a file and check a record. The file has 543267 records and I want to check the record 514455. I am trying to open in vi and go to that line :514455 but i get the error not enough lines in buffer How should i overcome this? Please help. (6 Replies)
Discussion started by: mhssatya
6 Replies

2. Shell Programming and Scripting

read records from a file

Hi all, I have a requirement where I need to read records one by one from a file. I have tried this below code: while read mLine do echo 'Line = '${mLine} done < input_file --- But the problem here is im getting the records with removed spaces. --Supposer if the record is like... (3 Replies)
Discussion started by: srilaxmi
3 Replies

3. Shell Programming and Scripting

need shell script to read particular records from a file

i am reading an i/p file input.txt as below and want to read all filenames as in highlighted in bold below and put them in a different file output.txt. can someone help me with a shell script to do this? thanks in advance regards brad input.txt --------- START TYPE:OPT INIT_SEQ:01... (8 Replies)
Discussion started by: bradc
8 Replies

4. Shell Programming and Scripting

How to read each 2 records from a file

Hi, I have 10000 records in my test.dat file All records are under the following format a,12,45,bn,c a,16,46,bn1,c a,18,47,bn2,c a,12,47,bn3,c a,11,49,bn4,c I have to read each 2 records and assign it into a temp file .Can anybody help me in this? Thanks (3 Replies)
Discussion started by: kavithakuttyk
3 Replies

5. Shell Programming and Scripting

AWK: Cannot read Number of records greater than 1(NR>1)

Hi all, I have a tab-delimited text file of size 10Mb. I am trying to count the number of lines using, grep -c . sample.txtor wc -l < sample.txt or awk 'END {print NR}' sample.txtAll these commands shows the count as 1, which means they are reading only the first header line of the file.... (3 Replies)
Discussion started by: mehar
3 Replies

6. Shell Programming and Scripting

How to read records in a file and sort it?

I have a file which has number of pipe delimited records. I am able to read the records....but I want to sort it after reading. i=0 while IFS="|" read -r usrId dataOwn expire email group secProf startDt endDt smhRole RoleCat DataProf SysRole MesgRole SearchProf do print $usrId $dataOwn... (4 Replies)
Discussion started by: harish468
4 Replies

7. Shell Programming and Scripting

Read File and check records for length

I need a script that will run in unix to: 1) Read and input file with 1 column that contains for ex: 0123456789 1234567890 ...etc 2) Checks the first column if it is: a. Numeric from 0 - 9 b. if it is not less... (4 Replies)
Discussion started by: mrn6430
4 Replies

8. Shell Programming and Scripting

How to read a particular records from a file?

Hi All Can anybody let me know the code to read a particular record from a file For e.g. File Name: File.txt File content: Script_path=/abc/def/script/ File_path=/xyz/data/ Business Date=19990905 SERVER_NAME=Server DATABASE_NAME=Database Login=NewUser Password=NewPassword ... (3 Replies)
Discussion started by: Siddhartha9833
3 Replies

9. Shell Programming and Scripting

Can ksh read records with blank fields

I have a tab delimited file with some fields potentially containing no data. In ksh 'read' though treats multiple tabs as a single delimiter. Is there any way to change that behavior so I could have blank data too? I.e. When encountering 2 tabs it would take it as a null field? Or do I have to... (3 Replies)
Discussion started by: benalt
3 Replies

10. Shell Programming and Scripting

How do i read only last 5 files records from a directory.?

Hello team, I have a directory containing huge number of files. I need to go select last 5 files and read it to do a calculation. Kindly guide, how can i achieve it. Regards, Sadique (13 Replies)
Discussion started by: sadique.manzar
13 Replies
LaTeX::Encode(3pm)					User Contributed Perl Documentation					LaTeX::Encode(3pm)

NAME
LaTeX::Encode - encode characters for LaTeX formatting SYNOPSIS
use LaTeX::Encode; $latex = latex_encode($text, %options); VERSION
This manual page describes version 0.03 of the "LaTeX::Encode" module. DESCRIPTION
This module provides a function to encode text that is to be formatted with LaTeX. It encodes characters that are special to LaTeX or that are represented in LaTeX by LaTeX commands. The special characters are: "" (command character), "{" (open group), "}" (end group), "&" (table column separator), "#" (parameter specifier), "%" (comment character), "_" (subscript), "^" (superscript), "~" (non-breakable space), "$" (mathematics mode). Note that some of the LaTeX commands for characters are defined in the LaTeX "textcomp" package. If your text includes such characters, you will need to include the following lines in the preamble to your LaTeX document. usepackage[T1]{fontenc} usepackage{textcomp} The function is useful for encoding data that is interpolated into LaTeX document templates, say with "Template::Plugin::Latex" (shameless plug!). SUBROUTINES
/METHODS "latex_encode($text, %options)" Encodes the specified text such that it is suitable for processing with LaTeX. The behaviour of the filter is modified by the options: "except" Lists the characters that should be excluded from encoding. By default no special characters are excluded, but it may be useful to specify "except = "\{}"" to allow the input string to contain LaTeX commands such as "this is \textbf{bold} text" (the doubled backslashes in the strings represent Perl escapes, and will be evaluated to single backslashes). "iquotes" If true then single or double quotes around words will be changed to LaTeX single or double quotes; double quotes around a phrase will be converted to "``" and "''" and single quotes to "`" and "'". This is sometimes called "intelligent quotes" "use_textcomp" By default the "latex_encode" filter will encode characters with the encodings provided by the "textcomp" LaTeX package (for example the Pounds Sterling symbol is encoded as "\textsterling{}"). Setting "use_textcomp = 0" turns off these encodings. NOT YET IMPLEMENTED EXAMPLES
The following snippet shows how data from a database can be encoded and inserted into a LaTeX table, the source of which is generated with "LaTeX::Table". my $sth = $dbh->prepare('select col1, col2, col3 from table where $expr'); $sth->execute; while (my $href = $sth->fetchrow_hashref) { my @row; foreach my $col (qw(col1 col2 col3)) { push(@row, latex_encode($href->{$col})); } push @data, @row; } my $headings = [ [ 'Col1', 'Col2', 'Col3' ] ]; my $table = LaTeX::Table->new( { caption => 'My caption', label => 'table:caption', type => 'xtab', header => $header, data => @data } ); my $table_text = $table->generate_string; Now $table_text can be interpolated into a LaTeX document template. DIAGNOSTICS
None. You could probably break the "latex_encode" function by passing it an array reference as the options, but there are no checks for that. CONFIGURATION AND ENVIRONMENT
Not applicable. DEPENDENCIES
The "HTML::Entities" and "Pod::LaTeX" modules were used for building the encoding table in "LaTeX::Encode::EncodingTable", but this is not rebuilt at installation time. The "LaTeX::Driver" module is used for formatting the character encodings reference document. INCOMPATIBILITIES
None known. BUGS AND LIMITATIONS
Not all LaTeX special characters are included in the encoding tables (more may be added when I track down the definitions). The "use_textcomp" option is not implemented. AUTHOR
Andrew Ford <a.ford@ford-mason.co.uk> LICENSE AND COPYRIGHT
Copyright (C) 2007 Andrew Ford. All Rights Reserved. This module is free software; you can redistribute it and/or modify it under the same terms as Perl itself. This software is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. SEE ALSO
Template::Plugin::Latex perl v5.10.0 2007-10-02 LaTeX::Encode(3pm)
All times are GMT -4. The time now is 02:12 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy