Hi Guys,
I need help in modifying a large text file containing more than 1-2 lakh rows of data using unix commands. I am quite new to the unix language
the text file contains data in a pipe delimited format
sdfsdfs
sdfsdfsd
START_ROW
sdfsd|sdfsdfsd|sdfsdfasdf|sdfsadf|sdfasdf... (9 Replies)
Hi pls help me out to short out this problem
rm PAB113_011.out
rm: PAB113_011.out: override protection 644 (yes/no)? n
If i give y it remove the file.
But i added the rm command as a part of ksh file and i tried to remove the file. Its not removing and the the file prompting as... (7 Replies)
I have the following space-delimited input:
1 11.785710 117.857100
1 15 150
1 20 200
1 25 250
3 2.142855 21.428550
3 25 250
22 1.071435 10.714350
The first field is the ID number, the second field is the percentage of the total points that the person has and the third column is the number... (3 Replies)
I have 2 files,
file01= 7 columns, row unknown (but few)
file02= 7 columns, row unknown (but many)
now I want to create an output with the first field that is shared in both of them and then subtract the results from the rest of the fields and print there
e.g.
file 01
James|0|50|25|10|50|30... (1 Reply)
hello all,
I have files that have a specific way for naming the first column
they are make of five names in Pattern of 3
Y = (no case sensitive)
so the files are names $Y-$Y-$Y or $X-$Y-$Z depending how we look
they only exist of the pattern exist
now I want to create a file from them that... (9 Replies)
Hi..
My requirement is simple but unable to get that..
File 1 :
3 415 A G
4 421 G .
39 421 G A
2 421 G A,C
41 427 A .
4 427 A C
42 436 G .
3 436 G C
43 445 C .
2 445 C T
41 447 A .
Output (4 Replies)
Hello,
I have a data format as follows:
Ind1 0 1 2
Ind1 0 2 1
Ind2 1 1 0
Ind2 2 2 0
I want to use AWK to have this output:
Ind1 00 12 21
Ind2 12 12 00
That is to merge each two rows with the same row names.
Thank you very much in advance for your help. (8 Replies)
Hi I would like to move the first 1000 rows of my file into an output file and then move the last 1000 rows into another output file.
Any help would be great
Thanks (6 Replies)
Hi All,
I have the below file where I want the lines to merged based on a pattern.
AFTER
CMMILAOJ
CMMILAAJ
AFTER
CMDROPEJ
CMMIMVIJ
CMMIRNTJ
CMMIRNRJ
CMMIRNWJ
CMMIRNAJ
CMMIRNDJ
AFTER
CMMIRNTJ
CMMIRNRJ
CMMIRNWJ (4 Replies)
Hellow,
I have a tab-delimited file with 3 columns :
BINPACKER.13259.1.p2 SSF48239
BINPACKER.13259.1.p2 PF13243
BINPACKER.13259.1.p2 G3DSA:1.50.10.20
BINPACKER.13259.2.p2 SSF48239
BINPACKER.13259.2.p2 PF13243
BINPACKER.13259.2.p2 G3DSA:1.50.10.20... (7 Replies)
Discussion started by: anjaliANJALI
7 Replies
LEARN ABOUT PHP
ingres_fetch_row
INGRES_FETCH_ROW(3) 1 INGRES_FETCH_ROW(3)ingres_fetch_row - Fetch a row of result into an enumerated arraySYNOPSIS
array ingres_fetch_row (resource $result)
DESCRIPTION ingres_fetch_row(3) returns an array that corresponds to the fetched row, or FALSE if there are no more rows. Each result column is stored
in an array offset, starting at offset 1.
Subsequent calls to ingres_fetch_row(3) return the next row in the result set, or FALSE if there are no more rows.
By default, arrays created by ingres_fetch_row(3) start from position 1 and not 0 as with other DBMS extensions. The starting position can
be adjusted to 0 using the configuration parameter ingres.array_index_start.
Note
Related Configurations
See also the ingres.array_index_start, ingres.fetch_buffer_size and ingres.utf8 directives in Runtime Configuration.
PARAMETERS
o $result
- The query result identifier
RETURN VALUES
Returns an array that corresponds to the fetched row, or FALSE if there are no more rows
EXAMPLES
Example #1
Fetch a row of result into an enumerated array
<?php
$link = ingres_connect($database, $user, $password);
$result = ingres_query($link, "select * from table");
while ($row = ingres_fetch_row($result)) {
echo $row[1];
echo $row[2];
}
?>
SEE ALSO ingres_num_fields(3), ingres_query(3), ingres_fetch_array(3), ingres_fetch_assoc(3), ingres_fetch_object(3).
PHP Documentation Group INGRES_FETCH_ROW(3)