I need to split a file into n separate files of about the same size. The way the file will be split is at every nth row, starting with the first row, that row will be cut and copied to it's corresponding new file so that each file has unique records. Any 'leftovers' will go into the last file. e.g.
'Run command' with n=4
Desired output:
# sample4.txt picked up the "leftovers".
Hi,
I want to write a shell script which increments a particular column in a row from a text file and then adds another row below the current row with the incremented value .
For Eg .
if the input file has a row :
abc xyz lmn 89 lm nk o p
I would like the script to create something like... (9 Replies)
Hi Friends,
I have a single column data like below.
1
2
3
4
5
I need the output like below.
0
1
2
3
4
where each row (including first row) subtracting from first row and the result should print below like the way shown in output file.
Thanks
Sid (11 Replies)
Issue: I am able to split source file in multiple files of 10 rows each but unable to get the required outputfile name. please advise.
Details:
input = A.txt having 44 rows
required output = A_001.txt , A_002.txt and so on. Can below awk be modified to give required result
current... (19 Replies)
I have a source file that contains multiple XML files concatenated in it. The separator string between files is <?xml version="1.0" encoding="utf-8"?>. I wanted to split files in multiple files with mentioned names. I had used a awk code earlier to spilt files in number of lines i.e.
awk... (10 Replies)
I have below script which does splitting based on a different criteria. can it be amended to produce required result
SrcFileName=XML_DUMP
awk '/<\?xml version="1\.0" encoding="utf-8"\?>/{n++}
n{f="'"${SrcFileName}_"'" sprintf("%04d",n) ".txt"
print >> f
close(f)}' $SrcFileName.txt
My... (3 Replies)
Hi. How can I read row number from one file and print that corresponding record present at that row in another file.
eg
file1
1
3
5
7
9
file2
11111
22222
33333
44444
55555
66666
77777
88888
99999 (3 Replies)
this is the requirement
list.txt
table1
table2
table3
testfile.txt
name#place#data#select * from table1
name2#place2#data2#select * from table 10 innerjoin table3
name2#place2#data2#select * from table 10
output
name place table1
name2 place table3
i tried using awk (7 Replies)
Using Awk, how can I achieve the following?
I have set of record numbers, for which, I have to replace the nth field with some values, say spaces.
Eg:
Set of Records : 4,9,10,55,89,etc
I have to change the 8th field of all the above set of records to spaces (10 spaces).
Its a delimited... (1 Reply)
Dear All,
We have input like this:
161 57 1378 176 1392 262 1444 441 1548 538 1611 670 1684
241 57 1378 208 1393 269 1447 444 1549 538 1610 677 1700
321 ... (4 Replies)
Discussion started by: attila
4 Replies
LEARN ABOUT PHP
fbsql_data_seek
FBSQL_DATA_SEEK(3) 1 FBSQL_DATA_SEEK(3)fbsql_data_seek - Move internal result pointerSYNOPSIS
bool fbsql_data_seek (resource $result, int $row_number)
DESCRIPTION
Moves the internal row pointer of the FrontBase result associated with the specified result identifier to point to the specified row num-
ber.
The next call to fbsql_fetch_row(3) would return that row.
PARAMETERS
o $
result -A result identifier returned by fbsql_query(3) or fbsql_db_query(3).
o $row_number
- The row number. Starts at 0.
RETURN VALUES
Returns TRUE on success or FALSE on failure.
EXAMPLES
Example #1
fbsql_data_seek(3) example
<?php
$link = fbsql_pconnect("localhost", "_SYSTEM", "secret")
or die("Could not connect");
fbsql_select_db("samp_db")
or die("Could not select database");
$query = "SELECT last_name, first_name FROM friends;";
$result = fbsql_query($query)
or die("Query failed");
// fetch rows in reverse order
for ($i = fbsql_num_rows($result) - 1; $i >=0; $i--) {
if (!fbsql_data_seek($result, $i)) {
printf("Cannot seek to row %d
", $i);
continue;
}
if (!($row = fbsql_fetch_object($result)))
continue;
echo $row->last_name . $row->first_name . "<br />
";
}
fbsql_free_result($result);
?>
PHP Documentation Group FBSQL_DATA_SEEK(3)