OK my biggest problem is that I want to be able to use this on a number of different queries, and most of my queries / CSV have 10-25 columns, and I'm not about to write out in scripts for each and every column depending on the particular query. I want to write this in a nice easy function.
you can loop the printf statement if needed.
something like:
Sorry, I dont have the time to test that out for you right now, but im sure someone else can whip it up or that may get you going.
I have a CSV file which contains number series as one of the fields. Some of the records of the field look like :
079661/3
I have to convert the above series as
079661
079662
079663
and store it as 3 different records.
Looking for help on how to achieve this. Am a newbie at Shell... (10 Replies)
I am trying to parse a csv file in the below 'name-value pair' format and then use the values corresponding to the name.
Type:G,Instance:instance1,FunctionalID:funcid,Env:dev,AppName:appname... (6 Replies)
I have a CSV file that needs to through two seperate processes (in the end there will be 2 files (Dload.unl and Tload.unl and we'll say the input file name is mass.csv). I have a processfile() function that will call the process Dload funtion. In Dload I want to read mass.csv into Dload and then... (1 Reply)
Hey guys,
I'm in the process of learning PHP and BASH scripting. I'm getting there, slowly ;)
I would like some help with parsing a CSV file. This file contains a list of hostnames, dates, and either Valid, Expired, or Expired Soon in the last column.
Basically, I want to parse the file,... (12 Replies)
Yes, there is a great doc out there that discusses parsing csv files with sed, and this topic has been covered before but not enough to answer my question (unix.com forums).
I'm trying to parse a CSV file that has optional quotes like the following:
"Apple","Apples, are fun",3.60,4.4,"I... (3 Replies)
Hello list,
I am working on a csv file which contains two fields per record which contain IP addresses. What I am trying to do is find records which have identical fields(IP addresses) which occur 4(four) times, and if they do, delete all records with that specific identical field(ip address).
... (4 Replies)
Hi Members, I am stuck with the following problem. Request your kind help
I have an csv file which contains, 1 header record, data records and 1 footer record. Sample is as below
Contents of cm_update_file_101010.csv
--------------------------------------------------
... (6 Replies)
Hi,
Newbie here and I need some help to parse a csv file that contains fields separated by ",". What I need to achieve here is, read the 1 line file and extract 240 fields and pass to a variable and then read the next 240 fields and pass to a variable, over and over. If anyone can assist that... (4 Replies)
Hi,
I have basic knowledge on unix shell scripting(not an expert).
My requirement is reading the csv file using the schema defined in the configuration file and if the condition is not mached then move the unmatched record to a error file and matched good records into other file.
In brief: ... (43 Replies)
Hello All,
I have an input CSV file like below, where first row data can be in different position after every run of the tool, i.e. pzTest in below example is in column 1, but it can be also in 3 column and same for all the headers in the first row.
pzTest, pzExtract, pxUpdate, pzInfo... (1 Reply)
Discussion started by: asirohi
1 Replies
LEARN ABOUT PHP
ifx_query
IFX_QUERY(3) 1 IFX_QUERY(3)ifx_query - Send Informix querySYNOPSIS
resource ifx_query (string $query, resource $link_identifier, [int $cursor_type], [mixed $blobidarray])
DESCRIPTION
Sends a $query to the currently active database on the server that's associated with the specified link identifier.
For "select-type" queries a cursor is declared and opened. Non-select queries are "execute immediate".
For either query type the number of (estimated or real) affected rows is saved for retrieval by ifx_affected_rows(3).
If the contents of the TEXT (or BYTE) column allow it, you can also use ifx_textasvarchar(1) and ifx_byteasvarchar(1). This allows you to
treat TEXT (or BYTE) columns just as if they were ordinary (but long) VARCHAR columns for select queries, and you don't need to bother with
blob id's.
With ifx_textasvarchar(0) or ifx_byteasvarchar(0) (the default situation), select queries will return BLOB columns as blob id's (integer
value). You can get the value of the blob as a string or file with the blob functions (see below).
PARAMETERS
o $query
- The query string.
o $link_identifier
- The link identifier.
o $cursor_def
- This optional parameter allows you to make this a scroll and/or hold cursor. It's a bitmask and can be either IFX_SCROLL,
IFX_HOLD, or both or'ed together. I you omit this parameter the cursor is a normal sequential cursor.
o $blobidarray
- If you have BLOB (BYTE or TEXT) columns in the query, you can add a $blobidarray parameter containing the corresponding "blob
ids", and you should replace those columns with a "?" in the query text.
RETURN VALUES
Returns valid Informix result identifier on success, or FALSE on errors.
EXAMPLES
Example #1
Show all rows of the "orders" table as a HTML table
<?php
ifx_textasvarchar(1); // use "text mode" for blobs
$res_id = ifx_query("select * from orders", $conn_id);
if (! $res_id) {
printf("Can't select orders : %s
<br />%s<br />
", ifx_error(), ifx_errormsg());
die;
}
ifx_htmltbl_result($res_id, "border="1"");
ifx_free_result($res_id);
?>
Example #2
Insert some values into the "catalog" table
<?php
// create blob id's for a byte and text column
$textid = ifx_create_blob(0, 0, "Text column in memory");
$byteid = ifx_create_blob(1, 0, "Byte column in memory");
// store blob id's in a blobid array
$blobidarray[] = $textid;
$blobidarray[] = $byteid;
// launch query
$query = "insert into catalog (stock_num, manu_code, " .
"cat_descr,cat_picture) values(1,'HRO',?,?)";
$res_id = ifx_query($query, $conn_id, $blobidarray);
if (! $res_id) {
/* ... error ... */
}
// free result id
ifx_free_result($res_id);
?>
SEE ALSO ifx_connect(3).
PHP Documentation Group IFX_QUERY(3)