Sponsored Content
Full Discussion: C++ help in large data set
Top Forums Programming C++ help in large data set Post 303020687 by arunkumar_mca on Wednesday 25th of July 2018 02:20:47 PM
Old 07-25-2018
C++ help in large data set

Hi All,

We are trying to replace a 3rdparty where we don't know how they handled the reader part here. The query below is getting 197 * 2038017 row in the table.

In the below code we are trying to run the query and execute in the DB part and fetch and read the record. That is where it is failing.while loop not entered until executeSql returns (function returns an integer we don't check), resulting in entire result in memory.

Any help


Code:
 CosBuf << "SELECT " << m_DefnVec.getColNameList() << " "
          << "FROM " << OBTTable.c_str() << " "
          << "ORDER BY " << DATE_COL;
    strSQLStmt = CosBuf.str();
    m_ptrDBConnect->executeSql(strSQLStmt.c_str(),reader);
while (reader.read())
   {
     if (reader.isValid())
     {
      
      ...
      ..

DB part :

void DSDbReader::AddRecord(int nSize, COLUMN_DATA* data)
{
std::shared_ptr<DSRecord> pRec = std::make_shared<DSRecord>();
for(int k = 0; k < nSize; k++)
{
pRec->AddColumnData();
COLUMN_DATA *pColumnData = pRec->GetColumnData(k);
pColumnData->AllocateValue(data[k].m_nValueLen);
   memcpy(pColumnData->m_psValue, data[k].m_psValue, data[k].m_nValueLen);
   pColumnData->m_nValueLen = data[k].m_nValueLen;
}
m_oRecords.push_back(pRec);

}

If I limit the table to 1 or 2 rows or with lesser column say 50 record it works fine. Possible the malloc is bringing more data to fit in memory

Last edited by arunkumar_mca; 07-25-2018 at 03:34 PM..
 

9 More Discussions You Might Find Interesting

1. HP-UX

large file options is set

Can someone tell me the right or exact syntax to check if the large file options is set on a filesystem Thanks! (2 Replies)
Discussion started by: catwomen
2 Replies

2. AIX

Value too large to be stored in data type???

Hello, I get this message : "Value too large to be stored in data type" when I try to open a 3Gb file. Can someone helps me to resolve the problem. Thank you very much (5 Replies)
Discussion started by: limame
5 Replies

3. Shell Programming and Scripting

Extracting data from large logs.

Hi all, I have a large log file that gets created daily. I need to be able to pull text out of the log when I hit a pattern including the 7 lines above it and the 3 lines below and output it to a new text file. Line 1 Line 2 Line 3 Line 4 Line 5 Line 6 Line 7 Pattern Line 9 Line 10... (11 Replies)
Discussion started by: bas
11 Replies

4. Shell Programming and Scripting

Drop common lines at head/tail of a large set of files

Hi! I have a large set of pairs of text files (each pair in their own subdirectory) and each pair shares head/tail (a couple of first and last lines) but differs in the middle part. I need to delete the heads/tails and keep only the middle portions in which they differ. The lengths of heads/tails... (1 Reply)
Discussion started by: dobryden
1 Replies

5. Shell Programming and Scripting

Finding data in large no. of files

I need to find some data in a large no. of files. The data is in the following format : VALUE A VALUE B VALUE C VALUE D 10 4 65 1 12 4.5 65.5 2 10.75 5.1 ... (2 Replies)
Discussion started by: cooker97
2 Replies

6. Shell Programming and Scripting

Need to delete large set of files (i.e) close to 100K from a directory based on the input file

Hi all, I need a script to delete a large set of files from a directory under / based on an input file and want to redirect errors into separate file. I have already prepared a list of files in the input file. Kndly help me. Thanks, Prash (36 Replies)
Discussion started by: prash358
36 Replies

7. AIX

AIX and Value too large to be stored in data type.

root@test8:/config1>oslevel -s 5300-01-00-0000 root@test8:/config1>oslevel -r 5300-01 while moving a file sized 9GB i got this message root@test8:/configapp>mv bkp_14JUN16_oraapp_2.tgz /oraapp bkp_14JUN16_oraapp_2.tgz: Value too large to be stored in data type. possibly need APAR... (2 Replies)
Discussion started by: filosophizer
2 Replies

8. UNIX for Beginners Questions & Answers

Transpose large data in UNIX

Hi I have the following sample of data: my full data dimention is 900,000* 1119 rs987435 C G 1 1 1 0 2 rs345783 C G 0 0 1 0 0 rs955894 G T 1 1 2 2 1 rs6088791 ... (7 Replies)
Discussion started by: marwah
7 Replies

9. Shell Programming and Scripting

Reversing large data set with awk?

Hello! I have quite a bit of map data that I have to edit. I originally had a DOS script that would reverse x1, y1 coordinates in order to change the direction of a particular segment in a map file. It worked wonderfully and all was well, but my bossman told me that there is a boatload of nodes... (9 Replies)
Discussion started by: Mothra
9 Replies
INGRES_NUM_ROWS(3)							 1							INGRES_NUM_ROWS(3)

ingres_num_rows - Get the number of rows affected or returned by a query

SYNOPSIS
int ingres_num_rows (resource $result) DESCRIPTION
This function primarily is meant to get the number of rows modified in the database. However, it can be used to retrieve the number of rows to fetch for a SELECT statement. Note If scrollable cursors are disabled and this function is called before using ingres_fetch_array(3), ingres_fetch_object(3), or ingres_fetch_row(3), the server will delete the result's data and the script will be unable to get them. Instead, you should retrieve the result's data using one of these fetch functions in a loop until it returns FALSE, indicating that no more results are available. PARAMETERS
o $result - The result identifier for a query RETURN VALUES
For delete, insert, or update queries, ingres_num_rows(3) returns the number of rows affected by the query. For other queries, ingres_num_rows(3) returns the number of rows in the query's result. SEE ALSO
ingres_query(3), ingres_fetch_array(3), ingres_fetch_assoc(3), ingres_fetch_object(3), ingres_fetch_row(3). PHP Documentation Group INGRES_NUM_ROWS(3)
All times are GMT -4. The time now is 05:44 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy