The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Stream Database Error with Postgres 9
Hello,
I am facing an issue when trying to stream a very large table in Rapid Miner Studio 7 from Postgres 9.
It appears that Rapid Miner is creating the secondary index mapping table, but it always errors out with the following text. Process failed: Database error occurred: ERROR: syntax error at or near "IDENTITY"
Position: 76
The table does have a compound primary key like the following, perhaps that is causing issues?
CONSTRAINT pk_nc9_fullfile PRIMARY KEY (field1, field2)
This table that I am trying query is very large, too large to load the entire thing into memory using the Read Database or Retrieve data operators. Has anyone seen this behavior before?
Thanks
I am facing an issue when trying to stream a very large table in Rapid Miner Studio 7 from Postgres 9.
It appears that Rapid Miner is creating the secondary index mapping table, but it always errors out with the following text. Process failed: Database error occurred: ERROR: syntax error at or near "IDENTITY"
Position: 76
The table does have a compound primary key like the following, perhaps that is causing issues?
CONSTRAINT pk_nc9_fullfile PRIMARY KEY (field1, field2)
This table that I am trying query is very large, too large to load the entire thing into memory using the Read Database or Retrieve data operators. Has anyone seen this behavior before?
Thanks
Tagged:
0
Answers
Stream Database needs a special index structure. It can't work with your table as it is now.
The error message is coming from the futile attempt to create a new index field in your table. But this uses non-standard (probably MySQL) syntax that PostgreSQL doesn't accept.
Try to find combinations from your compound index or other factors that allow you to do the batching yourself. Then you could use the Loop Values operator to execute the query for each batch.
It would appear this operator does not work well with postgres
I still don't understand why I have to do this, as it defeats the purpose of having rapid miner make a new key mapping table or adding the itself, but it's a solution for now.
Thanks for looking.