The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
Error: Cannot fetch folder content from repository
yelena_kro
Member Posts: 6 Contributor I
Hi everyone,
I am getting server error, when try to open one of the folders in the server Repository
Could you advise what can cause it?
0
Answers
Hi,
as the error states, the server.log would contain more information. The error itself I have never ever seen, so congratulations for that, I suppose? :smileywink:
Regards,
Marco
Hi Marco,
below is the server log fragment. Could you take a look?
Also I think there is one data file which causing the issue - the size of file is ~ 1.7 Gb, and when I try to open it in the server web-interface, it couldn't return metadata and returns 500 error. But I have that file on my local machine with no issues. I stored that file to other directories on the server and have the same error for these dir now. Is there any server parameter that limits the max file size?
Hi,
thank you for the log message. The root cause seems to be the PostgreSQL error message:
I'm a PostgreSQL user myself, but never saw this message. Some googling indicates two possible reasons:
1. If you have an older PostgreSQL version, the memory settings need to be updated.
2. If that's not a problem (recent PostgreSQL like 9.4 or later), a table might be corrupted. Please do a pg_dump of the RapidMiner Server database. This will read all tables and create a copy of them (a good idea anyway). If it gives a similar error, then you know that there's a problem with a table.
Some hints on how to solve this:
https://confluence.atlassian.com/jirakb/invalid-memory-alloc-request-size-440107132.html
But it might be easier to just delete the example set from the repository so the table gets dropped.
I don't know of a limit in Server, but 1.7 GB sounds like a lot.
Regards,
Balázs
@BalazsBarany, thank you for your reply. Our PostgreSQL version is 9.6. We also did pg_dump and it finished successfully, with no errors, so seems it is not a table issue... Also unfortunately, when I try to delete that dataset, it gives me the same error... Any other suggestions?