The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here

Radoop Hive connection test fail on 'Create permanent UDFs'

ericqiericqi Member Posts: 2 Learner I
Hi,

I'm trying to setup radoop connection to a secured cloudera cluster with Sentry enabled.
Hadoop Admin has manually created UDFs in my test db, but I'm still getting errors:
[Sep 27, 2019 3:56:09 PM] WARNING: The version of the rapidminer_libs jar on the HADOOP_CLASSPATH (9.0) is for another version of RapidMiner Radoop. Current version: 9.3. Please notify your Hadoop administrator about this issue.
[Sep 27, 2019 3:56:09 PM] WARNING: The current version (9.3) of the rapidminer_libs jar could not be found on the HADOOP_CLASSPATH. Please notify your Hadoop administrator about this issue.
[Sep 27, 2019 3:56:10 PM] SEVERE: Test failed: Create permanent UDFs
[Sep 27, 2019 3:56:10 PM]: Cleaning after test: Create permanent UDFs
[Sep 27, 2019 3:56:10 PM]: Cleaning after test: UDF jar upload
[Sep 27, 2019 3:56:10 PM]: Cleaning after test: Fetch dynamic settings
[Sep 27, 2019 3:56:10 PM]: Cleaning after test: Hive connection
[Sep 27, 2019 3:56:10 PM]: Total time: 23.705s
[Sep 27, 2019 3:56:10 PM]: com.rapidminer.operator.OperatorException: UDFs are claimed to be installed manually, but the following functions are not available in database 'test_db': [test_db.r3_is_eq, test_db.r3_greatest, test_db.r3_gaussian_rand, test_db.r3_which, test_db.r3_least, test_db.r3_max_index, test_db.r3_pivot_collect_count, test_db.r3_nth, test_db.r3_correlation_matrix, test_db.r3_pivot_collect_avg, test_db.r3_pivot_collect_min, test_db.r3_sum_collect, test_db.r3_add_file, test_db.r3_pivot_createtable, test_db.r3_score_naive_bayes, test_db.r3_pivot_collect_sum, test_db.r3_pivot_collect_max, test_db.r3_sleep, test_db.r3_esc]
 at eu.radoop.datahandler.hive.UDFHandler.ensureRadoopUDFs(UDFHandler.java:839)
 at eu.radoop.connections.service.test.connection.TestCreatePermanentUDFs.call(TestCreatePermanentUDFs.java:46)
 at eu.radoop.connections.service.test.connection.TestCreatePermanentUDFs.call(TestCreatePermanentUDFs.java:26)
 at eu.radoop.connections.service.test.RadoopTestContext.lambda$runTest$1(RadoopTestContext.java:280)
 at java.util.concurrent.FutureTask.run(FutureTask.java:266)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
 at java.lang.Thread.run(Thread.java:748)

I can confirm all required UDFs are already created by running: hive -e 'use test_db; show functions;'

test_db.r3_add_file
test_db.r3_apply_model
test_db.r3_correlation_matrix
test_db.r3_esc
test_db.r3_gaussian_rand
test_db.r3_greatest
test_db.r3_is_eq
test_db.r3_least
test_db.r3_max_index
test_db.r3_nth
test_db.r3_pivot_collect_avg
test_db.r3_pivot_collect_count
test_db.r3_pivot_collect_max
test_db.r3_pivot_collect_min
test_db.r3_pivot_collect_sum
test_db.r3_pivot_createtable
test_db.r3_score_naive_bayes
test_db.r3_sleep
test_db.r3_sum_collect
test_db.r3_which

What should I do to fix the issue now? Any suggestion would be much appreciated.

Thanks
Tagged:

Answers

  • sgenzersgenzer Administrator, Moderator, Employee-RapidMiner, RapidMiner Certified Analyst, Community Manager, Member, University Professor, PM Moderator Posts: 2,959 Community Manager
    cc @asimon
  • asimonasimon Administrator, Moderator, Employee-RapidMiner, Community Manager, RapidMiner Certified Master, Member Posts: 8 RM Engineering
    Have you done this already?

    Full doc is available here: https://docs.cloudera.com/documentation/enterprise/5-16-x/topics/cm_mc_hive_udf.html

    I would also recommend to update the rapidminer parcel as well so that the warnings could also go away.
  • ericqiericqi Member Posts: 2 Learner I
    Hi asimon , sgenzer 

    Thanks for answering my question. 
    I'll get back to my Hadoop admin and see if it helps.
Sign In or Register to comment.