The Altair Community is migrating to a new platform to provide a better experience for you. In preparation for the migration, the Altair Community is on read-only mode from October 28 - November 6, 2024. Technical support via cases will continue to work as is. For any urgent requests from Students/Faculty members, please submit the form linked here
default.r3_add_file_v4
I am connecting radoop with cloudera. While running random forest we get the errors HiveQL problem (org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: Failed: Semantic Exception line0:-1 invalid function invalid function 'default.r3_add_file_v4'.....
The details of errors :
- Exception: com.rapidminer.operator.OperatorException
- Message: HiveQL problem (org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException Line 0:-1 Invalid function 'default.r3_add_file_v4')
- Stack trace:
- eu.radoop.datahandler.hive.HiveHandler.runFastScriptPrivilegedAction(HiveHandler.java:906)
- eu.radoop.datahandler.hive.HiveHandler.runFastScript(HiveHandler.java:840)
- eu.radoop.datahandler.hive.HiveHandler.runFastScriptsNoParams(HiveHandler.java:795)
- eu.radoop.datahandler.hive.HiveHandler.runFastScripts(HiveHandler.java:763)
- eu.radoop.modeling.HiveModelApplier.apply(HiveModelApplier.java:101)
- eu.radoop.modeling.RadoopModelApplier.doWork(RadoopModelApplier.java:350)
- com.rapidminer.operator.Operator.execute(Operator.java:1004)
- com.rapidminer.operator.execution.SimpleUnitExecutor.execute(SimpleUnitExecutor.java:77)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:812)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:807)
- java.security.AccessController.doPrivileged(Native Method)
- com.rapidminer.operator.ExecutionUnit.execute(ExecutionUnit.java:807)
- eu.radoop.operator.meta.RadoopValidationChain.executeEvaluator(RadoopValidationChain.java:204)
- eu.radoop.operator.meta.RadoopValidationChain.evaluate(RadoopValidationChain.java:349)
- eu.radoop.operator.meta.SplitValidationChain.estimatePerformance(SplitValidationChain.java:66)
- eu.radoop.operator.meta.RadoopValidationChain.doWork(RadoopValidationChain.java:299)
- com.rapidminer.operator.Operator.execute(Operator.java:1004)
- com.rapidminer.operator.execution.SimpleUnitExecutor.execute(SimpleUnitExecutor.java:77)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:812)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:807)
- java.security.AccessController.doPrivileged(Native Method)
- com.rapidminer.operator.ExecutionUnit.execute(ExecutionUnit.java:807)
- com.rapidminer.operator.OperatorChain.doWork(OperatorChain.java:428)
- eu.radoop.RadoopNest.doWork(RadoopNest.java:662)
- com.rapidminer.operator.Operator.execute(Operator.java:1004)
- com.rapidminer.operator.execution.SimpleUnitExecutor.execute(SimpleUnitExecutor.java:77)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:812)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:807)
- java.security.AccessController.doPrivileged(Native Method)
- com.rapidminer.operator.ExecutionUnit.execute(ExecutionUnit.java:807)
- com.rapidminer.operator.OperatorChain.doWork(OperatorChain.java:428)
- com.rapidminer.operator.Operator.execute(Operator.java:1004)
- com.rapidminer.Process.execute(Process.java:1315)
- com.rapidminer.Process.run(Process.java:1290)
- com.rapidminer.Process.run(Process.java:1181)
- com.rapidminer.Process.run(Process.java:1134)
- com.rapidminer.Process.run(Process.java:1129)
- com.rapidminer.Process.run(Process.java:1119)
- com.rapidminer.gui.ProcessThread.run(ProcessThread.java:65)
- Cause
- Exception: org.apache.hive.service.cli.HiveSQLException
- Message: Error while compiling statement: FAILED: SemanticException Line 0:-1 Invalid function 'default.r3_add_file_v4'
- Stack trace:
- org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:256)
- org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:242)
- org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:254)
- eu.radoop.datahandler.hive.PooledStatement.execute(PooledStatement.java:365)
- eu.radoop.datahandler.hive.HiveHandler.executeStatement(HiveHandler.java:935)
- eu.radoop.datahandler.hive.HiveHandler.runFastScriptPrivilegedAction(HiveHandler.java:896)
- eu.radoop.datahandler.hive.HiveHandler.runFastScript(HiveHandler.java:840)
- eu.radoop.datahandler.hive.HiveHandler.runFastScriptsNoParams(HiveHandler.java:795)
- eu.radoop.datahandler.hive.HiveHandler.runFastScripts(HiveHandler.java:763)
- eu.radoop.modeling.HiveModelApplier.apply(HiveModelApplier.java:101)
- eu.radoop.modeling.RadoopModelApplier.doWork(RadoopModelApplier.java:350)
- com.rapidminer.operator.Operator.execute(Operator.java:1004)
- com.rapidminer.operator.execution.SimpleUnitExecutor.execute(SimpleUnitExecutor.java:77)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:812)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:807)
- java.security.AccessController.doPrivileged(Native Method)
- com.rapidminer.operator.ExecutionUnit.execute(ExecutionUnit.java:807)
- eu.radoop.operator.meta.RadoopValidationChain.executeEvaluator(RadoopValidationChain.java:204)
- eu.radoop.operator.meta.RadoopValidationChain.evaluate(RadoopValidationChain.java:349)
- eu.radoop.operator.meta.SplitValidationChain.estimatePerformance(SplitValidationChain.java:66)
- eu.radoop.operator.meta.RadoopValidationChain.doWork(RadoopValidationChain.java:299)
- com.rapidminer.operator.Operator.execute(Operator.java:1004)
- com.rapidminer.operator.execution.SimpleUnitExecutor.execute(SimpleUnitExecutor.java:77)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:812)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:807)
- java.security.AccessController.doPrivileged(Native Method)
- com.rapidminer.operator.ExecutionUnit.execute(ExecutionUnit.java:807)
- com.rapidminer.operator.OperatorChain.doWork(OperatorChain.java:428)
- eu.radoop.RadoopNest.doWork(RadoopNest.java:662)
- com.rapidminer.operator.Operator.execute(Operator.java:1004)
- com.rapidminer.operator.execution.SimpleUnitExecutor.execute(SimpleUnitExecutor.java:77)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:812)
- com.rapidminer.operator.ExecutionUnit$3.run(ExecutionUnit.java:807)
- java.security.AccessController.doPrivileged(Native Method)
- com.rapidminer.operator.ExecutionUnit.execute(ExecutionUnit.java:807)
- com.rapidminer.operator.OperatorChain.doWork(OperatorChain.java:428)
- com.rapidminer.operator.Operator.execute(Operator.java:1004)
- com.rapidminer.Process.execute(Process.java:1315)
- com.rapidminer.Process.run(Process.java:1290)
- com.rapidminer.Process.run(Process.java:1181)
- com.rapidminer.Process.run(Process.java:1134)
- com.rapidminer.Process.run(Process.java:1129)
- com.rapidminer.Process.run(Process.java:1119)
- com.rapidminer.gui.ProcessThread.run(ProcessThread.java:65)
- Cause
- Exception: org.apache.hive.service.cli.HiveSQLException
- Message: Error while compiling statement: FAILED: SemanticException Line 0:-1 Invalid function 'default.r3_add_file_v4'
- Stack trace:
- org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:400)
- org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:188)
- org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:267)
- org.apache.hive.service.cli.operation.Operation.run(Operation.java:337)
- org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:439)
- org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:416)
- sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
- sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- java.lang.reflect.Method.invoke(Method.java:498)
- org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
- org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
- org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
- java.security.AccessController.doPrivileged(Native Method)
- javax.security.auth.Subject.doAs(Subject.java:422)
- org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1796)
- org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
- com.sun.proxy.$Proxy20.executeStatementAsync(Unknown Source)
- org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:282)
- org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:501)
- org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
- org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
- org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
- org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
- org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
- org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
- java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
- java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
- java.lang.Thread.run(Thread.java:748)
- Cause
- Exception: java.lang.RuntimeException
- Message: org.apache.hadoop.hive.ql.parse.SemanticException:Line 0:-1 Invalid function 'default.r3_add_file_v4'
- Stack trace:
- org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:836)
- org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1176)
- org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
- org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:94)
- org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:78)
- org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:132)
- org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:109)
- org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:193)
- org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:146)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:10428)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:10384)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3777)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genLateralViewPlan(SemanticAnalyzer.java:9814)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genLateralViewPlans(SemanticAnalyzer.java:9758)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9613)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9538)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9565)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9551)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genOPTree(SemanticAnalyzer.java:10024)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10035)
- org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9915)
- org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:223)
- org.apache.hadoop.hive.ql.Driver.compile(Driver.java:490)
- org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1276)
- org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1263)
- org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:186)
- org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:267)
- org.apache.hive.service.cli.operation.Operation.run(Operation.java:337)
- org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:439)
- org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:416)
- sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
- sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
- java.lang.reflect.Method.invoke(Method.java:498)
- org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
- org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
- org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
- java.security.AccessController.doPrivileged(Native Method)
- javax.security.auth.Subject.doAs(Subject.java:422)
- org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1796)
- org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
- com.sun.proxy.$Proxy20.executeStatementAsync(Unknown Source)
- org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:282)
- org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:501)
- org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
- org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
- org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
- org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
- org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
- org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
- java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
- java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
- java.lang.Thread.run(Thread.java:748)
Please any body tell me how to solve this problem.
Tagged:
0
Answers
Hello,
I suspect this was caused by a missing UDF (user defined functions) in Hive in the default database.
These UDF-s help Radoop Operators to do their job on the cluster side.
If you are sure that they are already installed in the cluster (they hasn't been changed for a while so other user of hte same cluster might have done it before) then you may want to check which database has them since default doesn't. If you find that database then you will be able to configure Radoop to use that instead of default in Connection Settings dialog.
If you are in the situation that UDF-s are not in the cluster then I would suggest to follow this guide:
https://docs.rapidminer.com/latest/radoop/installation/operation-and-maintenance.html#drop-create-functions