RapidMiner Hadoop Configuration For apache distribution
Hi All,
I am new to rapidminer i have succesfully installed and configured rapidminer7.3 version its works fine the problem is when i am trying to configure hadoop it gives the below error
[Nov 23, 2016 12:39:42 PM] SEVERE: java.util.concurrent.TimeoutException
[Nov 23, 2016 12:39:42 PM] SEVERE: Hive server 2 connection test timed out. Please check that the server/daemon runs and is accessible on the address and port you specified.
[Nov 23, 2016 12:39:42 PM] SEVERE: Test failed: Hive connection
[Nov 23, 2016 12:39:42 PM] SEVERE: Connection test for 'Hadoop' failed.
My Hadoop Distribution is Apache
my hive version is hive1.2.1
and all my ports are default ports. If anybody knows please help me
Thanks in advance,
Praveen G
Answers
Hi Praveen,
it is difficult to figure out the problem only from this information, but there are a couple of hints:
The error simply states that there were no response from the HiveServer2 instance (specified by either the Master Address or the Hive Server Address fields, and the Hive Port) in a given time.
I would try the following:
Best,
Peter
Hi Peter,
Thanks for the Reply
I have verified hive cluster through beeline it working fine . Now if i am trying to connect i am getting below error.
I have attached the hive working using beeline scrren shot and Hadoop connection in RapidMiner
If u know please help me out . Any Way thanks for ur reply
regards,
Praveen
[Nov 25, 2016 9:43:25 AM] SEVERE: java.lang.RuntimeException: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/metastore/api/MetaException
[Nov 25, 2016 9:43:25 AM] SEVERE: Hive server 2 connection test failed. Please check that the server/daemon runs and is accessible on the address and port you specified.
[Nov 25, 2016 9:43:25 AM] SEVERE: Test failed: Hive connection
[Nov 25, 2016 9:43:25 AM] SEVERE: Connection test for 'Hadoop' failed.
Hi Praveen,
The screenshots help.
The first thing is that JDBC URL Postfix field is only there for additional, custom postfix (the URL is constructed automatically). So it should be empty in your case. That could already solve the connection problem.
However, the java.lang.NoClassDefFoundError indicates that probably temporary files or folders (for example, usually in /tmp/ on Linux) of Studio may have been deleted, since the software has been started. Is that possible? If you keep getting this error, a Studio restart should help.
The address in the connection is localhost, does that mean that you are running Studio on the master node? (Beeline, of course, runs on the Hadoop node, but Studio may not.)
I would also make sure that 54310 is the port to use for the NameNode. If you navigate to localhost:50070, is that the port that the Overview page shows?
Best,
Peter
hello,
i have the same problem when i configure radoop rapidminer
this is what have in logs:
[Aug 11, 2018 10:28:23 AM]: Integration test for 'hadoop' started.
[Aug 11, 2018 10:28:23 AM]: Removed test Kerberos beacuse Security is not enabled
[Aug 11, 2018 10:28:23 AM]: Using Radoop version 9.0.0.
[Aug 11, 2018 10:28:23 AM]: Running 25 tests: [Hive connection, Fetch dynamic settings, Java version, NameNode networking, DataNode networking, YARN services networking, HDFS, MapReduce, Radoop temporary directory, MapReduce staging directory, Spark staging directory, Spark assembly jar existence, UDF jar upload, Create permanent UDFs, HDFS upload, Radoop jar upload, Hive load data, Import job, SQL query (aggregation), Spark job, Hive export, UDF query, Distributed Cache, UDAF query, Job kill]
[Aug 11, 2018 10:28:23 AM]: Running test 1/25: Hive connection
[Aug 11, 2018 10:28:23 AM]: Hive server 2 connection (localhost:10000) test started.
[Aug 11, 2018 10:28:24 AM] SEVERE: Test failed: Hive connection
[Aug 11, 2018 10:28:24 AM]: Cleaning after test: Hive connection
[Aug 11, 2018 10:28:24 AM]: Total time: 1.093s
[Aug 11, 2018 10:28:24 AM]: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/metastore/api/MetaException
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
at com.rapidminer.extension.jdbc.tools.jdbc.DriverAdapter.connect(DriverAdapter.java:40)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at eu.radoop.datahandler.hive.PooledStatement.lambda$connect$0(PooledStatement.java:214)
at eu.radoop.tools.ExceptionTools.check(ExceptionTools.java:301)
at eu.radoop.datahandler.hive.PooledStatement.connect(PooledStatement.java:189)
at eu.radoop.datahandler.hive.StatementPool.getStatement(StatementPool.java:314)
at eu.radoop.datahandler.hive.StatementPool.getStatement(StatementPool.java:181)
at eu.radoop.datahandler.hive.HiveHandler.runFastQueryPrivilegedAction(HiveHandler.java:1010)
at eu.radoop.datahandler.hive.HiveHandler.runFastQuery(HiveHandler.java:986)
at eu.radoop.datahandler.hive.HiveHandler.getSimpleListFast(HiveHandler.java:2575)
at eu.radoop.datahandler.hive.HiveHandler.getTableList(HiveHandler.java:2507)
at eu.radoop.connections.service.test.connection.TestHiveConnection.call(TestHiveConnection.java:51)
at eu.radoop.connections.service.test.connection.TestHiveConnection.call(TestHiveConnection.java:25)
at eu.radoop.connections.service.test.RadoopTestContext.lambda$runTest$1(RadoopTestContext.java:279)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.metastore.api.MetaException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 20 more
[Aug 11, 2018 10:28:24 AM] SEVERE: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/metastore/api/MetaException
[Aug 11, 2018 10:28:24 AM] SEVERE: Hive server 2 connection test failed. Please check that the server/daemon runs and is accessible on the address and port you specified.
[Aug 11, 2018 10:28:24 AM] SEVERE: Test failed: Hive connection
[Aug 11, 2018 10:28:24 AM] SEVERE: Integration test for 'hadoop' failed.
is there any solution?
when i run jps i have this
1472 NameNode
3856 DataNode
12756 Jps
14164 SparkSubmit
14228 NodeManager
7544 GUILauncher
10860 RunJar
12524 ResourceManager
thanks for help