正在尝试安装oozie 4.0.1,请访问http://www.thecloudavenue.com/2013/10/installation-and-configuration-of.html
hadoop version-2.4.0
Maven-3.0.4
SQOOP-1.4.4
..........
[信息]Apache Oozie HCatalog库................................成功[0.399s]
[INFO]Apache Oozie Core.............................................失败[7.819s]
[INFO]Apache Oozie文档........................................................跳过
.........
[错误]无法在项目oozie上执行目标-core:无法解析项目org.apache.oozie:oozie-core:jar:4.0.0依赖项:无法解析以下项目:org.apache.oozie:oozie-hadoop-test:jar:2.4.0.oozie-4.0.0,org.apache.oozie:oozie-hadoop:jar:2.4.0.oozie-4.0.0,org.apache.oozie:oozie-sharelib-oozie:jar:4.0.0-cdh5.0.2,org.apache.oozie:oozie
[错误]使用-x开关重新运行Maven以启用完全调试日志记录。
[错误]
[错误]有关错误和可能解决方案的详细信息,请阅读以下文章:
[错误][帮助1]HTP://cwiki.apache.org/confluence/display/maven/dependencyresolutionexception
[错误]
[错误]更正问题后,可以使用命令
[错误]mvn-rf:oozie-core
恢复生成
有人用hadoop 2.4.0试过Oozie4.0.1吗?我该如何解决这个问题?
我也面临同样的问题。
尝试这个安装步骤,它对我来说是有效的,在下面的步骤中更改版本取决于您需要什么版本。
STEP 1 : Extract the tar file using tar -xvf oozie-4.0.1.tar.gz
STEP 2 : Change the name oozie-4.0.1 to oozie using below command.
mv oozie-4.0.1 oozie
STEP 3 : Move to oozie/bin directory using cd oozie/bin Build oozie for
Hadoop-2.2 using below command.
mkdistro.sh -DskipTests Dhadoopversion=2
Before build oozie we must change versions for java, hive pig,
sqoop in pom.xml file.
Java - 1.7
Hive - 0.13.0
Pig - 0.12.1
Sqoop - 1.4.3
Eg : <javaVersion>1.7</javaVersion>
<targetJavaVersion>1.7</targetJavaVersion>
<hive.version>0.13.0</hive.version>
<pig.version>0.12.1</pig.version>
<pig.classifier></pig.classifier>
<sqoop.version>1.4.3</sqoop.version>
If build is success you will get the message like
Oozie distro created, DATE[2014.01.05-18:55:14GMT] VC-
REV[unavailable], available at
[/home/labuser/oozie/distro/target]
Now use the expanded oozie located in
/home/labuser/oozie/distro/target/oozie-4.0.1-distro/oozie-4.0.1
STEP 4 : Create a libext directory in expanded oozie and copy the
Hadoop-2.2.0 jar files and extjs zip file to libext directory.
STEP 5 : Set this property in Hadoop core-site.xml file.
Eg : <property>
<name>hadoop.proxyuser.labuser.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.labuser.groups</name>
<value>*</value>
</property>
Set this property in oozie-site.xml file located in conf directory
<name>oozie.service.JPAService.create.db.schema</name>
<value>true</value>
By default it is false change it to true
Step 6 : Now prepare a oozie war file. So move to expanded oozie/bin
and run the below command.
./oozie-setup.sh prepare-war
If you get any error like zip: command not found then install
zip using following command sudo apt-get install zip
Then again run the prepare-war command to create a file. if the
war file created successfully you will get the message like
INFO: Oozie is ready to be started
Step 7 : upload the share lib folder from expanded oozie to hdfs using the
below command
./oozie-setup.sh sharelib create -fs hdfs://localhost:8020
Step 8 : Create a database for oozie using the command
./oozie-setup.sh db create –run
If database created then you will get the message like
setting CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"
Validate DB Connection
DONE
Check DB schema does not exist
DONE
Check OOZIE_SYS table does not exist
DONE
Create SQL schema
DONE
Create OOZIE_SYS table
DONE
Oozie DB has been created for Oozie version ’4.0.0′
Step 9 : Start the oozie using ./oozied.sh start
Step 10 : Check status of oozie using the below command
./oozie admin –oozie http://localhost:11000/oozie -status
You will get the message like System mode: NORMAL
Issues Faced with this installation
1. While building hive-0.13.0 share library of Oozie, there is an unsolvable dependency ‘hive-builtins’.
Cause: Hive-builtins jar is necessary in hive-0.10.0 but in hive-0.13.0 there is no hive-builtins.jar.
Solution: Removed dependency hive-builtins
2. While building Oozie, we faced issue with java.lang.OutOfMemoryError
Cause: This error signals that the JVM running Maven has run out of memory. It is caused by maven-compiler-plugin
Solution: Edited a maven-compiler-plugin property
<fork>true</fork>
Fork allows running the compiler in a separate process. If false it uses the built in compiler, while if true it will use an executable.
Finally we made a Oozie bulid, with above versions of Hadoop-ecosystems.
我在执行脚本时遇到了这个错误。我已经升级了FF(56),Gecko(v0.19)和Selenium 3.6。。。 这是我的代码: 日志中的错误: 线程“main”java中出现异常。lang.IllegalStateException:驱动程序可执行文件的路径必须由webdriver设置。壁虎。驱动系统属性;有关更多信息,请参阅https://github.com/mozilla/geckodri
文件"C:\Program Data\Anaconda3\Script\tensorboard-script.py",第10行,sys.exit(run_main())文件"C:\Program Data\Anaconda3\lib\site-包\拉伸板\main.py",第57行,run_mainapp.run(tensorboard.main,flags_parser=tensorboard.
我正在设置使用“Grails dev run-app”命令从我的终端运行graddle,然后出现了一个错误。 我不确定是什么导致了这个错误,但是我安装的是: null 正在运行应用程序...OBJC[3493]:类JavaLaunchHelper在/library/java/javaVirtualMachines/jdk1.8.0_111.jdk/contents/home/bin/java和/l
问题内容: 当我尝试在Chrome中运行测试时出现此错误: 初始化方法AutomationUsingSelenium.SmuladorChrome.MyTestInitialize引发异常。OpenQA.Selenium.DriverServiceNotFoundException:OpenQA.Selenium.DriverServiceNotFoundException 原因是什么? 问题答案
我正在尝试运行以下Sqoop命令: 然而,我得到了这个错误: 17/02/04 00:04:53 警告安全。用户组信息: 特权行动例外作为:avinash (身份验证:简单) 原因:java.io.文件不发现异常: 文件不存在: hdfs://localhost:9000/home/avinash/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/slf4j-api-
所以我尝试在Swift 2.0中执行GET请求,在从我的Swift 1.2迁移了几行代码后,我收到了这个错误,我真的不知道如何绕过它/正确迁移它。 该函数编写如下: 在此之后,Xcode向我输出以下错误: 无法使用类型为(NSMutableURLRequest,completionHandler: (NSData!,NSURLResponse!,NSError!) - 你有没有遇到过这种情况,或者