在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):linkedin/linkedin-gradle-plugin-for-apache-hadoop开源软件地址(OpenSource Url):https://github.com/linkedin/linkedin-gradle-plugin-for-apache-hadoop开源编程语言(OpenSource Language):Groovy 96.3%开源软件介绍(OpenSource Introduction):LinkedIn Gradle Plugin for Apache HadoopThe LinkedIn Gradle Plugin for Apache Hadoop (which we shall refer to as simply the "Hadoop Plugin" for brevity) will help you more effectively build, test and deploy Hadoop applications. In particular, the Plugin will help you easily work with Hadoop applications like Apache Pig and build workflows for Hadoop workflow schedulers such as Azkaban and Apache Oozie. The Plugin includes the LinkedIn Gradle DSL for Apache Hadoop (which we shall refer to as simply the "Hadoop DSL" for brevity), a language for specifying jobs and workflows for Azkaban. Hadoop Plugin User GuideThe Hadoop Plugin User Guide is available at User Guide. Hadoop DSL Language ReferenceThe Hadoop DSL Language Reference is available at Hadoop DSL Language Reference. Getting the Hadoop PluginThe Hadoop Plugin is now published at plugins.gradle.org.
Click on the link for a short snippet to add to your Can I Benefit from the Hadoop Plugin and Hadoop DSL?You must use Gradle as your build system to use the Hadoop Plugin. If you are using Azkaban, you should start using the Hadoop Plugin immediately and you should use the Hadoop DSL to develop all of your Azkaban workflows. If you are using Apache Pig, the Plugin includes features that will statically validate your Pig scripts, saving you time by finding errors at build time instead of when you run your Pig script. If you run Apache Pig or Apache Spark on a Hadoop cluster through a gateway node, the Plugin includes tasks that will automate the process of launching your Pig or Spark jobs on the gateway without you having to manually download your code and dependencies there first. If you are using Gradle and you feel that you might benefit from any of the above features, consider using the Hadoop Plugin and the Hadoop DSL. Example ProjectWe have added an Example Project that uses the Hadoop Plugin and DSL to build an example Azkaban workflow consisting of Apache Pig, Apache Hive and Java Map-Reduce jobs. Apache Oozie StatusThe Hadoop Plugin includes Gradle tasks for Apache Oozie, including the ability to upload versioned directories to HDFS, as well as Gradle tasks for issuing Oozie commands. If you are using Gradle as your build system and Apache Oozie as your Hadoop workflow scheduler, you might find the Hadoop Plugin useful. However, we would like to mention the fact that since we are no longer actively using Oozie at LinkedIn, it is possible that the Oozie tasks may fall into a non-working state. Although we started on a Hadoop DSL compiler for Oozie, we did not complete it, and it is currently not in a usable form. We are not currently working on it and it is unlikely to be completed. Recent News
Project StructureThe project structure is setup as follows:
Although the Building and Running Test CasesTo build the Plugin and run the test cases, run Unit testsUnit tests are invoked by running Integration testsIntegration tests are invoked by running |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论