Common Hadoop questions, errors and there answers

 many common questions, queries related  to hadoop will be resolved here.


Autosys for Hadoop

AutoSys Workload Automation enhances visibility and control of complex workloads across platforms, ERP systems, and the cloud to help reduce cost andcomplexity.

AutoSys Workload Automation enhances visibility and control of complex workloads across platforms, ERP systems, and the cloud. It helps to reduce the cost and complexity of managing mission critical business processes, ensuring consistent and reliable service delivery.Watch this video to find out how App Developers and Workload Administration teams can easily automate and orchestrate jobs across hundreds of enterprise systems such as SAP, Oracle and others alongside your Hadoop jobs using a single console. CA Workload Automation Advanced Integration for Hadoop gives you a single console that lets you run multiple workflows and manage parallel jobs across your Hadoop and your traditional jobs.
helpful content: https://www.youtube.com/watch?v=mjvLlYxvKb4


ETC/Hadoop/slaves No such file or directory
download stable version of hadoop
Configuration files depends on $HADOOP_HOME path. Make sure that you have $HADOOP_HOME set up properly in ~/.bashrc.


how to run hadoop commands in java?
how to run hadoop java program?
how to run hadoop java program in eclipse?
you can use FileSystem API in your java code to interact with hdfs command. Below is the sample code where we trying to check if a particular directory exists in hdfs or not. If exists, then remove that hdfs directory.
    Configuration conf = new Configuration();
    Job job = new Job(conf,"HDFS Connect");

    FileSystem fs = FileSystem.get(conf);
    Path outputPath = new Path("/user/cloudera/hdfsPath");
    if(fs.exists(outputPath))
        fs.delete(outputPath);

You can use the FileSystem API in your Java code to interact with HDFS.