HOW TO RUN MAPREDUCE PROGRAMS USING ECLIPSE
Hadoop provides us a plugin for Eclipse that helps us to connect our Hadoop cluster to Eclipse. We can then run MapReduce jobs and browse Hdfs, through the Eclipse itself . But it requires a few things to be done in order to achieve that. Normally, it is said that we just have to copy hadoop-eclipse-plugin-*.jar to the eclipse/plugins directory in order to get things going. But unfortunately it did not work for me. When I tried to connect eclipse to my Hadoop cluster it threw this error : An internal error occurred during: "Map/Reduce location status updater". org/codehaus/jackson/map/JsonMappingException You may face some different error, but it would be somewhat similar to this. This is because of the fact that some required jars are missing from the plugin that comes with Hadoop. Then, I tried a few things and it turned out to be positive. So, I thought of sharing it, so that if anybody else is facing the same issue, can try it out. Just t ry the steps outli