Tagged: hadoop

0

How a newline can ruin your Hive

Source: http://marcel.is/how-newline-can-ruin-your-hive/ If you do not fully understand how Hive/Impala stores your data, it might cost you badly. Symptom #1: Weird values in ingested Hive table You double-checked with select distinct(gender) from customers that the gender column in your source RDBMS really contains only values male, female and NULL....

0

Hadoop hdfs tips and tricks

Finding active namenode in a cluster   Active namenode in a cluster # lookup active nn nn_list=`hdfs getconf -namenodes` echo Namenodes found: $nn_list active_node=‘’ #for nn in $( hdfs getconf -namenodes ); do for nn in $nn_list ; do echo...

0

Fastest way of compressing file(s) in Hadoop

Compressing files in hadoop Okay, well.. It may or may not be the fastest. Email me if you find a better alternate 😉 Short background, The technique uses simple Pig script Make Pig use tez engine (set the queue name...

0

Ambari REST Api

Ambari configuration over REST   Ambari configuration over REST API Need to login to ambari Access below URL, http://ambari-host:8080/api/v1/services/AMBARI/components/AMBARI_SERVER   Related posts: Adding compression codec to Hortonworks data platform Permanently add jars to hadoop HDFS disk consumption – Find what...

0

Computing memory parameters for Namenode

Source: https://discuss.pivotal.io/hc/en-us/articles/203272527-Namenode-failed-while-loading-fsimage-with-GC-overhead-limit-exceeded Namenode failed while loading fsimage with GC overhead limit exceeded Problem During startup namenode failed to load fsimage into memory 2014-05-14 17:36:56,806 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loading image file /data/hadoop/nn/dfs/name/current/fsimage_0000000000252211550 using no compression 2014-05-14 17:36:56,806 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Number of files =...

0

Public datasets

Source: https://github.com/caesar0301/awesome-public-datasets Awesome Public Datasets This list of public data sources are collected and tidied from blogs, answers, and user responses. Most of the data sets listed below are free, however, some are not. Other amazingly awesome lists can be found...

0

Parsing sqoop logs for stats analysis

Below python code will help you extract statistics from a set of Sqoop log files for transfer analysis,   #!/usr/bin/env python import fnmatch import os import datetime def find_files(directory, pattern): for root, dirs, files in os.walk(directory): for basename in files: if...

0

Moving a host component from one host to another

NOTE ­ It is not safe to move components like Journal node or Zookeeper. This method is to be used only for components like Storm, Kafka , Falcon or Flume etc. Following steps would help in moving components from one Node...

0

Moving JournalNode service from one machine to another

In case you would like to move JournalNode service to another host, here are the steps to do so: Put HDFS in safemode su ­ hdfs ­c ‘hdfs dfsadmin ­fs hdfs://<active node>:8020 ­safemode enter’ Execute a save namespace of the...