APACHE HUE: Apache hue is a web UI for Hadoop ecosystem, certified by apache. Hue provide connection to all of Apache Hadoop ecosystem tools like HDFS, hive, pig, impala, spark and comes with interactive web UI interface.
Make change in Hadoop Conf files:
Hue depends on these following packages:
gcc, g++ , libxml2-dev , libxlst-dev, libsasl2-dev , libsasl2-modules-gssapi-mit ,
libmysqlclient-dev, python-dev , python-setuptools, libsqlite3-dev, ant ,
libkrb5-dev, libtidy-0.99-0, libldap2-dev, libssl-dev, libgmp3-dev .
Installing all the packages: To install all of these packages follow below use commands which are giving below.
sudo apt-get update
sudo apt-get install gcc g++ libxml2-dev libxslt-dev libsasl2-dev libsasl2-modules-gssapi-mit libmysqlclient-dev python-dev python-setuptools libsqlite3-dev ant libkrb5-dev libtidy-0.99-0 libldap2-dev libssl-dev libgmp3-dev
sudo apt-get install gcc g++ libxml2-dev libxslt-dev libsasl2-dev libsasl2-modules-gssapi-mit libmysqlclient-dev python-dev python-setuptools libsqlite3-dev ant libkrb5-dev libtidy-0.99-0 libldap2-dev libssl-dev libgmp3-dev
Installation and Configuration
Performing installation as hadoop user( if you have specific hadoop use as hadoop admin).
su - hduser
Download Hue from gethue.com (this link is an example obtained from Hue website)
wget https://dl.dropboxusercontent.com/u/730827/hue/releases/4.1.0/hue-4.1.0.tgz
Extract the downloaded tarball
tar -xvf hue-4.1.0.tgz
Execute install command
cd hue-4.1.0
make install
make install
Note: if you receive an error like c/_cffi_backend.c:15:17: fatal error: ffi.h: No such file or directory hue error
Run below command to resolve it:
sudo apt-get update
sudo apt-get install libffi-dev
sudo apt-get install libffi-dev
Once the above process is completed,
Update ~/.bashrc file,
export HUE_HOME=/home/hadoop/hue
export PATH=$PATH:$HUE_HOME/build/env/bin
export PATH=$PATH:$HUE_HOME/build/env/bin
source after adding the entries,
source ~/.bashrc
Configure Hue ( 3 files to edit)
- make this changes in hue.ini fileopen this file.cd $HUE_HOME/desktop/conf
[desktop]
server_user=hduser
server_group=hduser
default_user=hduser
default_hdfs_superuser=hduser
server_user=hduser
server_group=hduser
default_user=hduser
default_hdfs_superuser=hduser
app_blacklist=impala,security
fs_defaultfs=hdfs://localhost:50070
hadoop_conf_dir=$HADOOP_CONF_DIR
resourcemanager_host=localhost
resourcemanager_api_url=http://localhost:8088
hive_server_host=http://localhost:10000
hive_conf_dir=/usr/local/hive2.2/conf
proxy_api_url=http://localhost:8088
history_server_api_url=http://localhost:19888
hbase_clusters=(Cluster|localhost:9090)
oozie_url=http://localhost:11000/oozie
Make change in Hadoop Conf files:
open conf directory and edit below given files.
cd $HADOOP_CONF_DIR
- core-site.xml
Note: hduser = your hadoop useper huser
<property>
<name>hadoop.proxyuser.hduser.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hduser.groups</name>
<value>*</value>
</property>
<name>hadoop.proxyuser.hduser.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hduser.groups</name>
<value>*</value>
</property>
- hdfs-site.xml
Note: enabling webUI for Hadoop so that Hue can get access.
<property>
<name>dfs.webhdfs.enabled</name>
<value>true</value>
</property>
<name>dfs.webhdfs.enabled</name>
<value>true</value>
</property>
- httpfs-site.xmlNote: Here we are giving HDFS access to our Hue User, which we has created earlier.
<property>
<name>httpfs.proxyuser.hduser.hosts</name>
<value>*</value>
<description> this properties are for hue to access hdfs </description>
</property>
<property>
<name>httpfs.proxyuser.hduser.groups</name>
<value>*</value>
<description> this properties are for hue to access hdfs </description>
</property>
Starting Hadoop and hiveserver2 for hue:
Start Hadoop:
start-all.sh
Start Hive server:
Start Hive server:
$HIVE_HOME/bin/hiveserver2
Start Hue
nohup supervisor
or
$HUE_HOME/build/env/bin/hue runserver
Login to Hue Web Interface: http://localhost:8888
Reference link for help: