PlanetJava
Custom Search

java-hadoop-hive-user
[Top] [All Lists]

RE: not able to access Hive Web Interface

Subject: RE: not able to access Hive Web Interface
Date: Wed, 18 Jul 2012 06:41:16 +0000
Hi all :-),
Iam trying to access Hive Web Interface but it fails.
I have this changes in hive-site.xml
************************************************************************************************
<configuration>
    <property>
        <name>hive.hwi.listen.host</name>
        <value>0.0.0.0</value>
        <description>This is the host address the Hive Web Interface will 
listen on</description>
    </property>
    <property>
        <name>hive.hwi.listen.port</name>
        <value>9999</value>
        <description>This is the port the Hive Web Interface will listen 
on</description>
    </property>
    <property>
        <name>hive.hwi.war.file</name>
        <value>/HADOOP/hive/lib/hive-hwi-0.8.1.war</value> /*  (Here is the 
hive directory) */
        <description>This is the WAR file with the jsp content for Hive Web 
Interface</description>
    </property>
</configuration>
***********************************************************************************************
And also export the ANT lib like.
export ANT_LIB=/Yogesh/ant-1.8.4/lib
export PATH=$PATH:$ANT_LIB
now when i do run command
hive --service hwi  it results
12/07/17 18:03:02 INFO hwi.HWIServer: HWI is starting up
12/07/17 18:03:02 WARN conf.HiveConf: DEPRECATED: Ignoring hive-default.xml 
found on the CLASSPATH at /HADOOP/hive/conf/hive-default.xml
12/07/17 18:03:02 FATAL hwi.HWIServer: HWI WAR file not found at 
/HADOOP/hive/lib/hive-hwi-0.8.1.war
and if I go for
hive --service hwi --help it results
Usage ANT_LIB=XXXX hive --service hwi
Althought if I go to /HADOOP/hive/lib directory I found
1) hive-hwi-0.8.1.war
2) hive-hwi-0.8.1.jar
these files are present there.
what is Iam doing wrong :-( ?
Please help and Suggest
Greetings
Yogesh Kumar
________________________________
From: Gesli, Nicole [[email protected]]
Sent: Wednesday, July 18, 2012 12:50 AM
To: [email protected]; 
bejoy_ks-/[email protected]
Subject: Re: DATA UPLOADTION
For the Hive query approach, check the string functions 
(https://cwiki.apache.org/confluence/display/Hive/LanguageManual+UDF#LanguageManualUDF-StringFunctions)
 or write your own (UDF), if needed. It depends on what you are trying to get. 
Example:
SELECT TRIM(SUBSTR(data, LOCATE(LOWER(data), ' this '), LOCATE(LOWER(data), ' 
that ')+5)) my_string
FROM   log_table
WHERE  LOWER(data) LIKE '%this%and%that%'
From: Bejoy KS 
<bejoy_ks-/[email protected]<mailto:bejoy_ks-/[email protected]>>
Reply-To: 
"[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>,
 "bejoy_ks-/[email protected]<mailto:[email protected]>" 
<bejoy_ks-/[email protected]<mailto:bejoy_ks-/[email protected]>>
Date: Monday, July 16, 2012 11:39 PM
To: 
"[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Subject: Re: DATA UPLOADTION
Hi Yogesh
You can connect reporting tools like tableau , micro strategy etc direcly with 
hive.
If you are looking for some static reports based on aggregate data. You can 
process the data in hive move the resultant data into some rdbms and use some 
common reporting tools over the same. I know quite a few projects following 
this model.
Regards
Bejoy KS
Sent from handheld, please excuse typos.
________________________________
From: 
<[email protected]<mailto:[email protected]>>
Date: Tue, 17 Jul 2012 06:33:43 +0000
To: 
<[email protected]<mailto:[email protected]>>;
<bejoy_ks-/[email protected]<mailto:bejoy_ks-/[email protected]>>
ReplyTo: 
[email protected]<mailto:[email protected]>
Subject: RE: DATA UPLOADTION
Thanks Gesli and Bejoy,
I have created tables in hive and uploaded data into it. I can perform query on 
it, please suggest me how to generate reports from that tables.
Mr. Gesli,
If I create tables with single string column like ( create table Log_table( 
Data STRING); ) then how can perform condition based query over the data into 
Log_table ?
Thanks & Regards :-)
Yogesh Kumar
________________________________
From: Gesli, Nicole 
[[email protected]<mailto:[email protected]>]
Sent: Monday, July 16, 2012 11:30 PM
To: 
[email protected]<mailto:[email protected]>;
bejoy_ks-/[email protected]<mailto:bejoy_ks-/[email protected]>
Cc: 
user-50Pas4EWwPEyzMRdD/[email protected]<mailto:user-50Pas4EWwPEyzMRdD/[email protected]>
Subject: Re: DATA UPLOADTION
If you are just trying to find certain text in the data files and you just want 
to do bulk process to create reports once a day or so, and prefer to use Hive: 
you can create a table with with single string column. You need to pre-process 
your data to replace the default column delimiter in your data. Or, you can 
define a column delimiter that your data does not have. That is to make sure 
that entire line data is assigned to the column but not cut in where the column 
delimiter is. If your query will be different for each file type (flat files, 
logs, xls,…) you can create different partitions for each file type. Dump your 
files into the table (or table partition) folder(s). Or you can create external 
table(s) if your data is already in HDFS. You can than do "like" (faster) or 
"rlike" search on the table.
-Nicole
From: Bejoy KS 
<bejoy_ks-/[email protected]<mailto:bejoy_ks-/[email protected]>>
Reply-To: 
"[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>,
 "bejoy_ks-/[email protected]<mailto:[email protected]>" 
<bejoy_ks-/[email protected]<mailto:bejoy_ks-/[email protected]>>
Date: Monday, July 16, 2012 12:50 AM
To: 
"[email protected]<mailto:[email protected]>"
<[email protected]<mailto:[email protected]>>
Cc: 
"user-50Pas4EWwPEyzMRdD/[email protected]<mailto:user-50Pas4EWwPEyzMRdD/[email protected]>"
<[email protected]<mailto:user-50Pas4EWwPEyzMRdD/[email protected]>>
Subject: Re: DATA UPLOADTION
Hi Yogesh
If you are looking at some indexing and search kind of operation you can take a 
look at lucene.
Whether you are using hive or Hbase you cannot do any operation without having 
a table structure defined for the data. So you need to create tables for each 
dataset and then only you can go ahead and issue queries and generate reports 
on those data.
Regards
Bejoy KS
Sent from handheld, please excuse typos.
________________________________
From: 
<[email protected]<mailto:[email protected]>>
Date: Mon, 16 Jul 2012 06:21:15 +0000
To: 
<[email protected]<mailto:[email protected]>>
ReplyTo: 
[email protected]<mailto:[email protected]>
Cc: 
<user-50Pas4EWwPEyzMRdD/[email protected]<mailto:user-50Pas4EWwPEyzMRdD/[email protected]>>
Subject: RE: DATA UPLOADTION
Hello Debarshi,
Please suggest me what tool should I use for these operation over hadoop dfs.
Regards
Yogesh Kumar
________________________________
From: Debarshi Basak 
[debarshi.basak-/[email protected]<mailto:debarshi.basak-/[email protected]>]
Sent: Monday, July 16, 2012 11:25 AM
To: 
[email protected]<mailto:[email protected]>
Cc: 
[email protected]<mailto:[email protected]>;
user-50Pas4EWwPEyzMRdD/[email protected]<mailto:user-50Pas4EWwPEyzMRdD/[email protected]>
Subject: Re: DATA UPLOADTION
Hive is not the right to go about it, if you are planning to do search kind of 
operations
Debarshi Basak
Tata Consultancy Services
Mailto: 
debarshi.basak-/[email protected]<mailto:debarshi.basak-/[email protected]>
Website: http://www.tcs.com
____________________________________________
Experience certainty. IT Services
Business Solutions
Outsourcing
____________________________________________
----- wrote: -----
To: 
<[email protected]<mailto:[email protected]>>
From: 
<[email protected]<mailto:[email protected]>>
Date: 07/16/2012 09:11AM
cc: 
<user-50Pas4EWwPEyzMRdD/[email protected]<mailto:user-50Pas4EWwPEyzMRdD/[email protected]>>
Subject: DATA UPLOADTION
Hi all,
I have data of Flat files, Log files, Images and .xls Files of around many G.B
I need to put operation like searching, Querying over that raw data.  and 
generating reports.
And its impossible to create tables manually for all to manage them. Is there 
any other way out or how to manage them using Hive or Hbase.
Please suggest me how do I perform these operations over them, I want to use 
HADOOP DFS and files has been uploaded on HDFS (Single user)
Thanks & Regards
Yogesh Kumar
Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email.
www.wipro.com
=====-----=====-----=====
Notice: The information contained in this e-mail
message and/or attachments to it may contain
confidential or privileged information. If you are
not the intended recipient, any dissemination, use,
review, distribution, printing or copying of the
information contained in this e-mail message
and/or attachments to it are strictly prohibited. If
you have received this communication in error,
please notify us by reply e-mail or telephone and
immediately and permanently delete the message
and any attachments. Thank you
Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email.
www.wipro.com
Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email.
www.wipro.com
Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email. 
www.wipro.com
msgmiddle
<Prev in Thread] Current Thread [Next in Thread>
Current Sitemap | © 2012 planetjava | Contact | Privacy Policy