Hitachi Vantara Pentaho Community Forums
Results 1 to 3 of 3

Thread: Importing data in Hive through sqoop import in PDI

  1. #1
    Join Date
    May 2014
    Posts
    3

    Default Importing data in Hive through sqoop import in PDI

    I want to import database table from mysql to hive through PDI..

    Following is my sqoop job

    Name:  Screenshot-2.jpg
Views: 186
Size:  40.8 KB

    When I run the job the table is successfully imported in hdfs, but not in Hive.. I can view the contents of table in hdfs. It throws an error when copying contents from hdfs to hive. Here is the error message I get.

    Name:  Screenshot-3.jpg
Views: 154
Size:  31.3 KB

    Thanks in advance..

  2. #2
    Join Date
    Apr 2008
    Posts
    1,771

    Default

    Don't do screenshots.. they are unreadable.
    Copy and paste the error message.
    -- Mick --

  3. #3
    Join Date
    May 2014
    Posts
    3

    Default

    Hi Mike..
    Here is my error message..

    Spoon - Logging goes to file:///tmp/spoon_73daf1d2-e24b-11e3-aecf-ad8daa6585fb.log
    2014/05/23 12:55:45 - class org.pentaho.agilebi.platform.JettyServer - WebServer.Log.CreateListener localhost:10002
    2014/05/23 12:55:49 - Version checker - OK
    2014/05/23 12:56:06 - RepositoriesMeta - Reading repositories XML file: /root/.kettle/repositories.xml
    2014/05/23 12:56:24 - Spoon - Spoon
    2014/05/23 12:58:11 - Spoon - Spoon
    2014/05/23 13:00:16 - Spoon - Starting job...
    2014/05/23 13:00:16 - Test_Sqoop_Import - Start of job execution
    2014/05/23 13:00:16 - Test_Sqoop_Import - Starting entry [Sqoop Import]
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - Setting your password on the command-line is insecure. Consider using -P instead.
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - Using Hive-specific delimiters for output. You can override
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - delimiters with --fields-terminated-by, etc.
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - Preparing to use a MySQL streaming resultset.
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - Beginning code generation
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - Executing SQL statement: SELECT t.* FROM `append` AS t LIMIT 1
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - Executing SQL statement: SELECT t.* FROM `append` AS t LIMIT 1
    2014/05/23 13:00:16 - Sqoop Import - 2014/05/23 13:00:16 - $HADOOP_HOME is not set
    2014/05/23 13:00:17 - Sqoop Import - 2014/05/23 13:00:17 - Writing jar file: /tmp/sqoop-root/compile/87277e132ab6e54f985d199dd95158e6/append.jar
    2014/05/23 13:00:17 - Sqoop Import - 2014/05/23 13:00:17 - It looks like you are importing from mysql.
    2014/05/23 13:00:17 - Sqoop Import - 2014/05/23 13:00:17 - This transfer can be faster! Use the --direct
    2014/05/23 13:00:17 - Sqoop Import - 2014/05/23 13:00:17 - option to exercise a MySQL-specific fast path.
    2014/05/23 13:00:17 - Sqoop Import - 2014/05/23 13:00:17 - Setting zero DATETIME behavior to convertToNull (mysql)
    2014/05/23 13:00:17 - Sqoop Import - 2014/05/23 13:00:17 - Beginning import of append
    2014/05/23 13:00:17 - Sqoop Import - 2014/05/23 13:00:17 - SQOOP_HOME is unset. May not be able to find all job dependencies.
    2014/05/23 13:00:18 - Sqoop Import - 2014/05/23 13:00:18 - BoundingValsQuery: SELECT MIN(`ID`), MAX(`ID`) FROM `append`
    2014/05/23 13:00:18 - Sqoop Import - 2014/05/23 13:00:18 - Running job: job_201404211156_0136
    2014/05/23 13:00:19 - Sqoop Import - 2014/05/23 13:00:19 - map 0% reduce 0%
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - map 100% reduce 0%
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Job complete: job_201404211156_0136
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Counters: 18
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Job Counters
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - SLOTS_MILLIS_MAPS=11191
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Total time spent by all reduces waiting after reserving slots (ms)=0
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Total time spent by all maps waiting after reserving slots (ms)=0
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Launched map tasks=4
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - SLOTS_MILLIS_REDUCES=0
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - File Output Format Counters
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Bytes Written=210
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - FileSystemCounters
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - HDFS_BYTES_READ=396
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - FILE_BYTES_WRITTEN=233279
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - HDFS_BYTES_WRITTEN=210
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - File Input Format Counters
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Bytes Read=0
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Map-Reduce Framework
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Map input records=13
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Physical memory (bytes) snapshot=250331136
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Spilled Records=0
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - CPU time spent (ms)=2280
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Total committed heap usage (bytes)=141361152
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Virtual memory (bytes) snapshot=1544785920
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Map output records=13
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - SPLIT_RAW_BYTES=396
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Transferred 210 bytes in 7.0113 seconds (29.9516 bytes/sec)
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Retrieved 13 records.
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Executing SQL statement: SELECT t.* FROM `append` AS t LIMIT 1
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Removing temporary files from import process: hdfs://172.18.2.62:9000/user/root/append/_logs
    2014/05/23 13:00:24 - Sqoop Import - 2014/05/23 13:00:24 - Loading uploaded data into Hive
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : 2014/05/23 13:00:24 - Encountered IOException running import job: java.io.IOException: Cannot run program "hive": error=2, No such file or directory
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) :
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1041)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at java.lang.Runtime.exec(Runtime.java:617)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at java.lang.Runtime.exec(Runtime.java:528)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.sqoop.util.Executor.exec(Executor.java:76)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:361)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:314)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:226)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:415)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:49)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.pentaho.hadoop.shim.common.CommonSqoopShim.runTool(CommonSqoopShim.java:44)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.pentaho.hadoop.shim.hdp12.ClassPathModifyingSqoopShim.access$001(ClassPathModifyingSqoopShim.java:41)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.pentaho.hadoop.shim.hdp12.ClassPathModifyingSqoopShim$1.call(ClassPathModifyingSqoopShim.java:79)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.pentaho.hadoop.shim.hdp12.ClassPathModifyingSqoopShim$1.call(ClassPathModifyingSqoopShim.java:76)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.pentaho.hadoop.shim.hdp12.ClassPathModifyingSqoopShim.runWithModifiedClassPathProperty(ClassPathModifyingSqoopShim.java:63)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.pentaho.hadoop.shim.hdp12.ClassPathModifyingSqoopShim.runTool(ClassPathModifyingSqoopShim.java:76)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.pentaho.di.job.entries.sqoop.AbstractSqoopJobEntry.executeSqoop(AbstractSqoopJobEntry.java:259)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at org.pentaho.di.job.entries.sqoop.AbstractSqoopJobEntry$1.run(AbstractSqoopJobEntry.java:231)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at java.lang.Thread.run(Thread.java:724)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : Caused by: java.io.IOException: error=2, No such file or directory
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at java.lang.UNIXProcess.forkAndExec(Native Method)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at java.lang.UNIXProcess.<init>(UNIXProcess.java:135)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at java.lang.ProcessImpl.start(ProcessImpl.java:130)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : at java.lang.ProcessBuilder.start(ProcessBuilder.java:1022)
    2014/05/23 13:00:24 - Sqoop Import - ERROR (version 4.4.2, build 1 from 2013-12-09 16.07.28 by buildguy) : ... 22 more

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Privacy Policy | Legal Notices | Safe Harbor Privacy Policy

Copyright © 2005 - 2019 Hitachi Vantara Corporation. All Rights Reserved.