Are you a candidate looking to sharpen your skills? Try our practice tests or view sample interview questions & Get hired by the best. Click Here×

Apache Sqoop Online Assessment Test

Topics Covered

  • Sqoop-Based Connector
  • Disk I/O and Network I/O
  • MySQL Databases
  • Database Tables
  • Sqoop Import and Export Commands

Useful for hiring

  • Hadoop Developer - Sqoop
  • Big Data Developer - Sqoop

Start hiring job-fit candidates using this assessment


"Finding quality talent is a challenge more so when you consider the sheer number of resumes we get each year. To top this, the time we spent on our recruitment process was humongous. Interview Mocha helped us to cut down on our candidate filtration time by 40%, making it our preferred assessment tool."

Pedro Furtado, Altran,
Capacity Manager

When running the sqoop command with --hive-import and Sqoop cannot figure out column names, we meet the NullPointerException, how can we fix this?

11/09/21 17:18:49 INFO manager.OracleManager: Time zone has been set to GMT
11/09/21 17:18:49 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
11/09/21 17:18:49 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM addlabel_pris t WHERE 1=0
11/09/21 17:18:49 DEBUG manager.OracleManager$ConnCache: Caching released connection for jdbc:oracle:thin:
11/09/21 17:18:49 ERROR sqoop.Sqoop: Got exception running Sqoop:
at com.cloudera.sqoop.hive.TableDefWriter.getCreateTableStmt(
at com.cloudera.sqoop.hive.HiveImport.importTable(
at com.cloudera.sqoop.tool.ImportTool.importTable(
at com.cloudera.sqoop.Sqoop.runSqoop(
at com.cloudera.sqoop.Sqoop.runTool(
at com.cloudera.sqoop.Sqoop.runTool(
at com.cloudera.sqoop.Sqoop.main(

    • Specify the user name, which Sqoop is connecting as in the upper case.

    • Specify the table name, which you are working with, in upper case.

    • This happens because the client uses wrong Sqoop command.

    • None of the mentioned


When we import the data to Hive use --hive-home <dir>, which environment variable will high priority?

    • All of the mentioned

    • Sqoop will use own hive location

    • Current environment variable will be the highest priority

    • <dir> will be highest priority


What is the purpose of this argument --direct-split-size when we import data from RDBMS to Hadoop?

    • Split the input stream every n bytes when importing in direct mode

    • Split the input stream every n rows when importing in direct mode

    • Split the input stream every n mapper when importing in direct mode

    • Split the input stream every n times when importing in direct mode

Good News! You can customize this test as per your requirement

  • Choose and add questions from Interview Mocha question libraries

  • Add your own set of questions

  • Customize test settings like duration, number of questions, passing score, web proctoring and much more

  • Mail us at to create a custom test

Looking for a tailor-made test, to suit your assessment needs?

Get in touch

Global companies using Mocha assessments

  • altran
  • credit suisse
  • sephora
  • nielsen
  • capgemini
  • akamai

How Mocha assessments are brewed

Speed up your IT recruitment with Interview Mocha

  • 500+ IT skill tests
  • 20+ coding languages
  • 50+ domain knowledge tests
  • 100+ digital 2.0 skill tests
  • Customer focus aptitude tests
  • Enterprise ready features & more...

Why hiring managers and recruiters across the globe love Interview Mocha

Take your first step to hire job fit candidates