CCA175 Exam Questions 2019

If you want to prepare with latest CCA175 exam dumps 2019 and want to pass the CCA Spark and Hadoop Developer Exam - Performance Based Scenarios exam at first attempt then my recommendation to you to utilize and prepare with CertsMarket.com CCA175 exam dumps.

Best Preparation Material for CCA175 Exam at CertsMarket

You can get 100% verified CCA175 answers and updated CCA175 exam prep material that will boost up your preparation for your CCA175 exam. Certsmarket provides the best preparation material along with CCA175 practice test questions and their solution that will empower you to pass your CCA Spark and Hadoop Developer Exam - Performance Based Scenarios exam easily.

Passing the CCA175 certification exam in 2019 is not a piece of cake. Most of the Cloudera Certified Associate CCA CCA175 exam students want to pass this exam with minimum effort but this exam requires hard work and firm determination in order to get success in the CCA175 exam. You just need some skills and the very large amount of practicing with CCA175 sample questions by solving them through their verified answers, which will provide by Certsmarket.com

Cloudera

CCA175 Exam

CCA Spark and Hadoop Developer Exam

Questions & Answers

Demo

Question: 1

Problem Scenario 1:

You have been given MySQL DB with following details.

user=retail_dba

password=cloudera

database=retail_db

table=retail_db.categories

jdbc URL = jdbc:mysql://quickstart:3306/retail_db

Please accomplish following activities.

1. Connect MySQL DB and check the content of the tables.

2. Copy "retaildb.categories" table to hdfs, without specifying directory name.

3. Copy "retaildb.categories" table to hdfs, in a directory name "categories_target".

4. Copy "retaildb.categories" table to hdfs, in a warehouse directory name "categories_warehouse".

Answer:

Solution :

Step 1 : Connecting to existing MySQL Database mysql --user=retail_dba --password=cloudera

retail_db

Step 2 : Show all the available tables show tables;

Step 3 : View/Count data from a table in MySQL select count(1} from categories;

Step 4 : Check the currently available data in HDFS directory hdfs dfs -Is

Step 5 : Import Single table (Without specifying directory).

sqoop import --connect jdbc:mysql://quickstart:3306/retail_db -username=retail_dba -

password=cloudera -table=categories

Note : Please check you dont have space between before or after '=' sign. Sqoop uses the

MapReduce framework to copy data from RDBMS to hdfs

Step 6 : Read the data from one of the partition, created using above command, hdfs dfs -

catxategories/part-m-00000

Step 7 : Specifying target directory in import command (We are using number of mappers =1, you

can change accordingly) sqoop import -connect jdbc:mysql://quickstart:3306/retail_db -

username=retail_dba -password=cloudera ~table=categories-target-dir=categortes_target --m 1

Step 8 : Check the content in one of the partition file.

hdfs dfs -cat categories_target/part-m-00000

Step 9 : Specifying parent directory so that you can copy more than one table in a specified target

directory. Command to specify warehouse directory.

sqoop import -.-connect jdbc:mysql://quickstart:3306/retail_db --username=retail dba -

password=cloudera -table=categories -warehouse-dir=categories_warehouse --m 1


Question: 2

Problem Scenario 2 :

There is a parent organization called "ABC Group Inc", which has two child companies named Tech Inc and MPTech.

Both companies employee information is given in two separate text file as below. Please do the following activity for employee details.

Tech Inc.txt

1,Alok,Hyderabad

2,Krish,Hongkong

3,Jyoti,Mumbai

4,Atul,Banglore

5,Ishan,Gurgaon

MPTech.txt

6,John,Newyork

7,alp2004,California

8,tellme,Mumbai

9,Gagan21,Pune

10,Mukesh,Chennai

1. Which command will you use to check all the available command line options on HDFS and How will you get the Help for individual command.

2. Create a new Empty Directory named Employee using Command line. And also create an empty file named in it Techinc.txt

3. Load both companies Employee data in Employee directory (How to override existing file in HDFS).

4. Merge both the Employees data in a Single tile called MergedEmployee.txt, merged tiles should have new line character at the end of each file content.

5. Upload merged file on HDFS and change the file permission on HDFS merged file,so that owner and group member can read and write, other user can read the file.

6. Write a command to export the individual file as well as entire directory from HDFS to local file System.

Answer:

Solution :

Step 1 : Check All Available command hdfs dfs

Step 2 : Get help on Individual command hdfs dfs -help get

Step 3 : Create a directory in HDFS using named Employee and create a Dummy file in it called e.g. Techinc.txt hdfs dfs -mkdir Employee

Now create an emplty file in Employee directory using Hue.

Step 4 : Create a directory on Local file System and then Create two files, with the given data in problems.

Step 5 : Now we have an existing directory with content in it, now using HDFS command line , overrid this existing Employee directory. While copying these files from local file System to HDFS. cd

/home/cloudera/Desktop/ hdfs dfs -put -f Employee

Step 6 : Check All files in directory copied successfully hdfs dfs -Is Employee

Step 7 : Now merge all the files in Employee directory, hdfs dfs -getmerge -nl Employee MergedEmployee.txt

Step 8 : Check the content of the file. cat MergedEmployee.txt

Step 9 : Copy merged file in Employeed directory from local file ssytem to HDFS. hdfs dfs -put MergedEmployee.txt Employee/

Step 10 : Check file copied or not. hdfs dfs -Is Employee

Step 11 : Change the permission of the merged file on HDFS hdfs dfs -chmpd 664 Employee/MergedEmployee.txt

Step 12 : Get the file from HDFS to local file system, hdfs dfs -get Employee Employee_hdfs


Question: 3

Problem Scenario 5 : You have been given following mysql database details.

user=retail_dba

password=cloudera

database=retail_db

jdbc URL = jdbc:mysql://quickstart:3306/retail_db

Please accomplish following activities.

1. List all the tables using sqoop command from retail_db

2. Write simple sqoop eval command to check whether you have permission to read database tables

or not.

3. Import all the tables as avro files in /user/hive/warehouse/retail cca174.db

4. Import departments table as a text file in /user/cloudera/departments.

Answer:

Solution:

Step 1 : List tables using sqoop

sqoop list-tables --connect jdbc:mysql://quickstart:330G/retail_db --username retail dba -password

cloudera

Step 2 : Eval command, just run a count query on one of the table.

sqoop eval \

--connect jdbc:mysql://quickstart:3306/retail_db \

-username retail_dba \

-password cloudera \

--query "select count(1) from ordeMtems"

Step 3 : Import all the tables as avro file.

sqoop import-all-tables \

-connect jdbc:mysql://quickstart:3306/retail_db \

-username=retail_dba \

-password=cloudera \

-as-avrodatafile \

-warehouse-dir=/user/hive/warehouse/retail stage.db \

-ml

Step 4 : Import departments table as a text file in /user/cloudera/departments

sqoop import \

-connect jdbc:mysql://quickstart:3306/retail_db \

-username=retail_dba \

-password=cloudera \

-table departments \

-as-textfile \

-target-dir=/user/cloudera/departments

Step 5 : Verify the imported data.

hdfs dfs -Is /user/cloudera/departments

hdfs dfs -Is /user/hive/warehouse/retailstage.db

hdfs dfs -Is /user/hive/warehouse/retail_stage.db/products

To get access to all questions, please click ....