Nagarjuna_Damarla_Resume

4
Nagarjuna Damarla Phone: +91-9941618664 e-mail: nagarjuna.[email protected] Professional Summary: 3+ years of overall IT experience in Application Development using Big Data Hadoop, Java and BI technologies. Proficient working experience in Hadoop components like HDFS, Map Reduce, Hive, Pig, HBase, Sqoop and Flume. Good designing skills in writing Map Reduce programs. Involved in writing Pig and Hive scripts to reduce the job execution time. Able to handle export and import data to other database through Sqoop. Developed API to interact with MySQL data using java swings. Good communication, interpersonal, analytical skills, and strong ability to perform as part of team. Interested in learning new concepts to keep updated in technology trends. Smart working and enthusiastic. Knowledge on FLUME and No-SQL Databases like Mongo DB. Got appreciations from clients and got Q1 Quarterly Award – 2015 for my contribution in project. Professional Experience: Programmer Analyst in Cognizant Technology Solutions, Hyderabad, India since 2013. Education: Master of Computer Applications, Osmania University. Technical Skills: Skills BIG DATA MapReduce, Pig, Sqoop, Pig, Hive, Hbase, JAVA/J2EE Technologies, Swings, JDBC, OJDBC Frameworks Hadoop Java IDEs Eclipse and NetBeans Databases SQL, Mysql, MongoDB Operating Systems Windows XP, 2000, 2003, Unix and Linux

Transcript of Nagarjuna_Damarla_Resume

Page 1: Nagarjuna_Damarla_Resume

Nagarjuna Damarla

Phone: +91-9941618664 e-mail: [email protected]

Professional Summary:

3+ years of overall IT experience in Application Development using Big Data Hadoop, Java and BI technologies.

Proficient working experience in Hadoop components like HDFS, Map Reduce, Hive, Pig, HBase, Sqoop and Flume.

Good designing skills in writing Map Reduce programs. Involved in writing Pig and Hive scripts to reduce the job execution time.  Able to handle export and import data to other database through Sqoop.  Developed API to interact with MySQL data using java swings. Good communication, interpersonal, analytical skills, and strong ability to perform as part of

team. Interested in learning new concepts to keep updated in technology trends.   Smart working and enthusiastic.  Knowledge on FLUME and No-SQL Databases like Mongo DB. Got appreciations from clients and got Q1 Quarterly Award – 2015 for my contribution in

project.

Professional Experience: Programmer Analyst in Cognizant Technology Solutions, Hyderabad, India since 2013.

Education: Master of Computer Applications, Osmania University.

Technical Skills:Skills BIG DATA MapReduce, Pig, Sqoop, Pig, Hive, Hbase, JAVA/J2EE

Technologies, Swings, JDBC, OJDBCFrameworks Hadoop

Java IDEs Eclipse and NetBeans

Databases SQL, Mysql, MongoDB

Operating Systems Windows XP, 2000, 2003, Unix and Linux

Project Details:

Project #1Title : Target Re-hosting of WebIntelligence ProjectEnvironment : Hadoop , Apache Pig, Hive, SQOOP,Java , Unix , PHP , MySQLRole : Hadoop Developer.Hardware : Virtual Machines, UNIX.Duration : March 2015 to Till Date

Page 2: Nagarjuna_Damarla_Resume

Description:

The purpose of the project is to store terabytes of log information generated by the ecommerce website and extract meaningful information out of it. The solution is based on the open source BigData s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs. Which intern includes getting the raw html data from the websites ,Process the html to obtain product and pricing information,Extract various reports out of the product pricing information and Export the information for further processing.

This project is mainly for the re-platforming of the current existing system which is running on WebHarvest a third party JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to process large date sets (i.e. Tera bytes and Peta bytes of data) in order to meet the client requirements with the incresing competion from his retailers.

Contributions:

1. Moved all crawl data flat files generated from various retailers to HDFS for further processing.2. Written the Apache PIG scripts to process the HDFS data. 3. Created Hive tables to store the processed results in a tabular format.4. Developed the sqoop scripts inorder to make the intraction between Pig and MySQL Database.5. Involved in gathering the requirements, designing, development and testing6. Writing CLI commands using HDFS. 7. Completely involved in Hadoop, hive, pig and mysql installation setup.8. Analyzed log file to understand the user behavior. 9. Unit testing of MapReduce and Pig scripts.

Project #2Title : Device Fault PredectionEnvironment : Hadoop , MapReduce, Hive, Sqoop, pig.Role : Hadoop Developer.Hardware : Virtual Machines, UNIX.Duration : Jan 2014 – Feb 2015

Description:

Cisco’s support team on a day-to-day basis deals with huge volumes of issues related to their network products like routers, switches etc. The support teams have been operating on a reactive model i.e. based on the customer tickets/queries being raised. Hence, to improve customer satisfaction, they would like the system to predict network faults based on the logs being generated by various network devices i.e. by loading them into Hadoop cluster and analyzing them using some of the machine learning algorithms implemented in Apache Mahout or custom built.

Responsibilities:

1. Moved all log files generated by various network devices into hdfs location2. Written MapReduce code that will take input as log files and parse the logs and structure them in

tabular format to facilitate effective querying on the log data.3. Created External Hive Table on top of parsed data4. .Developed the sqoop scripts inorder to make the intraction between Pig and MySQL Database.

Page 3: Nagarjuna_Damarla_Resume

5. Involved in gathering the requirements, designing, development and testing6. Analyzed log file to understand the user behavior. 7. Unit testing of MapReduce and Pig scripts.

Project #3

Title : Endeca iPlus 3.1 Environment : Endeca 3.1, SQL developerRole : Endeca 3.1 developerHardware : Virtual Machines, UNIX.Duration : Mar 2013– Dec 2013

Description:

Deliver the best Incentive System, to meet the needs of Customers and Dealers, provide a state of the art system that will allow to be more efficient, flexible and capture increased sales, market share and profit , the existing SIMS R2.2 Business Processes have been modified, new functionalities and new reports have been added. Business Intelligence has been introduced to provide better reporting solutions through Endeca. Following are the key changes that have been implemented in ISYS.

Responsibilities:

1. Create Endeca pages with respective components like charts, results table, crosstabs.

2. Fine Tuning of queries, for the better performance.

3. Configure the instances according to the business rules and filter conditions.

4. Involving in Requirement gathering analysis.

5. Validate 3.1 endeca reports against 2.2.1 reports.

6. Prepared unit test cases for the reports.