Post on 14-Apr-2017
Ankita Nahata Contact Email: ankitanahata.1988@gmail.com
Objective
Develop large scale data processing systems and data warehousing solutions to solve big data problems
Work experience at Zaloni Inc.
Senior Software Engineer Oct 2011 Sept 2015 ➢ 4 years experience in leading software development for complex business problems involving large scale data
warehousing, realtime analytics and reporting ➢ Excellent skills in problem solving, collaboration and communication ➢ Conducted training sessions in various technologies and provided mentoring support
Technical Skills
Programming Languages: Java, SQL, BASH, C++ Big Data Technologies: Spark, Hadoop, Hive, HAWQ, Hbase, Teradata, MySql, PostgreSql, Oracle Misc: Informatica, Talend, Hibernate, Sqoop, Log4j, Dozer, JUnit, Mockito, Maven
Projects Project: Sparkify Bedrock Workflow Actions [Client: Zaloni Inc.]
➢ Designed and implemented a system to convert MapReduce actions into Spark actions and facilitate the customer to choose between execution engines
➢ Performance testing of Sparkified actions ➢ Point person in the team for managing QA deliverables and documentation
Project: NBC Universal [Client: NBC Universal] ➢ Designed and implemented a system for transformation and validation of viewership data in Bedrock, Talend
and Informatica BDE ➢ Prepared a detailed report to compare Bedrock, Talend and Informatica BDE
Project: Cablevision Hadoop ETL Offload [Client: Cablevision] ➢ Designed and implemented a Hadoop based solution of existing ETL jobs ➢ Developed a framework to import data from Netezza to HAWQ environment and export data back into
destination Netezza environment ➢ Point person in the team for managing QA deliverables and tech documentation
Project: NetApp ASUP ETL [Client: NetApp] ➢ Designed and implemented the metadata to process log files generated by NetApp storage system ➢ Developed map reduce jobs to process the data from log files
Project: Bedrock [Client: Zaloni Inc.] ➢ Developed services to create, validate and populate metadata to organize and manage data in data
management platform ‘Bedrock’ ➢ Worked on utility scripts to load and export data to and from Teradata and Netezza
Project: Verizon Enterprise Solutions Hadoop ETL [Client: Verizon] ➢ Developed a tool to convert the Teradata SQLs to Hive queries ➢ Developed a tool to import data from Teradata to Hadoop environment and export data back into Teradata
Education
20092011 MCA [Sikkim Manipal University (University topper)] 20062009 B.Sc. in Computer Science [Gauhati University (University 2nd topper)]