Ravikumar Profle NBA 1
Transcript of Ravikumar Profle NBA 1
Sensitivity: Internal & Restricted
BIO-DATA
1. Name : Dr. Ravikumar G.K.
2. Date of Birth : 18.07.1972
3. Highest Qualifications : B.E, M. Tech.In SACA,Ph.D In Information Technology
4. Address & Contact no : Dr. Ravikumar G.K.
Prof & Director IT ,IPR, R&DHead ,Dept. Of CSE
Adichunchanagiri University, BGSIT , BG Nagara,
Nagamangala Taluk, Mandya District,
Karnataka, India -571448
[email protected], [email protected]
+91-9972094249
www.bgsit.ac.in/www.acu.edu.in
5. Academic Performance :
Qualifications University Institute and place
where studied
Year of
passing
Class &
% marks
Ph.D.(IT) Dr.MGRUniversity, Chennai MGRU, Chennai, India 2012 NA
M. Tech.(SACA) Mangalore University NITK Surathkal Mangalore 2000 FCD / 72
B . E (Civil) Bangalore University SIT - Tumakuru 1996 FCD/74
Designation Institution /
Organization/Industry
Duration
Director, IT, IPR, Research-
Head (CSE)& Prof in CSE
BGSIT-ACU : BG Nagar Feb -2014 – Till date
Date Of Join : 01-02-2014
Professor and PG Co-
Ordinater
Dept Of CSE
Acharya Institute of
Technology ,Bangalore
Jul 2012 - Jan 2014
Lead Consultant Wipro Technologies,
Bangalore
Sept 2010- Jul 2012
Technical Lead IGATE Global Solutions
Bangalore
Dec 2006 to June 2010
Professor & HOD Dept of
CSE and ISE
SJBIT, Bangalore Oct 2001 to Dec 2006
Software Engineer Virgosys software July 2000 to Aug 2001
Software Engineer
Experience Details
Software Engineer Virgosys software July 1996 to Sep 1998
Software Engineer
Nature of Association : Regular
ORCID:0000-0001-6033-3331 & 0000-0002-0171-4609
Sensitivity: Internal & Restricted
Chapter publications In Springer
• Advances in Artificial Intelligence and Data Engineering - ISBN 978-981-15-3514-7
• Secure and Energy-Efficient Data Transmission - Pages 1311-1322
• Intelligent Data Analysis for COVID-19 Pandemic pp 347-364
-Tracking and Analysis of Corona Disease Using Intelligent Data Analysis : Online ISBN -978-981-16-1574-0 &Print ISBN : 978-981-16-1573-3 Advances in Artificial Intelligence, Software and Systems Engineering
• (Proceedings of the AHFE 2021 Virtual Conferences on Human Factors in Software and Systems Engineering, Artificial Intelligence and Social Computing, and Energy, July 25–29, 2021, USA) ISSN 2367-3370 ISSN 2367-3389 (electronic) Lecture Notes in Networks and Systems ISBN 978-3-030-80623-1 ISBN 978-3-030-80624-8 (eBook) https://doi.org/10.1007/978-3-030-80624-8
VISA STATUS ( USA) � Valid B1/B2 Visa till Jan 29-2030
� Valid I-797 Petition for H1-B till Jan 31-2021
Membership of Professional Body/Organization-National and International level
� Life member of Computer Society of India-CSI (F-8003014 )
� Life member of Fellowship of The Institution of Engineers India (F-1262900)
� Life Member of Indian Society for Technical Education - ISTE ( LM-41034)
� Senior Member of Institute of Electrical and Electronics Engineers - IEEE (96471611)
� Life Member of Indian Science Congress Association – (L-38411)
� National skills registry –NSR (ITPIN- 801047329461)
� Profile URL: https://vidwan.inflibnet.ac.in/profile/43375
Subjects Taught Data Structures in C, C++,CCP,UNIX,C/S computing, Data warehousing, Data
Mining(PG), System Simulation and modeling, DBMS, ADBMS(PG), Software
testing, Big Data, Analytics & Cloud computing
Achievements:
Patents
Guest
lectur
es
No. of
Publications
Industrial
Award
Conference
Seminars
Extracurricular
activities Admin.
23 15+ 70+ 3 08+ 14+ 2
Sensitivity: Internal & Restricted
Awards & Recognition:
1. Awarded “National educational Excellence award -2020, for “Outstanding Contribution
in Educational and Research Domain”.
2. Awarded “ Karnataka Educational Award -2021 for Best Professor in Information
Technology”
3. Awarded as a Best lateral hire for year 2010 at Wipro technology.
4. Qualified GATE -98 with AIR-490(all India rank)
5. Awarded “Wipro tenets award”
6. Awarded “SIT- Gold medal” for the year 1996 Batch
Area of Specialization: • Database and data warehouse Systems
• Data Mining Applications
• Big Data & Analytics
• Cloud computing.
• Data Masking&Data aggregation in WSNTest Data Management
• MDM, Data Migration&Machine learning &Artificial intelligence
Reviewer for International journals: • International journal of Journal of Big Data USA
• International journal of Computer Science and Information Security (IJCSIS). USA
• International Journal of Computer Science and Network Security,
• International Journal of Computer Science and Information Technologies
• International Journal on Soft Computing, Artificial Intelligence and Applications (IJSCAI)
• International Journal on Web Service Computing (IJWSC)
• Global Journal of Science &Engineering (Knowvel Journals)London, United Kingdom
Visited Aboard: • Deputed to USA (Los angles) to work with Union Bank of California USA for 6 months.
• Deputed to USA (Los angles) to work with Union Bank of California USA for 3 months
• Deputed to USA (Dallas) to work with CVS – TDM Project, USA for 1.5 months.
• Deputed to USA ( Dallas) to work with onsite Analytics project for 20 months
• Deputed to USA ( San Francisco) to work with onsite Analytics project for 10 months
• Deputed to USA (Atlanta) to work with CVS – Damasking project for 1.5 months.
• Visited Thailand (Bangkok) for the customer (K- Bank) presentation for one week.
Sensitivity: Internal & Restricted
Patent Details: 20 Patents filed– 3 Drafting Stage ( Yet to file)
International: 8. (USA)
National Patents: 12. (INDIA)
Idea No.
National/International Status Receipt No.
1 National Patent Filed & receipt copy Received 201941021627 2 International (USA) Patent Filed & receipt copy Received 16508510 3 National Patent Filed & receipt copy Received 201941035499 4 International (USA) Patent Filed & receipt copy Received 16513813 5 National Patent Filed & receipt copy Received 201941035229 6 International (USA) Patent Filed & receipt copy Received 16550288 7 National Patent Filed & receipt copy Received 201941042125 8 International (USA) Patent Filed & receipt copy Received 16599129 9 National Patent Filed & receipt copy Received 201941051093 10 National Patent Filed & receipt copy Received 201941053042 11 International (USA) Patent Filed & receipt copy Received 17015161 12 National Patent Filed & receipt copy Received 202041038834 13 National Patent Filed & receipt copy Received 202041038832 14 International (USA) Patent Filed & receipt copy Received 17015232 15 National Patent Filed & receipt copy Received 202041038831
Sensitivity: Internal & Restricted
Citations Details: Research Credentials: (Google Scholar)
Citations: 214
h-index : 8
i10-index : 5
Research guiding (Ph.D. In Computer Science and Engineering)
1. Visvesvaraya Technological University: VTU ( State Public Govt University – India)
Ph.D. Thesis title: (5- Awarded)
A. “A novel approach for personalized big data security system using soft computing
methods”.
B. “Optimization of frequent itemset mining for Big data streams on Retail Domain”
C.”A secure data aggregation technique in wireless sensor network”
D.”Performance evaluation of big data cluster using multipath TCP “
E. “Performance analysis of data masking techniques for healthcare domain”
Sl no Name of the Scholar USN University Research Area Status
1 Bindiya M. K 4BW14PEM02 VTU DW& BI-Big data Degree Awarded
2 Pavitra Bai S 4BW14PEM03 VTU Big data &Analytics Degree Awarded
3 Chaitra HV 4BW14PEM01 VTU Big data & WSN Degree Awarded
4 Shashikala S.V 4BW15PEJ02 VTU Big data & N/W Degree Awarded
5 Siddartha B.K 4BW15PEJ01 VTU Data Mining &Masking Degree Awarded
6 Govardhanameti 4BW18PCS01 VTU DW& BI-Big data Course workin Progress
7 ShreevyasH M 7BW19PCS01 VTU AI & ML (Blockchain) Thesis Submitted
8 Savitha G 1BW17PCA01 VTU Bigdata&Analytics Thesis Submitted
9 Shashikala S 4BW17PEA03 VTU Bigdata& Analytics Comp Viva Completed
2. AdichunchanagiriUniversity: ACU
Sl no Name of the Scholar USN University Research Area Status
1 Kavitha H.M 19PCS001 ACU Data Analytics Course work Completed 2 Manu Y.M 19PCS002 ACU Video Analytics Course work Completed 3 Sindushree H.N 19PCS003 ACU Big data & Data mining Course work in Progress 4 Vivek Veeraiah 19PCS004 ACU Big data & IOT Course work Completed
Publication Details:
INTERNATIONAL JOURNALS & CONFERENCES:
16 International (USA) Patent Filed & receipt copy Received 17015220 17 National Patent Filed & receipt copy Received 202041038830 18 National Patent Filed & receipt copy Received 202041038828 19 National Patent Filed & receipt copy Received 202041038827 20 International (USA) Patent Filed & receipt copy Received 17015251
Sensitivity: Internal & Restricted
“Evaluation of Varying Noise Levels with Geometric Aspects of Roads at Locations of Residential and
Industrial Zones in Bengaluru Metropolitan City” Paper published in International Journal of
Engineering Research & Technology (IJERT) http://www.ijert.org ISSN: 2278-0181 IJERTV10IS090002
(This work is licensed under a Creative Commons Attribution 4.0 International License.) Published by :
www.ijert.org Vol. 10 Issue 09, September-2021
Application of AI in Diagnosing and Drug Repurposing in COVID 19 : Advances in Artificial Intelligence,
Software and Systems Engineering(Proceedings of the AHFE 2021 Virtual Conferences on Human Factors
in Software and Systems Engineering, Artificial Intelligence and Social Computing, and Energy, July 25–
29, 2021, USA) ISSN 2367-3370 ISSN 2367-3389 (electronic) Lecture Notes in Networks and Systems
ISBN 978-3-030-80623-1 ISBN 978-3-030-80624-8 (eBook) https://doi.org/10.1007/978-3-030-80624-8
Internet of Things Applications Enablement its Challenges and Trends – Paper presented in
International Conference on “Emerging Trends in Circuit branch Technologies and Applications (ETCTA-
2021) April 3rd –4 th, 2021
The Study of Noise levels in Commercial and Sensitive Zones of Bengaluru Metropolitian City -
Published in Journal of Huazhong University of Science and Technology VOLUME 50 ISSUE 07 - 2021 PAPER ID: HST-0721-22: ISSN-1671-4512
Human Characterization using Fingerprint with Deep Learning Technique Paper accepted to present in
24th World Multi-Conference on Systemics, Cybernetics and Informatics: WMSCI 2020, which will be
held on July 12 - 15, 2020, in Orlando, Florida, USA.
Investigation of Network anomaly detection techniques for Distributed Daniel of service attacks.
Published in International journal of scientific research in engineering and management, Vol-04, issue-
04, April 2020. ISSN: 2582-3930.
Reduction of False Positives Reduction of False Positives in Network Intrusion Detection using a
Hybrid Classification Approach. Accepted in the International Conference on Smart Technologies in
Computing, Electrical and Electronics (ICSTCEE 2020) organized by IEEE.
Framework Design of Deep Detector: A Novel Network Anomaly Detector using Deep Learning
Technique for Detecting DDoS Attacks. Accepted in the IAPR International Conference on Computer
Vision & Image Processing (CVIP2020). Indian Institute of Information Technology Allahabad.
Android application for critical patient monitoring systempublished in International Journal for
Research in Applied Science and Engineering technology DOI:10.22214, Vol 8, Issue I, Paper
ID:IJRASET26498, Jan 2020.
A Survey on Rainfall Analysis Using Big Data AnalyticsAccepted in the IAPR International Conference on
Computer Vision & Image Processing (CVIP2020). Indian Institute of Information Technology Allahabad.
Human Characterization using Fingerprint with Deep Learning Technique Paper accepted to present in
International Conference on Human Interaction & Emerging Technologies IHIET 3: Artificial Intelligence
and Computing in Lausanne, Switzerland,Nov 2019.
Sensitivity: Internal & Restricted
Survey on Machine Learning Based Video Analytics Techniques, Accepted and Gallery Proof is
generated in Journal of Computational and Theoretical Nanoscience Vol. 17, 1–7, 2020,
doi:10.1166/jctn.2020.9000
Fraud detection using Support vector machine published in International Journal of Management,
Technology And Engineering, IJMTE Journal, Vol-9, Issue-3, March 2019.
Energy efficient clustering method for wireless sensor network Published in Indonesian Journal of
Electrical Engineering and Computer Science Vol. 14, No. 2, ISSN: 2502-4752 , May 2019
Secure and Energy Efficient Data Transmission Model for Wireless Sensor Networkpresented in AIDE-
19: International Conference on Artificial Intelligence and Data Engineering, May 23-24th 2019.
Secure And Efficient Cluster Based Routing Model For Wireless Sensor Network published in
International Journal of Innovative Technology and Exploring Engineering (IJITEE) ISSN: 2278-3075,
Volume-8 Issue-11, Sept 2019.
Design and Development of Efficient Multipath TCP for Data Center in Public Cloud Published in
International Journal of Recent Technology and Engineering (IJRTE) ISSN: 2277-3878, Volume-8 Issue-4,
November 2019
CompressingClosedFrequentItemsetswith
ControlledInformationLoss
CompressingClosedFrequentItemsetswith
ControlledInformationLoss
CompressingClosedFrequentItemsetswith
ControlledInformationLoss
Compressing Closed Frequent Itemsets with Controlled Information Loss Presented in 8th IEEE
INTERNATIONAL CONFERENCE ON CLOUD COMPUTING IN EMERGING MARKETS 19th & 20th September
2019 Bengaluru, India (IEEECCEM2019)
A Novel Data Masking Methods for Securing Medical Image Presented in 2nd IEEE International
Conference on Smart Systems and Inventive Technology 2019 (ICSSIT 2019) is being organized by
Francis Xavier Engineering College, Tirunelveli. Nov 2019.
A Novel Data Masking Approach for Preserving Format and Size. Published in IJER-eISSN-2320_592X
Volume-2, Issue-1, Feb-2019.
An Efficient Data Masking for securing Medical Data using DNA Encoding and Chaotic
SystemInternational Journal of Electrical and Computer Engineering (IJECE, ISSN: 2088-8708).
Analysis of Masking Techniques to Find out Security and other Efficiency Issues in Healthcare Domain
3rd International conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud) (I-SMAC 2019), SCAD
Institute of Technology, Palladam, Tamilnadu, India.
Cyberbullying Detection Primarily based on Semantic-Greater Marginalized Denoising Automobile-
Encoder Published in International Journal of Research In Electronics And Computer Engineering
Sensitivity: Internal & Restricted
(IJRECE), July-2019 Volume 7 & Issue 2. Also presented in First IEEE International Conference on
Advances In Information Technology (ICAIT-2019).
Survey on Machine Learning Based Video Analytics TechniquesPresented in Second International
Conference on Recent Innovative Trends in Computer Science and Applications.
An Approach for Big Data Security Based on Hadoop Distributed File SystemPresented inSecond
International Conference on Emerging Trends in Science & Technologies for Engineering Systems
Mining Prominent Closed Frequent Itemsets from Data Streams using Dynamic and Adaptive
Minimum Support Threshold presented in 3rd International Conference on Computational Systems
and Information Technology for Sustainable Solutions (CSITSS - 2018) IEEE ISBN: 978-1-5386-6078-2
Survey on MPTCP Approach On Bigdata Clusters Presented & Published in IEEE conference ISBN:978-1-
5386-5657-0 – Aug 2018
Monitoring the appliances used in health care using Medical Big Data. Presented in International
Conference on Inventive Computation Technologies 2018.
Visualization of Working of DBSCAN and OPTICS Algorithm Using Data Mining Tools published in IJCSIS
VOL 16, No 12 ISSN 1947-5500 – Dec 2018.
A Brief Survey on Data Masking Techniques and Their Limitations. Published in IEEE conference
ICEECCOT-IEEE ISBN:978-1-5386-5130-8/18 – Dec 2018.
Deriving Attribute Relationship for Big Data Security International journal of Advanced Research in
computer and communication Engineering. Vol 6,Issue 10,ISSN-2278-1021.Oct 2017
Improving Degree of Accuracy in Mining Significant Closed Frequent Itemsets using Subset Significance
Threshold presented in international IEEE CSITSS 2018 conference Nov -2018.
Subset Significance Threshold: An Effective Constraint Variable for Mining Significant Closed Frequent
Item sets presented in IEMIS-18 International Conference, kolkatta.Advances in intelligent systems and
computing AISC series ISSN:2194-5357 Indexed by Scopus, ISI, DBLP Springer link, Meta Press
A secure and energy efficient cluster optimization by using hierarchical clustering technique: Third
International Conference on Devices, Circuits and Systems (ICDCS'16): 978-1-5090-2309-
7/16/$31.00©2016 IEEE.
Efficient Incremental Itemset Tree for Approximate Frequent Itemset Mining On Data Stream: 978-1-
5090-2399-8/16/$31.00_c 2016 IEEE.2nd International Conference on Applied and Theoretical
Computing and Communication Technology (iCATccT)
An Energy Efficient Cluster Selection Optimization using Evolutionary Imperialist Competitive
Algorithm:International Journal of Computer Applications (0975 – 8887) Volume 127 – No.1, October
2015
Sensitivity: Internal & Restricted
Clustering of Dissimilar perception phase constructed for similarity Measures using K means
algorithm:IEEE International Conference on Applied and Theoretical Computing and Communication
Technology. July 2016
Emerging Trends in Real-Time Big Data Analytics. International Journal of Applied Engineering
Research, ISSN 0973-4562 Vol. 10 No.86 (2015) © Research India Publications;
http/www.ripublication.com/ijaer.htm
A Survey on Closed Frequent Itemset Mining on Data Streams 978-1-5090-5256-1/16/$31.00_c 2016
IEEE. 2016 2nd International Conference on Contemporary Computing and Informatics (ic3i)
Securing Big Data over Network using MD5 Algorithm Technique:International Journal of Computer
Applications (0975 – 8887) Volume 123 – No.15, August 2015
Smart Web Search Engine Framework with Personalization and Privacy Preservation with
Hadoop:International Advanced Research Journal in Science, Engineering and Technology Vol. 3, Issue 5,
May 2016: ISSN (Online) 2393-8021 ISSN (Print) 2394-1588
Parallel data mining on cloud computing: Institute for Studies on Recent Advance in Science and
Engineering (ISRASE) ISBN: 978-93-84935-26-99
Reviewing Potential Encryption Techniques for Ensuring Image Security: Institute for Studies on Recent
Advance in Science and Engineering (ISRASE) ISBN: 978-93-84935-26-99
An Efficient and Secured Framework for Image Authentication using Invariant Hash Function:
International Journal of Applied Engineering Research (IJAER) ISSN: 0973-4562 Vol.10
A Survey on Multimedia Data Mining and Its Relevance Today - IJCSNS International Journal of
Computer Science and Network Security, VOL.10 No.11, November 2010. 165-170. (IC=5)
Design and Analysis of DWH and BI in Education Domain- IJCSI International Journal of Computer
Science Issues, Vol. 8, Issue 2, March 2011 ISSN (Online): 1694-0814.545-551.(impact factor=0.242)
Analysis of Data Quality Aspects in Data Warehouse Systems. (IJCSIT) International Journal of
Computer Science and Information Technologies, Vol. 2 (1), 2011, 477-485.(IC=5)
Automated Data Validation for Data Migration Security, International Journal of Computer Applications
(0975 – 8887), Volume 30– No.6, September 2011. (Impact factor=0.88)
Cross Industry Survey on Data mining Applications, International Journal of Computer Science and
Information Technologies, Vol. 30 , 2011, 624-628.(IC=5)
A Survey on Recent trends, Process and Development in Data Masking for Testing- IJCSI International
Journal of Computer Science Issues, Vol. 8, Issue 2, March 2011 ISSN (Online): 1694-0814 535-
544.(impact factor=0.242)
A Study on design and analysis of web mart mining and its relevance today-IJEST, ISSN: 0975-5462,
Vol.3 No. 4 Apr 2011. (IC=5)
Sensitivity: Internal & Restricted
Design of Data Masking Architecture and Analysis of Data Masking Techniques for Testing - IJEST, ISSN:
0975-5462, Vol. 3 No. 6 June 2011. (IC=5)
Experimental Study of Various Data Masking Techniques with Random Replacement using data
volume, IJCSIS ISSN 1947-5500, Vol. 9 No. 8, August 2011.(impact factor=0.5)
A Study on Dynamic Data Masking with its Trends and Implications. International Journal of Computer
Applications 38(6):19-24, January 2012. Published by Foundation of Computer Science, New York,
USA.(impact factor=0.88)
Realistic Analysis of Data Warehousing and Data Mining Application in Education Domain Paper is
accepted in 2011 3rd International Conference on Machine Learning and Computing (ICMLC 2011)
Singapore, February 26-28, 2011(ICMLC 2011 will be published in the conference proceeding by IEEE,
which will be included in IEEE Xplore, and indexed by INSPEC, EiCompendex and ISI Proceedings; IEEE
Catalog Number: CFP1127J-PRT ISBN: 978-1-4244-9252-7 Paper ID : VIP – ICMLC-C00989-001)
NATIONAL-LEVEL CONFERENCES:
GK Ravikumar “ Overview of Artificial Intelligent Chatbot” presented NCCC'19 held at Presidency
University.
GK Ravikumar “ Intelligent Agriculture using sensing Technologies and IOT” presented NCCC'19 held at
Presidency University.
GK Ravikumar” presented NCCC'19 held at Presidency University. “ Multipoint co operative
Transmission for Virtual Reality in 5G New Radio
GK Ravikumar, “Forming Green Clouds using Green chips.” 27th State level ISTE convention at EWIT in
2013
G K Ravikumar, A Survey of Parallel Data Mining on Cloud Computing: NCETCSE-2015, Dept. of CS & E,
BGSIT, Karnataka, India
GK Ravikumar , A Survey on Big Data Architecture National Conference on Emerging Research Trends in
technology of modern Computing, Applications Dec 2013
GK Ravikumar An Overview of Emerging Software Defined Networking and Security on Big Data
Analytics National Conference on Emerging Trends in computer Science and EngineeringApril 2015
Sensitivity: Internal & Restricted
G K Ravikumar, "An Architecture of Big Data", "A State Level Technical Symposium", Acharya Institute
of Technology, Bangalore-90.Date:18th April 2013
G K Ravikumar, "Implementation of framework of Big Data", "A State Level Technical Symposium",
Acharya Institute of Technology, Bangalore-90.Date:18th April 2013
Dr Ravikumar GK, 12-Lead Compatible Low Power High Performance Electrocardiogram Machine, CSI
Convention at Reva Institute of Technology in 2013.
Dr Ravikumar GK, Design issues of 12-Lead Compatible Low Power High Performance lector car
diagram at SVIT, Bangalore in 2013
Invited Talks and Consultancy:
• Research Advisory committee member for Precedency University Bangal ore
• Research Doctoral Committee member for SJBIT- Visvesvaraya Technological University
• Research Doctoral Committee member for ACIT- Visvesvaraya Technological University
• Research Doctoral Committee member for Dr.AIT- Visvesvaraya Technological University
• Research Doctoral Committee member for M.S.Ramaiah University Bangalore
• Research Doctoral Committee member for BMSIT- Visvesvaraya Technological University:
• Involved in Research Thesis evaluation (Ph.D.) for Visvesvaraya Technological University
• Recognized Ph.D. guide for Computer Science and engineering in Visvesvaraya Technological
University
• Recognized Ph.D. guide for Computer Science and engineering in Adicunchanagiri University
• Worked as Guest Lecturer for Dr MGR University Chennai Bangalore center for PG Classes.
• Session chair for Data mining papers in national conference held in Acharya institute of
technology, Bangalore.
• Worked as corporate trainer for the new joiners for Tech Mahindra from CMC Bangalore.
• Active member for frame development for architect roles (4.2) in Wipro technologies.
• Conducted Cross Skill Programs to align the skill set for data warehouse in Wipro technologies
• Worked as Project coordinator for 8th semester students in SJBIT, Bangalore.
• Involved in theory evaluation and practical examination work from SJBIT, VTU, Karnataka.
• Guided MPhil student to complete the thesis and his findings and involved in validation work.
• Worked as HOD of CSE and ISE for more than 5 years and Involved in developing the Dept.
• Involved in setting up of all the lab for the CSE and ISE from 1st semester to 8th semester at
SJBIT
Administration. 1. Working as IT head& IP Director for Adichuncahanagiri University
2. Working as R&D head (CSE) for Adichuncahanagiri University
3. Worked as a Prof & Head of the department of CSE and ISE from Oct 2001 to Dec 2006 at SJBIT
Bangalore.
Sensitivity: Internal & Restricted
ResearchGate : Profile
Sensitivity: Internal & Restricted
Google Scholar Profile:
Professional Summary:
• Over 25 Years of Experience in Both IT and research
• Currently trained and working on RPA( Robotic process automation)
• Over 12 years of total IT experience in Analytics, development, and testing
• Over 12 years of total experience in research, teaching for research students.
• Strong understanding and experience with Data – data structures, databases & file systems,
relationships, data quality, transformations.
• Experience with Opportunity mining, Pre-sales solutioning, leading proposal teams, Getting
estimations developed, working with multiple contributing teams, bringing innovation, developing
thought-leading content and articulating clear value proposition to client
• Proposal defense, consulting, and delivery for analytics and data centric based proposals
• Account management
• Solid Retail Ecommerce Test management experience in HLS, Retail and BFSI verticals
• Experienced Solution consultant in Test Data Management, DW Testing, BI Testing, Analytics , Data
Migration, Big Data and Master Data Management
• Worked as a Pre-Sales Solutions Architect at Wipro technologies for DW/BI testing projects
(TDM,DW/BI testing, DM testing etc)
• Involved in customer presentation (K- bank) at Bangkok for TDM Project.
• Involved in Due diligence activates for TDM and Data masking project at onsite (for CVS - USA)
• Involved in Implementation of Data masking project for CVS- USA and Onshore.
• Worked as a Test Lead/Test Manager for BI/OLAP Stream for US Groceries for Wal-Mart, USA
• Worked as Test Lead/Test Manager, being responsible for Project delivery project management for
Union Bank of California. USA since Sept 2008 to June 2010.
Sensitivity: Internal & Restricted
• Worked as Principal consultant, being responsible for Project delivery project management for CVS
Health . USA since Dec 2015 to June 2017.
• Worked as Data architect , being responsible for Project delivery project management Kohl’s. USA
since Jun 2017 to May 2018.
• Worked as Test Analyst/CSM/TM for Onsite Project for Union Bank of California. USA through IGATE
Global solutions from Dec 2008 to Jul 2009.
• Worked as Test Lead being responsible for Project delivery project management for SWISS Re,
Switzerland from Dec2006 to Aug 2008
• Have got exposure to ETL(informatica 5x,6x,7x &8x) / OLAP and BI Reporting tools like Ab- Initio,
Tera Data, Data stage,Cognos, OBIEE and querying via SQL And UNIX
• Conceptual knowledge on OLAP cubes.
• Involved in E2E SIT lifecycle for Clients like Swiss re, UBOC with thorough knowledge of OSS/BSS in
Provisioning of a customer, Customer care & Order management via CRM and Billing the customers.
• Responsible for preparation of Test Strategy, Test plan, Test Summary Reports and other artifacts
• Successfully prepared and delivered the Test Objective Matrix & preparation and review of Test
cases.
• Sets standards and identifies, recommends and implements changes to enhance the effectiveness of
quality assurance strategies
• Involved in different testing phases like Sanity, Functionality, Regression, Front End GUI,
Database/Data warehouse testing, System testing, Integration testing, UAT, E2E telecom testing
• Effectively implemented and utilized test method tools (Eg: OA) in a timely manner consistent with
project deadlines.
• Excellent communication, analytical, problem solving and technical skills
Experienced and familiar with tool, process and domain like:
• Informatica ILM
• IBM Optim
• Trained in Big Data technology
• Expertise in database like oracle, MS Sql Server and DB2
• Expert in test management tool like Quality Center 10.0 and QC ALM
• Strong Project Management skills. Experienced in Microsoft Project Management 2010
• Python Scripting
• Excellent Microsoft power point presentation skills
Experience Summary:
Aug 2018 – May 2019
Role: Solutions Consultant for Data Warehouse, Test Data Management and Big Data
Description: Managing solutions team of Wipro’s Data Assurance Practice
Responsible for:
• Automation using Python programming
• Responsible for Bigdata, Hadoop and analytics solutions
• Responsible for solution architecture of the Wipro’s DCT practice
• Solution Strategy of Data Warehouse, BI, Analytics, Test Data Management testing strategy
• Response to RFP’s on BI space
• Provide leadership and vision to practice on BI testing space
Sensitivity: Internal & Restricted
• Pre-sales consultant
• Analyzing the RFP/RFI/Pro-active proposals for the scope of DW/BI, MDM, TDM
and Data Migration projects
• Proposal writing in Agile methodology, Water fall model and V model
• Estimation for DW/BI, MDM, TDM and Data Migration projects
• Resource Loading by maintaining Onsite/offshore ratio and bulge ratio
• Customer interaction and presentation for the proposals
• Due Diligence for TDM and DW/BI Projects.
• Project Management
• Managed a team of 20 members and handled several end to end deliveries.
• Played Quality Controller role for GBS IDS project.
• Initiated a Project Review Team to assess the quality of deliverables.
• Developed Project Management Reports and Metrics Reports on a monthly
basis.
• Played an active role in mentoring the Graduate Trainees.
• Delivery and project • Involved in Requirement gathering phase for major development activities for
testing activities.
• Strong in Analysis and Technical Architecture – ETL (Informatica),
Design/Analysis of Data Warehouses, Data Integration. Data migration
• Involved in Data model design for Banking, Finance and Healthcare Domain.
• Designed ETL Applications using Informatica in the following areas – Banking,
Finance and Healthcare Domain.
• Developed mappings to load Dimension & Fact tables as per STAR schema
•
• Domain knowledge
• Retail, Insurance, Banking and Finance, Manufacturing, Health Care, Energy and
Utility.
PRE-SALES HIGHLIGHTS AT WIPRO Designation: Sr. Presales Consultant for Test Data Management ,Data Migration, Data Warehouse, BI reports,
Master Data Management.
Verticals:Energy and Utility, BFSI (Banking, Finance, Security and Insurance), Health care, Communication and
Media, RCTG (Retail, Transport, Insurance), Manufacturing and Hi tech
Description:
• Solution for the RFPs
• Identifying the dependencies, risk and Mitigation plans
• Customer visits and Value propositions
• Value propositions to each sub vertical – in alignment to field ,vertical and Solution Definition
• Involving in RFP meetings with internal and external customers, Proposal Writing.
• Estimation for the scope of DCT testing and providing WBS structure
• Estimation and Resource Loading
• Due Diligence
• Customer presentations for the proposal.
• Consultant for DCT related support for internal and external customers.
Responsibilities include
Sensitivity: Internal & Restricted
Business pursuit
1.Ensure timely delivery of quality proposal/ RFI responses globally
2.Manage/ oversee strategic and complex proposals/ presentations
3.Exhibit consultative skills to help team with response stands on tricky and complex response sections/
topics in proposal/ deck context
4.Crisis / recovery management – Getting proposals/ work pieces back in track on short-notice; execution
of work pieces in short time window
5.Review and guide on pricing models, estimations and solution/ differentiator positioning
Participation in client presentations at offshore and onsite
Team management
1. Mentoring, ensuring growth and sustenance of interest at work
2. Training sessions – conduct and arrange
Practice perspective
1. Continuous refinement of DCT Knowledge repository
2. Maintenance and reporting of practice pipeline
3. Refinement/ update of existing frameworks/ concepts
4. Crafting of new frameworks/ concept
DELIVERY HIGHLIGHTS
ASSIGNMENT 1:
Customer : Charles Schwab Corporation USA
Industry : Banking
Project Name : IDW
Project Type : Quality Assurance(Testing)
Skills : Bigdata, Teradata, hadoop
Environment : Unix/Windows
Role : Test manager at offshore/onsite
Sensitivity: Internal & Restricted
A central repository IDW is built on all trading engine to build a warehouse for global reporting purpose. IDW is
designed as a platform for management reporting and decision support applications. This project utilize the tools
Talend, Informatica, Teradata, HDFS file system.
Responsibility:
• Work in Agile development life cycle and follow Test driven development
• Work in projects related to ETL(extract, transform and loading), Bigdata, Hadoop, Data warehousing
• Understanding, implementing and updating the Agile Development and Test Strategy.
• Attend daily stand ups and update scrum team about previously done tasks and next tasks planned to do
• Work with tools such as Talend studio, Informatica Powercenter, SQL Server, Teradata, Main frame and
Automation tools like Informatica DVO
• Interact with business users, source teams, gather requirements and understand business and data
requirements
• Convert business requirements into technical specifications, ensure technical specification meet the
business requirements
• Develop automation script using tools like Informatica DVO (ETL tool), Shell Script, SQL Script and BTEQs
as per the
• Attend sprint-planning session- Prioritize the story(work), estimate the story for preparing scripts, use
cases, test execution and product owner review
• Attend daily stand ups and update scrum team about previously done tasks and next task planned to do
work with tools such as talent studio, information power center, SQL Server, Teradata, Mainframe and
automation tools like Informatica DVO
• Interact with business users, source teams, gather requirements and understand business and data
requirements convert business requirements into technical specifications, ensure technical specifications
meet the business requirements
• Develop automation scripts using tools like information DVO(ETL tool), shells scripts, SQL scripts and
BTEQs as per the technical specifications, best practices and in accordance with customer standards
• Build ETL extracts, support schwab during acceptance testing and production development, participate in
release management meeting and adhere to the process of production installation and metrics reporting.
• Work with source systems to get data setup transfer mechanisms and procedures, confirm SLAs regarding
source data Define test strategies, test plans, test cycles and carries out details data level validation using
SQL/Informatica DVO tool.
• Perform history data loading and testing
• Understanding of customer’s framework, products, product technology and work practices
• Preparing complex SQL and HQL scripts for data validation in hive, Teradata and other databases
• Provide root cause analysis and recovery method from major/critical system issues.
Sensitivity: Internal & Restricted
• Develop new functions/methods to the reusable components and accelerate automated test
development process by leveraging existing automation code assets.
ASSIGNMENT 2:
Customer : Kohl’s Department Stores USA
Industry : Retail
Project Name : Kohl’s DWH
Project Type : Quality Assurance(Testing)
Skills : Bigdata, Teradata, hadoop
Environment : Unix/Windows
Role : Test manager at Onsite
The scope of work involves establishing a Quality Center of Excellence (QCoE) which will cater to the QA
requirements of Analytics and Personalization domain in Kohl’s Technology. The project involves testing
of Kohl's Data Platform, an Enterprise Data Warehouse built on Teradata; legacy DB systems like
Exadata Oracle (PL/SQL). Customer Insight appliance warehouse built on IBM Nettezza& Customer
Master Data Management warehouse built on Oracle. All the data in these systems are integrated to
Google Cloud Platform using Kohl’s owned Hadoop cluster running with the help of Google Dataproc
engine. This project involves development of Big data cloud data hub and the migration of Big data
(Hadoop) to Google cloud platform (GCP).Kohl’s Data is being visualized via Microstrategy and Tableau.
Key Projects Handled at Kohl’s
o Omni Channel Projects like Buy Online Pickup from Store, Marketplace companion, Loyalty and
Sales Hub.
▪ This project spans all across Client’s Line of Business where EDW (Teradata) is the end
point of all the data coming from various source systems. ETL tool used is Informatica
Power center and Automation with Informatica DVO.
o Informatica Upgrade v9.1 to v10.1
▪ Informatica Power center was upgraded from v9.1 to v10.1 which involves around
3000 jobs (SLA bounded) to be tested and deployed with enhanced security by
upgrading Service accounts , protegrity users etc.
o Teradata upgrade and Expansion project.
▪ Enterprise Data warehouse which is hoisted on Teradata Server was expanded and
upgraded to create Disaster recovery server to aide current production in every year’s
peak season.
▪ Supported production support team with development activity on fixing and
enhancing existing report’s metrics and attributes pointed to Teradata SLVs.
Sensitivity: Internal & Restricted
o Microstrategy and Tableau Cloud Migration and Upgrade
▪ Business Intelligence services using Microstrategy and Tableau were migrated to
Cloud and upgraded to newer versions
▪ Supported production support team with development activity on fixing and
enhancing existing report’s metrics and attributes.
o Kohl’s Digital Big data Foundation and Cloud transformation from EDW and CIA systems
▪ Data in traditional warehouse systems like Teradata, Nettezza, and Oracle are brought
in Google Cloud Hadoop Cluster for building the future of integrated Kohl’s Data
Platform.
o Kohl’s Enterprise 360` view of Customer data in Kohl’s Cloud Data platform.
▪ Building maintaining Customer Datasets in assisting to Kohl’s Consumer Marketing
campaigns using the foundational data built in Google Cloud platform.
Responsibilities
● Gather client business requirements, understand functional specification document for application
development, maintenance or enhancements; and prepare test plan document for Enterprise
Datawarehouse testing, Big data cloud testing, Business Report testing.
● Coordination with global teams for project deliverables
● Test Strategy, Planning and Estimation for Enterprise Datawarehouse projects, Hadoop projects,
Google cloud and BI Migration (Tableau &Microstrategy) projects.
● Perform Unit, Integration, System, Regression testing, UAT and Deployment Support for
implementations of various Business requirements in Big Data Cloud and Enterprise Datawarehouse
environment.
● Identify, manage, report and escalate program level testing risks and issues
● Support for Bug fixes and issues reported
● Interaction with business users for requirement clarification and resolution of problems
reported
● Provided root cause analysis and recovery method from major/critical system issues.
ASSIGNMENT 3:
Customer : CVSCaremark USA
Industry : Retail
Project Name : PBM-DWH
Project Type : Quality Assurance(Testing)
Skills : Informatica Powercenter v9.1.1
Environment : Unix/Windows
Role : Test manager at Onsite
Sensitivity: Internal & Restricted
Kohl's Enterprise Intelligence Management team comprises of Data level activities performed on Enterprise Data
warehouse and Big Data in Kohl’s, and Business Intelligence reports out of the data present in EDW and Big Data,
through Microstrategy, Tableau and Spotfire.
There are various tracks in Enterprise Intelligence Management which are namely EDW (Enterprise Data
Warehouse), Big Data,MDM (Master Data Management) and EDW-Companion (Teams which works with upstream
systems for data flow in EDW). All data level flows into EDW are taken care by EIM team and data flow into Big
Data are performed by Analytics team, where QCoE team is partneredin validating the data specific requirements.
ETL tools/softwares which are primarily used are Informatica Power Center (EDW), Azkaban (HIVE) and Airflow (Big
Query), Databases used are Teradata, Oracle, Netezza, SQL Server and DB2, Reportingtools are Microstrategy,
Tableau and Spotfire. For Big Data, widely used tools/softwares are Hive, Sqoop, Pig, Scala and Kafka. Big Data
projects are now ingesting data and migrating to Google Cloud Platform and Big Query.
CVS Caremark is the largest pharmacy health care provider in the United States. CVS Caremark provides pharmacy
services through its over 7 ,000CVS/pharmacy and Longs Drugs stores; its pharmacy benefit management, mail
order and specialty pharmacy division, Caremark Pharmacy Services; its retail-based health clinic subsidiary,
Minute Clinic; and its online pharmacy, CVS.com. CVS Caremark Pharmacy Services, one of the nation's leading
pharmacy benefit management (PBM) companies, provides comprehensive prescription benefit management
services to over 2,000 health plans, including corporations, managed care organizations, insurance companies,
unions and government entities. Claims from various adjudication engines are received and loadedinto the
database and used for reporting purpose.
● The scope of work comprised of Quality Assurance on the Datawarehousing projects which is being used
by Client’s Business Team to use and maintain their customer’s data.
● Responsibilities :
o Preparing Test strategies and approaches for projects with respect to Test data,Testcase creation and
Test scripts for validation.
o Logging Defects in the code and invloded in triage sessions with the Business team for the same.
o Prepared QA status reports for the projects involved and reported it timely to the Clients.
● Technologies Used :
o Informatica Powercenter v9.1.1
o Teradata SQL assistant
o Oracle Database.
o UNIX
ASSIGNMENT 4:
Customer : Tjx Stores USA
Industry : Retail
Project Name : TJX_ENTERPRISE_QA
Project Type : Quality Assurance(Testing)
Skills : Data stage, Netezza
Environment : Unix/Windows
Sensitivity: Internal & Restricted
Role : Test manager at offshore.
Responsible for:
As a Consultant for Data warehouse BI/OLAP application , my contributions are highlighted as below
• With the help of the brainstorming sessions with the developers/Business Analyst/Client & QA
Manager to gather requirements and to do the estimation.
• Developed test plan and conducted reviews with the QA Director.
• Developed test scripts covering the Business Requirements and conducted review of test cases with
Business and the Development team to identify the missing test scenarios
• Represented the ETL Team in the discussions providing the required clarifications and highlighting the
risks and status reports to the key stake holders
• Work allocation process, effective tracking, controlling the test activities and monitoring the progress
of the work.
• Extensively involved in ETL testing using Data stage by verification of checking Transformations for
columns Involved in Data validation, Record count validation in Mappings.
• Written SQL queries using TOAD/SQL developer to validate the data mapping
• Executed test scripts to check the data flow from Source tables to Staging Tables and Staging to Data
Warehouse tables by aggregation functions like Average, Max, Mean, Min and Sum in data retrieve
for various tables in DW
• Involved in defects logging, tracking it until closure by validating once the defect is fixed, Root Cause
analysis via HP Quality Centre.
• The Daily Status Reports are sent to the stakeholders showcasing the summary, status and issues.
Involved in Test data Preparation
• Involved in Functional, System Testing and Regression Testing.
Brief description of the project: TJX_ENTERPRISE_QA purpose in creating the Enterprise Data
Warehouse is to create a single source of truth which the company can use to support its operations and
business planning. This Project goal is to create a robust ETL framework which will maintain consistency,
flexibility, accuracy, and high performance across the entire ETL process while maximizing reusability and
ensuring repeatable processing. Achieving this goal will allow us to provide TJX with the data it needs to
support strong company growth over the next several years.
TJX_ENTERPRISE_QA which is a part of data center migration. The purpose of the TJX_ENTERPRISE_QA
Project is to consolidate core member Information to support the CRM initiatives and lay the Foundation
to support future key enterprise initiatives. It is an IT Modernization project that builds foundational
capabilities that will be used for multiple purposes and shared by more than one division
ASSIGNMENT 5:
Customer : Wal-Mart USA
Industry : Retail
Project Name : Titan-US Groceries
Sensitivity: Internal & Restricted
Project Type : Quality Assurance(Testing)
Skills : Informatica, Oracle BI (OBIEE OLAP), Informix, Oracle Database 10g, HP-Quality
Centre 10, PL/SQL Developer, TOAD, Retail - Warehouse Operations, Retail,
Environment : Unix/Windows
Role : Test manager at offshore.
Responsible for:
As a manager for Data warehouse BI/OLAP application , my contributions are highlighted as below
• With the help of the brainstorming sessions with the developers/Business Analyst/Client & QA
Manager to gather requirements and to do the estimation.
• Developed test plan and conducted reviews with the QA Manager.
• Developed test scripts covering the Business Requirements and conducted review of test cases with
Business and the Development team to identify the missing test scenarios
• Represented the ETL Team in the discussions providing the required clarifications and highlighting the
risks and status reports to the key stake holders
• where ever necessary.
• Work allocation process, effective tracking, controlling the test activities and monitoring the progress
of the work.
• Extensively involved in ETL testing using Informatica by verification of checking Transformations for
columns Involved in Data validation, Record count validation in Mappings.
• Written SQL queries using TOAD/SQL developer to validate the data mapping
• Executed test scripts to check the data flow from Source tables to Staging Tables and Staging to Data
Warehouse tables by aggregation functions like Average, Max, Mean, Min and Sum in data retrieve
for various tables in DW
• Executed test scripts to validate the reports generated through Oracle BI Reports (OBIEE OLAP /BI
tool)
• Involved in Report import check, Layout check, Graph check and Data check Involved in Report
business rules and functionality check
• Involved in defects logging, tracking it until closure by validating once the defect is fixed, Root Cause
analysis via HP Quality Centre.
• The Daily Status Reports are sent to the stakeholders showcasing the summary, status and issues.
• Involved in Test data Preparation
• Involved in Functional, System Testing and Regression Testing.
Brief description of the project:Wal-mart is implementing a Multi Tenant e-Commerce (MTEP) Platform.
ASDA, the UK subsidiary is the first customer for the MTE platform. The platform will also host a Data
Warehouse to cater to analytical and historical reporting of the data that resides in or is related to the
platform. The Data Warehouse will be extendable as and when new customers (tenants) are added to the
MTE platform, hence making it a Multi Tenant e-Commerce Data Warehouse. Wal-Mart intends to extend
this MTE platform to US Groceries named as “TITAN” implementation. As part of this implementation
Sensitivity: Internal & Restricted
both customer facing and associate facing applications viz. e-Store, CSC, Fatwire, OMS and Reporting will
be developed & localized for USA.
ASSIGNMENT 6:
Customer : Union Bank of California
Industry : Banking
Project Name : BASEL II ITG-PMO
Project Type : Quality Assurance(Testing)
Environment : Informatica, Oracle, ePartner
Role : Test manager at onsite( Los angelsCalifornia.USA)
Responsible for:
• Interacting with the client and Business user to understand the quality process to be adhered to ensure
the quality of data which will be source for future BASEL data mart.
• Designing/Developing and review of the Test cases/ Test scripts for QA process.
• Ensuring the QA activity is completed on time to ensure the quality of data which is used to generate
reports.
Brief description of the project:
UBOC Data warehouse has problem reports/tickets associated with it. This may be due to new
functionality that needs to be added, incorrect data in the warehouse or standards that the bank needs to adhere
to for certifications. The enhanced code which was standardized and to and executed under SIT. The main focus
here was to ensure the quality of data is good in SIT and went into production.
The project requires understanding the problem statement, analyzing the functionality and analyzing the
overall impact on the data warehouse. Once the analysis is complete the functionality had to be achieved using
Oracle 9.i to develop SQL test scripts to ensure the quality of data.
ASSIGNMENT 7:
Customer : Union Bank of California
Industry : Banking
Project Name : UBOC DDW SUPPORT
Project Type : Testing.
Environment : Informatica, Oracle, ePartner
Role : Test Manager
Responsible for:
• Interacting with the client and Business user to understand the problem statement, solution approach.
• Prepared Testing strategy, test plan and shared with the clients before proceeding to actual testing.
Sensitivity: Internal & Restricted
• Designing/Developing and review of test scripts.
• Ensuring the code is delivered on time with minimum defects.
Brief description of the project:
UBOC Data warehouse has problem reports/tickets associated with it. This may be due to new
functionality that needs to be added, incorrect data in the warehouse or standards that the bank needs to adhere.
ASSIGNMENT 8:
Customer : US Bank
Industry : Banking
Project Name : PowerTrack DWBI
Project Type : Testing
Environment : Informatica, Oracle
Role : Test manager.
Responsible for:
• Interacting with the onsite team to understand the problem statement, solution approach.
• Designing/Developing Testing process.
• Developing HLD and LLD for Testing process.
• Interacting and guiding the offshore team to develop the ETL specifications, Test Plans.
• Resolving all outstanding issues.
• Ensuring the designed documents/code is delivered on time with minimum defects.
Brief description of the project:
U.S. Bank’s vision to create and distribute innovative payment solutions to help companies efficiently
manage purchasing activity resulted in the creation of its Corporate Payment Systems (CPS) division in 1989. By
creating the first bank-issued corporate card for travel and entertainment (T&E) and the world’s first purchasing
card, the organization grew to process $16 billion in payments annually by 1996.
In 1996, and in response to customers who were trying to apply purchasing cards to freight payment and product
payment – a “black hole” for most companies – U.S. Bank started development of a solution (Power Track) that
would automate the entire business process associated with both a shipper’s “procure to pay” process and its
opposite, the carrier’s “order to cash” process..
The project requires understanding the problem statement, analyzing the functionality and analyzing the
overall impact on the Data warehouse. Once the analysis is complete the functionality had to be achieved using
informatica by enhancing the existing components or developing new components.
ASSIGNMENT 9:
Sensitivity: Internal & Restricted
Customer : US Bank
Industry : Banking
Project Name : PowerTrack DWBI
Project Type : Testing.
Environment : Informatica, Oracle
Role : Test Lead.
Responsible for:
• Interacting with the client to understand the problem statement, solution approach.
• Development of Data analysis report.
• Analysis on data cleansing and data quality check requirements.
• Analysis on the referential integrity issues.
• Development of data dictionary.
• Communicating with the offshore team to explain the problem statement, solution approach and
resolving their issues during design phase.
Brief description of the project:
U.S. Bank’s vision to create and distribute innovative payment solutions to help companies efficiently
manage purchasing activity resulted in the creation of its Corporate Payment Systems (CPS) division in 1989. By
creating the first bank-issued corporate card for travel and entertainment (T&E) and the world’s first purchasing
card, the organization grew to process $16 billion in payments annually by 1996.
In 1996, and in response to customers who were trying to apply purchasing cards to freight payment and product
payment – a “black hole” for most companies – U.S. Bank started development of a solution (Power Track) that
would automate the entire business process associated with both a shipper’s “procure to pay” process and its
opposite, the carrier’s “order to cash” process..
The project requires understanding the problem statement, analyzing the functionality and analyzing the
overall impact on the Data warehouse. Once the analysis is complete the functionality had to be achieved using
informatica by enhancing the existing components or developing new components.
ASSIGNMENT 10:
Customer : Union Bank of California
Industry : Banking
Project Name : UBOC PR AND CODE OPTIMIZATION
Project Type : Rework/ Maintenance
Environment : Informatica, Oracle, ePartner
Role : Test Lead.
Responsible for:
Sensitivity: Internal & Restricted
• Interacting with the onsite team to understand the problem statement, solution approach.
• Reviewing the developed code and the result for correctness.
• Resolving all outstanding issues.
• Ensuring the code is delivered on time with minimum defects.
ASSIGNMENT 11:
Customer : SWISS Re, Switzerland
Industry : Re-Insurance
Project Name : GBS-IDS
Project Type : Testing.
Environment : Informatica 7.1, DB2 UDB, UNIX
Role : Test Lead
Responsible for:
• Data validation by writing queries using DB2 UDB Development center
• Scheduling jobs in Tivoli Work Scheduler (TWS)
• Code reviews, testing and test reviews.
• Impact analysis on the new changes
• Adherence to SLA for tickets
• Identifying Project Improvements Process, Implement Quality Standards & Preparing Audit Standards.
• Played Quality Controller role for GBS IDS project.
• Drive the metrics analysis & root cause closure action (RCA & CAPA).
Brief description of the project:
This project involves maintaining of several folders which deal with various lines of businesses of Swiss-Re.
Users/Customers raise the requests in various tools like Maximo and ITSM which have to be resolved in specified
business hours. Service Level agreements (SLA) have to be met failing which attracts huge penalties. SL1 tickets
have to be resolved in 2 business hours, SL2 in 10 hours, SL3 in a day and so on. If it is a minor fix then the tickets
are raised in ITSM and if it is a Job-Jar, tickets are raised in Maximo for which estimates have to be done and final
output has to be delivered to the client within the specified period. The job ends by raising a request for migrating
the entities from development environment to production environment.
(Dr.Ravikumar GK)