12/06 Vishal Sawant
HR at HDFC Bank

Views:313 Applications:36 Rec. Actions:Recruiter Actions:0

HDFC Bank - Project Manager - Pentaho/Hadoop (3-7 yrs) Premium

Mumbai Job Code: 454414

Job Purpose : 


- Be an ETL Pentaho Developer on Hadoop system. Responsible for creation of an end to end solution in Pentaho for cross sell programs of the Bank. 


- The data used for these programs would include very large datasets from varied systems and occasionally from third parties. To Redesign, restructure and migrate all the Retail Assets campaigns business processes and jobs from the existing system to Pentaho running on Hadoop. 


- Manage, execute and optimize all the migrated processes during and between campaigns. Post migration do intense analytics and research in the same for customer level profiling, finding evaluating new insights and opportunities for the Retail Assets campaigns, create new use cases in the Big Data Platform having a direct impact on the bank's business

Job Responsibilities(JR) : 6 - 8 Areas Actionable (4-6)

JR 1 :

- Understand the process of all the Retail Assets campaigns and migrate all the redesigned processes using Pentaho on the Hadoop system

- Manage, execute and optimize all the migrated processes during and between campaigns

- Post migration do intense analytics and research in the same for new insights for the campaigns like customer level symmetry, customer level profiling etc.

- Create new use cases in the Big data Platform having direct indirect impact on the Bank's business

 JR 2 :


- Work on ad-hoc projects in Pentaho on the Big Data platform

- Execute and optimize technical processes on an ongoing basis to reduce execution time and HADOOP Infrastructure utilization.

- Automate activities to ensure minimal manual intervention.

- Strong technical knowledge of ETL and RDBMS

- Experience in writing database functions, procedures and troubleshooting.

- Experience in Visualization, Linux and database technologies.

- Data Extraction, Data Manipulation and Data Processing and analysis.

Educational Qualifications (examples listed below) Key Skills(examples listed below) :

- Graduation: Engineering/BCA/BCS

- Post-Graduation: Engineering/MCA/MCS/MBA

- Strong technical knowledge of ETL and RDBMS

- Experience in trouble shooting, maintaining and supporting the setup

- Experience on virtualization, Linux and other database technologies

- Has some amount of Hadoop Exposure

- Working experience in any RDBMS (PostgreSQL, Oracle, MSSQL, MySQL)

- Experience in writing Database Functions/Procedures

Experience Required (examples listed below) :

- 2 -4 years of experience in Development role in ETL Pentaho or Talend or Informatica Big Data Management or Informatica Power Centre is a must

- Having exposure, knowledge, experience of Big Data, Hadoop is desired

- Major Stakeholders(intra team and cross functional stakeholders, who would need to be interacted with for discharging duties) (examples listed below)

- RISK ANALYTICS UNIT

Add a note
Something suspicious? Report this job posting.