Roles and Responsibilities:

•Over all 7+ years of exp and minimum 5+ years of exp in BDM/BDE related development activities

•Performs analysis, design and development of ETL processes to support project requirements.

•Develop Informatica mappings, SQL/stored procedures as well as data maps for PowerExchange, or Unix shell scripts.

•Develop Sqoop scripts to extract data to/from RDBMS to Hadoop.

•Develop Hive tables and queries.

•Develop Spark jobs in (Scala/Python/Java) in order to stream / publish or consume data from Hadoop.

•Performs unit testing, QA, and work with business partners to resolve any issues discovered during UAT.

•Technical Skills in order of priority: Informatica BDM, Hadoop Programming (Sqoop, Hive, Spark), Strong SQL, Mainframe (JCL/ESP) , Teradata Experience

•Debugging the Spark or Blaze program error.

•Experience in Oracle, MSSQL Server and/or DB2

•Experience in writing script using Shell and Python

•Experience in writing Scala programming as Spark jobs convert the mapping to scala progress and any error code can be debugged using it

Technical exp:

Informatica Power Center, Informatica BDM, AWS, Hadoop, Hive, Sql Server, Oracle, shell scripting, python, scala programming

Close Menu