Search Jobs

Informatica Developer

This requirement is valid till 24-09-2019

Required Skills:

Location:  Kochi
Domain:  Insurance
Qualification:  Any Degree
Experience:  3-5 yrs.
Job Description with Roles & Responsibilities: 
    The person will be primarily responsible for Creating and maintaining data streams and processes within the Hadoop and EDW platforms Profile and analyze various data sources

    Design and implement data ingestion strategy Design develop code test and debug complex new code packages or make significant enhancements to existing code packages

    Interact with business users to understand and document the requirements

    Required Skills

    Demonstrated experience with the Hadoop ecosystem Hive Pig MapReduce Spark etc

    Demonstrated experience with ETL tools like Informatica

    Experience on scheduling tools like Tivoli or Autosys

    Development experience in one or more of the following languages Java Python Scoop and Scala

    Demonstrated experience with databases like Teradata DB2 Oracle

    Good understanding and experience with working in Hadoop and Big data technologies

    Good knowledge of database structures theories principles and practices

    Good knowledge of Unix UNIX principles and practices

    Good knowledge of UNIX Scripting theories principles and best practices Loading from disparate data sets

    Designing building installing configuring and supporting Hadoop Translate complex functional and technical requirements into detailed design Write complex scripts in Unix

    Perform analysis of vast data stores and uncover insights Maintain security and data privacy

    Create scalable and high performance web services for data tracking High speed querying

    Managing and deploying HBase Being a part of a POC effort to help build new Hadoop clusters

    Test prototypes and oversee handover to operational teams

    Propose best practices standards INFORMATICA

    Good Experience of Informatica theories principles and best practices

    2 to 4 years of relevant experience Data Integration projects 2 years of experience in Informatica consulting

    Application knowledge AWS Azure Force com Google is a plus Informatica PowerCenter PowerExchange connectors IDQ B2B Salesforce

    Expertise in Features of Various Informatica Editions Experience in Unix and other Shell Scripting

    Experience in Hadoop and its Dbs

Exp : 5 years TO 10 Years

Location : Banglore

Duration : 6 Months

Job Type : contract

Notice:Immediate Joiners Perferable

- The ETL developer is responsible for creating and managing ETL processes that load data from various sources into the enterprise data warehouse.

- The candidate would participate in design, development, testing, execution and quality assurance of both automated and manual data integration and exchange processes.

Position Responsibilities :

- Design and develop enterprise data warehousing and ETL solutions.

- Translate business requirements into ETL designs and mapping specifications.

- Create ETL mappings using Informatica PowerCenter.

- Provide technical and business knowledge support to the team.

Minimum Requirements :

- 5+ years of ETL & Informatica Power Center experience,Should have worked in Informatica 9.x and above

- 5+ years of SQL experience,Should have worked in Oracle database in recent projects.

- 5+ years of Unix Shell Scripting experience,

- 2+ Experience with working with ERPs (SAP, E1)

Nice to Have :

- Manufacturing or Life Science Domain exposure, Finance/GL data - Exposure to ERPs (SAP, E1, Oracle), at least interfacing with them

- Exposure to other Data Integration tools such as OBIEE.

- Exposure to other databases such as SQL Server.

- Exposure to Cognos Reporting.

- Exposure to Oracle Apex

- Exposure to Hyperion/FDM. Analytical/Soft Skills Should have great communication skills, Should have great analytical skills,


Login to Apply