Data Integration Lead/Developer (Talend & Hive)

22 March 21

PiMARQ JD Code: 075

Role Name: Data Integration Lead/Developer (Talend & Hive)

About Company: Our client is envisioned as a platform that can deliver analytical business insights while keeping the complexity out. They help develop and integrate business analytics, the key ingredient in any digital transformation, on a ready-to-use data lake platform supported by a range of managed and concierge services.

They have an amazing team that brings tremendous depth in the data warehouse, business intelligence and analytics area with experience ranging from 15 years to 25 years. A majority of their team members have worked for large IT consulting companies such as Capgemini, DXC and large corporations such as GM and Unilever.


  • Talend and Hive experienced Data Integration Lead or Developer

Core Skills & Competencies:

  • Cloud-native: Strong experience working in Azure Based Bigdata environment – ADLS (Data lake services), Azure SQL DB and Datawarehouse
  • Have working knowledge of Hive – HiveQL, Design, Optimisation etc.
  • Good experience in SQL scripting and Programming
  • Spark: Use the Spark framework in and with Talend – develop and configure a Big Data batch Job and troubleshoot ETL (Databricks, Apache)
  • Perform performance Tuning and Administration to help build end to end.


Candidates should have:

  • Experience designing, developing, and implementing solutions to extract, transform and load (ELT/ETL) data into Target Systems (Hadoop system or Cloud Native).
  • Experience in Talend Components (database, file, API, OS system calls, etc), Context management, Git Deployment
  • Hands-on experience connecting and processing data from application sources to target reporting layer using Talend ETL / Bigdata components
  • Number of Projects: Candidate should have worked on at least two real-world projects in the Technical areas of need.
  • Lifecycle: Should have experience Analysing business requirements, Design, Build and Deploy complex jobs to load data into data lake.
  • Deployment: Good knowledge of cloud deployments of BI solutions in Azure cloud eco-system.
  • Methodology: Should have experience in Agile projects
  • Devops: Candidates to have working knowledge of Devops based service delivery – specifically GitHub Integration
  • Candidates who have worked on Migration (Data, Process) projects will be a plus.
  • Testing and Acceptance: Candidates should have experience Testing using and reconciling data between systems.
  • Candidates should have the ability to work in a team environment, be able to work remotely with minimum supervision and should have strong work ethic.
  • Willing to work for Global clients (e.g. North America based client) from a time and distributed working perspective. developer

Location: Work from Home

Experience: 4-8 years

Job Category: Data Integration Lead/Developer (Talend & Hive)
Job Type: Full Time
Job Location: Work from Home
Experience Required: 4-8 years

Apply for this position

Allowed Type(s): .pdf, .doc, .docx