Data/Data Ops Engineering
Job Overview
-
Date PostedFebruary 11, 2026
-
Company Location
-
Expiration dateMarch 13, 2026
-
Experience5-10 Years
-
GenderBoth
-
QualificationB.Tech (Bachelor of Technology), B.E. (Bachelor of Engineering)
-
Career LevelManager
Job Description
Experience : 5 to 8 yrs
Location : Remote
Contract Duration : 12 months , Time: 2:30 PM – 10:30 PM IST
Skills and Responsibilities Needed :
Must-Have Experience & Skills
Technical/Product
● 4+ years of relevant Data Engineering/Ops experience
● Strong experience with ETL/ELT tooling (e.g. Dataflow, BigQuery, Cloud Composer, Cloud
Functions)
● Proficiency in Python for data manipulation and ingestion logic
● Familiarity with PostgreSQL for relational data models and metadata handling
● Experience working with APIs (file systems, messaging tools, enterprise platforms)
● Strong proficiency with orchestration tools like Airbyte or Cloud Composer
● Ability to design and document data mapping and tagging logic
● Understanding of observability best practices: logs, alerts, health metrics
● Familiarity with staging/operational/analytical data environments
Responsibilities
● Pipeline Scaling: Extend and automate ingestion connectors for diverse sources (email,
transcripts, files, conversational tools) using tools like Airbyte or Cloud Composer.
● Data Quality & Governance: Own data quality validation, schema alignment, and error
monitoring. Ensure data models support RBAC, analytics, and AI retrieval.
● Metadata Strategy: Maintain standardized metadata and traceability from source → insight
to unlock better retrieval and client analytics.
● Observability: Support DevOps in implementing usage-based diagnostics, logging, and
alerting for ingestion health.
● Collaboration: Work hands-with domain experts to align taxonomy and business logic.