Data Engineer I | Noida- Hybrid
Job Overview
-
Date PostedMarch 4, 2025
-
Company Location
-
Expiration dateApril 5, 2025
-
Experience2-5 Years
-
GenderBoth
-
Company NameC2FO
Job Description
Do you want to help transform the global economy? Join the movement disrupting the financial world and changing how businesses gain access to the working capital they need to grow. As the largest online platform for working capital, we serve over one million businesses in 160 countries, representing over $10.5 trillion in annual sales. Headquartered in Kansas City, C2FO has more than 600 employees worldwide, with operations throughout Europe, India, Asia Pacific, and Australia.
Here at C2FO, we value the quality of our technical solutions and are passionate about building the right thing, the right way to best solve the problem at hand. But beyond that, we also value our employees’ work-life balance and promote a continuous learning culture. We host bi-annual hackathons, have multiple book clubs focused on constant growth, and embrace a remote-first working environment. If you want to work where your voice will be heard and can make a real impact, C2FO is the place for you.
About The Position
Data is the lifeblood of any technology company. At C2FO, we know the best way to ensure our success is to provide our decision-makers with the most accurate, most relevant data possible.
Our Data Engineers use cutting-edge, open-source technologies to collect, process, and store the company’s data. They also work closely with talented scientists, engineers, and data experts to solve complicated long-term financial and working capital problems for clients in different industries, including Retail, Technology, and Health Care.
As a Data Engineer, you will be responsible for building and maintaining a scalable, reliable, and secure data infrastructure. You will work with stakeholders across the organization to design, develop, and maintain ETL pipelines, data models, and data lake architectures. You will also ensure the quality and accuracy of the data, and collaborate with data analysts, data scientists, and other teams to deliver data-driven insights and solutions.
Essential Duties
Work closely with Software Engineers, Data Scientists, Data Operations, and Business Analysts to meet the company’s data storage, access, and analysis needs.
Develop and implement processes for data cleaning, ensuring data quality and integrity.
Collaborate with the Development and Operations teams to deploy and maintain clustered computing across multi-cloud environments.
Monitor, maintain, and enhance existing data pipelines to ensure reliability and data integrity.
Maintain documentation of data infrastructure, processes, and data dictionaries.
Work independently on logically complex tasks with some external dependencies while tracking and reliably pushing the work through the process.
Take ownership of the codebase they work in and contribute to any required improvements.
Solicit feedback from peers, teammates, and managers to identify improvement areas and take steps to learn and grow.
Follow defined engineering processes and share new tools or processes to help the team be more collaborative, effective, or efficient.
Stay updated with industry trends and emerging technologies to improve data infrastructure and processes continuously.
Basic Qualifications
Bachelor’s degree in computer science or a related field
2-3 years of relevant engineering experience
Some relevant experience in Data Engineering space
Concerned with the success of their team
Respectful towards teammates regardless of their abilities
Able to work in a highly collaborative software development environment
Curious to understand the work’s problem space and ‘why.’
Passionate about testing, code quality, and continuous integration
Persistent while facing roadblocks, dispatching them efficiently, and pulling in others as necessary.
Comfortable with source control, especially git
Experience with cloud platforms – AWS or GCP
Necessary Experience on:
SQL, and Query Optimization.
Scala/Python.
Hands-on experience with Apache Spark or similar big data tools.
Preferred Qualifications
Passionate about solving problems for a fast-paced FinTech company
We would be delighted to hire someone who has knowledge or some experience with:
Experience with databases like Redshift, Postgres, or similar tools.
Understanding of data warehousing concepts and methodologies
Orchestration tools like Airflow etc.
Familiarity with DevOps tools such as K8s, GitHub actions, or Docker
Familiar with Agile methodologies such as Scrum or Kanban
Benefits
At C2FO, we care for our customers and people – the vital human capital that helps our customers thrive. That’s why we offer a comprehensive benefits package, flexible work options for work/life balance, volunteer time off, and more. Learn more about our benefitshere. (https://www.c2fo.com/amer/us/en-us/about-us/careers)
Commitment To Diversity And Inclusion
As an Equal Opportunity Employer, we value diversity and equality and empower our team members to bring their authentic selves to work daily. We recognize the power of inclusion, emphasizing that each team member was chosen for their unique ability to contribute to the overall success of our mission. We aim to create a workplace that reflects the communities we serve and our global, multicultural clients.
We do not discriminate based on race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status, or any other basis covered by appropriate law. All employment decisions are based on qualifications, merit, and business needs.