Data Engineer
Job Description
As a Data Engineer, you will be responsible for developing and maintaining robust data pipelines and systems that facilitate the collection, storage, and processing of large datasets. You will work closely with data scientists, analysts, and other engineering teams to ensure that data flows seamlessly and efficiently across the company’s infrastructure. Your role will be pivotal in ensuring that the data infrastructure is scalable, reliable, and supports the needs of various business applications and stakeholders.
You will be responsible for leveraging best practices in data engineering, including data transformation, automation of ETL processes, and performance optimization. The ideal candidate will have experience working with big data technologies, cloud platforms, and relational and non-relational databases, and will have a passion for data and analytics.
Responsibilities
- Design, develop, and maintain data pipelines, ETL processes, and data integrations to support analytics and data warehousing.
- Build scalable and reliable infrastructure for data extraction, transformation, and storage from multiple internal and external sources.
- Work with stakeholders to define data requirements and develop solutions that meet business needs.
- Optimize data workflows and pipelines to improve performance, cost-efficiency, and scalability.
- Monitor data quality and integrity, ensuring clean, consistent, and timely data availability.
- Collaborate with data analysts and data scientists to ensure data solutions align with analytical requirements and support predictive modeling and reporting.
- Implement data security and compliance measures, ensuring sensitive information is protected.
Preferred Qualifications
- Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field.
- At least 3 years of experience in data engineering or related fields.
- Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, Microsoft SQL Server).
- Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud Platform.
- Experience with data warehousing solutions such as Redshift, Snowflake, or BigQuery.
- Strong programming skills in languages like Python or Scala.
- Understanding of data modeling, ETL processes, and data architecture principles.
- Excellent problem-solving and debugging skills with a detail-oriented mindset.