وصف الوظيفة
As a Data Engineer, you will design, build, and maintain the company’s data pipelines and architecture, ensuring data is available and reliable for analytics and operational needs. You will work closely with data scientists and business analysts to optimize data workflows and support data-driven decision-making across the organization.
Key Responsibilities
Data Pipeline & Architecture:
- Design and implement scalable and efficient ETL pipelines for processing large datasets.
- Optimize data storage and retrieval systems to handle high-volume financial transactions and user data.
- Develop and maintain data models that support analytics and reporting needs.
Collaboration & Support
- Collaborate with engineering teams to ensure data solutions align with product and operational requirements.
- Provide support for data-related queries and troubleshoot data issues in production environments.
- Collaborate with product teams to ensure updates and new features on the app are reflected in the data and available for monitoring and reporting.
Data Quality & Security
- Implement data quality checks and monitoring systems to ensure the accuracy and integrity of financial data.
- Enforce data security protocols to protect sensitive information and ensure compliance with regulations.
- Continuously review and improve data management practices and infrastructure.
Innovation & Optimization
- Stay current with industry trends and technologies to enhance data processing and analytics capabilities.
- Identify and implement improvements in data engineering practices to boost performance and efficiency.
- Contribute to the development and maintenance of data documentation and best practices.
Qualifications
- 2+ years of experience in data engineering and ETL development.
- Advanced proficiency in SQL and Python
- Experience with data warehousing solutions, big data technologies, and ETL tools.
- Strong analytical skills with a background in developing data models and integrating analytics.
- Solid foundation in relational and non-relational databases, as well as writing efficient algorithms on large datasets.
- Knowledge of data security practices and regulatory requirements.