• Design and maintain efficient data architectures that meet business objectives and ensure reliable access to data.
• Collaborate closely with teams across various functions to identify data needs and provide scalable data solutions.
• Develop ETL processes for the extraction, transformation, and loading of data, maintaining high standards of data quality and integrity.
• Enhance data storage and access methods to achieve better performance and efficiency.
• Actively monitor and address issues related to data pipeline performance, applying necessary corrective actions.
• Produce thorough documentation for data workflows and system architecture.
• Bachelor’s degree in Computer Science, Engineering, or a related field.
• 3+ years of experience in data engineering or related roles.
• Proficiency in programming languages such as Python, Java, or Scala.
• Solid experience with SQL databases and NoSQL technologies, such as Cassandra or MongoDB.
• Familiarity with data warehousing solutions and big data technologies (e.g., Hadoop, Spark).
• Strong analytical skills and attention to detail.