Job Description
Cleans, prepares, and optimizes data for further analysis and modelling. Designs, develops, optimizes, and maintains data architecture and pipelines that adhere to Data Pipeline (ie ELT) principles and business goals.
Roles and Responsibilities
- Designs, develops, optimizes, and maintains data architecture and pipelines that adhere to ELT principles and business goals.
- Solves complex data problems to deliver insights that helps business achieve its goals.
- Creates data products for engineers, analysts, and data scientist team members to accelerate their productivity.
- Engineer effective features for modelling in close collaboration with data scientists and businesses.
- Leads the evaluation, implementation and deployment of emerging tools and process for analytics data engineering to improve productivity and quality.
- Partners with machine learning engineers, BI, and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.
- Fosters a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.
- Advises, consults, mentors, and coach other data and analytic professionals on data standards and practices.
- Develops and delivers communication and education plans on analytic data engineering capabilities, standards, and processes.
- Learns about machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics as necessary to carry out role effectively.
Required Skills
- 5-10 years of experience required.
- Experience with designing and maintaining data warehouses and/or data lakes with big data technologies such as Spark/Databricks, or distributed databases, like Redshift and Snowflake, and experience with housing, accessing, and transforming data in a variety of relational databases.
- Experience in building data pipelines and deploying/maintaining them following modern DE best practices (e.g., DBT, Airflow, Spark, Python OSS Data Ecosystem).
- Knowledge of Software Engineering fundamentals and software development tooling (e.g., Git, CI/CD, JIRA) and familiarity with the Linux operating system and the Bash/Z shell.
- Experience with cloud database technologies (e.g., Azure) and developing solutions on cloud computing services and infrastructure in the data and analytics space.
- Basic familiarity with BI tools (e.g., Alteryx, Tableau, Power BI, Looker).
- Expertise in ELT and data analysis, SQL primarily.
- Conceptual knowledge of data and analytics, such as dimensional modelling, reporting tools, data governance, and structured and unstructured data.
Education Qualification:
- Bachelor’s degree in computer science, statistics, engineering, or a related field.
Good to have
- Candidate with Mexico work visa can also apply