Data Engineer GCP
Remote
My client are looking for a Data Engineer to join their rapidly growing BI team. You will partner analysts, engineers and business stakeholders to deliver well-defined, transformed, tested, and code-reviewed data sets to the organization. This position is responsible for analysing data and collaborating with business analysts and developers to translate data requirements into logical data models, develop ETL/ data integration as needed and the continued evolution of data warehouses, data stores and data marts.
Open for US based remote roles.
Key Responsibilities
- Apply engineering best practices to design, build, and maintain critical data pipelines to ensure operational reliability and modularity.
- Design and develop source to target mapping to support data transformation process.
- Design highly scalable ETL processes or data transformations.
- Building ETL/ELT pipelines with tools like Airflow, DBT, Google DataFlow and FiveTran.
- Database design, data modeling and warehouse architecture.
- Programming with SQL, Shell Scripting, and Python.
- Working with SQL databases such as PostgreSQL, Snowflake, MySQL, and BigQuery.
- Utilising modern BI/Analytics tools such as Looker, Tableau, Redash etc.
- Design and develop SQL functions, stored procedures and views, help develop reporting data mart, provide support to the reporting team.
- Responsible for the integrity and quality of data and information stored and flowing between business applications, the enterprise data warehouse, data marts and business intelligence applications.
- Advising on new and improved data engineering strategies leading to improvement in data governance across the business, promoting informed decision-making, and ultimately improving overall business performance.
Qualifications
- Bachelor’s or Master’s degree in computer science or equivalent field.
- Minimum 2 years of experience with data warehouse methodologies (Star/Snowflake Schema, OLAP, Dimensional modeling etc.)
- Minimum 2 years of experience in writing extensive SQL functions and procedures and optimization techniques
- Experience in data analysis, data modelling, data cleansing and reporting
- Experience with Data Extraction, Transformation and Loading (ETL) using Talend and similar platforms.
- Preferably experience working with Cloud Data Platform – GCP
- Experience with programming languages (Python preferred)
- Experience with cloud platform services desired (GCP preferred)
- Scrum/Agile project knowledge with use of Jira and Confluence is a plus
Knowledge, Skills, And Abilities
- Proven understanding of the full software development lifecycle including testing, continuous integration and deployment
- Excellent problem solving, critical thinking, and communication skills
- Self-motivated, proactive and solution-oriented
- Commitment to the successful achievement of team and organizational goals
- Excellent interpersonal and collaborative skills
- Nice to have: Experience in handling semi / unstructured data and related tools & technologies.
As a business they offer a really great benefits package, including Healthcare, Dental, Unlimited PTO policy, 401(k) matching, Life insurance and discounts on products.
The rapid growth of this organisation means that they are looking to hire as soon as possible, so if you believe you are a good fit for this role, please apply immediately.