BigQuery Engineer


Job Title: Data Engineer

Location: Hyderabad, India

Reports To: India Leadership Team

Job Overview:

              We are seeking 3-6 years experienced Data Engineer to design and implement our data warehouse.

The ideal candidate will have a deep understanding of modern data architectures, including Data Warehousing, Data Lakehouse, and Medallion architecture.

With over 4 to 6 years of experience in the field, you will be responsible for data engineering activities including data warehousing and reporting.

You must possess a strong background in SQL, GCP/Google BigQuery and other data platforms, and have hands-on experience with ETL pipelines using SQL stored procedures/Python/Spark/Google Dataflow.

Additionally, you will be proficient in using analytics tools such as Google Looker and have a proven track record of data governance, AI/ML integration, and data modeling.

 

Key Responsibilities:

Data Warehousing: Implementation of data warehousing, Data Lakehouse, and Medallion architectures to support our business needs.

Data Modeling: Develop and maintain data models that align with business needs, ensuring consistency across the organization. Collaborate closely with stakeholders to understand their data requirements and translate them into effective models.

Data Migration & ETL: Oversee the design, implementation, and monitoring of data migration strategies. Develop and maintain ETL pipelines using SQL stored procedures/Python/Spark/Google Dataflow to ensure data is accurately and efficiently extracted, transformed, and loaded into our data platforms.

ETL Scheduling & Monitoring: Implement and manage ETL scheduling using tools like Google Dataflow. Develop monitoring mechanisms to ensure the reliability and performance of ETL processes.

Data Dictionary & Metadata Management: Develop and maintain a comprehensive data dictionary that includes metadata for all datasets. Ensure that the data dictionary is up-to-date and accessible to all stakeholders.

AI/ML Data Engineering: Collaborate with data scientists and engineers to integrate AI/ML capabilities into our data architecture. Ensure that the integration is seamless, scalable, and meets business requirements.

Data Governance: Adhere to data governance policies and procedures to ensure the quality, integrity, and security of our data. Collaborate with stakeholders across the organization to enforce these policies.

Analytics Tools: Leverage Google Looker or similar analytics tools to provide insights and reports that drive business decisions. Ensure that these tools are integrated with our data architecture.

Collaboration & Communication: Act as a key liaison between the business and technical teams, ensuring that data needs are understood and addressed. Communicate complex technical concepts to non-technical stakeholders in a clear and concise manner.

Qualifications:

Education: Bachelor's degree in Computer Science, Information Systems, or a related field.

Experience:

4+ years of experience in data architecture, with a strong focus on Data Warehousing, Data Lakehouse, and Medallion architectures.

Proven experience in designing and implementing data migration strategies using SQL, Google BigQuery and similar platforms.

Hands-on experience with ETL pipelines using SQL stored procedures/Python/Spark/Google Dataflow.

Experience with analytics tools such as Google Looker and data governance practices.

Skills:

Proficient in SQL, Google BigQuery or similar data platforms.

Experience with ETL scheduling and monitoring using tools like Google Dataflow/Composer or equivalent orchestration layer.

Experience in Partitioning and Clustering.

Experience in archiving old data.

Real-time data transformations using streams or API using Pub-Sub.

CI/CD deployment setup and troubleshooting experience.

Good Knowledge of DAGs

Strong understanding of data modeling principles and best practices.

Experience with AI/ML integration into the data architecture.

Excellent communication and collaboration skills, able to work effectively with cross-functional teams.

Apply