Data Engineer Lead - Vietnam
Apply now »Date: Dec 26, 2025
Location: Hanoi, Hanoi, VN
Company: sistemasgl
At Globant, we are working to make the world a better place, one step at a time. We enhance business development and enterprise solutions to prepare them for a digital future. With a diverse and talented team present in more than 30 countries, we are strategic partners to leading global companies in their business process transformation.
We seek a Data Engineer Lead who shares our passion for innovation and change. This role is critical to helping our business partners evolve and adapt to consumers' personalized expectations in this new technological era.
You will get the chance to:
- As a Data Engineer Lead, you will be domain agnostic; hence, you'll be able to work in different domains like finance, pharmaceutical, media and entertainment, manufacturing, hospitality, and so on. You will get the exposure to the following:
- Lead a Data Engineer team from multi-locations
- Get an opportunity to work on different projects, deal with ELT/ETL jobs, create complex SQL queries, and create analytical tables as per the requirements of the Data Science team.
- Contribute to innovative accelerators and develop tools, methodologies, and frameworks that can be used to accelerate the development of data science solutions.
- Build and maintain data pipelines that support data movement, deployment, and monitoring, ensuring that they meet the highest standards for quality and reliability.
- Hands-on experience on the Google Cloud platform which includes exposure to BigQuery, GCS, DataFlow, Cloud Functions, Cloud Composer, and Cloud Scheduler.
- Design and develop multiple POCs/POVs for existing customers or prospective leads. Preserving the knowledge through this research and innovation and utilizing it to enhance the overall capability of the Artificial Intelligence Studio.
- Proactively interacts with the client and takes important technical decisions regarding design and architecture. Establish and maintain relationships with clients, acting as a trusted advisor and identifying opportunities for new or expanded business.
- Agree on scope, priorities, and deadlines with the project managers.
- Describe problems, provide solutions, and communicate clearly and accurately.
- Assure the overall technical quality of the solution.
- Defining metrics and setting objectives in multiple complex tasks.
What will help you success:
- Good communication, client facing and stakeholders management skills
- Experience in leading/managing a team
- A strong base in SQL along with Data Warehousing concepts is mandatory. Strong experience in advanced data pipelines and data analysis techniques. You should be well versed with dbt (Data Build Tool) as a data transfer tool in the BigQuery.
- An expert professional with a lot of zeal to learn and explore new methodologies. You must work in a collaborative environment and come up with innovative ideas to continuously improve the solutions/models. Candidates must be willing to explore and research newer areas/technology/algorithms and look to continuously improve the models.
CORE TECHNICAL SKILLS
- Programming: Strong programming skills in Python for data manipulation and pipeline development, and Expertise in SQL for data querying and analysis.
- Data Pipeline Architecture: Designing, building, and maintaining robust, scalable, and fault-tolerant data pipelines to handle near real-time and batch data processing.
- Cloud Platforms: GCP services like Cloud Storage, BigQuery, Pub/Sub, Dataflow, Cloud Functions, Cloud Scheduler, and Cloud Composer.
- Data Processing Frameworks: Proficiency in Apache Beam, Spark, or similar frameworks for batch and stream processing.
- Data Modeling: Ability to design efficient data models for storing and querying video and image data.
- Metadata Management: Understanding of metadata standards and tools for managing data lineage and quality.
- Data Quality: Implementing data validation and cleaning processes.
- Performance Optimization: Identifying and resolving performance bottlenecks in data pipelines.
- Monitoring and Alerting: Setting up monitoring and alerting systems to track pipeline health and performance.
- The candidate should know the basics of Big Data and similar tools of the Hadoop ecosystem.
- Should be aware of terminologies such as SCD, Star/Galaxy Schemas, Data Lake, Data Warehouse, etc.
- Good to have an understanding of Data Governance and Data Security.
Filled in Hanoi/Danang #LI-Hybrid or Ho Chi Minh #LI-Remote
Create with us digital products that people love. We will bring businesses and consumers together through AI technology and creativity, driving digital transformation to impact the world positively.
We may use AI and machine learning technologies in our recruitment process. Globant is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability, veteran status, or any other characteristic protected by applicable federal, state, or local law. Globant is also committed to providing reasonable accommodations for qualified individuals with disabilities in our job application procedures. If you need assistance or an accommodation due to a disability, please let your recruiter know.
Final compensation offered is based on multiple factors such as the specific role, hiring location, as well as individual skills, experience, and qualifications. In addition to competitive salaries, we offer a comprehensive benefits package. Learn more about life at Globant here: Globant Experience Guide.
Job Segment:
Pharmaceutical, Cloud, Machinist, Data Analyst, Data Modeler, Science, Technology, Manufacturing, Data