Job Location
the woodlands, TX
Date posted
August 4, 2025
Valid through
August 9, 2025
Position title
AWS Data Engineer
Description

We are seeking a highly skilled AWS Data Engineer with a strong background in Red Hat Linux and expertise in building ETL pipelines to support cloud data migration initiatives. The ideal candidate will have hands-on experience with AWS Migration Services and will play a critical role in designing, developing, and maintaining scalable data integration solutions in a secure and reliable cloud environment.

Responsibilities

·         Design, build, and optimize robust ETL pipelines for data migration and integration using AWS services such as AWS GlueData Migration Service (DMS)Lambda, and Step Functions.
·         Work with AWS Migration Services to migrate on-premises data sources to AWS cloud-based data lakes and databases.
·         Administer and troubleshoot Red Hat Linux environments as part of the data pipeline infrastructure.
·         Collaborate with data architects, cloud engineers, and stakeholders to define data migration requirements and strategies.
·         Ensure data quality, data governance, and security standards are enforced throughout the ETL lifecycle.
·         Monitor and maintain data workflows to ensure performance, scalability, and fault tolerance.
·         Automate and orchestrate data workflows using scripting (Python, Bash) and cloud-native tools.
·         Support and troubleshoot production data pipelines and perform root cause analysis on failures.

Qualifications

·         13+years of IT and 5+ years of experience in Data Engineering or ETL development.
·         3+ years of hands-on experience with AWS services, including:
o    AWS Glue, AWS Lambda, AWS DMS, S3, Redshift, EMR, and Step Functions.
·         Proficiency in Red Hat Linux administration, shell scripting, and system troubleshooting.
·         Experience in designing and implementing ETL solutions for cloud migration projects.
·         Strong proficiency in PythonSQL, and cloud-native data processing frameworks.
·         Solid understanding of data warehousingdata lakes, and cloud-native architectures.
·         Familiarity with DevOps practices and tools such as CloudFormation, Terraform, or CI/CD pipelines is a plus.

Contacts

Regards,

Dhiva

813-906-5201

Close modal window

Thank you for submitting your application. We will contact you shortly!