Job: AWS Data Engineer

Published: May 4, 2025
Category: Job

Amidel is an Information Technology and Business Consulting Company that provides highly specialised solutions to large and small enterprises in both the private and public sectors.

We are seeking a highly motivated and experienced Intermediate/Senior AWS Data Engineer. The ideal candidate should have a strong background in C# or Python programming and building data pipelines on AWS, especially with AWS Glue Jobs using PySpark or AWS Glue Spark. This role offers an exciting opportunity to collaborate with leading financial institutions, contributing to the design and implementation of data pipelines. The candidate should have at least the AWS Certified Data Engineer – Associate certificate.


Job Family: Data Engineering/Computer Science/Finance

Minimum Experience:
5 Years Python/C# Development
3 Years AWS Data Engineering

Qualifications:
Bachelor’s degree in Computer Science, Information Systems, or related field.
Advantageous: AWS Certified Machine Learning – Specialty Certificate

Contract Type:
Permanent


Duties and Responsibilities

Responsibilities differ across client engagements but may include:

  • Creating data models that can be used to extract information from various sources and store it in a usable format.
  • Lead the design, implementation, and successful delivery of large-scale, critical, or difficult data solutions involving a significant amount of data.
  • Utilize expertise in SQL and have a strong understanding of ETL and data modelling.
  • Ability to ingest data into AWS S3, perform ETL into RDS or Redshift.
  • Use AWS Lambda (C# or Python) for event-driven data transformations.
  • Designing and implementing security measures to protect data from unauthorized access or misuse.
  • Maintaining the integrity of data by designing backup and recovery procedures.
  • Work on automating the migration process in AWS from development to production.
  • You will deliver digestible, contemporary, and immediate data content to support and drive business decisions.
  • You will be involved in all aspects of data engineering from delivery planning, estimating and analysis, all the way through to data architecture and pipeline design, delivery, and production implementation.
  • From day one, you will be involved in the design and implementation of complex data solutions ranging from batch to streaming and event-driven architectures, across cloud, on-premise and hybrid client technology landscapes.
  • Optimize cloud workloads for cost, scalability, availability, governance, compliance, etc.

Competencies

  • Must have experience with AWS Glue Jobs using PySpark or AWS Glue Spark.
  • Realtime ingestion using KAFKA is an added advantage.
  • Strong SQL and C# or Python programming knowledge.
  • Objective oriented principles in C# or Python: classes and inheritance.
  • Expert knowledge of data engineering packages and libraries and related functions in C# or Python.
  • AWS technical certifications (Developer Associate or Solutions Architect).
  • Experience with development and delivery of microservices using serverless AWS services (S3, RDS, Aurora, DynamoDB, Lambda, SNS, SQS, Kinesis, IAM).
  • Ability to understand and articulate requirements to technical and non-technical audiences.
  • Have experience working with RDBMS databases, such as Postgres, SQL Server and MySQL.
  • Apply knowledge of scripting and automation using tools like PowerShell, Python, Bash, Ruby, Perl, etc.
  • Stakeholder management and communication skills, including prioritising, problem solving and interpersonal relationship building.
  • Effectively and efficiently troubleshoot data issues and errors.
  • Strong experience in SDLC delivery, including waterfall, hybrid, and Agile methodologies.
  • Experience delivering in an agile environment.
  • Experience in implementing and delivering data solutions and pipelines on AWS Cloud Platform.
  • A strong understanding of data modelling, data structures, databases, and ETL processes.
  • An in-depth understanding of large-scale data sets, including both structured and unstructured data.
  • Knowledge and experience in delivering CI/CD and DevOps capabilities in a data environment.
  • Ability to clearly communicate complex technical ideas.
  • Experience in the financial industry is a plus.
  • An AWS Certified Machine Learning – Specialty Certificate is an advantage

If you are passionate about working with data and have a desire to work in a dynamic and challenging environment, we encourage you to apply. This is an excellent opportunity to make a significant impact in a leading investment bank and to grow your career in data engineering.

You may apply on the following link: Amidel Careers Portal.