Back to Job Search

ETL Developer

Job Description

Alchemy is currently partnering with a global technology firm who are looking to recruit an ETL Developer to be based in Philadelphia, PA, USA.

Responsibilities

  • Interpret data, analyze results using statistical techniques and provide ongoing reports.
  • Develop and implement databases, data collection systems, data analytics, and other strategies that optimize statistical efficiency and quality.
  • Acquire data from primary or secondary data sources and maintain databases/data systems.
  • Identify, analyze, and interpret trends or patterns in complex data sets.
  • Filter and clean data by reviewing reports and performance indicators to locate and correct problems.
  • Work with management to prioritize business and information needs.
  • Locate and define new process improvement opportunities.
  • Interpret the application/feature/component design to develop the same in accordance with specifications.
  • Code, debug, test, document, and communicate product/component/feature development stages.
  • Validate results with user representatives; integrates and commission the overall solution.
  • Select appropriate technical options for development such as reusing, improving, or reconfiguration of existing components or creating own solutions.
  • Influence and improve customer satisfaction.
  • Set FAST goals for self/team; provide feedback to FAST goals of team members.

Qualifications

  • At least 8+ years of experience architecting and implementing complex ETL pipelines preferably with the Spark toolset.
  • At least 4+ years of experience with Java particularly within the data space.
  • Technical expertise in data models, database design development, data mining, and segmentation techniques.
  • Good experience writing complex SQL and ETL processes.
  • Excellent coding and design skills, particularly in Java/Scala and Python and or Java.
  • Experience working with large data volumes, including processing, transforming, and transporting large-scale data.
  • Experience in AWS technologies such as EC2, Redshift, Cloud formation, EMR, AWS S3, and AWS Analytics required.
  • Big data-related AWS technologies like HIVE, Presto, and Hadoop.