Opening for AWS Data engineer in NY at NYC, New York, USA |
Email: [email protected] |
Hi, I hope you are doing well. A brief version of the job description is attached below, if you are interested revert to my mail with your updated resume Job Title: AWS Data Engineer Location: NYC, NY Duration: Long Term Rate : $50/hr on C2C Need PP number NO opt and cpt and h1t visas Skills : Python Pyspark Hadoop Aws Java - spark Job Description: We are seeking a highly skilled and motivated Data Engineer to join our team. The ideal candidate will have extensive experience working with big data technologies and cloud platforms, particularly in the areas of data processing, ETL pipeline development, and data warehousing. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure that support our data-driven initiatives. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Python and PySpark. Work with Hadoop ecosystems to manage and process large-scale datasets. Develop and optimize data workflows using Apache Spark with Java. Implement and maintain ETL processes to collect, process, and transform data from various sources. Collaborate with data scientists, analysts, and other stakeholders to ensure data availability and integrity. Deploy, monitor, and manage data infrastructure on AWS cloud services (e.g., S3, EMR, Lambda, Redshift). Ensure data security and compliance with industry standards. Troubleshoot and resolve issues related to data processing and infrastructure. Optimize performance and scalability of data solutions. Stay up-to-date with the latest developments in big data technologies and cloud computing. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. 3+ years of experience as a Data Engineer or in a similar role. Strong programming skills in Python and Java. Hands-on experience with PySpark and Apache Spark for data processing. Proficiency with Hadoop ecosystems (e.g., HDFS, Hive, MapReduce). Experience with AWS cloud services, including S3, EMR, Lambda, and Redshift. Knowledge of data warehousing concepts and ETL processes. Strong problem-solving skills and ability to troubleshoot complex data issues. Excellent communication skills and ability to work collaboratively in a team environment. Preferred Qualifications: Experience with other big data tools and technologies (e.g., Kafka, Airflow). Knowledge of DevOps practices for data engineering (CI/CD pipelines, infrastructure as code). Familiarity with database management systems (e.g., PostgreSQL, MySQL). Certification in AWS or other relevant technologies. -- Keywords: continuous integration continuous deployment sthree information technology New York Opening for AWS Data engineer in NY [email protected] |
[email protected] View All |
07:01 PM 22-Aug-24 |