enterprise datawarehouse scaler :: local to nj (local dl) :: usc::gc at Princeton, New Jersey, USA |
Email: [email protected] |
From: shweta, zealhire [email protected] Reply to: [email protected] Job Description Title: Enterprise Data Warehouse Scalable Engineer Hadoop/Python Location: Princeton, NJ (Hybrid) Need only local candidates with local DL nearby 30-50 miles from the client location. Visa: USC & GC with genuine Note: Need a candidate with Perfect English, leading hands on architect types. Job Description: Our Team: The Enterprise Data Warehouse and Business Intelligence team is looking for a proven senior engineer to join its team! Our fast-growing, diverse team of engineers is responsible for building scalable ingestion systems which handle and prepare petabytes of data for reporting, dashboards, and advanced analytics. Company runs on data. Our data captures the who, what, when, where and why of how our clients use Company products and our Product teams rely on those insights to drive adoption and usage. What's In It For You: Join a fast-paced team of dedicated engineers who are committed to building automated, reliable, performance-oriented systems Shape the strategic and technological direction of the team Empower your career growth through exposure to new technologies and processes - You will collaborate with cross-functional teams to enhance and ingest data from internal and external sources - You will identify opportunities to simplify and automate workflows - You will translate business requirements into robust and scalable data pipelines for key business metrics - You will use technologies such as Hadoop, Spark, Hive, Kafka and more - You will be working with structured and unstructured datasets You will need to have: - 5+ years of experience with DBMS, RDBMS and ETL methodologies. - Experience building automated, scalable architectures in an enterprise setting - Advanced SQL capabilities are required. Knowledge of database design and experience working with extremely large data volumes is a plus. - Programming experience in Python. PySpark - Familiarity with Hadoop ecosystem (HDFS, Spark, oozie) - Strong understanding of data warehousing methodologies, ETL processing and dimensional data modeling. - Strong problem-solving skills and trouble-shooting skills - Knowledge of Airflow is a plus - BA, BS, MS, PhD in Computer Science, Engineering or related technology field Nice to Have: - Knowledge of MPP systems Keywords: business analyst information technology green card microsoft New Jersey enterprise datawarehouse scaler :: local to nj (local dl) :: usc::gc [email protected] |
[email protected] View All |
09:21 PM 24-Dec-24 |