Urgent Requirement-Lead Data Engineer- Encino, CA (Onsite) at Remote, Remote, USA |
Email: [email protected] |
Hi, Hope you are doing great!! We have urgent requirements for the below position. If you have any excellent matches for this position please send me resume and contact details. Please reply at [email protected] for a quick response. Note:- only GC AND USC CANDIDATE IS WORKABLE FOR THIS ROLE. Job Description:- Lead Data Engineer Encino, CA (Onsite) Implementation Partner - Orion Inc End Client Name - AssetMark Contract - Long Term Only GC and USC profiles Note: Strong DataVault Modeling and Azure experience is mandatory Team and Responsibilities We are currently looking for an experienced and detail-oriented candidate to join our team for one of our Financial Services Clients i.e. AssetMark based out of Encino, CA as Lead Data Engineer. We are looking for this position as a self-motivated contributor working closely with software development teams. Responsibilities: Define software feasibility, run data reviews, assess current databases to identify areas in need of improvement, and oversee data development teams. Build and maintain Enterprise Data Model including Canonical Data Model to meet the companys needs Understand the Industry latest trends on data strategy and create Innovative solutions. Provide updates to stakeholders on product development processes. Drive innovation and data strategy by building and improving the data modeling and database design and best practices around Data Modeling/Architecture Lead the data committee, gather requirements, and define data models and data solutions. Collaborate with stakeholders to understand the organization's data requirements, business objectives, and long-term goals. Develop a data strategy that aligns with the overall business strategy and ensures data-driven decision-making. Design and develop data models that cater to the specific needs of the organization. Establish efficient and robust data integration processes to extract, transform, and load (ETL) data from various sources into the Azure environment. Ensure data quality, consistency, and accuracy during the integration process. Experience working on Near Realtime & Realtime (Change Data Capture -CDC , Streaming Data) solutions using tools such as Kafka, Golden Gate etc. Experience implementing Medallion Architecture customized for the Clients needs. Experience building ODS and Datawarehouse / Data Lakehouse Solutions Design and implement data warehousing solutions using Azure Analytics and Data Platform services such as Databricks, Synapse and MS Fabric to enable scalable storage and efficient data retrieval for analytical processing. Establish data governance policies and procedures for data access, usage, and retention. Design data visualization solutions for reporting and business intelligence purposes. Work closely with cross-functional teams, including developers, data engineers, business analysts, and project managers, to ensure successful implementation of data solutions. Document the data architecture, data flows, and processes for reference and training purposes, enabling other team members to understand and work with the data architecture effectively. Required Skills and Experiences. Experience with MS Fabric, Synapse Analytics, Azure Data Factory, Synapse Data Engineering, Azure Lake, Databricks, Microsoft Purview, Azure Stream Analytics, Azure Dev Ops, and standard Microsoft development environments. Experience working with SQL, PL/SQL, Python, PySpark programming Languages and DBT (Nice to have) Familiarity of the SDLC and common data warehousing development practices including understanding / Experience in Agile Development Experience with Data Quality, Scheduling and SLA management. Lead must have done data models with Kimball and data vault type structures as lake houses. Experience on Relational, NoSQL, Columnar Data and Big Data technologies, techniques, and trends on Azure Cloud Strong Experience with data analytics data integration (ETL/ELT), data delivery, data preparation, data discovery, data modelling, data processing and data storage Experience evaluating various Metadata products such as Collibra, OpenMetadata. Apache Atlas etc. and guide the team to implement and manage the same. Bachelors degree in computer science or related field or equivalent experience and/or training -- Keywords: information technology green card microsoft procedural language California Urgent Requirement-Lead Data Engineer- Encino, CA (Onsite) [email protected] |
[email protected] View All |
07:58 PM 05-Jul-24 |