Hiring! -- AWS/Databricks/AI Architect-REMOTE - Please share Architect level profiles only. at Washington, DC, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2047445&uid= Please share resumes to [email protected] Please share Architect level profiles only. Job Title: AWS/Databricks/AI Architect Location: Washington DC ( Remote) Key Responsibilities: Solutions Design: Architect end-to-end scalable data solutions using Databricks, Azure, AWS, and other cloud-based services to meet client requirements. Data Strategy & Architecture: Develop robust data architectures, implement ETL pipelines, and establish data governance frameworks to ensure data accuracy and consistency. Technical Leadership: Lead the design and implementation of data platforms, define best practices, and mentor team members to elevate technical competencies within the organization. Advanced Analytics & Machine Learning: Create pipelines and workflows to support advanced analytics and machine learning initiatives, utilizing Databricks and other cloud-based services. Collaboration & Client Engagement: Act as a trusted advisor, working closely with client stakeholders to understand business needs, present architectural solutions, and implement strategies that deliver high-impact results. Performance Optimization: Identify bottlenecks and recommend solutions to optimize data workflows and architecture for both cost and performance. Innovation & Experimentation: Champion using cutting-edge data technologies and methodologies to drive innovation and experimentation within client projects. Qualifications & Experience: 15+ years of experience in a senior data engineering or solutions architect role, with a strong focus on Databricks, Azure, or AWS cloud ecosystems. Proven track record of designing and implementing data architectures that support large-scale data pipelines and analytics. Strong proficiency in Python, PySpark, and SQL for data processing and manipulation. Data Applications (Ex: Logs Analysis, Alert mechanism ,Real-time Systems Monitoring, Risk Analysis and more) Big data engineering (Ex: Spark, Hadoop, Kafka) Data Warehousing & ETL (Ex: SQL, OLTP/OLAP/DSS) Experience or Exposure with IDMC tool. Strong knowledge in data quality, data lineage and data catalog. Experienced at designing, architecting, and presenting data systems for customers and managing the delivery of production solutions of those data architectures. Experience with Data Science and Machine Learning (AI/ML) - (Ex: pandas, R, scikit-learn, HPO) Extensive experience in creating ETL pipelines from scratch, handling large datasets, and developing solutions that align with business goals. Extensive experience interacting with C-Suite level stakeholders Hands-on experience with data warehousing concepts, data modelling, and building data marts. Establish the Databricks Lakehouse architecture as the standard data architecture for customers through excellent technical account planning. Industry experience in Transportation, logistics, or other complex domains is highly desirable. Fluency in verbal and written English, with the ability to communicate complex technical concepts to non-technical stakeholders. Personality traits: Analytical, solution-oriented, proactive, team player, and continuously learning. -- Keywords: cprogramm artificial intelligence machine learning rlang information technology Hiring! -- AWS/Databricks/AI Architect-REMOTE - Please share Architect level profiles only. [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2047445&uid= |
[email protected] View All |
11:06 PM 02-Jan-25 |