Data Architect (Snowflake) :: Remote at Snowflake, Arizona, USA |
Email: [email protected] |
From: Sri Sagar, brightsol [email protected] Reply to: [email protected] Job Summary: We are seeking a highly skilled Data Architect with expertise in Snowflake to design, implement, and optimize data solutions for our organization. The ideal candidate will have extensive experience in data modeling, ETL processes, cloud data architecture, and performance optimization. This role requires strong analytical skills, problem-solving abilities, and a deep understanding of modern data platforms. Key Responsibilities: Design and develop scalable, secure, and high-performance data architectures using Snowflake. Define and implement best practices for data management, governance, and security. Work with cross-functional teams to understand business requirements and translate them into technical solutions. Optimize Snowflake performance through indexing, partitioning, caching, and query tuning. Develop and maintain ETL pipelines using tools like dbt, Apache Airflow, Talend, Informatica, or Matillion. Ensure data quality, integrity, and consistency across multiple data sources. Implement data security and access control best practices to ensure compliance with industry standards. Evaluate and integrate third-party data tools and solutions as needed. Provide mentorship and technical guidance to data engineers and analysts. Stay up to date with the latest trends and advancements in data architecture, cloud technologies, and Snowflake innovations. Required Skills & Qualifications: 5+ years of experience in data architecture, data engineering, or related roles. 3+ years of hands-on experience with Snowflake (design, implementation, and optimization). Strong experience with SQL and database performance tuning. Expertise in cloud platforms such as AWS, Azure, or Google Cloud. Proficiency in ETL/ELT development using tools like dbt, Apache Airflow, or Informatica. Experience with data warehousing concepts, dimensional modeling, and data lakes. Knowledge of Python, Java, or Scala for data processing. Experience with CI/CD pipelines and DevOps practices in data environments. Strong problem-solving, analytical thinking, and communication skills. Preferred Qualifications: Snowflake SnowPro Certification. Experience with streaming data solutions like Kafka or Kinesis. Familiarity with data governance and compliance frameworks (GDPR, HIPAA, etc.). Knowledge of machine learning and AI-driven data solutions is a plus. Keywords: continuous integration continuous deployment artificial intelligence Data Architect (Snowflake) :: Remote [email protected] |
[email protected] View All |
11:53 PM 10-Feb-25 |