Home

Snowflake Data engineer || Remote || 12+ years experience Req at Snowflake, Arizona, USA
Email: [email protected]
Hi,

Hope you are doing good...!
This is Divya from Siri Info Solutions. I have some urgent requirements with one of my clients. Please go through the Job Description and let me know your interest. In case you are not interested, it will be nice to let your friends/colleagues know of this position who may be a potential.

COMPLETE JOB DESCRIPTION IS BELOW FOR YOUR REVIEW:
Job Title: Snowflake Data engineer
Location: Remote
Hire type: contract

Visa: only H1b with passport No. 
Job Description:
Experience required: 10-12+ years
    Experience in IT industry
    Working experience with building productionized data ingestion and processing data pipelines in Snowflake
    Strong understanding on Snowflake Architecture
    Fully well-versed with data warehousing concepts.
    Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing.
    Able to create the data pipeline for ETL/ELT
    Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements.
    Able to create the high level and low-level design document based on requirement.
    Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud.
    Awareness on data visualisation tools and methodologies
    Work independently on business problems and generate meaningful insights
    Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory.
    Should have experience on implementing Snowflake Best Practices
    Snowflake SnowPro Core Certification will be added an advantage

Roles and Responsibilities:
    Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc.
    Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data.
    Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, streamlit
    Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system.
    Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF)
    Should have good experience in Python/Pyspark integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage.
    Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts.
    Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark.
    Should have some experience on Snowflake RBAC and data security.
    Should have good experience in implementing CDC or SCD type-2.
    In-depth understanding of Data Warehouse, ETL concepts
    Experience in requirement gathering, analysis, development, and deployment.
    Should Have experience building data ingestion pipeline
    Optimize and tune data pipelines for performance and scalability
    Able to communicate with clients and lead team.
    Good to have exp with Airflow or other workflow management tools for scheduling and managing ETL jobs.
    Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc.

 Minimum qualifications
    B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer.
Skill Metrix:
    Snowflake, Python/PySpark, AWS/Azure, ETL concepts, & Data Warehousing concepts

Regards

Divya Pandey

LinkedIn: 

linkedin.com/in/divya-pandey-5345ba218

Mail Id: 
[email protected]

--

Keywords: continuous integration continuous deployment sthree information technology golang Idaho
Snowflake Data engineer || Remote || 12+ years experience Req
[email protected]
[email protected]
View All
07:55 PM 13-Feb-25


To remove this job post send "job_kill 2171786" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 0

Location: ,