Home

Looking for Snowflake Data Architect || 15+ Years || OH Locals Only || at Columbus, Ohio, USA
Email: [email protected]
From:

venkat,

oceanblue

[email protected]

Reply to:   [email protected]

Hi All,
Hope you are doing well.

Role: Snowflake Data Architect

Location: Columbus, OH (Hybrid)

Experience: 15+ Years

Responsibilities:

* Participate in Team activities, Design discussions, Stand up meetings and planning Review with team.

* Provide Snowflake database technical support in developing reliable, efficient, and scalable solutions for various projects on Snowflake.

* Ingest the existing data, framework and programs from ODM EDW IOP Big data environment to the ODM EDW Snowflake environment using the best practices.

* Design and develop Snowpark features in Python, understand the requirements and iterate.

* Interface with the open-source community and contribute to Snowflakes open-source libraries including Snowpark Python and the Snowflake Python Connector.

* Create, monitor, and maintain role-based access controls, Virtual warehouses, Tasks, Snow pipe, Streams on Snowflake databases to support different use cases.

* Performance tuning of Snowflake queries and procedures. Recommending and documenting the best practices of Snowflake.

* Explore the new capabilities of Snowflake, perform POC and implement them based on business requirements.

* Responsible for creating and maintaining the Snowflake technical documentation, ensuring compliance with data governance and security policies.

* Implement Snowflake user /query log analysis, History capture, and user email alert configuration.

* Enable data governance in Snowflake, including row/column-level data security using secure views and dynamic data masking features.

* Perform data analysis, data profiling, data quality and data ingestion in various layers using big data/Hadoop/Hive/Impala queries, PySpark programs and UNIX scripts.

* Follow the organization coding standard document, Create mappings, sessions and workflows as per the mapping specification document.

* Perform Gap and impact analysis of ETL and IOP jobs for the new requirement and enhancements.

* Create mockup data, perform Unit testing and capture the result sets against the jobs developed in lower environment.

* Updating the production support Run book, Control M schedule document as per the production release.

* Create and update design documents, provide detail description about workflows after every production release.

* Continuously monitor the production data loads, fix the issues, update the tracker document with the issues, Identify the performance issues.

* Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load and other standard approaches.

* Perform Quality assurance check, Reconciliation post data loads and communicate to vendor for receiving fixed data.

* Participate in ETL/ELT code review and design re-usable frameworks.

* Create Change requests, workplan, Test results, BCAB checklist documents for the code deployment to production environment and perform the code validation post deployment.

* Work with Snowflake Admin, Hadoop Admin, ETL and SAS admin teams for code deployments and health checks.

* Create re-usable framework for Audit Balance Control to capture Reconciliation, mapping parameters and variables, serves as single point of reference for workflows.

* Create Snowpark and PySpark programs to ingest historical and incremental data.

* Create SQOOP scripts to ingest historical data from EDW oracle database to Hadoop IOP, created HIVE tables and Impala views creation scripts for Dimension tables.

* Participate in meetings to continuously upgrade the Functional and technical expertise.

REQUIRED Skill Sets:

* Proficiency in Data Warehousing, Data migration, and Snowflake is essential for this role.

* Strong Experience in the implementation, execution, and maintenance of Data Integration technology solutions.

* Minimum (4-6) years of hands-on experience with Cloud databases.

* Minimum (2-3) years of hands-on data migration experience from the Big data environment to the Snowflake environment.

Keywords: Ohio
Looking for Snowflake Data Architect || 15+ Years || OH Locals Only ||
[email protected]
[email protected]
View All
06:46 PM 07-Mar-25


To remove this job post send "job_kill 2236464" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 8

Location: Columbus, Ohio