Home

Long Term JOB REQUIREMENT On Data Architect (Microsoft Fabric & Azure Data bricks), Atlanta, GA, Hybrid, Need submissions at Atlanta, Georgia, USA
Email: [email protected]
Long Term JOB REQUIREMENT On Data Architect (Microsoft Fabric & Azure Data bricks), Atlanta, GA, Hybrid, Need submissions

Long Term JOB REQUIREMENT On Data Architect (Microsoft Fabric & Azure Data bricks), Atlanta, GA, Hybrid, Need submissions Dear Partner, Good Morning

Long Term JOB REQUIREMENT On Data Architect (Microsoft Fabric & Azure Data bricks), Atlanta, GA, Hybrid, Need submissions

Dear Partner,

Good Morning ,

Greetings from Nukasani group Inc !, We have below urgent long term contract project immediately available for * Data Architect (Microsoft Fabric & Azure Data bricks), Atlanta, GA, Hybrid need submissions you please review the below role, if you are available, could you please send me updated word resume, and below candidate submission format details, immediately. If you are not available, any referrals would be greatly appreciated.

Interviews are in progress, urgent response is appreciated. Looking forward for your immediate response and working with you.

Candidate Submission Format - needed from you

Full Legal Name

Personal Cell No ( Not google phone number)

Email Id

Skype Id

Interview Availability

Availability to start, if selected

Current Location

Open to Relocate

Work Authorization

Total Relevant Experience

Education./ Year of graduation

University Name, Location

Last 4 digits of SSN

Country of Birth

Contractor Type

DOB: mm/dd

Home Zip Code

Assigned Job Details

Job Title : Data Architect (Microsoft Fabric & Azure Data bricks)

Location: Atlanta, GA, Hybrid

Rate : Best competitive rate

Supervises the coordination of design and security for computer databases to store, track, and maintain a large volume of critical business information.

Job Description:

Data Architect - Microsoft Fabric & Azure Databricks

The Department of Early Care & Learning (DECAL) is seeking an experienced Data Architect to design and implement enterprise data solutions using Microsoft Fabric and Azure Databricks for integration with state-level systems. This role will focus on creating scalable data architecture that enables seamless data flow between IES Gateway and our analytics platform. The ideal candidate will have deep expertise in modern data architecture, with specific experience in Microsoft's data platform and Delta Lake architecture.

Work Location & Attendance Requirements:

Must be physically located in Georgia

On-site: Tuesday to Thursday, per manager's discretion

Mandatory in-person meetings:

o All Hands

o Enterprise Applications

o On-site meetings

o DECAL All Staff

Work arrangements subject to management's decision

While the intent may be a long-term tenure, this position is subject to annual budget restrictions. The initial contract is through the end of this fiscal year and is anticipated to be renewed July 1st.

Key Responsibilities:

Data Architecture:

Design end-to-end data architecture leveraging Microsoft Fabric's capabilities.

Design data flows within the Microsoft Fabric environment.

Implement OneLake storage strategies.

Configure Synapse Analytics workspaces.

Establish Power BI integration patterns.

Integration Design:

Architect data integration patterns between IES Gateway and the analytics platform using Azure Databricks and Microsoft Fabric.

Design Delta Lake architecture for IES Gateway data.

Implement medallion architecture (Bronze/Silver/Gold layers).

Create real-time data ingestion patterns.

Establish data quality frameworks.

Lakehouse Architecture:

Implement modern data lakehouse architecture using Delta Lake, ensuring data reliability and performance.

Data Governance:

Establish data governance frameworks incorporating Microsoft Purview for data quality, lineage, and compliance.

Implement row-level security.

Configure Microsoft Purview policies.

Establish data masking for sensitive information.

Design audit logging mechanisms.

Pipeline Development:

Design scalable data pipelines using Azure Databricks for ETL/ELT processes and real-time data integration.

Performance Optimization:

Implement performance tuning strategies for large-scale data processing and analytics workloads.

Optimize Spark configurations.

Implement partitioning strategies.

Design caching mechanisms.

Establish monitoring frameworks.

Security Framework:

Design and implement security patterns aligned with federal and state requirements for sensitive data handling.

Required Qualifications:

Education: Bachelors degree in computer science or related field.

Experience:

6+ years of experience in data architecture and engineering.

2+ years hands-on experience with Azure Databricks and Spark.

Recent experience with Microsoft Fabric platform.

Technical Skills:

Microsoft Fabric Expertise:

Data Integration: Combining and cleansing data from various sources.

Data Pipeline Management: Creating, orchestrating, and troubleshooting data pipelines.

Analytics Reporting: Building and delivering detailed reports and dashboards to derive meaningful insights from large datasets.

Data Visualization Techniques: Representing data graphically in impactful and informative ways.

Optimization and Security: Optimizing queries, improving performance, and securing data

Azure Databricks Experience:

Apache Spark Proficiency: Utilizing Spark for large-scale data processing and analytics.

Data Engineering: Building and managing data pipelines, including ETL (Extract, Transform, Load) processes.

Delta Lake: Implementing Delta Lake for data versioning, ACID transactions, and schema enforcement.

Data Analysis and Visualization: Using Databricks notebooks for exploratory data analysis (EDA) and creating visualizations.

Cluster Management: Configuring and managing Databricks clusters for optimized performance. (Ex: autoscaling and automatic termination)

Integration with Azure Services: Integrating Databricks with other Azure services like Azure Data Lake, Azure SQL Database, and Azure Synapse Analytics.

Machine Learning: Developing and deploying machine learning models using Databricks MLflow and other tools.

Data Governance: Implementing data governance practices using Unity Catalog and Microsoft Purview

Programming & Query Languages:

SQL: Proficiency in SQL for querying and managing databases, including skills in SELECT statements, JOINs, subqueries, and window functions12.

Python: Using Python for data manipulation, analysis, and scripting, including libraries like Pandas, NumPy, and PySpark

Data Modeling:

Dimensional modeling

Real-time data modeling patterns

Soft Skills:

Strong analytical and problem-solving abilities

Excellent communication skills for technical and non-technical audiences

Experience working with government stakeholders

Preferred Experience:

Azure DevOps

Infrastructure as Code (Terraform)

CI/CD for data pipelines

Data mesh architecture

Certifications (preferred):

Microsoft Azure Data Engineer Associate

Databricks Data Engineer Professional

Microsoft Fabric certifications (as they become available)

Project-Specific Requirements:

Experience designing data architectures for grant management systems

Knowledge of federal/state compliance requirements for data handling

Understanding of financial data processing requirements

Experience with real-time integration patterns

This position requires strong expertise in modern data architecture with specific focus on Microsoft's data platform. The successful candidate will play a crucial role in designing and implementing scalable data solutions that enable efficient data processing and analytics for state-level grant management and reporting systems.

Skill Required / Desired

6+ years of experience in data architecture and engineering.

Required

2+ years hands-on experience with Azure Databricks and Spark.

Required

Recent experience with Microsoft Fabric platform Required 2 Years

Azure Databricks Experience

Required

Proficiency in SQL for querying and managing databases, including skills in SELECT statements, JOINs, subqueries, and window functions12 Required 3

Years

Using Python for data manipulation, analysis, and scripting, including libraries like Pandas, NumPy, and PySpark Required 3 Years

Thanks regards

Bhavani |Technical recruitment| Nukasani Group |

1001 E Chicago Ave, Unit B 111, Naperville IL 60540.

Email: [email protected]

People, Process, Technology Integrator.

An E-Verified Company

Submission Required Urgently  

2024 Nukasani Group | 1001 E Chicago Ave, Unit B 111, Naperville, IL 60540, US

Web Version  

Preferences  

Forward  

Powered by

GoDaddy Email Marketing
Keywords: continuous integration continuous deployment materials management business intelligence Georgia Idaho Illinois
Long Term JOB REQUIREMENT On Data Architect (Microsoft Fabric & Azure Data bricks), Atlanta, GA, Hybrid, Need submissions
[email protected]
[email protected]
View All
08:00 PM 03-Dec-24


To remove this job post send "job_kill 1977961" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 1

Location: ,