Sr. Data Analyst with Healthcare Experience (EAD or GC or Citizen)_Remote at Remote, Remote, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2136000&uid= From: bhuwan tiwari, Sonitalentcorp [email protected] Reply to: [email protected] Role Title: Sr. Data Analyst Work Location: 100% Remote Duration: Contract Visa: USC, GC, GC-EAD, H4-EAD only Note: Need 5 Digit SSN with LinkedIn also need his visa and DL copy image in hand. Work hours (ex. 9am-5pm day/night shifts rotating shifts etc) 9am-5pm What are the top 5-10 responsibilities for this position (please be detailed as to what the candidate is expected to do or complete on a daily basis) Develop and manage effective working relationships with other departments, groups, and personnel with whom work must be coordinated or interfaced Efficiently communicate with ETL architect while understanding the requirements and business process knowledge in order to transform the data in a way thats geared towards the needs of end users Assist in the overall architecture of the ETL Design, and proactively provide inputs in designing, implementing, and automating the ETL flows Investigate and mine data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions Developing ETL pipelines and data flows in and out of the data warehouse using a combination of Azure Data Factory and Snowflake toolsets Developing idempotent ETL process design so that interrupted, incomplete, or failed processes can be rerun without errors using ADF dataflows and Pipelines Ability to work in Snowflake Virtual Warehouses as needed in Snowflake and automate data pipelines using Snowpipe for tedious ETL problems Capturing changes in data dimensions and maintaining versions of them using Stream sets in snowflake and scheduling them using Tasks Optimize every step of the data movement not only limited to source and during travel but also when it's at rest in the database for accelerated responses Must have the ability to build a highly efficient orchestrator that can schedule jobs, execute workflows, perform Data quality checks, and coordinate dependencies among tasks Responsible for testing of ETL system code, data design, and pipelines and data flows. Root cause analysis on all processes and resolving production issues are also a part of the process and routine tests on databases and data flow and pipeline testing Responsible for documenting the implementations, and test cases as well as responsible for building deployment documents needed for CI/CD What skills/attributes are required (please be detailed as to the number of years of experience for each skill) 5+ years of Data engineering experience with a focus on Data Warehousing 2+ years of experience creating pipelines in Azure Data Factory (ADF) 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools. 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc. 3+ years of experience creating stored procedures with Oracle PL/SQL, SQL Server T-SQL, or Snowflake SQL 2+ years of experience with GitHub, SVN, or similar source control systems 2+ years of experience processing structured and unstructured data. Experience with HL7 and FHIR standards, and processing files in these formats. 3+ years analyzing project requirements and developing detailed specifications for ETL requirements. Excellent problem-solving and analytical skills, with the ability to troubleshoot and optimize data pipelines. Ability to adapt to evolving technologies and changing business requirements. Bachelors or Advanced Degree in a related field such as Information Technology/Computer Science, Mathematics/Statistics, Analytics, Business What skills/attributes are preferred (what will set a candidate apart) 2+ years of batch or Power scripting 2+ years of experience with Python scripting. 3+ years of data modeling experience in a data warehouse environment Experience or familiarity with Informatica Intelligent Cloud Services (specifically Data Integration) Experience designing and building APIs in Snowflake and ADF (e.g. REST, RPC) Experience with State Medicaid / Medicare / Healthcare applications Azure certifications related to data engineering or data analytics. Can you please provide a summary of the project/initiative which describes whats being done As a member of the Optum Data Management team, the Data Engineer supports the Alabama EDS by developing and maintaining workflows, identifying, and resolving data quality issues, and optimizing processes to improve performance. The Data Engineer will also support intrastate agencies by monitoring automated data extracts and working directly with state partners to create new extracts based on business specifications. What does the ideal candidate background look like (ex: healthcare specific background, etc.) We want to be as specific as possible with our firms so they can find the type of candidate youre looking for. Data Engineer with Healthcare (Medicaid) and Microsoft Azure based experience with Snowflake and Azure Data Factory Of the required skills listed, which would you consider the top 3 Please list your expectations regarding years of experience for each requirement. 5+ years of Data engineering experience with a focus on Data Warehousing 2+ years of experience creating pipelines in Azure Data Factory (ADF) 3+ years of experience creating stored procedures with Oracle PL/SQL, SQL Server T-SQL, or Snowflake SQL What experience will set candidates apart from one another 5+ years of Data engineering experience with a focus on Data Warehousing 2+ years of experience creating pipelines in Azure Data Factory (ADF) 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools. 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc. 3+ years of experience creating stored procedures with Oracle PL/SQL, SQL Server T-SQL, or Snowflake SQL 2+ years of experience with GitHub, SVN, or similar source control systems 2+ years of experience processing structured and un-structured data. Experience with HL7 and FHIR standards, and processing files in these formats. 3+ years analyzing project requirements and developing detailed specifications for ETL requirements. Excellent problem-solving and analytical skills, with the ability to troubleshoot and optimize data pipelines. Ability to adapt to evolving technologies and changing business requirements. Bachelors or Advanced Degree in a related field such as Information Technology/Computer Science, Mathematics/Statistics, Analytics, Business What does the team structure look like how many members and what is the break-down of the teams skill sets (ex: 1 PM, 4 Developers, etc.) 1 Project Manager 4 Data Engineers Business Analyst QA Testers DBA 21. What does the interview process look like a. How many rounds 3 b. Video vs. phone Video c. How technical will the interviews be High Thanks, | Bhuwan Tiwari Sr.Technical Recruiter SoniTalent Corporation I 5404 Merribrook Lan--e, Prospect, KY, USA | | Cell -859-946-4062 | [email protected] | | | | | Keywords: continuous integration continuous deployment quality analyst information technology green card procedural language Kentucky Sr. Data Analyst with Healthcare Experience (EAD or GC or Citizen)_Remote [email protected] http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2136000&uid= |
[email protected] View All |
03:58 AM 01-Feb-25 |