| We are looking for Data Platform Engineer with 10+ exp - Need locals to PA, SC, MA - Remote role - Direct client requirement at Remote, Remote, USA |
| Email: [email protected] |
|
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1104446&uid= Hai Friends, We are looking for Data Platform Engineer with 10+ years of experienced candidates, if you have any candidates please revert back to me with resumes. Role: Data Platform Engineer Location: Remote must live within an hour and a half of either(Carlisle PA, Greenville SC, Boston, MA) Top 3 Must Haves: Azure Databricks (PySpark) Azure DevOps Datastage Works with developers and the IT staff to oversee the code releases, combining an understanding of both engineering and coding. From creating and implementing systems software to analyzing data to improve existing ones, a DevOps Engineer increases productivity in the workplace. Responsibilities Utilize approved tools, adopt key performance indicators (KPIs), increase technology component reuse, and consolidate platforms, environments and products with the goal of reducing overall IT costs. Able to easily convert/deploy all the Data Platform Services into code based artifacts using ARM templates or BICEP along with security and compliance Proven solution design skills being able to script and automate a solution end-to-end (infrastructure and application layers). ETL in Azure Data Factory and Databricks Supporting the planning and implementation of data design services, providing sizing and configuration assistance, and performing needs assessments. Monitoring progress, managing risk, and ensuring key stakeholders are kept informed about progress and expected outcomes. Working experience with Visual Studio, Power Scripting, and ARM templates. Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows. Security (Best Practices, Containers, Linux Hardening) Provisioning resources as per the security standards Good understanding on AZ Networking mainly VNet, Subnet, Service endpoints, Private endpoints, NSG & AZ Firewall. Helping projects teams to ensure that all the AZ Resources are communicating with managed identities, service principal & RBAC Role Understanding all data sources connectivity as a part of data ingestion from multiple sources Technology: Azure Databricks, Azure Data Factory, Microsoft SQL, Azure RBAC, Azure CICD DevOps, Azure Kubernetes. Language: Python, C#/C++/Power, Bash, YAML Qualifications Proven track record with at least 4 years of experience in DevOps data platform development Strong understanding of DevOps concepts (Azure DevOps framework and tools preferred) Solid scripting skills in languages such as Python, Bash, Javascript, or similar Solid understanding of monitoring / observability concepts and tooling Extensive experience and strong understanding of cloud and infrastructure components Strong problem-solving and analytical skills, with the ability to troubleshoot complex DevOps platform issues and provide effective solutions Knowledge of developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication) Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders 4+ years of professional infrastructure and/or software development experience 3+ years of experience with AWS, GCP, Azure, or another cloud service (Azure preferred) -- HARISH IT Recruiter [email protected] Contact: 517- 409-1222 https://www.linkedin.com/in/harish-naidu-3a9819243/ -- Keywords: cplusplus csharp information technology Arizona Massachusetts Pennsylvania South Carolina http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1104446&uid= |
| [email protected] View All |
| 09:42 PM 09-Feb-24 |