Data Platform Engineer Location : Remote any visa is fine at Fine, New York, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1104866&uid= Role:Data Platform Engineer Location : Remote any visa is fine Immediate Interview Top 3 Must Haves: Azure Databricks (PySpark) Azure DevOps Datastage Works with developers and the IT staff to oversee the code releases, combining an understanding of both engineering and coding. From creating and implementing systems software to analyzing data to improve existing ones, a DevOps Engineer increases productivity in the workplace. Responsibilities Utilize approved tools, adopt key performance indicators (KPIs), increase technology component reuse, and consolidate platforms, environments and products with the goal of reducing overall IT costs. Able to easily convert/deploy all the Data Platform Services into code based artifacts using ARM templates or BICEP along with security and compliance Proven solution design skills being able to script and automate a solution end-to-end (infrastructure and application layers). ETL in Azure Data Factory and Databricks Provide support and advice to operational infrastructure teams to enable efficient operation and problem resolution of teams infrastructures. Supporting the planning and implementation of data design services, providing sizing and configuration assistance, and performing needs assessments. Monitoring progress, managing risk, and ensuring key stakeholders are kept informed about progress and expected outcomes. Working experience with Visual Studio, Power Scripting, and ARM templates. Ability to conduct data profiling, cataloging, and mapping for technical design and construction of technical data flows. Security (Best Practices, Containers, Linux Hardening) Provisioning resources as per the security standards Good understanding on AZ Networking mainly VNet, Subnet, Service endpoints, Private endpoints, NSG & AZ Firewall. Helping projects teams to ensure that all the AZ Resources are communicating with managed identities, service principal & RBAC Role Understanding all data sources connectivity as a part of data ingestion from multiple sources Technology: Azure Databricks, Azure Data Factory, Microsoft SQL, Azure RBAC, Azure CICD DevOps, Azure Kubernetes. Language: Python, C#/C++/Power, Bash, YAML Qualifications Proven track record with at least 4 years of experience in DevOps data platform development Strong understanding of DevOps concepts (Azure DevOps framework and tools preferred) Solid scripting skills in languages such as Python, Bash, Javascript, or similar Solid understanding of monitoring / observability concepts and tooling Extensive experience and strong understanding of cloud and infrastructure components Strong problem-solving and analytical skills, with the ability to troubleshoot complex DevOps platform issues and provide effective solutions Knowledge of developer tooling across the software development life cycle (task management, source code, building, deployment, operations, real-time communication) Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders 4+ years of professional infrastructure and/or software development experience 3+ years of experience with AWS, GCP, Azure, or another cloud service (Azure preferred) -- If I missed your call. Email is the best way to reach me. Regards, Arja Narasimha Naidu ( Us It Recruiter ) 9 Com Technologies, Inc. Email: [email protected] p: (847) 348-1020 407 (ext) 200 W Higgins Rd, Suite 302. Schaumburg, IL - 60195 w: www.9comtech.com Linkedin: https://www.linkedin.com/in/arja-narasimha-naidu-36531123b Keywords: cplusplus csharp information technology Arizona Illinois http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=1104866&uid= |
[email protected] View All |
10:27 PM 09-Feb-24 |