Fully Remote Senior Observability Data Pipeline Engineer USC only at Remote, Remote, USA |
Email: [email protected] |
From: Sonali, Kpg99 [email protected] Reply to: [email protected] Job Title: Fully Remote Senior Observability Data Pipeline Engineer (Vector.Dev) Duration: Long-Term contract Location: Remote Rate $53/hr on c2c Project: Name: AppDynamics Migration Skill-Set: Title: Vector.Dev (10x) - a tool for building observability pipelines that collects, transforms, and routes logs and metrics Level of Seniority/Years of Experience: Mid-Senior Level Communication Importance: Important Top 3-5: Experience with Vector.Dev Experience Designing and Managing Data pipelines specifically for Observability Migrations. Experience with data pipeline technologies including Cribl, Confluent/Kafka, Airflow, and Apache Ni-Fi or Vector.dev, or FluentD. Python Summary: We are seeking a Senior Observability Data Pipeline Engineer to join our team. This role offers a unique opportunity to design, implement, and manage data pipelines that handle data from a variety of sources. You will ensure efficient data processing, routing, and delivery to support large-scale data analytics. If you have a strong background in data pipeline technologies and are passionate about working with large-scale data analytics platforms, we want to hear from you. You play a crucial role in enabling automation engineers and solution engineers to perform their duties more effectively by ensuring that data flows seamlessly and is available when and where its needed, facilitating better automation, monitoring, and solution implementation. Key Responsibilities: Design, implement, and manage data pipelines to handle data from diverse sources. Ensure efficient data processing, routing, and delivery across various systems. Work with large-scale data analytics platforms such as Splunk, the ELK stack, DataBricks, and Snowflake. Utilize and manage data pipeline technologies including Cribl, Confluent/Kafka, Airflow, and Apache Ni-Fi. Develop custom scripts and tools using Python, and other relevant programming languages. Collaborate with data engineering, analytics, and observability teams to optimize data workflows and infrastructure. Monitor and troubleshoot data pipelines to ensure reliability and performance. Stay updated with the latest trends and best practices in data pipeline technologies and large-scale data analytics. Provide guidance and mentorship to junior engineers and other team members. Required Skills and Qualifications: 5 or more years of extensive experience in designing and managing data pipelines. 3 or more years of experience with data pipeline technologies such as Cribl, Confluent/Kafka, Airflow, and Apache Ni-Fi or Vector.dev, or FluentD. 3 or more years of experience in Python. 1 or more years of experience with large-scale data analytics platforms such as Splunk, the ELK stack, DataBricks, and Snowflake, and/or Clickhouse. Preferred Skills and Qualifications: Solid understanding of data processing, routing, and delivery mechanisms. Ability to work collaboratively with cross-functional teams. Strong problem-solving skills and a proactive approach to identifying and addressing issues. Excellent communication and documentation skills. Proven ability to stay current with the latest technologies and best practices in data engineering and analytics Thanks & Regards Sonali Kumari Technical Recruiter KPG99, INC Keywords: information technology Fully Remote Senior Observability Data Pipeline Engineer USC only [email protected] |
[email protected] View All |
01:08 AM 01-Mar-25 |