New Requirement On C2C - Data Engineer with DBT -100% Remote - Mc Lean , VA at Mc Lean, Virginia, USA |
Email: tarun.manukonda@srsconsultinginc.com |
https://h1.nu/161Il https://jobs.nvoids.com/job_details.jsp?id=2071661&uid= From: Manukonda Tarun Kumar, SRS CONSULTING INC tarun.manukonda@srsconsultinginc.com Reply to: tarun.manukonda@srsconsultinginc.com Hello, "Greetings from SRS Business solutions". I am Manukonda Tarun Kumar, We have an immediate opening for the below position with one of our premium clients. If interested, Can you please send me an updated resume & contact Details to discuss about the rate and role. Role: Data Engineer 100% Remote location: Mc Lean , VA Skills: Storng DBT, Python Coding Must, Snowflake, Looker, AWS are needed Responsibilities Develop data pipelines for streaming and batch data processing needs to move data in and out of Snowflake data warehouse Collaborate with engineering and DevOps team members to implement, test, deploy, and operate data pipelines and ETL solutions Develop scripts to Extract, Load and Transform data and other utility functions Hands-on experience with Snowflake, including schema design, query optimization, and data load techniques. Strong experience with DBT, including model development, testing, and documentation. Familiarity with ETL/ELT tools and processes Troubleshoot issues such data load problems, transformation translation problems raised by team members, investigating and providing effective solutions Optimize data pipelines, ETL processes, and data integrations for large-scale data analytics use cases Ensure data quality, integrity, and governance standards are maintained in data processing workflows Required Skills: Bachelor's degree in Computer Science, Information Systems, or related field 7+ years of experience in building and maintaining data pipelines and ETL/ELT processes in data-intensive organizations Design, build, and maintain scalable data pipelines using Snowflake and DBT. Develop and manage ETL processes to ingest data from various sources into Snowflake. Optimize and tune data pipelines for performance and cost efficiency. Able to build data integrations and ingestion pipelines for streaming and batch data Competence in Cloud database systems including 3+ years with Snowflake Hands-on experience with data movement using Snowpipe, Snow SQL, etc. Strong coding skills with Python and SQL for manipulating and analyzing data Knowledge of data governance and data security best practices. Hands-on experience with cloud platforms such as AWS and Google Cloud Other Desired Skills: Minimum 5 years of designing and implementing operational production grade large-scale data pipelines, ETL/ELT and data integration solutions Hands on experience with product ionized data ingestion and processing pipelines Strong understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies Proficiency with Kafka, AWS S3, SQS, Lambda, Pub/Sub, AWS DMS, Glue Experience working with structured, semi-structured, and unstructured data Preferred Skills: Background in healthcare data especially patient centric clinical data and provider data is a plus Familiarity with API security frameworks, token management and user access control including OAuth, JWT etc., Bachelor's or Master's degree in Computer Science, Information Systems, or related field Thanks & Regards, Tarun Kumar Manukonda Address: 39465 Paseo Padre Pkwy, Suite 3200 Fremont, CA 94538 Email: tarun.manukonda@srsconsultinginc.com Linkedin: https://www.linkedin.com/in/manukonda-tarun-kumar-983576261/ website : www.srsconsultinginc.com Keywords: access management sthree California Virginia New Requirement On C2C - Data Engineer with DBT -100% Remote - Mc Lean , VA tarun.manukonda@srsconsultinginc.com https://h1.nu/161Il https://jobs.nvoids.com/job_details.jsp?id=2071661&uid= |
tarun.manukonda@srsconsultinginc.com View All |
10:45 PM 10-Jan-25 |