W2 Roles ONLY - Not for Benchsales Recruiters - Only Independent Consultants at Remote, Remote, USA |
Email: [email protected] |
http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2217382&uid= Hi All, (Not For Benchsales) Hope you are doing good. Currently we are looking for Independent Consultants with Independent Visas for our W2 Roles. The rates mentioned are non-negotiable. Interested parties, please contact me at [email protected] Only W2 (Independent consultant requirement)-No OPT/CPT. Do share qualified consultants. 1. Python Developer (Only Citizens) Remote Salary : $110k per annum (Jr level - 6 yrs ) $120kper annum (Mid level - 8+yrs) Bachelor's Degree in Computer Science or related field. Hands-on programming experience in Python. Hands-on experience with self-testing code (unit tests) and building systems. Emerging ability to lead and influence a team's technical direction Hands-on experience with database and cloud technologies Emerging ability to design resilient software components within a distributed system. Our core stack is Python, migrating from system software in C, and legacy PHP in some places; as part of our move to the cloud, were running Kubernetes, and AWS technologies like EKS/Lambda/RDS/Aurora/Dynamo 2. Mid Level Data Engineer, 7+ yrs experience - $120k per annum Senior Data Engineer, 9+ yrs experience - $130k per annum Principal Data Engineer, 11+ yrs - $140k per annum Location : Fort Mill or Austin (Hybrid) We are looking for a skilled Data Engineer to join our team and help build robust, scalable, and efficient data pipelines. The ideal candidate will have strong expertise in AWS, Python, Spark, ETL Pipelines, SQL, and Pytest. This role involves designing, implementing, and optimizing data pipelines to support analytics, business intelligence, and machine learning initiatives. Key Responsibilities: Design, develop, and maintain ETL pipelines using AWS services, Python, and Spark. Optimize data ingestion, transformation, and storage processes for high-performance data processing. Work with structured and unstructured data, ensuring data integrity, quality, and governance. Develop SQL queries to extract and manipulate data efficiently from relational databases. Implement data validation and testing frameworks using Pytest to ensure data accuracy and reliability. Collaborate with data scientists, analysts, and software engineers to build scalable data solutions. Monitor and troubleshoot data pipelines to ensure smooth operation and minimal downtime. Stay up-to-date with industry trends, tools, and best practices for data engineering and cloud technologies. Required Skills & Qualifications: Experience in Data Engineering or a related field. Strong proficiency in AWS (S3, Glue, Lambda, EMR, Redshift, etc.) for cloud-based data processing. Hands-on experience with Python for data processing and automation. Expertise in Apache Spark for distributed data processing. Solid understanding of ETL pipeline design and data warehousing concepts. Proficiency in SQL for querying and managing relational databases. Experience writing unit and integration tests using Pytest. Familiarity with CI/CD pipelines and version control systems (e.g., Git). Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications: Experience with Terraform, Docker, or Kubernetes. Knowledge of big data tools such as Apache Kafka or Airflow. Exposure to data governance and security best practices. 3. Product & AI/ML Data Analyst Salary :$130k per annum Location : Remote Key Responsibilities Conduct product analytics and user behavior analysis to drive data-backed decisions. Develop and maintain dashboards, reports, and data visualizations using Qlik Sense, Tableau, or similar tools. Apply AI/ML models to analyze trends, optimize product performance, and drive automation. Define and track product success metrics, KPIs, and A/B testing experiments. Leverage statistical analysis and predictive modeling to identify patterns and trends. Provide business intelligence insights that impact product growth, customer engagement, and retention. Collaborate with Product Managers, Data Scientists, and Business Leaders to deliver actionable insights. Ensure data integrity, consistency, and governance across reporting and analysis efforts. Stay up to date with emerging AI/ML trends and analytics best practices. Required Skills & Qualifications 7+ years of experience in data analysis, product analytics, and AI/ML-driven insights. Strong SQL skills for querying and analyzing large datasets (Presto SQL, BigQuery, etc.). Proficiency in BI tools such as Qlik Sense, Tableau, Amplitude, or Data Studio. Python experience for statistical modeling and AI/ML-driven analysis. Experience in A/B testing, experimental design, and data instrumentation. Knowledge of predictive modeling, clustering, and classification techniques. Ability to transform complex data into simple, compelling insights for business stakeholders. Experience working with large-scale datasets and structuring unstructured data for analysis. Strong communication skills to present data findings and drive decision-making. Experience in a scripting language to automate and reuse code (optional but encouraged). Preferred Qualifications Prior experience in fintech, e-commerce, SaaS, or AI-driven product analytics. Familiarity with data-driven decision-making in a product-led organization. Strong understanding of user segmentation, customer journey analysis, and engagement metrics. Hands-on experience in prompt engineering to accelerate AI-driven development. 4.Principal Data Architect Salary : $160k per annum Location : Austin/Fort Mill/NY City/San Diego. ( has to be local or willing to come to these locations for 3 days a week) Job Overview: The Principal Data Architect is responsible for leading, analyzing, and designing the Enterprise Data Architecture (EDA) for the Merger and acquisition (M&A) group, which supports our extraordinary company growth projections and digital experiences vision. Responsibilities: Leading Enterprise Data Architecture Modeling and designs leveraging cross-domain architects and leadership to create the best technical design delivering domain driven, Saleable Services based solutions that eliminate fate sharing across teams supporting M&A Blueprint Data Architecture, including multi-year transitions and recommendations to drive transformation and modernization for M&A and Enterprise Collaborate with business, product, and technology teams early in the M&A and product lifecycle to infuse end-to-end architecture thinking, including functional and non-functional aspects, into the conversation. Craft a high-level design early in the lifecycle and elaborate it to lower levels of detail as the conversation progresses. Collaborating with Lean Portfolio Management to provide a high-level vision of enterprise data solutions and development initiatives. Assisting the Agile Program Management Office (ACDC) in identifying and designing development value streams. Participating in the strategy for building and maintaining the modern enabling data architectural runway initiatives via Enabler Epics Promoting Continuous Delivery Pipeline and DevSecOps capabilities Facilitating the reuse of code, components, and proven patterns supporting data and modern architecture Deliver Data Architecture in alignment with our Enterprise Architecture standards and advocate for Enterprise Architecture models and patterns. Enterprise-level data protection practices include developing, deploying, and auditing products and services through privacy by design, privacy engineering, risk models, frameworks, and more. Formulate and execute the organizations data replication strategy. Cloud migration strategy by engaging in an application rationalization process to identify the most suitable 7R approach for migration. What are we looking for We want strong collaborators who can deliver a world-class client experience. We are looking for people who thrive in a fast-paced environment, are client-focused and team-oriented, and can execute in a way that encourages creativity and continuous improvement. Requirements: 10+ years of hands-on design and implementation experience in data architecture landscape Previous experience migrating legacy platforms to modern data platform 5+ years of experience in designing and building SQL, no-SQL, big data, GraphDB, APIs, Kafka, and streaming solutions. 3+ years of experience in the following technical disciplines: application development, middleware, database management or operations, security, integration Core Competencies: Proficiency in defining current and future state Data Architecture models. In depth knowledge of cloud data platforms AWS(Primary) and Azure(Secondary). At least one cloud certification is required. Experience and expertise in designing, implementing or maintaining the following: Master Data Management solutions, Data Marketplace solutions, Data Science Platforms and Productizing Data, Enterprise Analytics solutions. Experience/exposure with one or more Neural Networks, machine learning, deep learning and NLP tools and frameworks. Experience migrating monolith on-prem applications to the cloud using Refactoring, Rearchitecting and Re-platforming approaches, including 12-factor. At least one industry-recognized tool/framework/certification is required. Experience in DevOps tools, automated deployments and containerization. At least one industry-recognized tool/framework/certification is required. Excellent communication skills that translate business problems in technical solutions Consult with customers to solve business problems by adapting current technology and/or capitalizing on emerging technology for competitive advantage to meet the strategic objectives of M&A Conduct demonstrations and presentations to executive management of prototypes and proofof-concepts. Understanding of agile methods (SAFe) and processes, and capability of supporting agile product teams by providing advice and guidance on opportunities, impact and risks, taking account of technical and architectural debt Preferences: More than 5+ years of experience in designing and building SQL, no-SQL, big data, GraphDB, APIs, Kafka, and streaming solutions is highly preferred Previous proven Data Architecture experience to include contributing to and/or progressing the following: Enterprise Data Modeling, Information Architecture Modeling, Information Analysis Modeling, Physical Data Modeling, Enterprise Data Governance Functions and Frameworks - Standards, Policies, Processes, QA, Enterprise Data Technology Landscape, Enterprise Information Analysis Maps, Reference, Master, Meta Data and Document Practices, Data Virtualization, Data Streaming, Operational Data Store, Data Warehouse, Data Management Practice including Stewardship and Life Cycle Management AWS Certification DevOps Certification Application Portfolio Rationalization Certification Data Security Certification 5. Sr Java Backend Developer Salary: $120 per annum St Louis, MO (Hybrid) Requirement: Strong Java developer with very good understanding of micro services. Grasp of software engineering skills in modular design, data structures, algorithms Deep knowledge and hands on experience with modern applications like spring boot, Angular. Experience building and operating critical production systems Solid understanding of modern API design and Restful principles Fluent with git (preferred) or other SCM system Write well designed and testable code Strong Experience in automation and build tools like Maven, Ant and Gradle. Bachelor's degree in computer science engineering or a related discipline, or equivalent work experience. Experience in software development design and implementation of large scale distributed systems and web services building complex software that is testable and designed for extensibility. Good understanding of building, deploying, and maintaining critical applications in a cloud based environment. Work extensively with open source software capable to modify or extend code maintained as part of an open source project. Employ both Object Orientated development skills and Systems Engineering skills . Skills Required : Java, Spring Boot, Microservices, Rest API, PCF , Oracle PostgreSQL, Jenkins, Kafka With Regards, [email protected] -- Keywords: cprogramm continuous integration continuous deployment quality analyst artificial intelligence machine learning business intelligence sthree information technology wtwo Missouri New York W2 Roles ONLY - Not for Benchsales Recruiters - Only Independent Consultants [email protected] hotlist http://bit.ly/4ey8w48 https://jobs.nvoids.com/job_details.jsp?id=2217382&uid= |
[email protected] View All |
10:53 PM 28-Feb-25 |