| Achyutha Pranavi - Sr. AI/ML Engineer | Data Analyst |
| [email protected] |
| Location: Jersey City, New Jersey, USA |
| Relocation: REMOTE |
| Visa: GC |
| Resume file: AchyuthaPranavi_SeniorAI-ML_Engineer_DataAnalyst_Resume_2025_1758117445788.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
|
Name: ACHYUTHA PRANAVI K
Role: Senior AI/ML Engineer | Data Scientist | Generative AI / LLM Specialist | Data Analyst Email address: [email protected] Contact: +1 (347) 921-0136 LinkedIn: http://www.linkedin.com/in/achyutha-p-89a3b5216 Professional Summary Senior AI/ML Engineer & Data Analyst with 11+ years of experience designing and deploying AI, ML, analytics, and cloud-native solutions across finance, healthcare, retail, insurance, and telecom sectors. Experienced in AI solution architecture, GenAI/LLM engineering, and agentic AI (LangGraph, AutoGen, Semantic Kernel). Skilled in GCP Vertex AI, BigQuery, and multi-cloud deployments (Azure, AWS, GCP). Proven leadership in guiding AI/ML teams, mentoring engineers, and aligning technical roadmaps with enterprise goals. Strong programming foundation in Python (Advanced), SQL, PySpark, R, and Bash, with additional working knowledge of Java, JavaScript, Scala, and SAS. Proven expertise in cloud platforms Microsoft Azure (ML, OpenAI, Databricks, Synapse, Cognitive Search), AWS (SageMaker, Redshift, Glue, Lambda, QuickSight), and GCP (Vertex AI, BigQuery, Looker, Dataflow), along with Snowflake and Databricks for large-scale data and AI workloads. Hands-on experience leveraging Azure OpenAI (GPT models, embeddings, function calling) for enterprise-scale NLP, summarization, and clinical/financial document automation. Skilled in AI/ML frameworks including PyTorch, TensorFlow, Keras, Hugging Face Transformers, scikit-learn, and XGBoost, with practical experience building deep learning, forecasting, and NLP pipelines. Advanced practitioner in LLMs, Generative AI, and Agentic AI hands-on with LangChain, LangGraph, CrewAI, AutoGen, Semantic Kernel, MCP, function calling, and multi-agent workflows. Delivered RAG pipelines using FAISS, Pinecone, Azure Cognitive Search, and embeddings, improving retrieval accuracy and reducing hallucinations in financial and healthcare use cases. Applied NLP techniques (GPT, BERT, LLaMA, Cohere) for summarization, sentiment analysis, entity recognition, multilingual NLP, translation consistency, and clinical text mining. Experienced in MLOps & DevOps MLflow, Docker, Kubernetes (AKS/EKS), Azure DevOps, GitHub Actions, Airflow, Weights & Biases, Prometheus, Grafana, and Azure Monitor to manage full ML lifecycles, CI/CD automation, real-time inference pipelines, and model retraining. Background in data engineering & ETL with PySpark, pandas, Azure Data Factory, AWS Glue, Delta Lake, dbt, Alteryx, and Apache Airflow; optimized feature pipelines and ensured high-quality datasets for downstream ML tasks. Strong BI & analytics experience: Power BI, Tableau, QuickSight, Matplotlib, Seaborn for executive dashboards, real-time KPI tracking, and campaign performance measurement. Applied advanced analytics Marketing Mix Modeling (MMM), ROI frameworks, cohort analysis, segmentation, and statistical testing to improve campaign targeting (+23%) and revenue uplift (+12%). Designed time-series forecasting models (ARIMA, Prophet, Neural Prophet, LSTM) for retail demand planning, healthcare risk prediction, and financial trend analysis. Hands-on with databases and data warehousing Snowflake, Redshift, SQL Server, PostgreSQL, MySQL, Oracle including query optimization, modeling, and graph-based analytics with Neo4j and NetworkX. Implemented Responsible AI practices using SHAP, LIME, fairness metrics, and Azure AI Content Safety to ensure explainability, compliance, and governance in regulated domains. Exposure to GPU acceleration (CUDA, performance benchmarking), synthetic data (Gretel.ai, data augmentation, differential privacy), and emerging frameworks (Spring AI, Agno) for next-gen AI development. Experienced in RLHF, RAHF, PEFT/LoRA fine-tuning, and reward modeling to improve model alignment, relevance, and accuracy. Delivered business impact: reduced false claims by 30% (HCA Healthcare), automated compliance/audit workflows (-40% prep time), and cut analyst review time by 40% at Jefferies with LLM-powered summarization. Collaborative leader, mentor, and cross-functional partner, known for translating complex AI/ML concepts into actionable business insights and training teams on MLOps, GenAI, and responsible AI practices. Technical Skills Programming & Core Technologies Python (Advanced), SQL, PySpark, R, Bash, Linux/Unix, SAS (Analytics & Actuarial Modeling) Cloud Platforms Microsoft Azure (ML, Azure ML Studio, Databricks, Synapse, Cognitive Search, OpenAI, Application Insights), AWS (SageMaker, Redshift, S3, Lambda, Athena, Glue, QuickSight, CloudWatch), GCP (Vertex AI, BigQuery, Looker, Dataflow, GCP AI/ML APIs), Snowflake Data Visualization & BI Power BI, Tableau, Amazon QuickSight, Matplotlib, Seaborn, Plotly, statsmodels Big Data & ETL PySpark, pandas, Azure Data Factory, AWS Glue, Delta Lake, dbt, Alteryx, Apache Airflow, Feature Stores (Feast, Delta Lake) Databases & Warehousing Snowflake, Amazon Redshift, SQL Server, PostgreSQL, MySQL, Oracle, Data Modeling, Query Optimization, Neo4j, NetworkX, Graph Embeddings, Knowledge Graphs Analytics, Forecasting & ROI Marketing Mix Modeling (MMM), ROI Measurement Frameworks, Advanced Statistics, A/B Testing, Cohort Analysis, KPI Development, Customer Segmentation, Market Research, ARIMA, Prophet, Neural Prophet, LSTM, Demand Forecasting AI/ML Frameworks PyTorch, TensorFlow, Keras, Hugging Face Transformers, scikit-learn, XGBoost, LangChain, Semantic Kernel, LangGraph, AutoGen, CrewAI NLP & LLMs & GenAI Frameworks & Tools: LangChain, LangGraph, AutoGen, CrewAI, Semantic Kernel, Azure OpenAI, Hugging Face Transformers Models & Techniques: GPT, BERT, LLaMA, Cohere, RAG (FAISS, Pinecone, Azure Cognitive Search), embeddings, semantic search, prompt engineering, multi-agent systems, function calling, MCP (Model Context Protocol) Training & Alignment: LoRA/PEFT, RLHF, RAHF, Reward Modeling, Prompt Tuning Applications: Text summarization, sentiment analysis, entity recognition, multi-lingual NLP, translation consistency checks, localization QA, clinical NLP MLOps & DevOps MLflow, Weights & Biases (W&B), Docker, Kubernetes (AKS/EKS), Azure Kubernetes Service (AKS), Azure DevOps, GitHub Actions, Airflow, Prometheus, Grafana, Azure Monitor, CI/CD Automation (YAML pipelines), Terraform Real-Time Inference Pipelines, KServe, Kubeflow Pipelines, Vertex AI Pipelines Responsible AI & Privacy SHAP, LIME, Fairness Metrics, Azure AI Content Safety, HIPAA & Financial Compliance API Development & Model Serving FastAPI, Flask, Pydantic (LLM APIs, real-time inference, compliance systems) Synthetic Data & Privacy Gretel.ai, Differential Privacy, Data Augmentation, Synthetic Data Generation Architecture & Leadership AI/ML Solution Architecture, GenAI Roadmap Planning, AI Team Mentorship, Enterprise AI Strategy Exposure / Working Knowledge Programming & Scripting: Java (basic), JavaScript (basic), Scala (basic) Agentic AI (POC-level): Spring AI, Agno GPU & Performance: CUDA (GPU utilization optimization, throughput benchmarking) CRM/Business AI: Salesforce Einstein/CRM integration (LLM-based triggers & API workflows) Prototyping Tools: Streamlit (basic), Pydantic (data validation in FastAPI) Educational Details Master of Science in Computer Science - University of Central Missouri (Aug 2011 - Dec 2012) Bachelor of Science in Computer Science - Lovely Professional university (Aug 2007 - Jun 2011) Certifications Microsoft Certified: Azure AI Engineer Associate - 2023 AWS Certified Machine Learning - Specialty 2021 Work Experience Client: Jefferies Financial Group Inc, New York, NY May 2024 - Present Role: Senior AI/ML Engineer Responsibilities: Led Generative AI initiatives on Azure, designing multi-agent LLM workflows with LangChain, LangGraph, CrewAI, and Azure OpenAI for financial research and compliance. Built production-grade RAG pipelines with FAISS, Azure Cognitive Search, and hybrid retrieval to cut SEC filing review time by 40%. Applied advanced prompt engineering and RLHF/RAHF to align LLM outputs with financial compliance guidelines. Developed containerized LLM APIs (FastAPI, Flask, Docker, Kubernetes/AKS) for low-latency inference in trading and audit workflows. Used MLflow + Azure DevOps CI/CD for experiment tracking, model versioning, and automated deployment pipelines. Benchmarked Azure ML vs. GCP Vertex AI + BigQuery to optimize multi-cloud portability and reduce infra costs. Prototyped Agent-to-Agent (A2A) communication flows using MCP (Model Context Protocol), improving explainability and governance. Deployed synthetic data workflows with Gretel.ai to handle imbalanced datasets while maintaining data privacy. Deployed ML workloads on AKS with CI/CD and MLflow tracking. Implemented Neo4j + NetworkX graph analytics for fraud detection and financial entity linkage. Designed time-series forecasting models (ARIMA, Prophet, LSTM) for asset risk prediction and financial trend analysis. Built BI dashboards in Power BI + Azure Monitor with Prometheus, Grafana for model drift detection, latency monitoring, and compliance reporting. Explored Cursor AI, Spring AI, and Agno in POCs for orchestration, reducing analyst workload by 65% and error rates by 30%. Applied parameter-efficient fine-tuning (LoRA/PEFT) for domain-specific model customization without heavy compute costs. Championed Responsible AI by embedding SHAP, LIME, fairness metrics, and Azure AI Content Safety into ML lifecycle. Mentored junior engineers on GenAI, MLOps best practices, and multi-cloud AI architectures, strengthening internal capabilities. Led GenAI architecture design for enterprise-wide AI systems, aligning deployments with C-suite strategy and compliance requirements. Built agentic workflows using LangGraph and AutoGen, enabling multi-agent orchestration for financial document parsing and compliance automation. Deployed LLM APIs with FastAPI on Azure and GCP, containerized with Docker/Kubernetes and tracked via MLflow. Mentored AI engineers on best practices for RAG, LLM tuning, and scalable MLOps workflows. Client: HCA Healthcare Inc, Nashville, TN Nov 2022 - April 2024 Role: AI/ML Engineer Responsibilities: Built fraud detection models in claims processing, reducing false claims by 30% using Python, XGBoost, and Azure ML. Designed patient risk stratification pipelines on Azure Databricks with PySpark, improving early detection for high-risk patients. Fine-tuned BERT-based LLMs for clinical note classification, raising medical coding accuracy by 18%. Integrated social determinants of health (SDoH) with patient records to enrich risk models. Developed secure ETL pipelines with Azure Data Factory and Delta Lake for healthcare data ingestion and cleaning. Leveraged Azure AutoML + MLflow to automate hyperparameter tuning and model tracking. Applied explainability frameworks (SHAP, LIME) to ensure clinician trust and HIPAA compliance. Prototyped clinical Q&A assistants using LangChain + Azure OpenAI for compliance-ready retrieval systems. Implemented multi-agent workflows for physician Q&A and note validation, reducing audit prep time by 40%. Containerized ML models with Docker + Kubernetes (AKS) and integrated into Azure DevOps CI/CD. Built real-time monitoring dashboards in Power BI + Azure Monitor for accuracy, drift, and compliance metrics. Explored early adoption of agentic AI (LangChain + Cursor AI) for clinical document parsing and automation. Applied transfer learning to diagnostic imaging, improving anomaly detection accuracy in radiology use cases. Established data governance practices (lineage tracking, secure access controls) for HIPAA compliance. Collaborated with clinicians and compliance teams to translate medical expertise into ML-ready features and success metrics. Client: Target Corp, Minneapolis, MN Jan 2019 - Oct 2022 Role: Senior Data Analyst Responsibilities: Designed and implemented AWS data pipelines (Redshift, S3, Glue, Athena) for transactions across 1,800+ stores, improving reporting speed by 35%. Delivered insights using SQL, SAS, and Python (pandas, NumPy) for customer behavior, conversion, and clickstream analysis. Built dashboards in Power BI, Tableau, and QuickSight, tracking inventory, supply chain, and sales KPIs. Partnered with marketing/e-commerce teams, improving campaign targeting by 23%. Applied forecasting models (ARIMA, Prophet, LSTM, regression) to optimize pricing and demand planning, contributing to a 12% revenue uplift. Automated recurring reports using Python scripting, AWS Lambda, and Alteryx ETL, cutting manual effort by 40%. Managed data modeling projects with dbt and Snowflake, improving metric consistency and data integrity. Enriched analytics by integrating external datasets (Nielsen, market data, social sentiment). Drove data governance with AWS Lake Formation, implementing lineage tracking and compliance controls. Implemented CI/CD pipelines (Git, AWS CodeCommit, Jenkins) for analytics workflows. Partnered with DevOps to deploy monitoring solutions via AWS CloudWatch, improving system reliability. Conducted A/B testing and statistical analysis, enabling data-driven product and marketing decisions. Client: Allstate, Northbrook, IL Sept 2015 - Dec 2018 Role: Data Analyst Responsibilities: Conducted insurance data analysis (policyholder behavior, claims, underwriting) to support pricing and risk models. Leveraged AWS services (S3, Redshift, Athena, Glue, Lambda, EC2, IAM, CloudWatch) for large-scale insurance data processing. Built dashboards in Tableau, Power BI, SAS Visual Analytics, improving KPI and claims visibility. Designed ETL pipelines (AWS Glue, Python, SQL) to automate CRM and actuarial data ingestion. Partnered with actuaries to build predictive models in SAS and Python (scikit-learn, pandas) for churn and fraud detection, reducing fraud by 18%. Applied SAS procedures (PROC REG, PROC LOGISTIC, PROC GLM) for actuarial modeling and segmentation. Optimized SQL queries in Redshift, improving reporting performance by 30%. Conducted cohort analysis and segmentation with SAS and SQL, improving cross-sell campaigns. Implemented data governance and quality checks in SAS for HIPAA compliance. Applied statistical anomaly detection in SAS and SQL to flag fraudulent claims. Collaborated with DevOps to deploy analytics workflows securely with AWS IAM, EC2, CloudWatch. Trained junior analysts on SAS, SQL, and Tableau, supporting Allstate s transition to cloud-native Python/AWS analytics. Client: Ooma Inc, Sunnyvale, CA Feb 2013 - Aug 2015 Role: Data Analyst Responsibilities: Partnered with engineering, operations, and marketing teams to analyze telecom datasets, improving efficiency by 20%+. Built dashboards in Tableau and Power BI, providing executives visibility into churn and call quality. Conducted customer behavior and churn analysis with Python (pandas, NumPy) and R, shaping retention strategies. Developed predictive models (logistic regression, decision trees) to forecast call drop rates and optimize network allocation. Built ETL pipelines (SQL Server, Oracle, MySQL, Python), standardizing telecom analytics workflows. Applied data validation/cleansing routines, increasing reporting accuracy by 30%. Performed market segmentation analysis, identifying demographics to support pricing/acquisition strategies. Designed A/B testing frameworks to evaluate feature adoption, guiding product development. Conducted VoIP and call quality analytics, detecting bottlenecks and improving reliability. Supported database migration projects, validating integrity and compliance with standards. Produced trend analyses for business reviews, highlighting risks and growth opportunities. Delivered training sessions on BI dashboards, fostering a data-driven culture across teams. Keywords: cprogramm continuous integration continuous deployment quality analyst artificial intelligence machine learning business intelligence sthree rlang trade national California Illinois Minnesota New York Tennessee |