Home

Big data engineer || San Jose, Ca - Onsite at San Jose, California, USA
Email: [email protected]
Hi,

Hope you are doing good...!

This is Divya from Siri Info Solutions. I have some urgent
requirements with one of my clients. Please go through the Job Description and
let me know your interest. In case you are not interested, it will be nice to
let your friends/colleagues know of this position who may be a potential.

COMPLETE JOB DESCRIPTION IS BELOW FOR YOUR REVIEW:

Job title: Cloud System Engineer

Location: San Jose, CA- onsite

Hire Type: Contract

 Only H1b & TN consultant with PP no.

Skills, Roles and Responsibilities

(Google/AWS/Azure public cloud, PySpark, Big Query and
Google Airflow)

Participate in 24x7x365 SAP Environment rotational shift
support and operations

As a team lead you will be responsible for maintaining the
upstream Big Data environment day in day out where millions of finanacial data
flowing through, consists of PySpak, Big Query , Datgaproc and Google Air flow

You will be responsible for streamlining and tuning
existing Big Data systems and pipelines and building new ones. Making sure the
systems run efficiently and with minimal cost is a top priority

Manage the operations team in your respective shift, You
will be making changes to the underlying systems

This role involves providing day-to-day support, enhancing
platform functionality through DevOps practices, and collaborating with
application development teams to optimize database operations..

Architect and optimize data warehouse solutions using
BigQuery to ensure efficient data storage and retrieval.

Install/build/patch/upgrade/configure big data
applications

Manage and configure BigQuery environments, datasets, and
tables.

Ensure data integrity, accessibility, and security in the
BigQuery platform.

Implement and manage partitioning and clustering for
efficient data querying.

Define and enforce access policies for BigQuery datasets.

Implement query usage caps and alerts to avoid unexpected
expenses.

Should be very comfortable with troubleshooting
Linux-based systems on issues and failures with good grasp of the Linux command
line

Create and maintain dashboards and reports to track key
metrics like cost, performance.

Integrate BigQuery with other Google Cloud Platform (GCP)
services like Dataflow, Pub/Sub, and Cloud Storage.

Enable Bigquery through tools like Jupiter notebook,
Visual Studio code, other CLIs

Implement data quality checks and data validation
processes to ensure data integrity.

Manage and monitor data pipelines using Airflow and CI/CD
tools (e., Jenkins, Screwdriver) for automation.

Collaborate with data analysts and data scientists to
understand data requirements and translate them into technical solutions.

Provide consultation and support to application development
teams for database design, implementation, and monitoring.

Proficiency in Unix/Linux OS fundamentals, / perl
/python scripting, and Ansible for automation.

Disaster Recovery & High Availability

Expertise in planning and coordinating disaster recovery
principles, including backup/restore operations

Experience with geo-redundant databases and Red hat
cluster

Accountable for ensuring that delivery is within the
defined SLA and agreed milestones (projects) by following best practices and processes
for continuous service improvement.

Work closely with other Support Organizations (DB, Google,
PySpark data engineering and Infrastrcture teams)

Incident Management, Change Management, Release Management
and Problem Manageme

Desirable Skills:

Proven experience with cloud platforms such as AWS,/ Google
Cloud.

Should Oversee cloud-based systems, including virtual
machines, storage solutions, and networking components, ensuring optimal
performance and scalability.

Should Regularly monitor system performance, troubleshoot
issues, and implement necessary updates to maintain system integrity and
availability

Able to Implement and manage security measures to protect
cloud infrastructure, ensuring compliance with industry standards and
organizational policies

Should Control and manage user access permissions to cloud
resources, ensuring appropriate levels of access and data security.

Should Develop and implement automation scripts to
streamline cloud operations and optimize resource utilization.

Should Work closely with IT teams, including developers and
security professionals, to design and implement cloud solutions that meet
organizational needs

Strong understanding of cloud security principles and best
practices.

Proficiency in scripting languages like Python, Power,
or similar.

Excellent problem-solving skills and attention to detail.

Strong communication and collaboration abilities.

Regards

Divya Pandey

LinkedIn:

linkedin.com/in/divya-pandey-5345ba218

Mail Id:
[email protected]

--

Keywords: continuous integration continuous deployment database information technology golang trade national California Idaho Tennessee
Big data engineer || San Jose, Ca - Onsite
[email protected]
[email protected]
View All
10:05 PM 21-Feb-25


To remove this job post send "job_kill 2196772" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.


Your reply to [email protected] -
To       

Subject   
Message -

Your email id:

Captcha Image:
Captcha Code:


Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 11

Location: San Jose, California