Home

Data Bricks Developer at 100% Remote at Remote, Remote, USA
Email: [email protected]
Processing description:
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2459213&uid=72b8c7fa1db34253a1fd04089c8592c4

From:

Rajeev Jeevan,

Cloud Security Web

[email protected]

Reply to: [email protected]

Job Title: Data Bricks Developer.
Location: 100% Remote.
Experience: 7+

Note: Mandatory Skill: Pyspark experience

Job Description

We are seeking experienced DataBricks Developers with a strong background in PySpark to join our remote team supporting Farmcredit. The ideal candidate will have at least 7 years of hands-on experience in data engineering, with a focus on building, optimizing, and maintaining scalable big data solutions using DataBricks and PySpark.

Key Responsibilities

Design, develop, and optimize large-scale data pipelines and ETL processes using DataBricks and PySpark.

Collaborate with data architects, analysts, and business stakeholders to gather requirements and deliver robust data solutions.

Implement data transformation, cleansing, aggregation, and validation logic to ensure high data quality.

Optimize Spark jobs for performance and cost efficiency in cloud environments (Azure/AWS).

Integrate data from various structured and unstructured sources into the DataBricks platform.

Monitor, troubleshoot, and resolve issues in production data pipelines.

Develop and maintain technical documentation for data workflows and processes.

Ensure best practices in data security, governance, and compliance are followed.

Mandatory Skills

Extensive hands-on experience with PySpark

for building and optimizing data pipelines.

Strong experience with DataBricks (preferably in Azure or AWS environments).

Proficiency in Python programming for data engineering tasks.

Solid understanding of distributed computing concepts and Spark architecture.

Experience with ETL/ELT development and data integration from multiple sources.

Strong SQL skills for data manipulation and querying.

Familiarity with cloud data platforms (Azure Data Lake, AWS S3, etc.).

Experience with version control tools (e.g., Git).

Nice-to-Have Skills

Experience with Delta Lake, DataBricks SQL, or MLflow.

Knowledge of data warehousing concepts and BI tools.

Familiarity with CI/CD pipelines for data engineering.

Exposure to data security, governance, and compliance frameworks.

Prior experience in the financial or agricultural domain.

Keywords: continuous integration continuous deployment business intelligence sthree
Data Bricks Developer at 100% Remote
[email protected]
http://bit.ly/4ey8w48
https://jobs.nvoids.com/job_details.jsp?id=2459213&uid=72b8c7fa1db34253a1fd04089c8592c4
[email protected]
View All
01:48 AM 28-May-25


To remove this job post send "job_kill 2459213" as subject from [email protected] to [email protected]. Do not write anything extra in the subject line as this is a automatic system which will not work otherwise.

Pages not loading, taking too much time to load, server timeout or unavailable, or any other issues please contact admin at [email protected]


Time Taken: 24

Location: , Remote