Instagram
youtube
Facebook

Sr. Software Engineer – Python & Pyspark

3 - 6 years
Not Disclosed
10 June 12, 2025
Job Description
Job Type: Full Time Employee Education: B.Sc/ M.Sc/ B.E/ M.E./ B.Com/ M.Com/ BBA/ MBA/B.Tech/ M.Tech/ All Graduates Skills: Python, .net, React Native, Django, Javascript, HTML, CSS, Typescript, Communication Skills, Power Bi, Numpy Pandas, Sql, machine learning, Data Analysis, Coimbatore, Data Science, Java, Adobe XD, Figma, php, wordpress, Artificial Intelligence, Excel

Job Title: Sr. Software Engineer – Python & PySpark
Job Requisition ID: 61427
Location: Bangalore, Karnataka, India
Posting Date: May 31, 2025
Employment Type: Right-to-Hire
Experience Required: 3 to 6 Years
Company: YASH Technologies


Primary Responsibilities

  1. Design, develop, test, and support data pipelines and data-centric applications.

  2. Build and industrialize data feeds to ensure scalability and maintainability.

  3. Create and manage data pipelines integrated into existing systems and cloud platforms.

  4. Implement security best practices in AWS pipelines, including encryption, access controls, and auditing.

  5. Improve data cleansing processes and enable connectivity between internal and external data sources.

  6. Establish and manage continuous quality improvement systems to optimize data quality.

  7. Translate user requirements into technical ingestion activities for data teams.

  8. Work collaboratively with cross-functional teams to support data pipeline design and integration.

  9. Ensure alignment with CI/CD workflows and promote test-driven development approaches.

  10. Apply version control using tools like Git and work with major repositories (e.g., GitHub, GitLab, Bitbucket, Azure DevOps).


Required Qualifications

  1. Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field.

  2. Minimum 3 years of relevant industry experience in data engineering or software development.

  3. Proficient in AWS services related to data pipelines and serverless architectures.

  4. Strong programming skills in Python and PySpark for data pipeline development.

  5. Solid experience in building secure and efficient data solutions within AWS environments.

  6. Understanding of infrastructure as code using tools like AWS CloudFormation, AWS CDK, or Terraform.

  7. Interest and ability to solve complex technical challenges related to data infrastructure.

  8. Nice to have: experience with multiple programming languages (e.g., Java, Scala, R, Rust, C++, TypeScript).

  9. Familiarity with CI/CD pipelines and cloud-native DevOps processes.

  10. Good communication skills, proactive learning attitude, and ability to work in collaborative environments.

Jobs in other cities