Careers

Awesome Image Awesome Image

Come join us and build a brighter future together.

  • Innovate, lead and grow! You'll thrive in an environment that encourages creative thinking, collaboration, and leadership.
  • We are always on the lookout for curious, energetic, enthusiastic engineers that believe in a better tomorrow.
Awesome Image
Awesome Image

Openings

No. of Positions: 2
Position: Data Engineer
Location: On-site (Coimbatore)
Total Years of Experience: 3+ years

Key Responsibilities:

  • Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Azure Synapse, Databricks & Apache Spark (PySpark).
  • Data Integration: Integrate data from various sources, ensuring data quality and consistency.
  • Performance Optimization: Optimize data processing workflows for performance and cost-efficiency.
  • Collaboration: Work with data architects, analysts, and product owners to understand data requirements and deliver solutions.
  • Monitoring and Troubleshooting: Monitor data pipelines and troubleshoot issues to ensure data
  • Documentation: Document data workflows, processes, and best practices.

Technical Skills:

  • Proficiency in Azure Synapse/Databricks and Apache Spark.
  • Strong PySpark and SQL skills for data manipulation and querying.
  • Familiarity with Delta Live Tables and Databricks workflows.
  • Experience with ETL tools and processes.
  • Knowledge of cloud platforms (AWS, Azure, GCP).

Soft Skills:

  • Excellent problem-solving abilities.
  • Strong communication and collaboration skills.
  • Ability to work in a fast-paced environment and manage multiple priorities.

No. of Positions: 1
Position: DevOps Architect 
Location: Coimbatore ( Onsite) 
Total Years of Experience: 7+ years

Key Responsibilities:

  • Design, implement, and optimize scalable and reliable DevOps processes for continuous integration, continuous deployment (CI/CD), and infrastructure as code (IaC). 
  • Lead the architecture and implementation of cloud-based infrastructure solutions, leveraging AWS, Azure, or GCP, depending on project requirements. 
  • Collaborate with software development teams to ensure smooth integration of development, testing, and production environments. 
  • Implement and manage tools for automation, monitoring, and alerting across development and production environments (e.g., Jenkins, GitLab CI, Ansible, Terraform, Docker, Kubernetes). 
  • Oversee the management of version control, release pipelines, and deployment processes for a variety of applications. 
  • Design and implement infrastructure monitoring solutions, ensuring high availability and performance of systems. 
  • Foster a culture of continuous improvement and work closely with development and operations teams to enhance automation, testing, and release pipelines. 
  • Ensure security best practices are followed in the development and deployment pipeline (e.g., secret management, vulnerability scanning). 
  • Lead efforts to address performance bottlenecks, scaling challenges, and infrastructure optimization. 
  • Mentor and guide junior engineers in the DevOps space.

Required Skills:

  • Bachelor’s degree in computer science, Information Technology, or related field, or equivalent work experience. 
  • 7+ years of experience in DevOps, cloud infrastructure, and automation tools. 
  • Strong experience with cloud platforms (AWS, Azure, GCP) and their services (EC2, Lambda, S3, etc.). 
  • Expertise in containerization technologies (Docker, Kubernetes) and orchestration tools. 
  • Extensive experience with automation tools (Jenkins, Ansible, Chef, Puppet, Terraform). 
  • Familiarity with infrastructure as code (IaC) principles and practices. 
  • Proficient with scripting languages (Bash, Python, Go, etc.). 
  • Strong knowledge of version control systems (Git, SVN). 
  • Experience with monitoring and logging tools (Prometheus, Grafana, ELK stack, New Relic). 
  • Excellent troubleshooting skills, with the ability to quickly identify and resolve complex issues. 
  • Strong communication and leadership skills, with a proven ability to collaborate across multiple teams. 
  • Solid understanding of Agile and Scrum methodologies. Preferred.

Qualifications: 

  • Certifications in DevOps tools, cloud technologies, or Kubernetes. 
  • Experience with serverless architecture. 
  • Familiarity with security best practices in a DevOps environment. 
  • Experience with database management and backup strategies.

No. of Positions: 1
Position:
MERN Stack Developer (Immediate Joiner)
Location: 
Coimbatore (Onsite)
Job Type:
 Full Time
Total Years of Experience: 5+ years

Job Summary:

We are looking for a highly skilled MERN Stack Developer with 5+ years of experience to join our team onsite in Coimbatore immediately. The ideal candidate should have expertise in MongoDB, Express.js, React.js, and Node.js and be capable of developing scalable web applications while ensuring high performance, security, and reliability.

Key Responsibilities:

  • Develop, maintain, and optimize web applications using the MERN stack.
  • Design and implement RESTful APIs and third-party integrations.
  • Ensure application performance, security, and scalability.
  • Write clean, maintainable, and efficient code following best practices.
  • Manage databases efficiently, ensuring data integrity and optimization.
  • Troubleshoot, debug, and resolve technical issues proactively.
  • Collaborate with UI/UX designers and backend teams for seamless integration.
  • Stay updated with emerging technologies and industry trends.

Required Skills & Qualifications:

  • 5+ years of hands-on experience in MERN stack development (MongoDB, Express.js, React.js, Node.js).
  • Strong proficiency in JavaScript, TypeScript, ES6+.
  • Hands-on experience with Redux, Hooks, Context API, and component-based architecture in React.
  • Expertise in backend development, API design, and microservices architecture.
  • Strong database management skills with MongoDB and query optimization.
  • Experience with authentication mechanisms like JWT, OAuth, session-based authentication.
  • Familiarity with cloud platforms, CI/CD pipelines, and DevOps practices.
  • Strong debugging and problem-solving skills.

Preferred Qualifications:

  • Experience with GraphQL, WebSockets, Docker, Kubernetes.
  • Knowledge of performance testing and optimization techniques.
  • Exposure to Agile methodologies and Scrum teams.

No. of Positions: 1
Position: Java Architect
Location: Coimbatore (Onsite)
Total Years of Experience: 10+ years

Responsibilities:

  • Design and develop scalable, high-performing enterprise Java applications and microservices.
  • Define architecture standards, best practices, and technical governance across development teams.
  • Collaborate with stakeholders to translate business requirements into technical solutions.
  • Lead architectural reviews, code reviews, and performance tuning exercises.
  • Define and maintain architectural artifacts such as system diagrams, data flow diagrams, and component-level specifications.
  • Guide teams on Java frameworks like Spring, Hibernate, and related technologies.
  • Implement API gateways, service orchestration, and secure communication between distributed services.
  • Leverage DevOps practices for CI/CD, containerization (Docker), orchestration (Kubernetes), and cloud deployment (AWS/GCP/Azure).
  • Ensure compliance with security, scalability, and maintainability standards.
  • Mentor junior developers and foster a strong technical culture.

Required Skills and Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • 10+ years of professional experience in Java development.
  • Strong hands-on experience with Spring Boot and Microservices architecture.
  • Proficient in ReactJS for frontend development.
  • Strong knowledge of PostgreSQL and database management.
  • Solid understanding of REST APIs, JSON, and HTTP protocol.
  • Familiarity with version control tools like Git.
  • Good understanding of Agile methodologies (Scrum/Kanban).
  • Excellent problem-solving skills and ability to work independently and as part of a team.
  • Strong communication and interpersonal skills.

No. of Positions: 1
Position: Lead Data Engineer
Location: Hybrid or Remote
Total Years of Experience: 5+ years

Key Responsibilities:

  • Build ETL (extract, transform, and loading) jobs using Fivetran and dbtfor our internal
  • projects and for customers that use various platforms like Azure, Salesforce and AWS technologies
  • Monitoring active ETL jobs in production.
  • Build out data lineage artifacts to ensure all current and future systems are properly documented.
  • Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes.
  • Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies.
  • Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations.
  • Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs.
  • Hands-on experience in developing and implementing large-scale data warehouses.
  • Business Intelligence and MDM solutions, including Data Lakes/Data Vaults.

Required Skills:

  • This job has no supervisory responsibilities.
  • Bachelor’s Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years’ experience in business analytics, data science, software development, data modeling or data engineering work.
  • 5+ years’ experience with a strong proficiency with SQL query/development skills.
  • Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks.
  • Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory).
  • Experience working in the healthcare industry with PHI/PII.
  • Creative, lateral, and critical thinker.
  • Excellent communicator.
  • Well-developed interpersonal skills.
  • Good at prioritizing tasks and time management.
  • Ability to describe, create and implement new solutions.
  • Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef).
  • Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau).

Position: Data Architect
Location: Coimbatore
Type: Full-Time

Key Responsibilities:

  • Design and architect scalable, secure, and resilient data platformsleveraging cloud-native technologies (AWS, Snowflake, Redshift).
  • Lead the design and implementation of data pipelines (ETL/ELT) for both batch and real-time processing. 
  • Build and optimize data lakes and data warehouses to support enterprise analytics and AI/ML use cases. 
  • Define and implement data modeling standards, metadata management, and master data management (MDM) practices. 
  • Architect event-driven and streaming solutions using Kafka, EventBridge, and similar technologies. 
  • Ensure compliance with governance, data security, privacy, and regulatory requirements (HIPAA, GDPR, etc.). 
  • Drive adoption of best practices in data engineering, cloud architecture, and DevOps for data platforms. 
  • Collaborate with business stakeholders, data scientists, engineers, and product teams to deliver data-driven insights. 
  • Provide technical leadership, mentoring, and guidance to engineering teams. 

Required Skills & Experience :

  • 10+ years of experience in data architecture, engineering, or related technical roles. 
  • Proven expertise in the modern data stack: AWS Glue, Lambda, Kinesis, S3, Redshift/Snowflake. 
  • Strong programming and scripting experience with SQL and Python. 
  • Hands-on experience with workflow orchestration tools (Airflow, Prefect, Step Functions). 
  • Proficiency in Big Data technologies: Spark, PySpark, Scala. 
  • Solid understanding of streaming & event-driven architectures (Kafka, EventBridge/Event Bus).
  • Experience in data modeling, building data lakes/warehouses, and architecting analytical platforms. 
  • Strong knowledge of data governance, security, compliance, and MDM practices. 
  • Excellent problem-solving, analytical, and system design skills. 
  • Exceptional communication, stakeholder management, and leadership capabilities. 

Nice-to-Have Skills :

  • Exposure to ML Ops and supporting AI/ML pipelines. 
  • Experience with containerization and orchestration (Docker, Kubernetes). 
  • Familiarity with CI/CD pipelines and Infrastructure as Code (Terraform, CloudFormation). 
  • Experience in multi-cloud or hybrid-cloud environments. 
Don’t see a role that fits? We are growing rapidly and always on the lookout for passionate and smart engineers! If you are passionate about your career, reach out to us at careers@hashagile.com.