Job Title: Solution Architect – Big Data (AWS)
Location: Pune, Bangalore, Chennai, Hyderabad, Gurgaon,Trivandrum, Kochi
Experience Level: 9+years
Key Skills: AWS, pyspark, SQL
Job Summary:
We are looking for a highly skilled and motivated Solution Architect with deep expertise in designing and implementing innovative big data solutions on the Amazon Web Services (AWS) platform. As a key member of our team, you will lead the architecture and development of scalable, high-performance, and secure data systems, playing a critical role in driving our data-driven initiatives forward.
Key Responsibilities:
Solution Architecture & Design
Collaborate with product managers, engineers, and data scientists to gather business requirements and translate them into efficient, scalable data solutions on AWS. Architect AWS-based big data systems using services such as S3, EMR, Glue, Redshift, DynamoDB, Lambda, and Athena. Design well-architected and cost-effective solutions aligned with industry best practices.Technology Leadership
Stay current with AWS advancements and big data technologies to evaluate and recommend appropriate tools. Provide technical leadership and mentorship to the data engineering team, promoting a collaborative and innovative environment.Implementation & Optimization
Oversee the end-to-end development lifecycle—from proof of concept to production deployment. Identify and resolve performance bottlenecks, ensuring optimal scalability and cost-efficiency. Ensure solutions meet security, compliance, and data governance standards.Collaboration & Communication
Work closely with both technical and non-technical stakeholders to communicate solutions and guide decision-making. Create and maintain documentation for architecture, data pipelines, and engineering processes.Support & Maintenance
Assist in troubleshooting and resolving technical issues. Ensure the reliability, accuracy, and consistency of data across pipelines.Technical Skills Required:
AWS Big Data Services: S3, EMR, Glue, Glue Catalog, Lambda, Redshift, DynamoDB, Athena, SageMaker Workflow Orchestration: Airflow Programming: Python, PySpark Data Management: SQL, data modeling, warehousing, transformation, and integration Security: IAM, encryption, and compliance implementation Version Control: Git / Bitbucket Nice to Have: Experience with training/processing jobs in SageMaker, Unix/Linux scripting, and Apache Spark/Flink/HadoopQualifications:
Bachelor's or Master's degree in Computer Science, IT, or a related field. Proven experience in a Solution Architect or similar role, with a focus on big data systems on AWS. Solid understanding of data security, governance, and compliance. Strong problem-solving and analytical skills. Excellent verbal and written communication skills; ability to work with diverse teams.