We are seeking an experienced Java Developer with strong data engineering skills to join our team working with cloud-native technologies and event-driven architectures.
Core Responsibilities
Event Streaming Development:
Design and implement event-driven solutions using Kafka and Kafka Streams
Develop and maintain Kafka Connect connectors for data integration
Optimize event streaming patterns and data flows
Cloud Data Engineering:
Work with AWS data services including S3, Kinesis, SNS/SQS
Implement data pipelines and ETL processes
Design and optimize data models for various use cases
Software Development:
Write clean, maintainable Java code following best practices
Participate in code reviews and technical discussions
Contribute to continuous improvement of development practices
Ideal Profile
Technical Skills:
Strong Java development experience (Java 11+)
Proven experience with Kafka, Kafka Streams, and Kafka Connect
Practical knowledge of AWS data services (S3, Kinesis, SNS/SQS)
Experience with data modeling and schema design
Solid understanding of distributed systems
Nice to Have
Snowflake and data lake architectures
Grafana and observability practices
Kubernetes and container orchestration
General data engineering principles and patterns
Experience with cloud-native development practices
Soft Skills
Strong problem-solving and analytical skills
Team player with good communication abilities
Proactive approach to learning new technologies
Interest in data engineering and distributed systems