Job Description
Job Description
The Integrations Developer III (Senior) will be responsible for architecting, designing, and developing scalable, real-time event-driven solutions and streaming data integrations in cloud-native environments. This role requires a strong background in Java development with modern frameworks, multithreading, and reactive programming, as well as extensive experience with Kafka, AWS services, and CI/CD automation. The ideal candidate will work collaboratively across Cloud, DevOps, and Data Engineering teams to deliver robust and secure event-driven architectures and mentor junior engineers on best practices.
Job Duties:
· Collaborate with architects and technical leads to design and implement scalable real-time event streaming platforms using Java 11/17+, Spring Boot, multithreading, and reactive programming paradigms.
· Design and develop event-driven architectures by integrating Confluent Kafka with AWS EventBridge to support serverless computing and microservices patterns.
· Configure and deploy Kafka Connect Source/Sink connectors for seamless data flow across AWS services including S3, DynamoDB, RDS, Lambda, and Kinesis.
· Utilize Kafka Streams and ksqlDB for real-time data processing and transformation pipelines.
· Manage and maintain Kafka Schema Registry, supporting serialization formats such as Avro, JSON Schema, and Protobuf.
· Monitor and tune Kafka cluster performance, including partition optimization, consumer lag tracking, and leveraging tiered storage for data retention.
· Enforce secure Kafka deployments by implementing RBAC, ACLs, SSL/TLS, and OAuth configurations.
· Automate infrastructure provisioning and deployment workflows for Kafka and AWS integrations using Terraform, Kubernetes, and Helm charts.
· Build and maintain CI/CD pipelines for real-time applications using Jenkins and GitHub Actions, ensuring repeatable and reliable deployments.
· Establish observability through tools like Confluent Control Center, AWS CloudWatch, Prometheus, and Datadog to monitor health and performance.
· Partner closely with cross-functional teams to ensure smooth integration of event-driven workflows into broader enterprise systems.
· Provide technical leadership by mentoring junior engineers, conducting code reviews, and sharing best practices across the engineering organization.
Education & Requirements:
· Bachelor’s Degree in Computer Science, Engineering, or related field
· 5+ years of experience with Java (Java 11/17+), Spring Boot, multithreading, and reactive programming
· 4+ years of experience implementing and managing Confluent Kafka and Kafka Connect integrations with AWS services (EventBridge, S3, RDS, DynamoDB, Lambda)
· Strong experience with Kafka Schema Registry using Avro, Protobuf, and JSON Schema
· Solid understanding of event-driven architecture, microservices patterns, and real-time data pipelines
· Hands-on experience automating infrastructure with Terraform, Kubernetes, and Helm
· Proficiency with CI/CD tools including Jenkins and GitHub Actions
· Strong experience in observability/monitoring using Prometheus, Datadog, and CloudWatch
· Demonstrated experience securing Kafka platforms (RBAC, ACLs, TLS/SSL, OAuth)
Plus:
· Experience with Kafka Streams and ksqlDB for complex data transformations
· Previous experience in FinTech, banking, or large-scale enterprise environments
· Experience mentoring junior developers and leading small teams