Kafka Engineer/Architect
Job Type | Contract |
Location | England |
Area | UK, UK |
Sector | Big Data & Analytics |
Salary | £650 - 750 per day |
Currency | GBP |
Start Date | ASAP |
Advertiser | remoteapi |
Job Ref | CR/079046 |
Job Views | 368 |
- Description
Data Engineer/Architect – KAFKA
6-12 Months Contract
Our client is searching for a talent and motivated Kafka Engineer/Architect, who will have experience in designing, performing POC where as needed, and developing enterprise’s Apache Kafka Distributed Messaging and Integration Ecosystem.
What you'll do:
- Design, develop, and implement scalable data pipelines using technologies like Apache Kafka, Spark, and others (familiarity with a variety of data engineering tools is a plus).
- Design and Develop enterprise’s Apache Kafka Distributed Messaging and Integration Ecosystem.
- Developing playbooks for troubleshooting and support teams.
- Design monitoring solutions and baseline statistics reporting to support implementation requirements.
- Collaborate with data analysts and data scientists to understand data requirements and translate them into efficient data pipelines.
- Ensure data quality and integrity through data cleansing, transformation, and validation processes.
- Develop and implement automated solutions to streamline data processing tasks.
- Monitor and maintain data pipelines to ensure smooth operation and performance optimization.
Experiences required:
- 3+ years of experience as a Data Engineer/Architect or similar data-centric role.
- Solid experiences and knowledge in the deployment of Kafka.
- Strong Java, Spark, and Kafka experiences.
- Worked with medium-to-large data pipelines: implementation, testing and deploying.
- Strong proficiency in SQL and scripting languages (Python, Java, etc.).
- Experience with cloud platforms like AWS, Azure, or GCP (a plus).
- Familiarity with data warehousing concepts and technologies.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills.Big Data Technologies such as Hadoop, Spark, Kafka, etc. Hadoop: 5+ years Kafka: 3+ years Spark: 4+ years PySpark: 3+ years