I am trying to create a component which consumes data from one topic, processes it and sends to another topic, i.e., I need to make my component a consumer and producer both. How do I configure this in my Spring boot application?
Sounds like what you are looking for is the Kafka Streams API. It's an Open source Java API for manipulating events in flight by reading from one topic, run through processing steps and writing to another topic. Therefore it acts as both a producer and consumer. See Kafka Streams documentation for examples.
Setting different Serde configs for the producer and consumer:
KStream<String, String> wordCountInputStream = streamsBuilder.stream("word-count-input", Consumed.with(Serdes.String(), Serdes.String()));
KTable<String, Long> wordCounts = wordCountInputStream.mapValues(value -> value.toLowerCase()).....
wordCounts.toStream().to("word-count-output", Produced.with(Serdes.String(), Serdes.Long()));