Here is a comprehensive example of how to use Kafka with EDFS. This guide covers publish, subscribe, and the filter directive. All examples can be modified to suit your specific needs. The schema directives and edfs__* types belong to the EDFS schema contract and must not be modified.
Based on the example above, you will need a compatible router configuration.
Copy
Ask AI
events: providers: kafka: - id: my-kafka # Needs to match with the providerID in the directive tls: enabled: true authentication: sasl_plain: password: "password" username: "username" brokers: - "localhost:9092"
In the example query below, one or more subgraphs have been implemented alongside the Event-Driven Graph to resolve any other fields defined on Employee, e.g., tag and details.surname.
Copy
Ask AI
subscription { filteredEmployeeUpdatedMyKafka(employeeID: 1) { id # resolved by the Event-Driven Graph (through the event) tag # resolved by another subgraph details { # resolved by another subgraph surname } }}