I have done similar for a few customers. I have found that useful pattern is to have both raw queues (incoming data) and clean queue (outgoing data). Outgoing data in single queue only (so all changes are ordered, so we avoid eventual consistency) that has well-defined data model (custom DSL for defining it) and tables/REST api that corresponds 1-to-1 to the data model. Then we need mappings from raw queues to the clean queue.
Interesting experience! Can you explain a bit more about the raw queues vs clean queues? Is it literally just incoming and outgoing queues or was there a problem that you were trying to solve?
The raw data was whatever came from the individual source systems. Mostly these were external systems. As we are talking large organizations, there are always various kinds of problems with the source data. So the aim was to bring it all together under a single, shared data model. It was very helpful to use internal DSL for defining clean models, we called it the canonical data model.