One of the best places where I can point to that describes the situation quite well is this thread:
https://github.com/Azure/azure-service-bus-dotnet/issues/239
(I rarely see much getting resolved on that forum. That thread was closed a few years ago, and once that happens, you can usually give up on getting a resolution on there.)
So imagine you have a string - let's say its value is "test123"
, and you want to send it over the Azure Service Bus to a legacy system, and that legacy system is using .NET Framework along with BrokeredMessage
s to receive everything. However you need to use .NET Standard (or potentially Core, if necessary), which relegates you to Message
.
This doesn't work out of the box, as once the Full Framework system gets the message and starts trying to deserialize it out of a BrokeredMessage
it throws exceptions along the following lines:
There was an error deserializing the object of type System.Byte[]. The input source is not correctly formatted.`
The problem is this: When using Message
s in .NET Standard, the serialized form of your body will typically look something like this:
t e s t 1 2 3
116 101 115 116 49 50 51
However to make this work with a BrokeredMessage
out of the box, it needs to look something more like:
@ [ACKNOWLEDGE] s t r i n g [BACKSPACE] 3
64 6 115 116 114 105 110 103 8 51
h t t p : / / s c h e m a s . m ...
104 116 116 112 58 47 47 115 99 104 101 109 97 115 46 109 ...
Ö [BELL] t e s t 1 2 3
153 7 116 101 115 116 49 50 51
So, to some extent, this can potentially be worked around in the .NET Standard sender by taking the original string, "test123"
, and just manually padding it to look like the .NET Framework version. However there are multiple items which seem to get in the way of this:
There are tiny nuances in the legacy / Full Framework format that are hard to predict. The main place where there seems to be a problem is where you see
Ö
on the third line above - that byte varies in ways that just seem a little quirky and hard to perfectly predict.That's just for strings. This needs to be able to work with a wide range of generics (and not just ones with the
Serializable
attribute).So far, I haven't found anything that translates stuff into the legacy format automatically. Even suggestions on the thread above are not perfect and throw exceptions on the receiving end like this:
{"Expecting element 'string' from namespace 'http://schemas.microsoft.com/2003/10/Serialization/'.. Encountered 'Element' with name 'base64Binary', namespace 'http://schemas.microsoft.com/2003/10/Serialization/'. "}
- Using heuristic reverse-engineering like this is both time-consuming and error-prone.
So what do you do about this when the legacy system is using .NET Framework and BrokeredMessage
s? In not just my case, but in many other peoples' cases as well, it is not possible to adjust the legacy software to cater to the newer .NET Standard/Core software; it must be done the other way around, if feasible.
Is there a way to resolve this issue? Are Full Framework and Core/Standard systems just directly incompatible in the general case because of this?
In case it matters, I'm doing this with topics, but normally I'd think it'd be the same situation between them and queues.
Update
The title of the question is phrased as a yes-or-no question, but I am looking for an answer that goes into detail.
If no, they are compatible, please clearly explain a very viable method to make these two work together. If yes, they are incompatible, please provide a clear, solid explanation.
In other words, please clearly explain how to put these two together in a professional, viable way that doesn't depend on 20 extra whacky dependencies or something, or please explain why and/or how this is not reasonably possible.