I'm querying an API for a set of data which may change in the future, however I only want to process the response if there is a change.
What I figured would be a workable solution but am yet to find anything to support my way of thinking, is to pass the response into a hash and record that result into a table, then when doing subsequent queries I can generate and compare the hash before processing the data received.
The data returned can be considerable in size and may have the slightest change (just a single character) and I'm wanting to be certain that it will be reflected in the resulting hash. I'm also concerned about the speed of generating the hash supplying such a large dataset, I believe md4 would be the fastest.
What do I need to consider? or is proposed method above sufficient?