mercredi 16 avril 2014

C# algorithme mieux de faire un seul dataset égal à un autre ? -Débordement de pile


Here is our problem:


We have a current dataset that our UI is bound to (20 tables, thousands of rows per table). Normally, we use a custom PubSub implementation to apply incremental changes from other participants in our business process (clients and servers). Due to old habits of our customers and some reliability issues with our PubSub, some of our customers insist on hitting our hard refresh button.


That button gets a full copy of the data and clears the existing dataset (#1) and merges the new dataset (#2), effectively making dataset #1 identical to #2. Our merge algorithm is fairly simplistic:


public static void MergeDataSets(ref DataSet original, DataSet updated)
{
if (null == updated)
{
return;
}
if(null == original)
{
original = updated.Copy();
}
else
{
// TODO: ELI: it would be nice to have an actual merge algorithm to publish changes from updated to original.
original.Clear();
original.Merge(updated, false, MissingSchemaAction.Add);
}
}

as you can see, I did make a note that we should design a true merge algorithm. But at the time, our customers accepted the side effects for the performance trade-off. Or so we thought. Anyway, the major side effect is that any data grid (Infragistics) bound to our data gets reset, so all expanded/collapsed information is lost, selection, etc.


So, can anybody recommend a 3rd party library that already does something like that? if not, any recommendations as to how to proceed with the implementation? If we do succeed, I'll be sure to post the results here...



Here is our problem:


We have a current dataset that our UI is bound to (20 tables, thousands of rows per table). Normally, we use a custom PubSub implementation to apply incremental changes from other participants in our business process (clients and servers). Due to old habits of our customers and some reliability issues with our PubSub, some of our customers insist on hitting our hard refresh button.


That button gets a full copy of the data and clears the existing dataset (#1) and merges the new dataset (#2), effectively making dataset #1 identical to #2. Our merge algorithm is fairly simplistic:


public static void MergeDataSets(ref DataSet original, DataSet updated)
{
if (null == updated)
{
return;
}
if(null == original)
{
original = updated.Copy();
}
else
{
// TODO: ELI: it would be nice to have an actual merge algorithm to publish changes from updated to original.
original.Clear();
original.Merge(updated, false, MissingSchemaAction.Add);
}
}

as you can see, I did make a note that we should design a true merge algorithm. But at the time, our customers accepted the side effects for the performance trade-off. Or so we thought. Anyway, the major side effect is that any data grid (Infragistics) bound to our data gets reset, so all expanded/collapsed information is lost, selection, etc.


So, can anybody recommend a 3rd party library that already does something like that? if not, any recommendations as to how to proceed with the implementation? If we do succeed, I'll be sure to post the results here...


Related Posts:

0 commentaires:

Enregistrer un commentaire