Thank you for your much detailed answer.
I can play deal with that, plus 10s was just an exaggeratedly high value for the example. We're probably going to go with 1000ms (1s) interval.
It's not quite clear to how you would want to work this differently. As you correctly expect, the RequestedUpdateRate limits the rate of changes being sent by the server to the client. Even if the *first* change was sent by the server right away what if the *next* change occurred just a second later? The server would have to wait 9 additional seconds anyway, to fulfill the rate limit - so you would be back to the same problem.
Well my understanding was like the following.
With a RequestedUpdateRate = 10s
Let's assume I start my subscription at time k.
Then my value would change at k+22seconds. I expected to get a notification at that exact time.
Then my value would change again at k+35 seconds. I expected to get a notification at that exact time.
Then my value would change again at k+37 seconds. I expected to get a notification at k+45 (previous change time + minimal interval of 10s)
Now If i understand you answer it should work as the following.
With a RequestedUpdateRate = 10s
Let's assume I start my subscription at time k.
Then my value would change at k+22seconds. I will get a notification at k+30s (k+ 3x10s)
Then my value would change again at k+35 seconds.
Then my value would change again at k+37 seconds. I will get only one notification at k+40s and only if my value differs from it's value at k+22s.
To make it shorter, it will send a notification to the next "tick" of the timer (which has a 10s interval), not "when the value change".
Am i correct?