Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Starting from the worst case in the above table, more and more data is now set to a lower sampling rate, while the others remain unchanged:

Sampling rate / Number of data points010002000300040005000
1000~67%~65%~64%~62%~60%~59%
2000~67%~64%~62%~59%~58%~56%
5000~67%~64%~61%~59%~56%~54%
Note

A mixed operation of 1000 variables each with a sampling rate of 500, 1000, 2000, 3000, and 4000 milliseconds resulted in a CPU load of ~59%.

The second conclusion can be drawn as follows:

  • The CPU load can also be reduced by dividing the variables into groups with different refresh rates.

...

This time as well, the values are recorded with and without the value being changed.
Again, the sampling rate is at 500 milliseconds.



Number of data pointsWithout value changeWith value change
1000~10%~11%
2000~10%~12%
3000~11%~13%
4000~11%~14%
5000~12%~14%
Tip

Even if combining data points in different groups/arrays means more work in the actual project, this investment should be made for larger plants.

...