Skip to main content
Question

TrendMiner tag “update” overwrites history instead of appending – scalability limitation

  • February 12, 2026
  • 0 replies
  • 8 views

We are currently using the trendminer‑interface SDK  to create and populate custom tags (e.g. predictions / calculated KPIs). During implementation and testing, we have identified a major limitation that becomes critical at scale.

When using: 

client.io.tag.save(tag_data_dict, index=False)

to write data to an existing tag:

  • Saving a new dataset overwrites existing historical data for that tag.
  • It is not possible to append new time‑series data to an existing tag.
  • Writing data for a new time range clears previously uploaded history.

Although the documentation refers to this operation as an “update”, the actual behavior is a full overwrite of the time series, not an incremental append.

Why this is a problem?

Because appending is not supported, every  update would require:

  • Re‑uploading the entire historical dataset for each tag
  • Maintaining full history externally

This leads to:

  • Significant performance and bandwidth overhead
  • Long execution times
  • High operational risk (partial uploads or failures can wipe history)
  • A design that does not scale for near‑real‑time or historian‑like use cases

Questions to the comunity/Trendminer team

  1. Is there any supported way to append data to an existing tag time series (via SDK or API)?
  2. Is this overwrite behavior by design, or is append functionality planned?
  3. If append is intentionally not supported, what is the recommended scalable approach ?