Regarding embedding a Notebooks output into a DashHub tile via the “Pipeline” feature as an Expert license holder, I am wondering what the limit specs are for bridging the Notebooks/DashHub environments.
I was testing out the Pipeline feature to embed a visual output, specifically a plotly chart, from my Notebooks script into a tile in DashHub. I pulled in a saved View with two tags, converted them to a dataframe on a one-minute granularity, so about half a million data points (they only have data for 2025), and I am able to succesfully plot these tags in a Notebooks output. However, when I publish the notebook to a Pipeline and configure a “Notebook output” tile in DashHub, it loads for ~10s then fails with the error message: “The notebook pipeline is too big to be run by a kernel and cannot be loaded.”
In my Notebook script, for reference, the View loads into the dataframe in about ~120s, which doesn’t seem overly ambitious, so I am wondering at what point the dataframe is too large for the DashHub tile to handle.
Instead of stress-testing the feature myself, I am wondering if there are hard limits on record for the Notebooks output tile in DashHub. Some questions I would have are the following:
- What is the maximum number of datapoints or maximum time allowed for loading a View in a DashHub Notebooks output tile kernel?
- About how many Views/tags could a Notebooks output tile handle in DashHub?
If the limits of this feature are known, it would be helpful to know or at least figure this out as we evaluate more use cases for our users. Thanks
