Notebooks-embedded DashHub tiles Regarding embedding a Notebooks output into a DashHub tile via the “Pipeline” feature as an Expert license holder, I am wondering what the limit specs are for bridging the Notebooks/DashHub environments. I was testing out the Pipeline feature to embed a visual output, specifically a plotly chart, from my Notebooks script into a tile in DashHub. I pulled in a saved View with two tags, converted them to a dataframe on a one-minute granularity, so about half a million data points (they only have data for 2025), and I am able to succesfully plot these tags in a Notebooks output. However, when I publish the notebook to a Pipeline and configure a “Notebook output” tile in DashHub, it loads for ~10s then fails with the error message: “The notebook pipeline is too big to be run by a kernel and cannot be loaded.” In my Notebook script, for reference, the View loads into the dataframe in about ~120s, which doesn’t seem overly ambitious, so I am wondering at what point the dataframe is too large