JobsRunsGetOutput200Response
Last updated
Last updated
notebook_output
Option<>
[optional]
sql_output
Option<>
[optional]
dbt_output
Option<>
[optional]
logs
Option<String>
The output from tasks that write to standard streams (stdout/stderr) such as , [SparkPythonTask](https://docs.databricks.com/dev-tools/api/latest/jobs.html#/components/schemas/SparkPythonTask, [PythonWheelTask](https://docs.databricks.com/dev-tools/api/latest/jobs.html#/components/schemas/PythonWheelTask. It's not supported for the [NotebookTask](https://docs.databricks.com/dev-tools/api/latest/jobs.html#/components/schemas/NotebookTask, [PipelineTask](https://docs.databricks.com/dev-tools/api/latest/jobs.html#/components/schemas/PipelineTask, or [SparkSubmitTask](https://docs.databricks.com/dev-tools/api/latest/jobs.html#/components/schemas/SparkSubmitTask. Databricks restricts this API to return the last 5 MB of these logs.
[optional]
logs_truncated
Option<bool>
Whether the logs are truncated.
[optional]
error
Option<String>
An error message indicating why a task failed or why output is not available. The message is unstructured, and its exact format is subject to change.
[optional]
error_trace
Option<String>
If there was an error executing the run, this field contains any available stack traces.
[optional]
metadata
Option<>
[optional]