Skip to content

Clean Rooms

The Clean Rooms page tracks activity for Databricks Clean Rooms — secure environments where multiple organizations collaborate on shared data without exposing their underlying datasets. Clean room notebook runs and collaboration events can be difficult to monitor, especially across collaborators. LakeSentry surfaces notebook run counts, durations, and collaboration events to help teams understand clean room usage.

The top of the page shows headline metrics:

MetricWhat it shows
Active RoomsNumber of clean rooms that have not been deleted
Total RunsTotal notebook executions across all clean rooms
SuccessfulNumber of successful notebook runs
CollaboratorsTotal number of distinct collaborating organizations

The Overview tab shows all tracked clean rooms in a table, along with a bar chart of notebook runs by room (top 10):

ColumnWhat it shows
Clean RoomName and truncated central clean room ID
StatusActive or Deleted
CollaboratorsNumber of participating organizations
Notebook RunsTotal number of notebook executions, with a badge showing failed runs if any
Total DurationCumulative wall-clock time of all notebook runs
Last ActivityWhen the most recent event occurred, along with the event type

The table defaults to sorting by Notebook Runs (descending). Sortable columns include:

  • Clean Room — Alphabetical by name
  • Collaborators — By collaborator count
  • Notebook Runs — By total execution count

The Daily Activity tab shows per-day, per-room activity metrics for the selected time range. You can filter by a specific clean room using the room_id parameter.

ColumnWhat it shows
DateDay of recorded activity
Clean RoomName of the clean room
RunsNumber of completed runs, with failed runs shown separately if any
AssetsNumber of asset updates
CollaboratorsNumber of distinct collaborators active on that day
Avg DurationAverage notebook run duration for the day

LakeSentry ingests the following event types from the system.access.clean_room_events system table:

Event typeWhat it records
CLEAN_ROOM_CREATEDWhen the clean room was created and by which collaborator
CLEAN_ROOM_DELETEDWhen the clean room was deleted
RUN_NOTEBOOK_STARTEDA notebook execution was initiated — includes which collaborator started it
RUN_NOTEBOOK_COMPLETEDA notebook execution finished — includes duration, status, and output schema
CLEAN_ROOM_ASSETS_UPDATEDAssets (tables, notebooks) were added, modified, or removed
ASSET_REVIEW_CREATEDAn asset review was created — approvals, rejections, or auto-approvals
OUTPUT_SCHEMA_DELETEDAn expired output schema was cleaned up (initiated by the system)

Each event records the initiating collaborator (identified by their alias — “creator”, “collaborator”, or a custom value) and the timestamp.

  1. Sort the clean room list by Notebook Runs (descending).
  2. Review the bar chart to see which rooms have the most executions.
  3. Check total duration to identify rooms with long-running notebooks.
  4. Review failed run counts — failed runs still consume compute time.
  1. Switch to the Daily Activity tab.
  2. Review runs, asset updates, and collaborator counts per day.
  3. Look at average run durations to spot performance changes over time.
  1. In the Overview tab, look for rooms with failed run badges.
  2. Switch to Daily Activity to see when failures occurred.
  3. Note the duration — long-running failures are expensive.
  4. Compare successful vs. failed run counts to identify reliability issues.