🤖 fix: suppress post-compaction continue notifications#3261
🤖 fix: suppress post-compaction continue notifications#3261ammar-agent wants to merge 1 commit intomainfrom
Conversation
|
@codex review |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 7553b82751
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
7553b82 to
f02b2c4
Compare
|
@codex review Addressed the auto-retry resume gap by preserving |
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: f02b2c4693
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
f02b2c4 to
4877a0c
Compare
|
@codex review Addressed the source-specificity issue by marking only mid-stream auto-compaction follow-ups with |
|
Codex Review: Didn't find any major issues. You're on a roll. ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
4877a0c to
830d092
Compare
|
@codex review Refactored per feedback: notification-specific suppression is no longer threaded through backend stream/activity schemas or persisted metadata. The backend only marks the compaction follow-up source as |
|
Codex Review: Didn't find any major issues. Breezy! ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
Summary
Suppress notify-on-response for source-marked internal post-compaction resume turns. The policy is now owned by the transcript aggregator/response-completion layer instead of being threaded through backend stream/activity schemas. User-authored follow-up text after
/compact(even exactlyContinue) still notifies normally.Background
I found the prior notification fixes, especially #2959 (
🤖 fix: suppress notifications for auto-follow-up handoffs), plus the earlier compaction-specific fixes #1892 and #2914. That work was not regressed: the current test suite still encoded its intended behavior, namely “compaction with continue message should fire only ONE notification (for continue response)”. The remaining gap is narrower: internal mid-stream compaction resumes are implementation-detail follow-ups, but if the workspace is backgrounded during the handoff, the activity generation update clears stale active-stream context before the final background completion is inferred.Implementation
dispatchOptions.source = "internal-resume"). On-send compaction preserves user-authored prompts, including prompts whose text is exactlyContinue.StreamingMessageAggregatorand response-completion metadata.Validation
bun test src/browser/utils/messages/StreamingMessageAggregator.test.ts src/browser/stores/WorkspaceStore.test.tsmake typecheckmake static-checkRisks
Low-to-medium. This touches the response completion notification policy and background activity handoff behavior, but the new suppression path is scoped to an explicit internal-resume source marker. User-authored compaction follow-ups remain notifying and are covered by tests.
Generated with
mux• Model:openai:gpt-5.5• Thinking:high• Cost:n/a