Systems

How I Prevent Access Rejections With Reviewable Login and Deterministic Demo State

A demo-access rejection pattern: App Review cannot validate safety if login and seeded chat states are not reviewable.

2026-03-27Anonymous ChatDemo AccountStar SecretSafetyiOS

For Star Secret, the rejection was simple and painful: review could not reliably enter a realistic conversation state, so core safety and moderation paths looked incomplete. The lesson was that demo-state engineering is part of submission readiness.

Developer forum and App Review FAQ threads repeat the same operational rule: if login is required, review credentials must be valid and feature-complete; otherwise the app is effectively unreviewable no matter how good the production architecture is.

An empty inbox does not prove messaging works. A missing conversation does not prove reporting works. A clean database does not prove moderation exists. In a normal productivity app, an empty state can still explain the product. In an anonymous chat app, an empty state can make the product look both incomplete and unsafe.

That is why I treat deterministic demo mode as engineering infrastructure, not launch decoration.

Star Secret is the kind of product where this matters. The app needs onboarding, profile boundaries, seeded conversations, block/report controls, policy links, and account deletion to be visible without requiring a real community to exist. The demo environment should exercise the same UI and safety code as production, but with controlled local or staging data.

Access gate Demo session seeded identity Seeded chats safe examples UI flows Safety actions use production components report, block, contact, delete account, policy links
Figure 1: A deterministic demo mode is a state harness. It should make safety-critical paths testable without depending on live community activity.

Access Review Trigger Matrix (Chat and Social Apps)

Common trigger in chat/social appsGuideline pressureHow I remove the rejection risk
Review credentials are missing, invalid, or under-provisionedReview access requirementProvide stable demo account with full feature reach
Empty state hides moderation/report capabilitiesSafety verification riskSeed deterministic conversations that expose safety flows
Report/block/delete paths are not reachable in review sessionSafety + policy complianceEnsure safety controls are reachable from first-run context
Demo mode differs from production safety behavior2.3.1 representation riskReuse production components with controlled state only

Demo mode should not be fake UI

The easiest bad implementation is a screenshot gallery. It looks controlled, but it does not test the product.

A useful demo mode has three properties:

  1. It uses the same screens as production.
  2. It activates the same safety actions as production.
  3. It avoids fragile external state.

I model the environment explicitly:

enum RuntimeEnvironment: Equatable {
    case production(userID: UUID)
    case demo(seed: DemoSeed)
}

struct DemoSeed: Hashable {
    let personaID: UUID
    let conversations: [DemoConversation]
    let policiesVersion: String
}

The UI does not branch into a separate toy app. It receives a repository:

protocol ConversationRepository {
    func listConversations() async throws -> [ConversationSummary]
    func messages(in conversationID: UUID) async throws -> [ChatMessage]
    func report(conversationID: UUID, reason: ReportReason) async throws
    func block(peerID: UUID) async throws
}

Production uses a network-backed implementation. Demo mode uses a deterministic implementation that records actions locally:

actor DemoConversationRepository: ConversationRepository {
    private var seed: DemoSeed
    private var blockedPeers = Set<UUID>()
    private var reports: [ReportEvent] = []

    func report(conversationID: UUID, reason: ReportReason) async throws {
        reports.append(ReportEvent(conversationID: conversationID, reason: reason, createdAt: .now))
    }

    func block(peerID: UUID) async throws {
        blockedPeers.insert(peerID)
    }
}

This makes demo actions real enough to verify behavior, but safe enough to avoid polluting production data.

The state graph matters more than the login screen

Many developers think demo access means "provide a login." For anonymous or social apps, login is only the first edge in the graph.

The important states are:

State What must be visible
No conversations explanation and safe next action
Existing conversation report, block, profile boundary
Blocked user removed or restricted interaction
Report submitted confirmation and support route
Account settings delete account, policy, contact

I keep the demo data small but complete:

struct DemoConversation {
    let id: UUID
    let peer: DemoPeer
    let messages: [ChatMessage]
    let availableActions: Set<SafetyAction>
}

enum SafetyAction: String, Codable {
    case reportConversation
    case blockPeer
    case openGuidelines
    case contactSupport
}

The seed should include normal content, awkward content, and boundary content. It should not include extreme content just to prove the system is edgy. The goal is to demonstrate controls, not to create shock.

Safety controls must be reachable from context

A common mistake is putting safety features only in Settings. Settings is necessary, but it is not enough. A user needs to report or block at the moment they encounter the problematic interaction.

For a conversation screen, the action menu should be contextual:

struct ConversationSafetyMenu: View {
    let peerID: UUID
    let conversationID: UUID
    let actions: ConversationSafetyActions

    var body: some View {
        Menu {
            Button("Report Conversation", role: .destructive) {
                actions.report(conversationID)
            }
            Button("Block User", role: .destructive) {
                actions.block(peerID)
            }
            Button("Community Guidelines") {
                actions.openGuidelines()
            }
        } label: {
            Image(systemName: "ellipsis.circle")
        }
    }
}

In demo mode, these buttons should still execute. A disabled report button teaches nothing.

Empty states are part of the system

In anonymous chat apps, an empty state can be misread as a broken backend. It can also hide the safety model.

The empty state should explain both the product and the boundary:

No conversations yet.
Start a private chat when you are ready. Report and block controls are available inside every conversation.

That text is not only copy. It is state documentation for the user.

Release-readiness takeaways

The more a product depends on social state, the more it needs deterministic state for testing.

For anonymous chat, demo mode should be built like a harness:

  1. Seed representative conversations.
  2. Use production UI components.
  3. Exercise report and block paths.
  4. Keep actions deterministic and isolated.
  5. Make empty, active, blocked, and reported states visible.

This makes the app easier to evaluate, but more importantly, it makes the product safer to develop. If the safety path cannot be demonstrated without a real user on the other side, the architecture is not finished.