Skip to main content
Ask AI

Ask AI

Ask AI is the primary conversational AI interface in TestRelic. It provides a full-page chat experience where you can ask questions about your test suite, request analysis, and generate structured artifacts — all using natural language.

Growth plan required

Ask AI requires the Growth plan. See Plans & Billing.

Accessing Ask AI

  • Click Ask AI in the left sidebar navigation.
  • The URL format is /ai for a new conversation or /ai/:conversationId to resume an existing conversation.

Starting a conversation

Type your question or request in the input box and press Enter. The AI responds with a streaming reply — text appears token-by-token as it is generated.

Example prompts

Analyze recent failures
What are the most common failure patterns in the last 7 days
across the @staging repository?
Generate a test plan
Create a test plan for the checkout flow covering happy path,
payment failure, and out-of-stock scenarios.
Ask about a specific run
Summarize the failures in @run-20240315 and suggest which ones
are likely infrastructure issues vs real bugs.

Attaching context with @ mentions

The @ context picker lets you attach specific entities to your message so the AI can answer with precision. Supported context types:

Context typeWhat it is
@repoA specific repository
@test_runA specific test run by ID
@test_caseA specific test case
@branchA Git branch
@suiteA test suite grouping
@tagA test tag label
@environmentA deployment environment
@integrationA connected integration

Type @ in the input box to open the context picker and search for the entity you want to attach.

Conversation history

All conversations are saved automatically. The Chat Sidebar on the left of the Ask AI page lists all your previous conversations. From there you can:

  • Resume any past conversation.
  • Rename a conversation for easier reference.
  • Delete conversations you no longer need.

Conversations are private to your user account within the organization.

Message feedback

You can rate any AI response using the thumbs up / thumbs down feedback controls on each message. This feedback helps improve the AI's responses over time.

AI artifacts

Some requests cause the AI to generate a structured artifact in addition to the text response. Artifacts appear in a dedicated panel to the right of the chat. The supported artifact types are:

Artifact typeDescription
dashboardA rendered analytics dashboard with charts and metrics
reportA formatted test quality or coverage report
test_planA structured test plan with scenarios and steps
presentationA slide-style summary for sharing with stakeholders
codeA code snippet (e.g. a new test, a reporter config)
data_tableA tabular breakdown of test data
chartAn inline data visualization
navigation_pathsA test navigation path map

See AI Insights & Artifacts for more detail on individual artifact types.

Streaming and tool use

Behind the scenes, the AI uses tool calling to fetch data from your repositories, runs, and test cases. While a tool call is running, the UI shows an indicator. The result of the tool call is incorporated into the final AI response — you do not need to take any action.

The streaming connection uses Server-Sent Events (SSE). If the connection is interrupted, the platform automatically retries and resumes streaming.