Ask AI
Ask AI is the primary conversational AI interface in TestRelic. It provides a full-page chat experience where you can ask questions about your test suite, request analysis, and generate structured artifacts — all using natural language.
Ask AI requires the Growth plan. See Plans & Billing.
Accessing Ask AI
- Click Ask AI in the left sidebar navigation.
- The URL format is
/aifor a new conversation or/ai/:conversationIdto resume an existing conversation.
Starting a conversation
Type your question or request in the input box and press Enter. The AI responds with a streaming reply — text appears token-by-token as it is generated.
Example prompts
What are the most common failure patterns in the last 7 days
across the @staging repository?
Create a test plan for the checkout flow covering happy path,
payment failure, and out-of-stock scenarios.
Summarize the failures in @run-20240315 and suggest which ones
are likely infrastructure issues vs real bugs.
Attaching context with @ mentions
The @ context picker lets you attach specific entities to your message so the AI can answer with precision. Supported context types:
| Context type | What it is |
|---|---|
@repo | A specific repository |
@test_run | A specific test run by ID |
@test_case | A specific test case |
@branch | A Git branch |
@suite | A test suite grouping |
@tag | A test tag label |
@environment | A deployment environment |
@integration | A connected integration |
Type @ in the input box to open the context picker and search for the entity you want to attach.
Conversation history
All conversations are saved automatically. The Chat Sidebar on the left of the Ask AI page lists all your previous conversations. From there you can:
- Resume any past conversation.
- Rename a conversation for easier reference.
- Delete conversations you no longer need.
Conversations are private to your user account within the organization.
Message feedback
You can rate any AI response using the thumbs up / thumbs down feedback controls on each message. This feedback helps improve the AI's responses over time.
AI artifacts
Some requests cause the AI to generate a structured artifact in addition to the text response. Artifacts appear in a dedicated panel to the right of the chat. The supported artifact types are:
| Artifact type | Description |
|---|---|
dashboard | A rendered analytics dashboard with charts and metrics |
report | A formatted test quality or coverage report |
test_plan | A structured test plan with scenarios and steps |
presentation | A slide-style summary for sharing with stakeholders |
code | A code snippet (e.g. a new test, a reporter config) |
data_table | A tabular breakdown of test data |
chart | An inline data visualization |
navigation_paths | A test navigation path map |
See AI Insights & Artifacts for more detail on individual artifact types.
Streaming and tool use
Behind the scenes, the AI uses tool calling to fetch data from your repositories, runs, and test cases. While a tool call is running, the UI shows an indicator. The result of the tool call is incorporated into the final AI response — you do not need to take any action.
The streaming connection uses Server-Sent Events (SSE). If the connection is interrupted, the platform automatically retries and resumes streaming.