Ask AI
Ask AI is the primary conversational AI interface in TestRelic. It provides a full-page chat experience where you can ask questions about your test suite, request analysis, and generate structured artifacts — all using natural language.
Ask AI requires the Growth plan. See Plans & Billing.
Accessing Ask AI
- Click Ask AI in the left sidebar navigation.
- The URL format is
/aifor a new conversation or/ai/:conversationIdto resume an existing conversation.
Starting a conversation
Type your question or request in the input box and press Enter. The AI responds with a streaming reply — text appears token-by-token as it is generated.
Example prompts
What are the most common failure patterns in the last 7 days
across the @staging repository?
Create a test plan for the checkout flow covering happy path,
payment failure, and out-of-stock scenarios.
Summarize the failures in @run-20240315 and suggest which ones
are likely infrastructure issues vs real bugs.
Attaching context with @ mentions
The @ context picker lets you attach specific entities to your message so the AI can answer with precision. Supported context types:
| Context type | What it is |
|---|---|
@repo | A specific repository |
@test_run | A specific test run by ID |
@test_case | A specific test case |
@branch | A Git branch |
@suite | A test suite grouping |
@tag | A test tag label |
@environment | A deployment environment |
@integration | A connected integration |
Type @ in the input box to open the context picker and search for the entity you want to attach.
Conversation history
All conversations are saved automatically. The Chat Sidebar on the left of the Ask AI page lists all your previous conversations. From there you can:
- Resume any past conversation.
- Rename a conversation for easier reference.
- Delete conversations you no longer need.
Conversations are private to your user account within the organization.
Message feedback
You can rate any AI response using the thumbs up / thumbs down feedback controls on each message. This feedback helps improve the AI's responses over time.
AI artifacts
Some requests cause the AI to generate a structured artifact in addition to the text response. Artifacts appear in a dedicated panel to the right of the chat. The supported artifact types are:
| Artifact type | Description |
|---|---|
dashboard | A rendered analytics dashboard with charts and metrics |
report | A formatted test quality or coverage report |
test_plan | A structured test plan with scenarios and steps |
presentation | A slide-style summary for sharing with stakeholders |
code | A code snippet (e.g. a new test, a reporter config) |
data_table | A tabular breakdown of test data |
chart | An inline data visualization |
navigation_paths | A test navigation path map |
See AI Insights & Artifacts for more detail on individual artifact types.
Streaming and tool use
Behind the scenes, the AI uses tool calling to fetch data from your repositories, runs, and test cases. While a tool call is running, the UI shows an indicator. The result of the tool call is incorporated into the final AI response — you do not need to take any action.
The streaming connection uses Server-Sent Events (SSE). If the connection is interrupted, the platform automatically retries and resumes streaming.
Memory
Ask AI can remember context from past sessions so that follow-up conversations feel continuous. Memory is controlled by a toggle in the composer bar — it is on by default.
How memory works
Memory operates in three layers. The layers are independent and stack on top of each other:
| Layer | What it contains | Always active? |
|---|---|---|
| Current thread | The last 20 messages of the open conversation, loaded from the database | Yes — unaffected by the toggle |
| Recent conversation history | Short excerpts pulled from your other recent conversations across all threads | Only when Memory is on |
| Long-term memory | Durable facts about your preferences, recurring topics, and working patterns, extracted by the AI and stored across sessions | Only when Memory is on and enabled on your instance |
When Memory is on, all three layers are combined into the system context sent to the AI before each reply. When you turn Memory off, only the current thread history is used.
Enabling and disabling memory
The Memory toggle appears:
- In the Ask AI composer bar at the bottom of the full-page chat.
- In the AI Assistant panel (the floating assistant available on every page).
Toggle it off for any conversation where you want the AI to respond without prior-session context — for example, when troubleshooting an isolated issue or starting fresh on a new project.
Memory is scoped to your user account. Colleagues in the same organization cannot see or access your memory context.
Ask AI Apps
Ask AI Apps are third-party service integrations — such as Slack, Gmail, GitHub, and Jira — that you connect to Ask AI. Once connected, the AI can call actions in those services directly from the chat: look up Slack threads, draft emails, open Jira issues, and more.
Connecting an app
- Go to Settings → Integrations → Ask AI Apps.
- Browse the available app catalog and click the app you want to connect.
- Complete the OAuth authorization flow for that service.
- The app appears as Connected in the list and is now available in Ask AI.
You can manage connected apps (view status, disconnect) from the same page at any time.
Using an app in a conversation
Once an app is connected, you can bring it into any Ask AI conversation in two ways:
@context picker — Type@in the composer, search for the app by name, and select it from the list. The app is attached to your message as acomposio_toolkitcontext item.- Apps control in the composer bar — Click the apps icon in the composer bar to open the apps picker, then select the connected app.
After attaching an app, the AI can call its tools when it determines an action is relevant to your request. You can attach multiple apps to a single message.
How app tool calls work
When an app is attached, the AI is given access to its tools through an MCP server backed by the Composio integration layer. If the MCP path is unavailable, the platform falls back to direct Anthropic tool definitions for the same app. In both cases, the tool call is transparent: the UI shows a tool-call indicator while the action runs, and the result is incorporated into the AI's reply.
Connecting a Jira app alongside your test context lets Ask AI open tickets, query existing issues, and link failures directly — without leaving the chat.
Workspace vs personal connections
Some apps support a workspace scope (shared across your organization's Ask AI sessions) and a personal scope (visible only to you). The scope is shown when you initiate the connection and can be reviewed on the Ask AI Apps settings page.