JenniAI

JenniAI is a research and writing tool for students and career academics. I joined the team in the summer of 2022 and have been leading design ever since.The citation feature discussed in this case study unlocked a new part of the user journey (research discovery).

2022-23

Research

Interaction Design

Growth

Increased conversion rate by 65%

Retention

Improved day-10 retention by 24%

JenniAI

JenniAI is a research and writing tool for students and career academics. I joined the team in the summer of 2022 and have been leading design ever since.The citation feature discussed in this case study unlocked a new part of the user journey (research discovery).

2022-23

Research

Interaction Design

Growth

Increased conversion rate by 65%

Retention

Improved day-10 retention by 24%

JenniAI

JenniAI is a research and writing tool for students and career academics. I joined the team in the summer of 2022 and have been leading design ever since.The citation feature discussed in this case study unlocked a new part of the user journey (research discovery).

2022-23

Research

Interaction Design

Growth

Increased conversion rate by 65%

Retention

Improved day-10 retention by 24%

Feature in Production

Watch the video below for a look at the final version.

Feature in Production

Watch the video below for a look at the final version.

Feature in Production

Watch the video below for a look at the final version.

The Problem

After a market-pivot our product had to cater to the needs of researchers and post-grad academics. Generative AI is a tool that empowers ESL (English as a second language) students to translate their knowledge into clear, cohert pieces of writing.


However, generative AI has constraints that can lead to problematic outcomes in academia:


  • AI tools often halloucinate ‘sources’ eroding the system trust and potentially leading the user to false information.

  • Our current model would sometimes suggested a source to users e.g. (James, 2021). However, the user had no clear path to checking the legitimacy of the source. In fact, sometimes it did not even exist at all.

  • Researchers must be able to back-up claims with relevant, high-quality sources.

The Solution

Focusing on redesigning the experience for new users, we prioritised the following in our design decisions:


  1. Guiding new users to the correct difficulty level

  2. Sharing the benefits of taking part in the challenge ahead of the first session

  3. Moving reminders from a 'paid' to a 'free' feature. Encouraging repeat use and consistency is an important lever to increase paid conversions.

User Research

As a 5-person start-up we adhered to Lean UX principles to optimize for speed of learning. This case study documents multiple build-measure-learn loops. User interviews served as the primary driver of insights.


  • Synthesis of the key outcomes
  • Understanding which factors users value when assessing a potential citation
  • Event-driven user text chats via Intercom
  • Google’s human-centred AI guidelines as a reference for best practices

Primary Frustration

When AI hallucinates a citation it creates more work for a user forcing them to search in order to validate legitimacy.

Secondary Frustration

Users often have an appropriate source but currently have no way to add it to text/ bibliography.

"How might we create an intuitive citation experience that allows researchers to verify the sources that our AI model has consulted?"

Design Iterations

I cycled through multiple design iterations regularly sharing prototypes with our community of users for feedback.

Version 1

The system decides when to automatically show a citation (confidence threshold of at least 90%). Users can toggle citations on/off inside content settings.

Issues

  • Not enough user control, citations appear randomly

  • Violated the principle of user control and human-in-the-loop

  • On/off setting has some discoverability issues

Version 2

Allowing users to highlight a block of text and find citations that were used as part of the generative process.

Issues

  • Stopping and highlighting breaks the users writing flow

  • What if a block of text is edited after the citation is added? A user might change the sentiment of an argument thus rending the citation incorrect

  • Citations need to be from journals for Masters & PhD students

Final Version

Keyboard shortcut (@) to find a list of sources that relate to the preceding text block. Allow users to filter by 'web' or 'journals'.


Manual citations allow users to add their own pre-found sources and then format that with one click.


Users can add a placeholder. A mental reminder to come back later, this keeps the writer in flow.

Key Lessons Learned

Focus on past behaviours

During user interviews I resisted the urge to hint at possible solutions. Instead, understand the detailed user journey to accomplish the task of finding and integrating research studied into their writing. "Tell me about the last time you…"

Data to guide future improvements

Qualitative inquiry enabled me to uncover the root 'why' behind a user need. However, after shipping the first version of the feature, leaning heavily into product analytics helped iterate our citation ranking system to optimize result quality.

Unlock users existing library

Researchers have often already done a lot of heavy lifting by conducting research and synthesis. Exploring interoperability with existing data sources can help improve citation quality.

Key Lessons Learned

Challenge assumptions & iterate

Never be afraid to revist the problem statements as more information is unveiled. Being truly iterative means being brave enough to change assumptions rather than pushing forward on a flawed insight.

UI consistency is a fundamental for functional UX

It is not advisable to introduce a new UI style arbitrarily. Focus on design consistency throughout the entirety of the product experience.

Visit site