Skip to main content

Knowledge Q&A

The Knowledge Q&A feature lets you ask natural language questions and receive answers drawn directly from your saved content. Powered by retrieval-augmented generation (RAG) with Google Gemini.

How It Works

When you ask a question, Mindweave follows a multi-step process to find the best answer:

  1. Query embedding — Your question is converted into a vector embedding using Google Gemini's text-embedding-004 model.
  2. Retrieval — The most relevant content from your knowledge base is found using vector similarity search (pgvector).
  3. Context building — The top matching content items are assembled as context for the AI.
  4. Generation — Google Gemini generates a natural language answer using only your retrieved content as its source material.
  5. Citation — The answer includes references to the source content so you can verify and explore further.

Using Knowledge Q&A

Access the Q&A feature from the Ask page in your dashboard. Type your question in natural language, just as you would ask a colleague.

Example questions:

  • "What notes do I have about React performance optimization?"
  • "Summarize my bookmarks related to machine learning."
  • "What were the key takeaways from my project retrospective?"
  • "Which links did I save about database indexing strategies?"

Tips for Good Questions

  • Be specific — "What are my notes about PostgreSQL indexing?" works better than "database stuff".
  • Ask about topics you have content on — The Q&A can only answer based on what you've saved. It won't make up information.
  • Use follow-up questions — If the first answer is too broad, ask a more targeted follow-up.
  • Check the sources — Review the cited content to verify the answer and discover related material.

Limitations

  • Answers are only as good as the content in your knowledge base.
  • Very recent content may not yet have embeddings generated.
  • Questions about topics with no matching content will return a "not enough context" message.
  • For simple lookups, consider using search instead, which is faster for direct queries.