Back to Glossary

    Embedding

    A numerical vector representation of text that captures semantic meaning, enabling similarity search and AI understanding.

    What is an Embedding?

    An embedding is a dense vector (list of numbers) that represents a piece of text in a high-dimensional space. Texts with similar meanings have embeddings that are close together in this space.

    How Embeddings Work

    1.Input — A sentence, paragraph, or document is fed into an embedding model
    2.Processing — The model analyzes the semantic meaning of the text
    3.Output — A fixed-length vector (e.g., 1536 dimensions) is produced

    Use Cases

    Semantic Search — Find documents similar in meaning, not just keyword matches
    Recommendation Systems — Suggest related content based on similarity
    Clustering — Group similar documents together automatically
    RAG Systems — Power retrieval in AI chatbots and Q&A systems

    Example

    The sentences "How do I reset my password?" and "I forgot my login credentials" would have very similar embeddings despite sharing few words, because they express similar intent.

    In Customer Support

    SiteSupport generates embeddings for every page of your website, storing them in a vector database. When a visitor asks a question, their query is also embedded, and the most semantically similar content is retrieved to answer their question.

    Related Terms

    Related Tools

    Want AI-powered customer support?

    Deploy a custom AI chatbot trained on your website in minutes.