Hugging Face
Let's load the Hugging Face Embedding class.
%pip install --upgrade --quiet langchain sentence_transformers
from langchain_huggingface.embeddings import HuggingFaceEmbeddings
API Reference:HuggingFaceEmbeddings
embeddings = HuggingFaceEmbeddings()
text = "This is a test document."
query_result = embeddings.embed_query(text)
query_result[:3]
[-0.04895168915390968, -0.03986193612217903, -0.021562768146395683]
doc_result = embeddings.embed_documents([text])
Hugging Face Inference API
We can also access embedding models via the Hugging Face Inference API, which does not require us to install sentence_transformers
and download models locally.
import getpass
inference_api_key = getpass.getpass("Enter your HF Inference API Key:\n\n")
Enter your HF Inference API Key:
········
from langchain_community.embeddings import HuggingFaceInferenceAPIEmbeddings
embeddings = HuggingFaceInferenceAPIEmbeddings(
api_key=inference_api_key, model_name="sentence-transformers/all-MiniLM-l6-v2"
)
query_result = embeddings.embed_query(text)
query_result[:3]
API Reference:HuggingFaceInferenceAPIEmbeddings
[-0.038338541984558105, 0.1234646737575531, -0.028642963618040085]
Hugging Face Hub
We can also generate embeddings locally via the Hugging Face Hub package, which requires us to install huggingface_hub
!pip install huggingface_hub
from langchain_huggingface.embeddings import HuggingFaceEndpointEmbeddings
API Reference:HuggingFaceEndpointEmbeddings
embeddings = HuggingFaceEndpointEmbeddings()
text = "This is a test document."
query_result = embeddings.embed_query(text)
query_result[:3]