Hugging Face Hub
Learn about using Sentry for Hugging Face Hub.
This integration connects Sentry with Hugging Face Hub in Python.
Once you've installed this SDK, you can use Sentry AI Agents Monitoring, a Sentry dashboard that helps you understand what's going on with your AI requests. Sentry AI Monitoring will automatically collect information about prompts, tools, tokens, and models. Learn more about the AI Agents Dashboard.
Install sentry-sdk
from PyPI with the huggingface_hub
extra:
pip install "sentry-sdk[huggingface_hub]"
If you have the huggingface_hub
package in your dependencies, the Hugging Face Hub integration will be enabled automatically when you initialize the Sentry SDK.
import sentry_sdk
sentry_sdk.init(
dsn="https://examplePublicKey@o0.ingest.sentry.io/0example-org / example-project",
environment="local",
traces_sample_rate=1.0,
send_default_pii=True,
)
Verify that the integration works by initializing a transaction and invoking an agent. In these examples, we're providing a function tool to roll a die.
import random
import sentry_sdk
from huggingface_hub import InferenceClient
def roll_die(sides):
"""Roll a die with a given number of sides"""
return f"Rolled a {random.randint(1, sides)} on a {sides}-sided die."
HF_TOKEN = "..."
TOOLS = [{
"type": "function",
"function": {
"name": "roll_die",
"description": "Roll a die with a given number of sides",
"parameters": {
"type": "object",
"properties": {
"sides": {
"type": "int",
"description": "The number of sides of the die"
}
},
"required": ["sides"],
},
}
}]
def main():
sentry_sdk.init(...) # same as above
client = InferenceClient(token=HF_TOKEN)
with sentry_sdk.start_transaction(name="testing_sentry"):
response = client.chat_completion(
model="Qwen/Qwen2.5-72B-Instruct",
messages=[{
"role": "user",
"content": "Greet the user and use the die roll tool to roll a 6-sided die."
}],
tools=TOOLS,
tool_choice="auto",
)
if __name__ == "__main__":
main()
After running this script, the resulting data should show up in the "AI Spans"
tab on the "Explore" > "Traces"
page on Sentry.io, and in the AI Agents Dashboard.
It may take a couple of moments for the data to appear in sentry.io.
The Hugging Face Hub integration will connect Sentry with all supported Hugging Face Hub methods automatically.
All exceptions are reported.
Sentry considers LLM and tokenizer inputs/outputs as PII (Personally identifiable information) and doesn't include PII data by default. If you want to include the data, set
send_default_pii=True
in thesentry_sdk.init()
call. To explicitly exclude prompts and outputs despitesend_default_pii=True
, configure the integration withinclude_prompts=False
as shown in the Options section below.
By adding HuggingfaceHubIntegration
to your sentry_sdk.init()
call explicitly, you can set options for HuggingfaceHubIntegration
to change its behavior:
import sentry_sdk
from sentry_sdk.integrations.huggingface_hub import HuggingfaceHubIntegration
sentry_sdk.init(
# ...
# Add data like inputs and responses;
# see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
send_default_pii=True,
integrations=[
HuggingfaceHubIntegration(
include_prompts=False, # LLM inputs/outputs will be not sent to Sentry, despite send_default_pii=True
),
],
)
You can pass the following keyword arguments to HuggingfaceHubIntegration()
:
include_prompts
Whether LLM and tokenizer inputs and outputs should be sent to Sentry. Sentry considers this data personal identifiable data (PII) by default. If you want to include the data, set
send_default_pii=True
in thesentry_sdk.init()
call. To explicitly exclude prompts and outputs despitesend_default_pii=True
, configure the integration withinclude_prompts=False
.The default is
True
.
- Python: 3.8+
- huggingface_hub: 0.22+
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").