I labelfunktionen går du till den modell som du vill använda och letar efter modell-ID:et.
I detta exempel kan vi se att det är 1968.
Steg 2: Gör en begäran
We are now ready to make a request to Labelfs API. It's a normal HTTP REST interface so you will feel familiar.
We are going to use our endpoint https://api.app.labelf.ai/v2/models/{model_id}/inference to do inference with the model. To make a call to the API you need to add the bearer token we generated previously to your header Authorization. We also need to provide which model to do inference with, and what texts you would like to perform inference on. Max 8 texts per call.
import requests
Replace with your bearer token
bearer_token = ""
# Replace with your model ID
model_id = 0
# Replace texts with your own texts, max 8 text items in the texts array
json_data = {
'texts': [
'Breakfast was not tasty',
]
}
headers = {
'Authorization': f'Bearer {bearer_token}',
}
response = requests.post(f'https://api.app.labelf.ai/v2/models/{model_id}/inference', headers=headers, json=json_data)
print(response.json())
fetch('https://api.app.labelf.ai/v2/models/YOUR_MODEL_ID/inference', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_BEARER_TOKEN',
'Content-Type': 'application/json'
},
// Replace texts with your own texts, max 8 text items in the texts array
body: JSON.stringify({
'texts': [
'Breakfast was not tasty'
],
'max_predictions': 2
})
});
curl --location --request POST 'https://api.app.labelf.ai/v2/models/YOUR_MODEL_ID/inference' \
--header 'Authorization: Bearer YOUR_BEARER_TOKEN' \ --header 'Content-Type: application/json' \
--data-raw '{ "texts": ["Breakfast was not tasty"], "max_predictions": 2 }'
Steg 3: Nu har du implementerat din egen AI-modell!
Vi kan inte vänta på att se vad du har byggt och vill gärna uppmuntra dig att gå med i vår diskord och diskutera din implementering med oss och andra!