Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[NLP] Support the different mask tokens used by NLP models for Fill Mask #2178

Merged
merged 2 commits into from
Jul 12, 2023
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions specification/ml/_types/inference.ts
Original file line number Diff line number Diff line change
Expand Up @@ -265,6 +265,12 @@ export class NerInferenceOptions {

/** Fill mask inference options */
export class FillMaskInferenceOptions {
/** The string/token which will be removed from incoming documents and replaced with the inference prediction(s).
* In a response, this field contains the mask token for the specified model/tokenizer. Each model and tokenizer
* has a predefined mask token which cannot be changed. Thus, it is recommended not to set this value in requests.
* However, if this field is present in a request, its value must match the predefined value for that model/tokenizer,
* otherwise the request will fail. */
mask_token?: string
/** Specifies the number of top class predictions to return. Defaults to 0. */
num_top_classes?: integer
/** The tokenization options to update when inferring */
Expand Down
Loading