Transformers.js documentation

backends/onnx

You are viewing main version, which requires installation from source. If you'd like regular npm install, checkout the latest stable version (v3.0.0).
Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

backends/onnx

Handler file for choosing the correct version of ONNX Runtime, based on the environment. Ideally, we could import the onnxruntime-web and onnxruntime-node packages only when needed, but dynamic imports don’t seem to work with the current webpack version and/or configuration. This is possibly due to the experimental nature of top-level await statements. So, we just import both packages, and use the appropriate one based on the environment:

  • When running in node, we use onnxruntime-node.
  • When running in the browser, we use onnxruntime-web (onnxruntime-node is not bundled).

This module is not directly exported, but can be accessed through the environment variables:

import { env } from '@huggingface/transformers';
console.log(env.backends.onnx);

backends/onnx.deviceToExecutionProviders([device]) β‡’ <code> Array. < ONNXExecutionProviders > </code>

Map a device to the execution providers to use for the given device.

Kind: static method of backends/onnx
Returns: Array.<ONNXExecutionProviders> - The execution providers to use for the given device.

ParamTypeDefaultDescription
[device]*

(Optional) The device to run the inference on.


backends/onnx.createInferenceSession(buffer, session_options, session_config) β‡’ <code> * </code>

Create an ONNX inference session.

Kind: static method of backends/onnx
Returns: * - The ONNX inference session.

ParamTypeDescription
bufferUint8Array

The ONNX model buffer.

session_options*

ONNX inference session options.

session_configObject

ONNX inference session configuration.


backends/onnx.isONNXTensor(x) β‡’ <code> boolean </code>

Check if an object is an ONNX tensor.

Kind: static method of backends/onnx
Returns: boolean - Whether the object is an ONNX tensor.

ParamTypeDescription
xany

The object to check


backends/onnx.isONNXProxy() β‡’ <code> boolean </code>

Check if ONNX’s WASM backend is being proxied.

Kind: static method of backends/onnx
Returns: boolean - Whether ONNX’s WASM backend is being proxied.


backends/onnx~defaultDevices : <code> Array. < ONNXExecutionProviders > </code>

Kind: inner property of backends/onnx


backends/onnx~wasmInitPromise : <code> Promise < any > </code> | <code> null </code>

To prevent multiple calls to initWasm(), we store the first call in a Promise that is resolved when the first InferenceSession is created. Subsequent calls will wait for this Promise to resolve before creating their own InferenceSession.

Kind: inner property of backends/onnx


backends/onnx~DEVICE_TO_EXECUTION_PROVIDER_MAPPING : <code> * </code>

Kind: inner constant of backends/onnx


backends/onnx~supportedDevices : <code> * </code>

The list of supported devices, sorted by priority/performance.

Kind: inner constant of backends/onnx


backends/onnx~ONNX_ENV : <code> * </code>

Kind: inner constant of backends/onnx


backends/onnx~ONNXExecutionProviders : <code> * </code>

Kind: inner typedef of backends/onnx


< > Update on GitHub