Skip to main content

Creating Frame Processor Plugins

Expose your Frame Processor Plugin to JS​

To make the Frame Processor Plugin available to the Frame Processor Worklet Runtime, create the following wrapper function in JS/TS:

import { VisionCameraProxy, Frame } from 'react-native-vision-camera'

const plugin = VisionCameraProxy.getFrameProcessorPlugin('scanFaces')

/**
* Scans faces.
*/
export function scanFaces(frame: Frame): object {
'worklet'
return plugin.call(frame)
}

Test it!​

Simply call the wrapper Worklet in your Frame Processor:

function App() {
const frameProcessor = useFrameProcessor((frame) => {
'worklet'
const faces = scanFaces(frame)
console.log(`Faces in Frame: ${faces}`)
}, [])

return (
<Camera frameProcessor={frameProcessor} {...cameraProps} />
)
}

Next Steps​

If you want to distribute your Frame Processor Plugin, simply use npm.

  1. Create a blank Native Module using bob or create-react-native-module
  2. Name it vision-camera-plugin-xxxxx where xxxxx is the name of your plugin
  3. Remove the generated template code from the Example Native Module
  4. Add VisionCamera to peerDependencies: "react-native-vision-camera": ">= 3"
  5. Implement the Frame Processor Plugin in the iOS, Android and JS/TS Codebase using the guides above
  6. Publish the plugin to npm. Users will only have to install the plugin using npm i vision-camera-plugin-xxxxx and add it to their babel.config.js file.
  7. Add the plugin to the official VisionCamera plugin list for more visibility

🚀 Next section: Browse Community Plugins​