As research in the field of image and video editing progresses rapidly, new ideas and tools for manipulating and generating images are continually emerging. Frameworks like ComfyUI, while useful, often don’t cover all aspects comprehensively. Our platform provides a powerful environment not only for end users, such as artists and designers, but also for researchers and developers. It facilitates the quick implementation of innovative UI ideas, benefiting artists and designers by enhancing their creative processes and productivity. These tools are not limited to advanced AI features; they also include traditional image editing tools.
The SAM Auto Matte tool operates in point mode and utilizes ComfyUI workflows as its backend.
For supporting development, we offer a lightweight environment with some basic functions like zoom, which is the most challenging part for developing such a tool.
Same tool using rectengular mode and our test environment.
The steps are similar to custom UI components, so these are also based on Web Components. For parameter selection, we offer a floating toolbar that is located automatically depending on document size and scroll position screen size. Each tool needs an additional web component for dialog elements of such a toolbar with the suffix -floating-toolbar
.
The layer
parameter points to an internal data object storing all information of the current tool. Both components are used automatically by selecting a tool in the toolbar. Quite often, such a tool offers different variations, so the main application supports a menu with the sub-tools for switching between the states.
For testing a tool, just use our test environment from our SDK:
The first script activates the connection to a ComfyUI server with workflow management, workflow pre-parser, and execution of any ComfyUI workflows by name.
With the test environment tag, you easily start the test environment with an example image:
Now activate our plugin in the test environment:
document.;
It uses a stripped-down version of the manifest here as well to keep things easy.
To activate the plugin on the test environment canvas, just use:
stage.; // so in this example:
stage.;
Both tool components need to provide a refresh()
function which will be automatically called if something has been changed in the main application. This is usually activation of the tool, zoom function, or setting a new sub-tool.
Inside the tool plugin, we provide a helper function for the calculation of mouse coordinates to image coordinates and vice versa. This calculation depends on three factors: the coordinate, the zoom factor, and the device screen ratio.
canvas
is just the canvas object of the Gyre-API here.
In addition to the refresh
function, the tool has to provide prepareForSave()
and prepareAfterLoad()
functions. In a complex gyre file, the current state of each tool is stored as well, so these usually provide a serialized version of toolsLayer
without unneeded data.
This section provides detailed information about the JSON manifest structure for defining tools in your image editing plugin. The manifest is stored in gyre_entry/gyre_ui_tools.json
in plugins
array.
"tool"
for tool-plugins."-floating-toolbar"
added.true
.490
. This will overwritten by sub-tool parameter for the toolbar width."Auto Matte"
.["image"]
.true
if you want the refresh()
called as user scrolls the canvas. This is useful if the tool is working on visible part of canvas only (see canvas.screen
object in Gyre API).Each tool object has the following structure:
"rect"
."Rectangular Selection"
."fds-image-editor-sam-rect"
.440
.The icons object contains key-value pairs where each key is the name of an icon and the value is an object with the following structure:
1024
.1024
.Here is an example of a JSON manifest for image editing tools: