FaaS UI Tutorial
  • Dark
  • PDF

FaaS UI Tutorial

  • Dark
  • PDF

Objective: In this tutorial you will learn to use the Dataloop platform to push, deploy, execute, and trigger Functions.

SDK Tutorial

To learn how to setup and work with FaaS module using Dataloop Pythin SDK, read here

1. Codebase

The following Python code consists of two image-manipulation functions:

  • RGB to grayscale over an image – turns a color image to an image composed of shades of gray.
  • CLAHE (Contrast Limited Adaptive Histogram Equalization) Histogram Equalization over an image – reduces noise amplification to improve the contrast of the image.

Each function receives a single item (image type), which later can be used as a trigger to invoke the function.

import dtlpy as dl
import cv2
import numpy as np

class ServiceRunner(dl.BaseServiceRunner):
   def rgb2gray(item: dl.Item):
       This function converts an RGB image to gray.
       This will also add a modality to the original item.
       :param item: dl.Item to convert
       :return: None
       buffer = item.download(save_locally=False)
       bgr = cv2.imdecode(np.frombuffer(buffer.read(), np.uint8), -1)
       gray = cv2.cvtColor(bgr, cv2.COLOR_BGR2GRAY)
       gray_item = item.dataset.items.upload(local_path=gray,
                                             remote_path='/gray' + item.dir,
       # add modality

   def clahe_equalization(item: dl.Item):
       This function performs histogram equalization (CLAHE).
       This will add a modality to the original item.
       Based on opencv: https://docs.opencv.org/4.x/d5/daf/tutorial_py_histogram_equalization.html
       :param item: dl.Item to convert
       :return: None
       buffer = item.download(save_locally=False)
       bgr = cv2.imdecode(np.frombuffer(buffer.read(), np.uint8), -1)
       # create a CLAHE object (Arguments are optional).
       lab = cv2.cvtColor(bgr, cv2.COLOR_BGR2LAB)
       lab_planes = cv2.split(lab)
       clahe = cv2.createCLAHE(clipLimit=2.0, tileGridSize=(8, 8))
       lab_planes[0] = clahe.apply(lab_planes[0])
       lab = cv2.merge(lab_planes)
       bgr_equalized = cv2.cvtColor(lab, cv2.COLOR_LAB2BGR)
       bgr_equalized_item = item.dataset.items.upload(local_path=bgr_equalized,
                                                      remote_path='/equ' + item.dir,
       # add modality

To continue with the tutorial, save the code as a Python file (you may call it “main.py”) and then compress it to a zip file or upload it to your Git.

Since the code requires a Python library that is not a standard library (cv2 - opencv), please create a requirements.txt file specifying this library that needs to be installed, and simply add it to the same folder as the entry point (“main.py”):



Now you have all the files in one compressed folder:

2. Push a Package of Multiple Functions

In this tutorial you will learn how to create and push code as a package to the Dataloop FaaS library.

The code package will serve as an app that you can install/uninstall in your project.

  1. From the left-side navigation menu, go to Application (FAAS) → Application hub.

  2. Click the "Add Function" button.

  3. Provide an app name. This name will serve as the service name.
    We can name our tutorial app “image-process-app”.

  4. Select your preferred way of uploading the example code:

  • Codebase “Item” type – for uploading a compressed folder (for example, “.zip”).
  • Codebase “Git” type – to link a git code.
  1. Our package's codebase uses one Python library that is not a standard library. Therefore, we need to make sure it’s pre-installed before running the entry point. One way to do so is to use a custom Docker Image (information on this process can be found here. The other way is to add a requirements.txt file to the package codebase, as discussed in step 1 (Codebase). Alternatively, you can upload the requirements file directly:
    In this tutorial example, we upload the requirements file with the codebase in the zipped file.

  2. Multiple functions may be defined in a single package under a “module” entity. This way you will be able to use a single codebase for various services (applications).

    Define the module name (no limitations). The “main.py” file you created should be defined as the module entry point.

  3. List the functions defined in the main.py file: “rgb2gray” and ”clahe_equalization” and their inputs and outputs. In our example, no outputs were defined.

  4. When ready, click "Create" and the new package will be listed in the table:

3. Deploy a Service

You can now create a service from the package. To do that, you need to define which module the service will serve. Notice that a service can only contain a single module. The module’s functions will be added to the service automatically.

Multiple services can be deployed from a single package. Each service can get its own configuration:
a different module and settings (computing resources, triggers, UI slots, etc.).

In our example, there is only one module in the package. Let’s deploy the service:


Click the "Install" button of the code package (under the “Library” tab). An installation configuration popup will appear:


Set an event-based Trigger to the Service

When creating a service, we can configure a trigger to automatically run the service functions. When you bind a trigger to a function, that function will execute when the trigger fires. The trigger is defined by a given time pattern or by an event in the Dataloop system.

An event-based trigger is related to a combination of resource and action. A resource can be several entities in our system (item, dataset, annotation, etc.) and the associated action will define a change in the resource that will prompt the trigger (update, create, delete).

The resource object that triggered the function will be passed as the function's parameter (input).
Separate triggers must be set for each function in your service.

Let’s set a trigger to invoke the “rgb2gray” function:
The trigger we set, invokes the function every time a new item is created.
We added a DQL filter, which monitors image items only, as well as a specific dataset (based on its dataset ID), rather than the entire project.

To find your dataset ID, go to the Dataset Browser → click on any item → the Dataset ID can be found in the right panel and copied by clicking the clipboard icon.

	"$and": [{
			"type": "file"
			"hidden": false
			"datasetId": "61e0349e9e3e2358d715ee06"
			"metadata.system.mimetype": {
				"$eq": "image/*"

Filters are created using the Dataloop Query Language (DQL).


You can also add filters to specify the item type (image, video, JSON, directory, etc.) or a certain format (jpeg, jpg, WebM, etc.).

To trigger the function only once (only on the first item event), set “Execution Mode” to “Once”, otherwise use “Always”.

Optional: CRON Trigger at Predefined Intervals

Click CRON to set a predefined interval for triggering the function, for example processing image files once a day. For more specific scheduling, use regular CRON expressions.

Advanced Options – Compute Settings

By default, packages are deployed on Regular-S GCS instances. Click the ADVANCED button to change settings such as machine type, replicas, and concurrencies, and even apply a docker image.

Final Installation

Click "INSTALL" to deploy the service.

Bot user - Please note that if this is the first time you deploy a service, your project probably does not have a bot user. In this case, a pop-up dialog will require your permission to create a bot user in order to deploy the package. Follow the dialog steps to create the bot.


Your service should now be listed on the INSTALLED apps list. Click the service to see the sidebar or click the 3-dot icon for more information and options.

4. Execute the Function

Once the service is ready, you may execute the available functions on an input.

An execution means running the function on a service with specific inputs (arguments). The execution input will be provided to the function that the execution runs.

Now that the service is up, it can be executed manually (on-demand) or automatically, based on a set trigger (time/event).

We will manually upload an item to the dataset we specified in the trigger’s DQL filter. You can use the following image or any other you want: Sample item. Once the image is uploaded, the “item created” trigger event invokes the “rgb2gray” function from our service. The transformed image will be saved in the same dataset.

Congratulations! You have successfully created, deployed, and executed a Dataloop function!

5. FaaS UI Slots – Optional

Activate UI Slots

Assigning a function to UI slots creates a button in the Dataloop platform, allowing users to invoke the FaaS function when needed.

  1. Click the “Library” tab.
  2. Click the 3-dot icon to the right of your package and then click "Activate UI Slots".
  3. Pick your module and function, write your Display Name, and select a Post 4. Action to decide what to do with the function’s output.
    Add your UI scopes – select the relevant resource for the function and choose the panel for the button to appear (the dataset dashboard or studio).
Post Action

The UI Slots Post-Action defines what to do with the function output. Currently, there are 3 available:

  1. drawAnnotation - for an annotation output, drawAnnotation Post-Action will draw on the item.
  2. download - for an item output, download Post-Action will download it.
  3. noAction - do nothing with output.

Activate UI Slots

To add a UI button to enable you to execute the functions from other places in the platform:

  • Click "Activate UI Slots" to activate your pre-defined slots.

  • Select the recipe, mark the slots, and click OK.

  • Execute the function through the UI-Slot in the dataset dashboard or annotation studio.

6. Service Overview

The table listing and the sidebar panel provide you with information about the service status (running, paused, etc.), and allow you to pause/run the service by clicking the action button.
In addition, the side-bar panel provides access to information and tools:

  • Triggers – to see the triggers of your service, click "VIEW Triggers." On this page, you can also create new triggers.
    You can also see project-level triggers on the triggers page, which lists all triggers in the system, their type, settings, etc.

  • Logs – to see the logs of your service, click "VIEW LOGS." In the logs tab, filter your executions or sort the log in ascending/descending order to view the most recent or oldest content.

  • Executions History – click "EXECUTIONS" to view the executions log. Click on any execution to see that execution’s details. Click the repeat icon to rerun a failed execution.
    You can also see project-level executions on the executions page, which lists all executions in the system, their statuses, services, etc.