Synchronous Filter

This chapter of the tutorial shows you the implementation of filters (components that have outputs as well as inputs) which contain the interesting processing logic.

We start with the result of the previous chapter, a simple app with a debug display & a video source.

The code examples included in this chapter are available as ready-to-run examples under examples/tutorial/02_*.py.

Add a synchronous filter

With synchronous we mean that every input image has to pass the filter before we do anything else with it. A filter is not synchronous per se, it is rather a decision of the subsequent components to take their input solely from that filter instead of allowing to bypass it. But we’ll come to that later.

For now, we just add a new custom EventDrivenComponent with one input field and one output field that has the task to flip input images horizontally.

#!/usr/bin/env python3
# PYTHON_ARGCOMPLETE_OK
import cv2

from async2v.application import Application
from async2v.cli import ApplicationLauncher
from async2v.components.base import EventDrivenComponent
from async2v.components.opencv.video import VideoSource, Frame
from async2v.components.pygame.display import OpenCvDebugDisplay
from async2v.components.pygame.main import MainWindow
from async2v.event import OPENCV_FRAME_EVENT
from async2v.fields import Latest, Output


class FlipFilter(EventDrivenComponent):

    def __init__(self):
        self.input: Latest[Frame] = Latest(key='source', trigger=True)
        self.debug_output: Output[Frame] = Output(key=OPENCV_FRAME_EVENT)

    async def process(self) -> None:
        flipped_image = cv2.flip(self.input.value.image, 1)
        self.debug_output.push(Frame(flipped_image, source=self.id))


class Launcher(ApplicationLauncher):

    def __init__(self):
        super().__init__()
        self.add_configurator(MainWindow.configurator())
        self.add_configurator(VideoSource.configurator())

    def register_application_components(self, args, app: Application):
        displays = [
            OpenCvDebugDisplay(),
        ]
        main_window = MainWindow(displays, config=MainWindow.configurator().config_from_args(args))
        video_source = VideoSource(config=VideoSource.configurator().config_from_args(args))
        flip_filter = FlipFilter()
        app.register(main_window, video_source, flip_filter)


def main():
    Launcher().main()


if __name__ == '__main__':
    main()

A lot of things are happening here now. Let’s go through them step by step:

  • The flip filter is an EventDrivenComponent. This means: its process method is only called when one of its trigger fields received a new input.
  • The filter has one input field, input. It is a Latest field, meaning that it always retains the last value it received. It has also set trigger=True, making it a trigger field.
  • The input field listens on the key source. This is the default key the VideoSource component pushes its frames to.
  • The filter has one output field, debug_output. For the sake of simplicity, the output key is set to OPENCV_FRAME_EVENT, which causes the OpenCvDebugDisplay to display the frames.
  • The Frame objects pushed to the output have source set to the component id (which in this case is FlipFilter0 at runtime). This allows the OpenCvDebugDisplay to display frames from different sources in different tiles.

Add a primary, non-debug display

Let’s extend our example with a single display that defines what our final _output_ of our application shall be. We add it as the first display, making it our new default display instead of the debug display. To make our application more flexible, we also make the input & output keys of our filter component configurable.

#!/usr/bin/env python3
# PYTHON_ARGCOMPLETE_OK
import cv2

from async2v.application import Application
from async2v.cli import ApplicationLauncher
from async2v.components.base import EventDrivenComponent
from async2v.components.opencv.video import VideoSource, Frame
from async2v.components.pygame.display import OpenCvDebugDisplay, OpenCvDisplay
from async2v.components.pygame.main import MainWindow
from async2v.event import OPENCV_FRAME_EVENT
from async2v.fields import Latest, Output


class FlipFilter(EventDrivenComponent):

    def __init__(self, input_key: str, output_key: str):
        self.input: Latest[Frame] = Latest(key=input_key, trigger=True)
        self.output: Output[Frame] = Output(key=output_key)
        self.debug_output: Output[Frame] = Output(key=OPENCV_FRAME_EVENT)

    async def process(self) -> None:
        flipped_image = cv2.flip(self.input.value.image, 1)
        output_frame = Frame(flipped_image, source=self.id)
        self.output.push(output_frame)
        self.debug_output.push(output_frame)


class Launcher(ApplicationLauncher):

    def __init__(self):
        super().__init__()
        self.add_configurator(MainWindow.configurator())
        self.add_configurator(VideoSource.configurator())

    def register_application_components(self, args, app: Application):
        displays = [
            OpenCvDisplay('flipped'),
            OpenCvDebugDisplay(),
        ]
        main_window = MainWindow(displays, config=MainWindow.configurator().config_from_args(args))
        video_source = VideoSource(config=VideoSource.configurator().config_from_args(args))
        flip_filter = FlipFilter('source', 'flipped')
        app.register(main_window, video_source, flip_filter)


def main():
    Launcher().main()


if __name__ == '__main__':
    main()

The OpenCvDisplay takes one argument, source, that defines the event key to read input frames from.

You can now switch between the new default display and the debug display with F2 & F3. Now we also have a better understanding how the FlipFilter is used as synchronous filter – every image that is displayed has to pass the FlipFilter first, as we only display outputs from the flipped event key.

Draw a component diagram

Use the graph sub-command to draw a component graph (output defaults to graph.pdf, requires graphviz):

./synchronous_filter.py graph

This creates a graph like this:

../_images/tutorial_02_02_graph.png

Here see again how our FlipFilter is a synchronous part of our processing pipeline, that is not bypassed.

Check out the next chapter to learn how to perform expensive calculations asynchronously while retaining a smooth video stream.