Skip to main content

Introduction

Why Nodes?

Engineers love programming languages. Analysts prefer SQL. Business people are fond of visual tools.

It’s a bit tricky to communicate when teammates speak and think in different ways. Especially in a fast moving world where people change teams, new challenges arise daily, and the environment is transforming permanently. If any change takes weeks you are behind.

Fabrique.ai Nodes are created for teams to deliver an intuitive lightweight approach to easily build, run, maintain, reuse, extend, and share event flow processing projects. No code is needed.

Node Collection is limited for purpose by a dozen of typical operations to cover the most common scenarios and to shorten the learning curve. At the same time, the Node Collection is extendable and new functionality can be added without friction.

Node

Nodes have Input and Output Ports.

Node gets values to the Input Ports, executes the processing logic, and transmits the resulting values to the Output Ports.

Port has the input field to specify the operation with the value and the label in the input field to prompt the operation.

Input and Output Ports, depending on a Node, may operate with the values of a different type in a different manner. For example, query path, apply formula, name variable, define value, apply pattern, write code etc.

Port may require to specify the type of data. For example, integer, number, boolean, string, object, array, or any.

Input and Output Ports can be combined into the Port Group. For example, unnamed, condition, if condition, else.

Nodes can have default, obligatory, or optional Ports. Ports can be added, deleted, or hidden, if the processing logic of a Node allows it.

Nodes may require special configuration parameters and encrypt sensitive credentials before storing.

Nodes may verify configurations to minimize the risk of errors.

Nodes produce logs and error messages to help detect errors in Runtime.

Fabrique provides the SDK to extend the Node Collection and the Nodes functionality.

Node Collection

Nodes are divided into 6 groups.

The division into Groups depends on the specifics of the Nodes.

Conditional Nodes

  • Filter → gets values to the Input Ports and transmits the values to the Output Ports without any change if the value in the conditional Port is_true, otherwise does not transmit.

  • If-Else → gets values to the Input Ports, and transmits the values to the if condition Output Ports without any change if the value in the conditional Port is_true, otherwise to the else Output Ports.

Functional Nodes

  • Function → gets values of variables to the Input Ports, applies functions to the values, and sends the results to the Output Ports.

  • Random → gets a trigger to the Input Port, applies a random function from the collection, generates a random value, and sends the value to the Output Port.

In/Out Nodes

  • JSONin → reads JSON messages from the Input Topic, deserializes the JSON messages, decomposes JSON objects to elements (except arrays), gets values of the elements, sends the values of the elements to the Output Ports.

  • JSONout → gets values to the Input Ports, composes the values to elements and objects of JSON messages, serializes the JSON messages, writes the JSON messages to the Output Topic.

  • SQSWrite → gets values to the Input Ports, composes JSON messages, serializes JSON messages, connects to the AWS SQS PubSub service, publishes the JSON messages to the Queue.

  • SQSRead → connects to the AWS SQS PubSub service, subscribes to the Queue, consumes JSON messages, deserializes the JSON messages, decomposes elements and objects of the JSON messages, sends the values of the elements and objects to the Output Ports.

  • Timer → triggers periodic events having a timestamp to the Output Port.

Modifying Nodes

  • Constants → gets a trigger to the Input Port, declares values of elements, objects and arrays, sends the values to the Output Ports.

  • Parser → gets a String value to the Input Port, applies patterns, automatically generates the Output Ports, extracts values in accordance with the patterns, and sends the values to the Output Ports.

  • TypeConverter → gets a value of one type to the Input Port, modifies the value to another type of data, and sends the modified value to the Output Port.

  • REST GetClient → gets a trigger and params to the Input Ports, requests the external REST API Server, gets response from the REST API Server, sends the response to the Output Ports.

  • REST PostClient → gets a trigger and params to the Input Ports, requests the external REST API Server, sends the params to the REST API Server, sends the response to the Output Ports.

  • REST PutClient → gets a trigger and params to the Input Ports, requests the external REST API Server, updates the values on the REST API Server, sends the response to the Output Ports.

  • REST DeleteClient → gets a trigger and params to the Input Ports, requests the external REST API Server, sends the request on deletion, sends the response to the Output Ports.

Stateful Nodes

  • DictionaryWrite → gets values for the key-value pair to the Input Port, looks up for the key in the Dictionary Collection, writes the values of the key-value pair to the Dictionary Collection, returns a status of writing.

  • DictionaryGet → gets a value of the key to the Input Port, looks up for the key in the Dictionary Collection, gets the value for the key, sends the value to the Output Port.

  • DictionaryDelete → gets a value of the key to the Input Port, looks up for the key in the Dictionary Collection, deletes the key-value pair from the Dictionary Collection, sends the boolean status of deletion to the Output Port.

  • WindowWrite → gets a timestamp to the Input Port, applies an aggregation function to the time window that matches the timestamp, appends the aggregate to the Window Collection, and sends the window timestamp to the Output Port.

  • WindowRead → gets a timestamp to the Input Port, requests the counted aggregates from the Window Collection, and sends the values of the selected aggregates to the Output Ports.

Structural Nodes

  • ArrayToElement → gets an array to the Input Ports, iterates the array, extracts elements from the array, transmits element by element to the Output Ports.

  • ArrayToArray → gets an array to the Input Port, applies a group operation from the collection to the elements of the array, and sends the modified array to the Output Port.

  • ElementToArray → gets value by value of elements of an array to the Input Port, composes the elements to the array, sends the array to the Output Port.

  • Decompose → gets a nested object as a value to the Input Port, decomposes elements from the object, and sends the elements to the Output Ports.

  • Compose → gets values to the Input Ports, composes the values to elements and objects of JSON structure, and sends the JSON structure to the Output Port.

Node Graph

The Output Ports of one Node can be linked with the Input Ports of other Nodes.

Linked Nodes form a direct acyclic synchronous computational graph.

Node Graph is an intuitive way to build, test, reuse, and explain event processing logic.

Node Graph allows to realize quite different processing logic like consume messages from an external service, filter messages, check conditions, split a flow of events into multiple sub-flows, flatten a nested structure, enrich messages, collect aggregates, detect anomalies etc.

Node Graph allows processing a micro batch of messages at once.

caution

All Ports have to be linked, hidden, or deleted. Unlinked Ports are not allowed.​

Actor

Instead of building a Monster Node Graph it’s much more practical to decompose a complex processing logic into multiple stand-alone Simple Node Graphs.

The rule of thumb - make Node Graphs as simple as possible.

Simple Node Graphs allow to atomize and isolate the processing logic, computational resources, error handling, testing, collaborative work etc.

Simple Node Graphs are isolated from each other by Actors.

Actor is a microservice that can be deployed and maintained independently, paralleled by multiple instances, provided with required computational resources.

Topic

Actors consume input messages, execute Simple Node Graphs, and produce output results.

If an Actor consumes messages from an upstream Actor, or produces messages to feed a downstream Actor, then a Topic is required to link 2 or more Actors.

caution

Actors are not allowed to be linked directly.

Topics link Actors asynchronously to unbind the execution of Simple Node Graphs.

Topics and Actors allow to build complex directed asynchronous cyclic computational graphs.

Actors employ the special JSONin and JSONout Nodes from the In/Out Group to link input and output Topics respectively.

Actors may consume messages from multiple Input Topics and produce messages to multiple Output Topics.

caution

Once processed a message is deleted from a Topic by retention time.

State

Besides messages Actors may employ shared States.

One Actor may collect aggregates. Another **Actor may consume aggregates independently.

Nodes from the Stateful Group are used to write, read, update, and delete records in collections.

One Actor may operate with multiple collections.

caution

Once created or updated, a record is retained until deleted on command or by retention time.

Project

Actors, Topics, and States form a Project.

Projects have Versions.

caution

Once saved a Project gets a new Version.

Version of a Project can be Deployed in Runtime.

A version of a Project can be Duplicated, Shared or made Public.