Streamsheets can be used both in private environments (e.g. Smart Home) and in industrial environments (e.g. Smart Factory, Industrie 4.0, Industrial IoT).
In this blog you find a 4 minute video that shows how to control an industrial machinery that sorts incoming parts according to its color and its shape. A MQTT Broker (www.mosquitto.org) is used to connect the machinery and the Streamsheet server.
In order to visualize the machinery I have installed a factory simulation program called FactoryIO (www.factoryio.com). This easy-to-use program allows to build virtual industrial machinery in a 3D environment including sensors and actuators that operate in real-time.
To make FactoryIO compatible with MQTT I am using a special gateway for FactoryIO that Cedalo has developed. It converts the internal memory map of FactoryIO into a simple JSON payload and publishes the payload on a MQTT broker at a high-speed frequency.
At the same time the MQTT gateway is also able to subscribe to a MQTT topic and receive JSON based payloads to control the actuators (e.g. Pusher 1 to 4) on the machinery.
In the following video you can see how the Streamsheet is build from the ground up. It is very much recommended that you view the video in full screen mode.
The Streamsheet works in a cyclic mode that recalculates the formulas every 20 ms. In the first part of the video the sensor status of Sensor 1 to 4 is linked with the corresponding Streamsheet cells.
In addition to reading values from the incoming MQTT payloads, the video also shows how to create functions that publish JSON data to a MQTT topic. This topic is then subscribed by the MQTT gateway of FactoryIO and controls the actuators (e.g. Pusher 1)
To synchronize all actions timewise the Streamsheet uses a function called EDGE.DETECT(). This function determines whether a certain condition is true and is then able to hold and delay the signal for a given number of milliseconds. This makes sure that the pusher is activated at the right moment.
In the next part of this use case I will show how to integrate some AI capabilities in this scenario. A locally installed voice recognition system (Snips, see http://www.snips.ai) will then allow the user to voice control which belt will pick which parts.
In another blog post for this use case I show an analytic dashboard of the sorter station with charts and statistical functionality. In the following screenshot you can see a short preview of this analytic dashboard. Follow this link to see how I created the dashboard.