reused in another script. When two nodes are
connected directly, the second node receives the prior
node's output value as input. Users do not need to
assign and parse each output into a variable before
reprocessing it into a second node since the system
has been created that way. It is possible to assign the
output, or a portion of the output, to a variable in some
circumstances where this method is necessary. Every
script must be composed of at least one start and one
end node, defining the starting and ending points of
the script.
The visual script presented in Figure 5 works in
this way: The process begins on the "start" node. One
way is through the "Keyboard Input" node. This node
will register user input on a keyboard as events. When
one of those events occurs, the following node is
executed. A variable definition is the second block
connected by start. The value of the virtual variable
"human" is set to "null." It does imply that the
variable has been registered; it exists but has no value.
The flow immediately moves to a "While" node after
defining this variable. The criteria "human == null"
will be checked by this node. The nodes registered in
the "true" output will be performed when this
condition is true. Those nodes will request a camera
frame, run a remote machine learning algorithm to
recognize a person, and assign the algorithm's result
to the "Human" variable. The "machine learning"
node's output contains a JavaScript object, which is
assigned to the variable "human." The while loop will
cycle again if the "MachineLearning" node's output
fails to retrieve a human in the provided image.
Otherwise, the while loop will end, and the "logging"
node will fire, notifying the user. Heading back to the
Keyboard Input Node, if the key "F" is pressed on the
keyboard after a human has been detected, the
microphone mounted on the robotic system will speak
the text described in the node, then the variable will
be set to null again and the "while" node will restart,
as requested by the software. The welcome will not
be shown if the user pushes Esc. Finally, a lengthy
press of the Esc key will bring the script to the "End"
node, which will bring the entire visual scripting
program to a stop.
3.3.6 Fault Tolerant System
The process of controlling a robotic system can be
fraught with faults, timeouts, and delays. Some rules
have been created to provide the user with an
understanding of what is going on in order to promote
consistency. When a button-triggered action is in
progress, the button is disabled until the action is
finished. Furthermore, when the user needs to be
aware of an issue or an essential message, a pop-up
will display on the bottom right of the screen. In some
cases, the pop-up will require the user's
acknowledgement before it vanishes.
4 CONCLUSIONS
The proposed framework offers a scalable and highly
distributable solution for the real-time control of
robotic systems and for the monitoring of
autonomous systems. Users merely need to extract
the already used protocol, export it into the
framework, and configure the appropriate settings to
link this solution with an existing robotic solution.
The use of the framework will assist administrators
and operators from the planning stage to the
realization of a real-time intervention. Operators can
prepare for the intervention by customizing their user
interface or by creating scripts and plugins, ensuring
that the intervention is carried out under the best
conditions. The framework also allows for
autonomous system monitoring by customizing the
dashboard so that all essential information is provided
to the administrator, either on a screen installed on the
machine or remotely on a computer. All the data
collected by the program is centralized on a server to
simplify the process of data-driven enhancement of
the entire system and operator behaviour analysis.
REFERENCES
I-Scoop (Ed.). (2022, April 4). Industry 4.0 and the fourth
industrial revolution explained. I-SCOOP; www.i-
scoop.eu. https://www.i-scoop.eu/industry-4-0/
Eguchi, A. (2014, July). Robotics as a learning tool for
educational transformation. In Proceeding of 4th
international workshop teaching robotics, teaching with
robotics & 5th international conference robotics in
education Padova (Italy) (pp. 27-34).
Day, M. (2021, September 12). In Amazon’s
Flagship Fulfillment Center, the Machines Run
the Show. Bloomberg; www.bloomberg.com.
https://www.bloomberg.com/news/features/2021-09-
21/inside-amazon-amzn-flagship-fulfillment-center-
where-machines-run-the-show
Forrest, A., & Konca, M. (2007). Autonomous cars and
society. Worcester Polytechnic Institute, 15, 23
Katz, D., & Some, R. (2003). NASA advances robotic
space exploration. Computer, 36(1), 52-61.
Di Castro, M., Ferre, M., & Masi, A. (2018).
CERNTAURO: A Modular Architecture for Robotic
Inspection and Telemanipulation in Harsh and Semi-
Structured Environments. IEEE Access, 6, 37506-
37522.