• Section A notifies about the steps being
performed during the routine;
• Section B informs the type of PCB detected
during the barcode read function and the number
of components it takes;
• Section C presents the number of components
inserted, the number of damaged components,
and the total number of picked components;
• Section D informs the number of components
left in the box, showing a yellow LED when it
is almost empty, and a red LED when it is
empty;
• Section E lets know the number of pins read,
showing the result of the external vision system.
Figure 15 shows a possible sequencing of what
can be observed in the interface over several
iterations.
Two images were recorded during the filling of
the first PCB (Figure 15-1 and Figure 15-2). The first
one shows the pick and place task and the validation
task of the first component, where it is possible to see
the number of pins read by the external vision system
validating the component and resulting in its
insertion. The second shows the expected last
component of that PCB, where it was a damaged
component, showing 42 read pins instead of 44, and
a total of one damaged component out of a total of
three. The last figure (Figure 15-3) illustrates an
example of a possible iteration of the interface during
a day of work.
6 DISCUSSION
During Test A, we confirmed the accomplishment of
the 30 second requirement. We performed some
insertion trials to compare this with a human worker,
concluding that the worker can be faster by 1.280
seconds on average. However, some major
requirements need to be considered, like efficiency,
repeatability, and precision. Humans are susceptible
to tiredness, and after hours of work, their precision
and efficiency will not be the same, leading to delays
in production or, in worst cases, damaged PCBs. In
opposition, the cobot always performs at his highest
level, sustaining these requirements throughout hours
of work. These requirements are also possible with
industrial robots, but they have some disadvantages,
such as: they are usually more expensive, they are
used for heavier payloads, and they also require an
isolated work area.
Moreover, different camera positions were tested
during this test to find the easiest and fastest way to
achieve a more efficient work. When the camera is on
top, the environment lightning variations do not affect
the external vision system precision and fast decision
making. There were still two other possibilities: one
being the cobot’s camera performing the inspection,
however this camera did not present the minimum
requirements for this process; and the other was an
external camera attached to the cobot, which met the
quality and resolution requirements, but increased the
cycle time of the entire process since the robot would
need to place the component in an intermediate
position to take the reading.
Test B confirms the high level of inspection by the
external vision system when validating minor
displacements and defects of the components’ pins. It
thus delivers a sensibility and accuracy that a human
eye cannot achieve.
One of the few tasks in this work cell that a human
worker needs is replacing an empty component box.
In Test C, one concluded that this type of information
cannot be presented as a mere label but also needs to
be displayed using three LEDs, representing the
quantity of components left in the box, giving an
explicit luminous warning to the human worker. In
addition to this task, to reduce the cycle time of this
process, the operator cooperates with the cobot by
informing it of the limited barcode search area since
each type of PCB has its barcode in different areas.
Test C confirms the high level of inspection by the
external vision system when validating minor
displacements and defects of the components’ pins. It
thus delivers a sensibility and accuracy that a human
eye cannot achieve.
As mentioned before, the human worker could
replace some functionalities of this work cell, like the