3D Scan Applications in Logistics - Using CoAP with RGB-D Cameras

Autor Fabian Hüßler
Date 23. June 2019
Degree Bachelor
Title

3D Scan Applications in Logistics: Using CoAP with RGB-D Cameras

(This version includes corrections from July 3, 2019)

Abstract The embedded devices which form the Internet of Things (IoT) experience a rapid devel-
opment in increasing processing power and decreasing chip sizes and prices. Future homes
will be equipped with smart network interoperable devices, which will communicate over
various network protocol stacks. In the fields of home- and industrial automation, cameras
providing color and depth information prove to be very useful in many applications such as
face recognition, pose tracking or environmental 3D scanning.
The Constraint Application Protocol (CoAP) is a popular IoT protocol for low power and
lossy wireless networks. CoAP is commonly used to transmit small sized sensor data, while
image sizes may be in the order of MB. This thesis aims to provide a comprehensive Appli-
cation Programming Inteface (API) to make camera resources from the state of the arts low
cost Intel RealSense RGB-D (color and depth) cameras retrievable for a CoAP client. It
also gives an insight in basic camera concepts and the use of cameras for logistic companies.
As an example, the provided CoAP client computes the object dimensions of received point
cloud data and may show the color image and the depth image in grayscale values. The
client may monitor a resource, while it repeats the initial request. The application is tested
in several test cases, which show that CoAP can be used for simple 3D scan applications,
but packet drops become a bottleneck because with default protocol parameters (NSTART =
1), CoAP effectively becomes a “stop and wait” protocol. The median to transmit a color
image with a resolution of 1280x720 pixels over a wireless network is 14.6 s. The median to
transmit a full point cloud from a depth image with 1280x720 pixels over a wireless network
could be reduced to 16 s.

Last Modification: 09.06.2021 - Contact Person: Marian Buschsieweke