Visual Sorting of Express Parcels Based on Multi-Task Deep Learning

Visual sorting of express parcels in complex scenes has always been a key issue in intelligent logistics sorting systems. With existing methods, it is still difficult to achieve fast and accurate sorting of disorderly stacked parcels. In order to achieve accurate detection and efficient sorting of disorderly stacked express parcels, we propose a robot sorting method based on multi-task deep learning. Firstly, a lightweight object detection network model is proposed to improve the real-time performance of the system. A scale variable and the joint weights of the network are used to sparsify the model and automatically identify unimportant channels. Pruning strategies are used to reduce the model size and increase the speed of detection without losing accuracy. Then, an optimal sorting position and pose estimation network model based on multi-task deep learning is proposed. Using an end-to-end network structure, the optimal sorting positions and poses of express parcels are estimated in real time by combining pose and position information for joint training. It is proved that this model can further improve the sorting accuracy. Finally, the accuracy and real-time performance of this method are verified by robotic sorting experiments.

Medienart:

E-Artikel

Erscheinungsjahr:

2020

Erschienen:

2020

Enthalten in:

Zur Gesamtaufnahme - volume:20

Enthalten in:

Sensors (Basel, Switzerland) - 20(2020), 23 vom: 27. Nov.

Sprache:

Englisch

Beteiligte Personen:

Han, Song [VerfasserIn]
Liu, Xiaoping [VerfasserIn]
Han, Xing [VerfasserIn]
Wang, Gang [VerfasserIn]
Wu, Shaobo [VerfasserIn]

Links:

Volltext

Themen:

Intelligent logistics sorting system
Journal Article
Multi-task deep learning
Object detection network
Robotic sorting
Warehouse automation

Anmerkungen:

Date Completed 07.12.2020

Date Revised 14.12.2020

published: Electronic

Citation Status PubMed-not-MEDLINE

doi:

10.3390/s20236785

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

NLM318273667