EconPapers    
Economics at your fingertips  
 

Increasing the Energy-Efficiency in Vacuum-Based Package Handling Using Deep Q-Learning

Felix Gabriel, Johannes Bergers, Franziska Aschersleben and Klaus Dröder
Additional contact information
Felix Gabriel: Institute of Machine Tools and Production Technology, Technische Universität Braunschweig, Langer Kamp 19b, 38106 Braunschweig, Germany
Johannes Bergers: Institute of Machine Tools and Production Technology, Technische Universität Braunschweig, Langer Kamp 19b, 38106 Braunschweig, Germany
Franziska Aschersleben: Institute of Machine Tools and Production Technology, Technische Universität Braunschweig, Langer Kamp 19b, 38106 Braunschweig, Germany
Klaus Dröder: Institute of Machine Tools and Production Technology, Technische Universität Braunschweig, Langer Kamp 19b, 38106 Braunschweig, Germany

Energies, 2021, vol. 14, issue 11, 1-13

Abstract: Billions of packages are automatically handled in warehouses every year. The gripping systems are, however, most often oversized in order to cover a large range of different carton types, package masses, and robot motions. In addition, a targeted optimization of the process parameters with the aim of reducing the oversizing requires prior knowledge, personnel resources, and experience. This paper investigates whether the energy-efficiency in vacuum-based package handling can be increased without the need for prior knowledge of optimal process parameters. The core method comprises the variation of the input pressure for the vacuum ejector, compliant to the robot trajectory and the resulting inertial forces at the gripper-object-interface. The control mechanism is trained by applying reinforcement learning with a deep Q-agent. In the proposed use case, the energy-efficiency can be increased by up to 70% within a few hours of learning. It is also demonstrated that the generalization capability with regard to multiple different robot trajectories is achievable. In the future, the industrial applicability can be enhanced by deployment of the deep Q-agent in a decentral system, to collect data from different pick and place processes and enable a generalizable and scalable solution for energy-efficient vacuum-based handling in warehouse automation.

Keywords: vacuum-based handling; energy-efficiency; deep Q-learning; automation (search for similar items in EconPapers)
JEL-codes: Q Q0 Q4 Q40 Q41 Q42 Q43 Q47 Q48 Q49 (search for similar items in EconPapers)
Date: 2021
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/1996-1073/14/11/3185/pdf (application/pdf)
https://www.mdpi.com/1996-1073/14/11/3185/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jeners:v:14:y:2021:i:11:p:3185-:d:565111

Access Statistics for this article

Energies is currently edited by Ms. Agatha Cao

More articles in Energies from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jeners:v:14:y:2021:i:11:p:3185-:d:565111