Open Access System for Information Sharing

Login Library

 

Article
Cited 7 time in webofscience Cited 7 time in scopus
Metadata Downloads

Unified deep neural networks for end-to-end recognition of multi-oriented billet identification number SCIE SCOPUS

Title
Unified deep neural networks for end-to-end recognition of multi-oriented billet identification number
Authors
Koo, G.Yun, J.P.Choi, H.Kim, S.W.
Date Issued
2021-04
Publisher
PERGAMON-ELSEVIER SCIENCE LTD
Abstract
In this study, a novel framework for the recognition of a billet identification number (BIN) using deep learning is proposed. Because a billet, which is a semi-finished product, could be rolled, the BIN may be rotated at various angles. Most product numbers, including BIN, are a combination of individual characters. Such product numbers are determined based on the class of each character and its order (or the positioning). In addition, the two pieces of information are constant even if the product number is rotated. Inspired by this concept, the proposed framework of deep neural networks has two outputs. One is for the class of an individual character, and the other is the order of an individual character within BIN. Compared with a previous study, the proposed network requires an additional annotation but does not require additional labor for labeling. The multi-task learning for two annotations has a positive role in the representation learning of a network, which is shown in the experiment results. Furthermore, to achieve a good performance of the BIN identification, we analyzed various networks using the proposed framework. The proposed algorithm was then compared with a conventional algorithm to evaluate the performance of the BIN identification.
URI
https://oasis.postech.ac.kr/handle/2014.oak/105128
DOI
10.1016/j.eswa.2020.114377
ISSN
0957-4174
Article Type
Article
Citation
EXPERT SYSTEMS WITH APPLICATIONS, vol. 168, 2021-04
Files in This Item:
There are no files associated with this item.

qr_code

  • mendeley

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Views & Downloads

Browse