DC Field | Value | Language |
---|---|---|
dc.contributor.author | Park, JS | - |
dc.contributor.author | Han, JH | - |
dc.date.accessioned | 2016-03-31T13:57:26Z | - |
dc.date.available | 2016-03-31T13:57:26Z | - |
dc.date.created | 2009-03-18 | - |
dc.date.issued | 1998-01 | - |
dc.identifier.issn | 0031-3203 | - |
dc.identifier.other | 1998-OAK-0000000048 | - |
dc.identifier.uri | https://oasis.postech.ac.kr/handle/2014.oak/20898 | - |
dc.description.abstract | This paper presents a novel method of velocity field estimation for the points on moving contours in a 2-D image sequence. The method determines the corresponding point in a next image frame by considering the curvature change of a given point on the contour. In traditional methods, there are errors in optical flow estimation for the points which have low curvature variations since those methods compute solutions by approximating normal optical flow. The proposed method computes optical flow vectors of contour points minimizing the curvature changes. As a first step, snakes are used to locate smooth curves in 2-D imagery. Thereafter, the extracted curves are tracked continuously. Each point on a contour has a unique corresponding point on the contour in the next frame whenever the curvature distribution of the contour varies smoothly. The experimental results showed that the proposed method computes accurate optical flow vectors for various moving contours. (C) 1997 Pattern Recognition Society. Published by Elsevier Science Ltd. | - |
dc.description.statementofresponsibility | X | - |
dc.language | English | - |
dc.publisher | PERGAMON-ELSEVIER SCIENCE LTD | - |
dc.relation.isPartOf | PATTERN RECOGNITION | - |
dc.subject | contour motion | - |
dc.subject | optical flow | - |
dc.subject | snakes | - |
dc.subject | tracking | - |
dc.subject | contour matching | - |
dc.subject | OPTICAL-FLOW | - |
dc.subject | CORNER DETECTION | - |
dc.subject | TRACKING | - |
dc.title | Contour motion estimation from image sequences using curvature information | - |
dc.type | Article | - |
dc.contributor.college | 컴퓨터공학과 | - |
dc.identifier.doi | 10.1016/s0031-3203(97)00031-9 | - |
dc.author.google | Park, JS | - |
dc.author.google | Han, JH | - |
dc.relation.volume | 31 | - |
dc.relation.issue | 1 | - |
dc.relation.startpage | 31 | - |
dc.relation.lastpage | 39 | - |
dc.contributor.id | 10077431 | - |
dc.relation.journal | PATTERN RECOGNITION | - |
dc.relation.index | SCI급, SCOPUS 등재논문 | - |
dc.relation.sci | SCI | - |
dc.collections.name | Journal Papers | - |
dc.type.rims | ART | - |
dc.identifier.bibliographicCitation | PATTERN RECOGNITION, v.31, no.1, pp.31 - 39 | - |
dc.identifier.wosid | 000071506300004 | - |
dc.date.tcdate | 2019-01-01 | - |
dc.citation.endPage | 39 | - |
dc.citation.number | 1 | - |
dc.citation.startPage | 31 | - |
dc.citation.title | PATTERN RECOGNITION | - |
dc.citation.volume | 31 | - |
dc.contributor.affiliatedAuthor | Han, JH | - |
dc.identifier.scopusid | 2-s2.0-0031633646 | - |
dc.description.journalClass | 1 | - |
dc.description.journalClass | 1 | - |
dc.description.wostc | 14 | - |
dc.type.docType | Article | - |
dc.subject.keywordPlus | OPTICAL-FLOW | - |
dc.subject.keywordPlus | CORNER DETECTION | - |
dc.subject.keywordPlus | TRACKING | - |
dc.subject.keywordAuthor | contour motion | - |
dc.subject.keywordAuthor | optical flow | - |
dc.subject.keywordAuthor | snakes | - |
dc.subject.keywordAuthor | tracking | - |
dc.subject.keywordAuthor | contour matching | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Engineering | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.
library@postech.ac.kr Tel: 054-279-2548
Copyrights © by 2017 Pohang University of Science ad Technology All right reserved.