Angular-Driven Feedback Restoration Networks for Imperfect Sketch Recognition

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

4 Scopus Citations
View graph of relations


Related Research Unit(s)


Original languageEnglish
Pages (from-to)5085-5095
Number of pages11
Journal / PublicationIEEE Transactions on Image Processing
Online published15 Apr 2021
Publication statusPublished - 2021



Automatic hand-drawn sketch recognition is an important task in computer vision. However, the vast majority of prior works focus on exploring the power of deep learning to achieve better accuracy on complete and clean sketch images, and thus fail to achieve satisfactory performance when applied to incomplete or destroyed sketch images. To address this problem, we first develop two datasets that contain different levels of scrawl and incomplete sketches. Then, we propose an angular-driven feedback restoration network (ADFRNet), which first detects the imperfect parts of a sketch and then refines them into high quality images, to boost the performance of sketch recognition. By introducing a novel “feedback restoration loop” to deliver information between the middle stages, the proposed model can improve the quality of generated sketch images while avoiding the extra memory cost associated with popular cascading generation schemes. In addition, we also employ a novel angular-based loss function to guide the refinement of sketch images and learn a powerful discriminator in the angular space. Extensive experiments conducted on the proposed imperfect sketch datasets demonstrate that the proposed model is able to efficiently improve the quality of sketch images and achieve superior performance over the current state-of-the-arts.

Research Area(s)

  • angular-based loss function, attention module, Convolution, Deep learning, feedback restoration loop, Image recognition, Image restoration, Imperfect sketch recognition, Semantics, Support vector machines, Task analysis

Bibliographic Note

Information for this record is supplemented by the author(s) concerned.

Download Statistics

No data available