LOGIN
>> Home
>> Topics
>> Students
>> Partners
>> Statistics


Information for topics

Topic Id:
ID topic: 585
Partner Email: dtdim@iinf.bas.bg
Project Title: Real-Time Segmentation and Tracing of a Mobile Object in a Video Clip
Abstract: A team at the Institute of Information and Communication Technologies of BAS is working on an experimental system EFIRS for efficient (rapid and noise resistant) access to Data Bases of Images (DBI), mainly following the concept CBIR (Content Based Image Retrieval). Briefly said, EFIRS system extracts from a given input image the relevant content, with respect to which it organizes rapid index search for similar images in a given DBI. The target of EFIRS was the practice of the Patent Department of the Republic of Bulgaria and more precisely – their large DBI of professional trademarks and services marks. At the same time, some other, in certain aspect non-standard applications of the methods of access, developed in EFIRS, deserve interest, such as these intended for efficient search for 3D objects, where the corresponding 3D samples in DB are obtained by the so called “surrounding filming”. The idea of recognition, from a Data Base (DB) with 3D samples, is simple – in DBI we shall rely on a sufficient number of stored images (i.e., 2D projections) of every one of the objects that the system must recognize. 2D projections of every 3D object in DBI must be different and show sufficient viewpoints, evenly located in the space sector in front of the camera, so that the EFIRS mechanism of noise resistance is actuated (in case of “smoothing” – from one projection to another). If following “a well structured scenario”, the surrounding filming will give enough information for storing in DBI. The surrounding filming without any scenario is in fact the actual video sequence of a given object in front of the camera. The idea of surrounding filming namely inspires the topic of this diploma work. Purpose of the diploma work offered: To develop and experiment a method and a software program for real time processing of video clips, as well as for extraction (segmentation) of a mobile object. It is considered that the mobile object is the “biggest spot” in most of the frames of the video clip, for example - a face in a frame, a gesticulating hand in the language of people suffering hearing loss, an automobile in movement, etc. The problem is to be solved in two aspects: (1) an immobile camera, i.e., the object makes local movements in front of the camera and (2) a mobile camera, tracing the object movement in the environment. The work includes the following parts: - A module for extraction of separate frames from the video clip; - A module for segmentation of a mobile object in a series of frames with an immobile camera, following the principle that the object traced is dominating (i.e., the biggest mobile spot in the clip); - A module for evaluation of the movement of the scene with respect to the working video camera; - A module for evaluation of the relative movement of the object with respect to the environment in the scene considered; - A module for extracting (segmenting) of the object by deletion of the environment in the clip frames. To organize a test system for demonstrating the modules developed. Recommended environment for software experiments - Borland C/C++ Builder.
Advisor: Assoc. Prof. Dr Dimo T. Dimov
Link:
Degree: Bachelor
 Keywords:
Artificial intelligence & Neural networks
Computer vision
Image processing