Université de Liège Réseau des Bibliothèques

Serveur institutionnel des thèses de doctorat

Nouvelles thèses
dans BICTEL/e - ULg
  • Marquet, Manon - Examining consequences of ageism on older adults and how ageist attitudes differ across cultural and socio-economic contexts
  • Shevchouk, Olesya Taisia - Steroid-dependent and -independent control of singing motivation and neural plasticity in a seasonal songbird
  • Dupuis, Nadine - Identification of chemical probes and signaling pathways for the orphan GPCR GPR27/Identification de modulateurs pharmacologiques et des voies de signalisation du RCPG orphelin GPR27
Présentation Recherche thèse Dépôt thèse Accès
Page de résumé pour ULgetd-10232012-110510

Auteur : Declercq, Arnaud
E-mail de l'auteur : Arnaud.Declercq@BuroHappold.com
URN : ULgetd-10232012-110510
Langue : Français/French
Titre : Real-time Simultaneous Modelling and Tracking of Articulated Objects
Intitulé du diplôme : Doctorat en sciences de l'ingénieur
Département : FSA - Département d'électricité, électronique et informatique
Jury :
Nom : Titre :
Sebe, Nicu Membre du jury/Committee Member
Wehenkel, Louis Membre du jury/Committee Member
Van Droogenbroeck, Marc Président du jury/Committee Chair
Piater, Justus Promoteur/Director
Verly, Jacques Promoteur/Director
Mots-clés :
  • computer vision
  • tracking
  • learning
Date de soutenance : 2012-09-25
Type d'accès : Public/Internet
Résumé :

In terms of capability, there is still a huge gap between the human visual system

and existing computer vision algorithms. To achieve results of su cient quality,

these algorithms are generally extremely specialised in the task they have been

designed for. All the knowledge available during their implementation is used

to bias the output result and/or facilitate the initialisation of the system. This

leads to increased robustness but a lower reusability of the code. In most cases,

it also majorly limits the freedom of the user by constraining him to a limited

set of possible interactions.

In this thesis, we propose to go in the opposite direction by developing a

general framework capable of both tracking and learning objects as complex

as articulated objects. The robustness will be achieved by using one task to

assist the other. The method should be completely unsupervised with no prior

knowledge about the appearance or shape of the objects encountered (although,

we decided to focus on rigid and articulated objects). With this framework,

we hope to provide directions for a more di cult and distant goal: that of

completely eliminating the time consuming prior design of object models in

computer vision applications. This long term target will allow the reduction

of the time and cost of implementing computer vision applications. It will also

provide a larger freedom in the range of objects that can be used by the program.

Our research focuses on three main aspects of this framework. The rst one is

to create an object description e ective on a wide variety of complex objects and

able to assist the object tracking while being learnt. The second is to provide

both tracking and learning methods that can be executed simultaneously in

real-time. This is particularly challenging for tracking when a large number of

features are involved. Finally, our most challenging task and the core of this

thesis, is to design robust tracking and learning solutions able to assist each

other without creating counter-productive bias when one of them fails.

Autre version :
Fichiers :
Nom du fichier Taille Temps de chargement évalué (HH:MI:SS)
Modem 56K ADSL
[Public/Internet] PhD_Thesis_Declercq_Arnaud.pdf 17.79 Mb 00:42:20 00:01:34

Bien que le maximum ait été fait pour que les droits des ayants-droits soient respectés, si un de ceux-ci constatait qu'une oeuvre sur laquelle il a des droits a été utilisée dans BICTEL/e ULg sans son autorisation explicite, il est invité à prendre contact le plus rapidement possible avec la Direction du Réseau des Bibliothèques.

Parcourir BICTEL/e par Auteur|Département | Rechercher dans BICTEL/e

© Réseau des Bibliothèques de l'ULg, Grande traverse, 12 B37 4000 LIEGE