Please use this identifier to cite or link to this item:
Scopus Web of Science® Altmetric
Full metadata record
DC FieldValueLanguage
dc.contributor.authorDick, A.en
dc.contributor.authorTorr, P.en
dc.contributor.authorCipolla, R.en
dc.identifier.citationInternational Journal of Computer Vision, 2004; 60(2):111-134en
dc.descriptionThe original publication can be found at www.springerlink.comen
dc.description.abstractThis paper describes the automatic acquisition of three dimensional architectural models from short image sequences. The approach is Bayesian and model based. Bayesian methods necessitate the formulation of a prior distribution; however designing a generative model for buildings is a difficult task. In order to overcome this a building is described as a set of walls together with a ‘Lego’ kit of parameterised primitives, such as doors or windows. A prior on wall layout, and a prior on the parameters of each primitive can then be defined. Part of this prior is learnt from training data and part comes from expert architects. The validity of the prior is tested by generating example buildings using MCMC and verifying that plausible buildings are generated under varying conditions. The same MCMC machinery can also be used for optimising the structure recovery, this time generating a range of possible solutions from the posterior. The fact that a range of solutions can be presented allows the user to select the best when the structure recovery is ambiguous.en
dc.description.statementofresponsibilityA. R. Dick, P. H. S. Torr and R. Cipollaen
dc.publisherKluwer Academic Publen
dc.subjectarchitectural modelling; structure and motion; object recognitionen
dc.titleModelling and interpretation of architecture from several imagesen
dc.typeJournal articleen
pubs.library.collectionComputer Science publicationsen
dc.identifier.orcidDick, A. [0000-0001-9049-7345]en
Appears in Collections:Computer Science publications

Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.