Full metadata record
DC FieldValueLanguage
dc.contributor.authorRieger, Ankede
dc.date.accessioned2004-12-06T12:53:38Z-
dc.date.available2004-12-06T12:53:38Z-
dc.date.created1995de
dc.date.issued1999-10-29de
dc.identifier.issn0943-4135de
dc.identifier.urihttp://hdl.handle.net/2003/2590-
dc.identifier.urihttp://dx.doi.org/10.17877/DE290R-5094-
dc.description.abstractWe address the problem of guiding a robot in such a way, that it can decide, based on perceived sensor data, which future actions to choose, in order to reach a goal. In order to realize this guidance, the robot has access to a (probabilistic) automaton (PA), whose final states represent concepts, which have to be recognized in order to verify, that a goal has been achieved. The contribution of this work is to learn these PA's from classified sensor data of robot traces through known environments. Within this framework, we account for the uncertainties arising from ambiguous perceptions. We introduce a knowledge structure, called prefix tree , in which the sample data, represented as cases, is organized. The prefix tree is used to derive and estimate the parameters of deterministic, as well as probabilistic automata models, which reflect the inherent knowledge, implicit in the data, and which are used for recognition in a restricted first-order logic framework.en
dc.format.extent257726 bytes-
dc.format.extent796763 bytes-
dc.format.mimetypeapplication/pdf-
dc.format.mimetypeapplication/postscript-
dc.language.isoende
dc.publisherUniversität Dortmundde
dc.relation.ispartofseriesForschungsberichte des Lehrstuhls VIII, Fachbereich Informatik der Universität Dortmund ; 18de
dc.subject.ddc004de
dc.titleInferring probabilistic automata from sensor data for robot navigationen
dc.typeTextde
dc.type.publicationtypereport-
dcterms.accessRightsopen access-
Appears in Collections:LS 08 Künstliche Intelligenz

Files in This Item:
File Description SizeFormat 
report18_ps.pdfDNB251.69 kBAdobe PDFView/Open
report18_ps.ps778.09 kBPostscriptView/Open


This item is protected by original copyright



This item is protected by original copyright rightsstatements.org