The robot cleans up
M. E. Messinger () and
R. J. Nowakowski ()
Additional contact information
M. E. Messinger: Dalhousie University
R. J. Nowakowski: Dalhousie University
Journal of Combinatorial Optimization, 2009, vol. 18, issue 4, No 3, 350-361
Abstract:
Abstract Imagine a large building with many corridors. A robot cleans these corridors in a greedy fashion, the next corridor cleaned is always the dirtiest to which it is incident. We determine bounds on the minimum s(G) and maximum S(G) number of time steps (over all edge weightings) before every edge of a graph G has been cleaned. We show that Eulerian graphs have a self-stabilizing property that holds for any initial edge weighting: after the initial cleaning of all edges, all subsequent cleanings require s(G) time steps. Finally, we show the only self-stabilizing trees are a subset of the superstars.
Keywords: Cleaning process; Searching; Greedy algorithms; Edge traversing (search for similar items in EconPapers)
Date: 2009
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://link.springer.com/10.1007/s10878-009-9236-7 Abstract (text/html)
Access to the full text of the articles in this series is restricted.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:jcomop:v:18:y:2009:i:4:d:10.1007_s10878-009-9236-7
Ordering information: This journal article can be ordered from
https://www.springer.com/journal/10878
DOI: 10.1007/s10878-009-9236-7
Access Statistics for this article
Journal of Combinatorial Optimization is currently edited by Thai, My T.
More articles in Journal of Combinatorial Optimization from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().