Abstract:
Fall related injuries are one of the biggest problem the elderly and the visually impaired populations face everyday. The Center for Disease Control and Prevention reported that about one-third of Americans over the age of 65 fall every year. And according to the American Foundation for the Blind, 25 million Americans suffer from total or partial vision loss, which are twice as likely to fall as their sighted counterparts. This creates a need for preventative systems that would allow for the detection of objects that may constitute a tripping hazard. This paper relates to the implementation of a component that will be part of a much larger system that will allow elderly and visually impaired people navigate safely in indoors environments using off-the-shelf smartphones. The component refers to a floor detection module that will identify the floor ahead of the walking person in structured and unstructured environments in real time. A structured environment is an area with a well defined shape such as hallway, and an unstructured environment is an area without a known shape (not all rooms are the same). Floor detection is a challenging task considering the real time nature of the system, and the use of resource-constrained devices such as smartphones. The evaluation of the system showed that it can work in real time with a run time of 1s. The accuracy of the floor detection module in unstructured environments was measured at 87.6% and 93% in structured environment.