Semi-automated DroidBot: Semi-automated Android UI testing
- Mentors
- Hanno Lemoine, Yuanchun Li
- Organization
- The Honeynet Project
The solution will consist of an Android app, that will read the input provided by the user, and generate an interaction model, which can be read by DroidBot, and used for automating the appropriate view.
Firstly, for every new page, a layout will be generated. The dumpWindowHierarchy method can be used to dump the layout to a file, and this file can be read to understand the events associated with it. This will be useful later, when the input events need to be reconciled.
Touch input events will be recorded using the getevent tool. Simple touches and gestures will be differentiated. Only events that are contained within the layout area will be processed by the system.
Since getevent isn't reliable for keypress input, it cannot be used. Instead, an IME that logs text input will be implemented. This reduces logical complexity by not having to parse getevent logs for keyboard input, when keyboard layouts can vary wildly.
The input events will be reconciled with the UI, by performing the input actions on the innermost element that the X and Y values of the input were contained within.
Interactions can be sent to droidbot either as a script, or by directly deserializing the UTG class.