Search code examples
androidandroid-sdk-2.3

Using Android as a Touch Screen HMI


I've been asked to develop an HMI application using an Android Tablet, but I'm not sure as to whether or not using a tablet is a good solution (I have zero experience with dev'ing for Android).

The project will consist of two hardware devices: 1. A device that monitors and controls water systems (this is the brain of the system) 2. A touch screen that allows a human to view and interact with the the brain (this is the HMI)

The brain and HMI will be built in a single box to make it appear as though it's one device.

Typically, the devices that are used for HMI are made with intent that the HMI will be used for one particular application and being that these devices are typically running Linux, Windows Embedded, etc, the developer has a lot of control as to how the HMI will run, look, and feel.

Can anyone fill me in on how much control the Android SDK gives the developer? i.e.

  • Does the Android SDK make it possible for the developer to do pretty much whatever s/he wants to do to the device?
  • Can the developer lock down (deny access) to specific features of the Android tablet?

I have a handful of other concerns with using the Android Tablet for this project I'm about to embark on, but I suppose the answers to the questions noted above will bring me closer to making a more informed decision on a path forward.


Solution

  • Does the Android SDK make it possible for the developer to do pretty much whatever s/he wants to do to the device?

    No. The Android SDK makes it possible for the developer to write apps. If you want to "do pretty much whatever", you would need to be able to build your own custom firmware to run on that device. Such custom firwmare, IMHO, is pretty much de rigueur for this sort of industrial situation, anyway.

    Can the developer lock down (deny access) to specific features of the Android tablet?

    Via custom firmware, yes. Via the Android SDK, generally no.