Activity Recognition in Android — Still, Walking, Running, Driving and much more

Activity Recognition in Android — Still, Walking, Running, Driving and much more

Nowadays, everyone is having smartphones, and we use them to do our day to day lives. The best part of Android applications that are present in mobile phones is that these applications try to know their users more and more. Today, many of the applications are taking locations of the users to give a location related feeds to the users. One common example of this can be a normal news application, where the app takes your current location and presents the news according to the location.

If you are an Android Developer, then to give your users a better experience of the application, you have to understand your users in a better way. You should know what your users are doing at any instant of time. The more you know about your users the better application you can build for your users. So, there are many application that uses this Activity Recognition of the users. For example, a kilometer finder app starts running when you start driving a car or a bike and stops when you stop driving. By doing this, the app can find the distance that you traveled in a particular day. Other application of this Activity Recognition can be any Health and Fitness app that determines how many meters or kilometers you are running or walking on a particular day and after that, you can find the calories burnt on that day.

So, in this blog, we will learn how to use the Activity Recognition feature in our Android application to find if a user is Still, Running, Walking, Driving or something else. So, let’s get started.

Activity Recognition Client

In order to find the activities that the user is doing at a particular moment, you have to constantly communicate with the sensors of the mobile and after collecting the data, you have to use some Machine Learning algorithm to find which activity the user is currently doing. But hold on! do we need to learn Machine Learning algorithms to recognize the activities that the user is doing? No no, you need not learn any Machine Learning algorithm for detecting the Activities.

In Android, we have the Activity Recognition Client that wakes up your device at a regular interval and then collects the data from the device’s sensor and after that this collected data will be used to determine the Activities with the help of some Machine Learning algorithm. All you need to do is use the Activity Recognition Client and the API will do the rest for you.

The Activity Recognition Client returns a list of activities that a user might be doing with some confidence percentage. This confidence percentage tells you about the surety of the activity. For example, the activity which is having more than 75% confidence, then there is a probability that the user might be doing that activity. So the confidence parameter tells you the probability of an activity being done by the user.

Activities detected by the Activity Recognition Client

The Activity Recognition Client determines a list of Activities that a user can perform and the API can detect. Following are the activities that can be detected by the Activity Recognition Client:

  • STILL: When the mobile device will be still i.e. the user is either sitting at some place or the mobile device is having no motion, then the Activity Recognition Client will detect the STILL activity.
  • ON_FOOT: When the mobile device is moving at a normal speed i.e. the user carrying the mobile device is either walking or running then the Activity Recognition Client will detect the ON_FOOT activity.
  • WALKING: This is a sub-activity of the ON_FOOT activity which is detected by the Activity Recognition Client when the user carrying the mobile device is walking.
  • RUNNING: This is also a sub-activity of ON_FOOT activity which is detected by the Activity Recognition Client when the user carrying the mobile device is running.
  • IN_VEHICLE: This activity detected when the mobile device is in the bus or car or some other kind of vehicle or the user holding the mobile device is present in the vehicle.
  • ON_BICYCLE: When the device is on the bicycle or the user carrying the mobile is on a bicycle then this activity will be detected.
  • TILTING: When the mobile device is being lifted and is having some angle with the flat surface then the Activity Recognition Client will detect this activity.
  • UNKNOWN: The Activity Recognition Client will show this result when the device is unable to detect any activity on the mobile device.

Activity Recognition example

We have learned the concept of Activity Recognition Client. Now, the time is for some practical session :) Let’s have one example of Activity Recognition Client to understand the concept of Activity Recognition in a better way. So, let’s do one example.

Open Android Studio and create a project with Empty Activity template.

Adding dependencies and permissions

The Activity Recognition Client requires the dependency of Google Play Services. So, to add the dependency of Google Play Services, add the below line in your app level build.gradle file:

implementation 'com.google.android.gms:play-services-location:16.0.0'

After adding the dependency of Google Play Services, add the permission of ACTIVITY_RECOGNITION in the AndroidManifest.xml file:

<uses-permission android:name="com.google.android.gms.permission.ACTIVITY_RECOGNITION" />

Adding the UI of the application

So, the process of adding the dependencies and permissions are done. Now the next step is to add the UI for our Main Activity. In our application, we will be having one TextView to display the name of current activity and one TextView to display the confidence precentage of Activity. So, the activity_main.xml file looks something like this:

<?xml version="1.0" encoding="utf-8"?>
<android.support.constraint.ConstraintLayout xmlns:app="http://schemas.android.com/apk/res-auto"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    xmlns:android="http://schemas.android.com/apk/res/android">

    <TextView
        android:id="@+id/txt_activity"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_margin="24dp"
        android:layout_marginStart="8dp"
        android:layout_marginLeft="8dp"
        android:layout_marginEnd="8dp"
        android:layout_marginRight="8dp"
        android:layout_marginBottom="48dp"
        android:textAllCaps="true"
        android:textColor="@color/colorPrimary"
        android:textSize="18dp"
        android:textStyle="bold"
        app:layout_constraintBottom_toTopOf="@+id/txt_confidence"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent" />

    <TextView
        android:id="@+id/txt_confidence"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_gravity="center_horizontal"
        android:layout_margin="24dp"
        android:layout_marginStart="8dp"
        android:layout_marginLeft="8dp"
        android:textAllCaps="true"
        android:textSize="14dp"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toTopOf="parent" />

    <Button
        android:id="@+id/btn_start_tracking"
        android:layout_width="240dp"
        android:layout_height="wrap_content"
        android:layout_marginStart="8dp"
        android:layout_marginLeft="8dp"
        android:layout_marginEnd="8dp"
        android:layout_marginRight="8dp"
        android:layout_marginBottom="8dp"
        android:text="Start Tracking"
        app:layout_constraintBottom_toTopOf="@+id/btn_stop_tracking"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent" />

    <Button
        android:id="@+id/btn_stop_tracking"
        android:layout_width="240dp"
        android:layout_height="wrap_content"
        android:layout_alignParentRight="true"
        android:layout_alignParentBottom="true"
        android:layout_marginStart="8dp"
        android:layout_marginLeft="8dp"
        android:layout_marginEnd="8dp"
        android:layout_marginRight="8dp"
        android:layout_marginBottom="8dp"
        android:text="Stop Tracking"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintStart_toStartOf="parent" />

</android.support.constraint.ConstraintLayout>

Now, add the strings values in the res/values/strings.xml file:

<resources>
    <string name="app_name">Activity Recognition</string>
    <string name="activity_in_vehicle">In Vehicle</string>
    <string name="activity_on_bicycle">On Bicycle</string>
    <string name="activity_on_foot">On Foot</string>
    <string name="activity_running">Running</string>
    <string name="activity_still">Still</string>
    <string name="activity_tilting">Tilting</string>
    <string name="activity_walking">walking</string>
    <string name="activity_unknown">Unknown</string>
</resources>

Now, we are done with the UI part of our Main Activity. Now, let's move on to the coding part of our application.

So, we are totally done with the UI part of our application. Now let’s move on to the coding part.

Creating an IntentService

After making the UI of the application, our next task is to create a class that will be extended from the IntentService. This class will return a list of probable activities that can be performed by the user or the activity that the user is currently doing i.e. WALKING, RUNNING, ON_FOOT, etc. The code of my DetectedActivitiesIntentService is:

class DetectedActivitiesIntentService : IntentService(TAG) {

    override fun onCreate() {
        super.onCreate()
    }

    override fun onHandleIntent(intent: Intent?) {
        val result = ActivityRecognitionResult.extractResult(intent)

        // Get the list of the probable activities associated with the current state of the
        // device. Each activity is associated with a confidence level, which is an int between
        // 0 and 100.
        val detectedActivities = result.probableActivities as ArrayList<*>

        for (activity in detectedActivities) {
            broadcastActivity(activity as DetectedActivity)
        }
    }

    private fun broadcastActivity(activity: DetectedActivity) {
        val intent = Intent(MainActivity.BROADCAST_DETECTED_ACTIVITY)
        intent.putExtra("type", activity.type)
        intent.putExtra("confidence", activity.confidence)
        LocalBroadcastManager.getInstance(this).sendBroadcast(intent)
    }

    companion object {

        protected val TAG = DetectedActivitiesIntentService::class.java.simpleName
    }
}// Use the TAG to name the worker thread.

Activity Running in Background

Our next step is to write the code for the MainActivity.kt file. But before that, the thing that should be kept in mind is the battery performance. If you want your Activity Recognition Client to update the activities in a regular interval or in a frequent manner then this will reduce the battery performance of your mobile device. Also, if you want your application to continuously track the ongoing activity, then it is a good task to run your Application in background. But at the same time, the battery consumption should be take care of.

So, make a class that will detect the Activities in background. Here is the code:

class BackgroundDetectedActivitiesService : Service() {

    private lateinit var mIntentService: Intent
    private lateinit var mPendingIntent: PendingIntent
    private lateinit var mActivityRecognitionClient: ActivityRecognitionClient

    internal var mBinder: IBinder = LocalBinder()

    inner class LocalBinder : Binder() {
        val serverInstance: BackgroundDetectedActivitiesService
            get() = this@BackgroundDetectedActivitiesService
    }

    override fun onCreate() {
        super.onCreate()
        mActivityRecognitionClient = ActivityRecognitionClient(this)
        mIntentService = Intent(this, DetectedActivitiesIntentService::class.java)
        mPendingIntent = PendingIntent.getService(this, 1, mIntentService, PendingIntent.FLAG_UPDATE_CURRENT)
        requestActivityUpdatesButtonHandler()
    }

    override fun onBind(intent: Intent): IBinder? {
        return mBinder
    }

    override fun onStartCommand(intent: Intent, flags: Int, startId: Int): Int {
        super.onStartCommand(intent, flags, startId)
        return Service.START_STICKY
    }

    fun requestActivityUpdatesButtonHandler() {
        val task = mActivityRecognitionClient?.requestActivityUpdates(
                MainActivity.DETECTION_INTERVAL_IN_MILLISECONDS,
                mPendingIntent)

        task?.addOnSuccessListener {
            Toast.makeText(applicationContext,
                    "Successfully requested activity updates",
                    Toast.LENGTH_SHORT)
                    .show()
        }

        task?.addOnFailureListener {
            Toast.makeText(applicationContext,
                    "Requesting activity updates failed to start",
                    Toast.LENGTH_SHORT)
                    .show()
        }
    }

    fun removeActivityUpdatesButtonHandler() {
        val task = mActivityRecognitionClient?.removeActivityUpdates(
                mPendingIntent)
        task?.addOnSuccessListener {
            Toast.makeText(applicationContext,
                    "Removed activity updates successfully!",
                    Toast.LENGTH_SHORT)
                    .show()
        }

        task?.addOnFailureListener {
            Toast.makeText(applicationContext, "Failed to remove activity updates!",
                    Toast.LENGTH_SHORT).show()
        }
    }

    override fun onDestroy() {
        super.onDestroy()
        removeActivityUpdatesButtonHandler()
    }

    companion object {
        private val TAG = BackgroundDetectedActivitiesService::class.java?.getSimpleName()
    }
}

Code for MainActivity.kt

So, our final task is to write the code for MainActivity.kt file. Here, BroadCastReceiver() is used to receive the Activity Update from the user i.e. whenever there is a change in Activity, then the activity will be recived. Following is the code for MainActivity.kt file:

class MainActivity : AppCompatActivity() {

    private val TAG = MainActivity::class.java.simpleName
    internal lateinit var broadcastReceiver: BroadcastReceiver

    private lateinit var txtActivity: TextView
    private lateinit var txtConfidence: TextView
    private lateinit var btnStartTrcking: Button
    private lateinit var btnStopTracking: Button

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)

        txtActivity = findViewById(R.id.txt_activity)
        txtConfidence = findViewById(R.id.txt_confidence)
        btnStartTrcking = findViewById(R.id.btn_start_tracking)
        btnStopTracking = findViewById(R.id.btn_stop_tracking)

        btnStartTrcking?.setOnClickListener { startTracking() }

        btnStopTracking?.setOnClickListener { stopTracking() }

        broadcastReceiver = object : BroadcastReceiver() {
            override fun onReceive(context: Context, intent: Intent) {
                if (intent.action == MainActivity.BROADCAST_DETECTED_ACTIVITY) {
                    val type = intent.getIntExtra("type", -1)
                    val confidence = intent.getIntExtra("confidence", 0)
                    handleUserActivity(type, confidence)
                }
            }
        }

        startTracking()
    }

    private fun handleUserActivity(type: Int, confidence: Int) {
        var label = getString(R.string.activity_unknown)

        when (type) {
            DetectedActivity.IN_VEHICLE -> {
                label = "You are in Vehicle"
            }
            DetectedActivity.ON_BICYCLE -> {
                label = "You are on Bicycle"
            }
            DetectedActivity.ON_FOOT -> {
                label = "You are on Foot"
            }
            DetectedActivity.RUNNING -> {
                label = "You are Running"
            }
            DetectedActivity.STILL -> {
                label = "You are Still"
            }
            DetectedActivity.TILTING -> {
                label = "Your phone is Tilted"
            }
            DetectedActivity.WALKING -> {
                label = "You are Walking"
            }
            DetectedActivity.UNKNOWN -> {
                label = "Unkown Activity"
            }
        }

        Log.e(TAG, "User activity: $label, Confidence: $confidence")

        if (confidence > MainActivity.CONFIDENCE) {
            txtActivity?.text = label
            txtConfidence?.text = "Confidence: $confidence"
        }
    }

    override fun onResume() {
        super.onResume()

        LocalBroadcastManager.getInstance(this).registerReceiver(broadcastReceiver,
                IntentFilter(MainActivity.BROADCAST_DETECTED_ACTIVITY))
    }

    override fun onPause() {
        super.onPause()

        LocalBroadcastManager.getInstance(this).unregisterReceiver(broadcastReceiver)
    }

    private fun startTracking() {
        val intent = Intent(this@MainActivity, BackgroundDetectedActivitiesService::class.java)
        startService(intent)
    }

    private fun stopTracking() {
        val intent = Intent(this@MainActivity, BackgroundDetectedActivitiesService::class.java)
        stopService(intent)
    }

    companion object {

        val BROADCAST_DETECTED_ACTIVITY = "activity_intent"

        internal val DETECTION_INTERVAL_IN_MILLISECONDS: Long = 1000

        val CONFIDENCE = 70
    }
}

Now, run the application and you will get the list of activities along with the confidence of the activity.

Conclusion

In this blog, we learned how to use the Activity Recognition Client in our application to determine the activities that users are doing at any instant of time. The Activity Recognition Client determines the ongoing Activities with some confidence percentage that tells you which activity is currently going on.

So, that’s it for this blog. Keep Learning :)

Team MindOrks!