Safevision - Designing a Body worn camera for public safety

Tools Used

No items found.

Tools Used

No items found.

Project Overview

Objective

SafeVision is a wearable, touch screen, front facing body worn camera. As part of a product suite, this is compatible with a companion app!

View Prototype

Target Audience

Timeline

8 weeks | Jan-March 2023

Role

Role: UX/UI Designer

Contributions

Designed as part of a team of 2. I was responsible for creating the interface for the body worn camera along with the button interactions reflected on the screen. My partner designed the companion application. Taking the lead on creating the style guide, button interactions, and logged in vs logged out functionality.

Project Statement

Body-worn cameras (BWCs) are widely used by United States state and local law enforcement agencies in the performance of duties that require open and direct contact with the public. Local police and sheriffs’ offices acquire body-worn cameras to improve officer safety, increase evidence quality, reduce civilian complaints, and reduce agency liability. Current body worn cameras lack the ability to portray situations, danger cues, spatial awareness, and timing exactly as it is experienced.

How might we create an easy-to-use wearable application so that officers can quickly and accurately enter/view necessary information?

Project Scope

Design and develop an interactive mobile application (Android operating system) for public safety body worn cameras. Form factor includes a radio component in which public safety officers can receive and relay information. Form factors and accompanying mobile applications will have redundancies in interactions. Creatively, this product should be kept simple and minimal. The user should not have to go through multiple screens to complete and action.

Target Users
  • Police First Responder/ Frontline
  • Highway Patrol
  • Corrections Officers

Process

The Challenge
research
Design
Ideate
The Outcome

The Challenge

Redesign a body worn camera (BWC) user interface considering all product requirements, constraints and physical controls.

Product Requirements
  • Types of Features
    • Capture: videos, photos, voice notes
    • Pre-buffering event capture
    • System/device notifications
    • Media tagging and integrated metadata
    • Media access and management
    • RT broadcasting
  • Physical Controls:
    • PTT button
    • Power button
    • Volume toggle
    • Programmable buttons (2)
    • Emergency button
    • Video record slider
  • Functions:
    • Emergency alert
    • Pre-buffering event capture
    • Video, Photo, & Audio recording
    • Speech to text notes
    • RT broadcasting
    • System Notifications/ Status updates
    • Device indicators (recording, battery, signal strength, recording capacity, etc.)
    • Media tagging and integrated metadata (time, date, geo tagging)
    • Media access and management
UX Requirements
  • Recording indicator is required
    • Rationale:
      • People need to be aware that they are being recorded.
      • Some states require officer announcement to civilian for recording.
      • The device must offer ability to disable all visual & audible indicators for specialist operational roles that require presence to be concealed
  • Simplify device status information
    • Show only what is active or needs attention
  • Real-time transmission of footage from body-worn camera to a remote location
  • Automatic video recording is agency definable
  • Data integrity
    • File uploads should preserve the original format and any metadata retained
    • Each recorded incident should have its own file or files, with a unique file name or code

The Solution

Methods and process used to solve specific problem, user needs, business requirements and pain points. This section specifically fouse on how the features were designed to address the objectives.

Secondary Research

To better understand the user, secondary research was conducted. Some key insights pulled from this research were:

  • BWCs are intended to increase transparency, improve police encounters, enhance evidence pertaining to an officer encounter, and provide opportunities for improvement through officer training.
  • Officers can wear body-worn cameras on their clothing (on the chest, shirt pocket, collar, or shoulder) or mounted on a helmet or glasses.
  • Police officers are open to and supportive of the use of body-worn cameras.
  • There are over 324,951 police officers currently employed in the United States.
  • The average age of an employed police officer is 39 years old.

To better understand the product, research into the Android operating system and Motorola forma factor was conducted. Some artifacts utilized were the Motorola Si500 product guide ad documentation and the Android OS Style guide.

Read full secondary research report.

BWC Concept Map

Definition: Concept maps are visual representations of information. They can take the form of charts, graphic organizers, tables, flowcharts, Venn Diagrams, timelines, or T-charts.

This concept map helped me grasp the interactions and mindset of the user. It also allowed me to dive into the limitations and capabilities of the device. This tool acted as a guide and inspiration board for further understanding the form factor. By grasping the placement of the buttons allowed me to imagine the possibilities for integrating the physical controls with the touch screen. Plotting potential questions about the system helped me better understand my user's mental models around the BWC and situational touch points.

User Flow

Definition: A user flow is a visual representation of how the user moves through a website or application and shows what happens at each step along the way.

User Tasks
  • Turn device on/off
  • Change volume- up/down
  • Search for event capture
  • Take/ View/ Upload pictures
  • Start/ Stop/ Review/ Upload video and audio recording
  • Append captures (STT or voice notes)
  • View device indicators
  • View/Dismiss system notifications
  • Send/Cancel emergency alert
  • Receive emergency alert/view who is sending alert

The user flows reflected both the optimal path and their alternatives that I believed the users would take while using the system. To begin this process, I outlined the different tasks the system would need to be capable of accomplishing. Then I played out the possible touching for initiating the test. These ranged from Pushing a device button to tapping on the screen to interacting with an alert. Laying out the steps allowed me to indicate what happens or would happen at each step in a user's journey. It let me better understand what critical situation tasks needed to be simplified and the others that could have more components associated with them. Planning this before designing helped me know where officers could get confused and also what needed to change to make their experience smoother and easier.

Site map

Definition: A UX sitemap is a hierarchical diagram of a website or application, that shows how pages are prioritized, linked, and labeled. If a user flow is like the street view details, the sitemap is like the bird’s eye view.

Creating a site map allowed me to visualize the hierarchy of my app and determine where essential tasks would fit in. I was also able to connect the physical controls to actions done in the app. This site map included the initial touchpoint for the users along with the various screens, buttons and notification alerts. An approach to the site map that further helped understand the tasks and capabilities of the system was including a simple flow between the pages. For example noting where the several recording mechanisms would save to provided context for the filing system. Knowing this set a precedent for the sketches done later.

Sketches

Definition: UX Sketching is simply rough drawing by hand, commonly used for generating, communicating and refining ideas. It can be for your own reference, or to be shared with co-workers, team managers, or clients and stakeholders.

I heavily relied on these sketches to provide context for the screen design. In addition to designing screen alternatives, they also allowed me to envision the micro interactions of screen components.  Annotating the sketches gave me a way of documenting interactions for the screen that were not drawn. Sketching also provided a map for where required components should be placed and the best path to execution. These sketches were a great way to plan out the gestures and functions users would do along with outlining screen variations.

The Results

A body worn camera and companion app that makes detailing events quick, simple and intuitive!

Style Guide

The Style guide for this body worn camera was created in the mindset of applying the styles, competes and articles to a product suite. The wearable BWC is compatible with a companion app.

Wireframes v1

To start off the wireframe design, the first screens focus on the key interaction touchpoint needed. Each action the user will take is reflected on the screen. Designing the emergency device shutdown along with the officer log in process were the first two created. Keepinging in mind the device's small screen, it was key that any inadvertent actuations were minimized. With the need to keep the interface simple but have the ability to accomplish a multitude of tasks, bringing in the priority from the user flow was key.

Early on, I designed a locked feature to aid the users. This allows an officer to quickly capture a voice, image or video recording without logging in. Once logged in, the user will then save the recording as need. Keeping in mind that a quick reaction time is key in emergency situations it was imperative that the officer had the ability to complete an action and worry about saving the actions later.

Annotated Wireframes

Definition: Wireframes communicate structure, behaviors, and content for individual pages.

The annotated wireframes allowed me to define the user interactions with the BWC form factor and screen. Annotations provided a more technical aspect of the prototype so developers would be able to implement it. Having the annotations also allowed me to communicate  feature implementations that were not possible to prototype. This was another way to highlight future "nice to haves" for the system. I laid out the annotations screen by screen and with the different iterations of a screen. For example there are two versions of the Home page detailing a logged out and a logged in view. This helped with understanding the version available without a user signed in.

View the Annotated Wireframes

Interactive Prototypes

These prototype brought all the previous design artifacts to life. The interactive prototypes combined both the physical controls and the touch screen capabilities. To do this, I split my prototype into two separate versions. The first version was the device screen prototype containing all the interactions with the system. The second was a device button interaction prototype, highlighting the screen response to the physical control touchpoints.

Device Screen Prototype

This version showcases a users entire journey while using the BWC. It includes the optimal path to complete tasks as well as other options the user might take. Ultimately stimulating a typical flow.

View Device Screen Prototype!

Device Button Interaction Prototype

This prototype highlights only the interactions a user will do when a button is pushed.  It does not include the other actions on the screen that would lead the user to further envelope themselves in the system.

View Device Button Prototype!

Design

Lessons Learned

View my other projects

Discover Me

ErrorSafe

Eventify - Capstone