SafeVision is a wearable, touch screen, front facing body worn camera. As part of a product suite, this is compatible with a companion app!
View Prototype8 weeks | Jan-March 2023
Role: UX/UI Designer
Designed as part of a team of 2. I was responsible for creating the interface for the body worn camera along with the button interactions reflected on the screen. My partner designed the companion application. Taking the lead on creating the style guide, button interactions, and logged in vs logged out functionality.
Body-worn cameras (BWCs) are widely used by United States state and local law enforcement agencies in the performance of duties that require open and direct contact with the public. Local police and sheriffs’ offices acquire body-worn cameras to improve officer safety, increase evidence quality, reduce civilian complaints, and reduce agency liability. Current body worn cameras lack the ability to portray situations, danger cues, spatial awareness, and timing exactly as it is experienced.
How might we create an easy-to-use wearable application so that officers can quickly and accurately enter/view necessary information?
Design and develop an interactive mobile application (Android operating system) for public safety body worn cameras. Form factor includes a radio component in which public safety officers can receive and relay information. Form factors and accompanying mobile applications will have redundancies in interactions. Creatively, this product should be kept simple and minimal. The user should not have to go through multiple screens to complete and action.
Redesign a body worn camera (BWC) user interface considering all product requirements, constraints and physical controls.
Methods and process used to solve specific problem, user needs, business requirements and pain points. This section specifically fouse on how the features were designed to address the objectives.
To better understand the user, secondary research was conducted. Some key insights pulled from this research were:
To better understand the product, research into the Android operating system and Motorola forma factor was conducted. Some artifacts utilized were the Motorola Si500 product guide ad documentation and the Android OS Style guide.
Read full secondary research report.
Definition: Concept maps are visual representations of information. They can take the form of charts, graphic organizers, tables, flowcharts, Venn Diagrams, timelines, or T-charts.
This concept map helped me grasp the interactions and mindset of the user. It also allowed me to dive into the limitations and capabilities of the device. This tool acted as a guide and inspiration board for further understanding the form factor. By grasping the placement of the buttons allowed me to imagine the possibilities for integrating the physical controls with the touch screen. Plotting potential questions about the system helped me better understand my user's mental models around the BWC and situational touch points.
Definition: A user flow is a visual representation of how the user moves through a website or application and shows what happens at each step along the way.
The user flows reflected both the optimal path and their alternatives that I believed the users would take while using the system. To begin this process, I outlined the different tasks the system would need to be capable of accomplishing. Then I played out the possible touching for initiating the test. These ranged from Pushing a device button to tapping on the screen to interacting with an alert. Laying out the steps allowed me to indicate what happens or would happen at each step in a user's journey. It let me better understand what critical situation tasks needed to be simplified and the others that could have more components associated with them. Planning this before designing helped me know where officers could get confused and also what needed to change to make their experience smoother and easier.
Definition: A UX sitemap is a hierarchical diagram of a website or application, that shows how pages are prioritized, linked, and labeled. If a user flow is like the street view details, the sitemap is like the bird’s eye view.
Creating a site map allowed me to visualize the hierarchy of my app and determine where essential tasks would fit in. I was also able to connect the physical controls to actions done in the app. This site map included the initial touchpoint for the users along with the various screens, buttons and notification alerts. An approach to the site map that further helped understand the tasks and capabilities of the system was including a simple flow between the pages. For example noting where the several recording mechanisms would save to provided context for the filing system. Knowing this set a precedent for the sketches done later.
Definition: UX Sketching is simply rough drawing by hand, commonly used for generating, communicating and refining ideas. It can be for your own reference, or to be shared with co-workers, team managers, or clients and stakeholders.
I heavily relied on these sketches to provide context for the screen design. In addition to designing screen alternatives, they also allowed me to envision the micro interactions of screen components. Annotating the sketches gave me a way of documenting interactions for the screen that were not drawn. Sketching also provided a map for where required components should be placed and the best path to execution. These sketches were a great way to plan out the gestures and functions users would do along with outlining screen variations.
A body worn camera and companion app that makes detailing events quick, simple and intuitive!
The Style guide for this body worn camera was created in the mindset of applying the styles, competes and articles to a product suite. The wearable BWC is compatible with a companion app.
To start off the wireframe design, the first screens focus on the key interaction touchpoint needed. Each action the user will take is reflected on the screen. Designing the emergency device shutdown along with the officer log in process were the first two created. Keepinging in mind the device's small screen, it was key that any inadvertent actuations were minimized. With the need to keep the interface simple but have the ability to accomplish a multitude of tasks, bringing in the priority from the user flow was key.
Early on, I designed a locked feature to aid the users. This allows an officer to quickly capture a voice, image or video recording without logging in. Once logged in, the user will then save the recording as need. Keeping in mind that a quick reaction time is key in emergency situations it was imperative that the officer had the ability to complete an action and worry about saving the actions later.
Definition: Wireframes communicate structure, behaviors, and content for individual pages.
The annotated wireframes allowed me to define the user interactions with the BWC form factor and screen. Annotations provided a more technical aspect of the prototype so developers would be able to implement it. Having the annotations also allowed me to communicate feature implementations that were not possible to prototype. This was another way to highlight future "nice to haves" for the system. I laid out the annotations screen by screen and with the different iterations of a screen. For example there are two versions of the Home page detailing a logged out and a logged in view. This helped with understanding the version available without a user signed in.
These prototype brought all the previous design artifacts to life. The interactive prototypes combined both the physical controls and the touch screen capabilities. To do this, I split my prototype into two separate versions. The first version was the device screen prototype containing all the interactions with the system. The second was a device button interaction prototype, highlighting the screen response to the physical control touchpoints.
This version showcases a users entire journey while using the BWC. It includes the optimal path to complete tasks as well as other options the user might take. Ultimately stimulating a typical flow.
This prototype highlights only the interactions a user will do when a button is pushed. It does not include the other actions on the screen that would lead the user to further envelope themselves in the system.