WinterWinds
Role: UI/UX Designer
Projects I Worked On: Industrial Condition Monitoring Device, Wildfire Resource Management System, Company Marketing & Content Management
Programs I Often Used: Figma, Adobe Illustrator, Adobe Photoshop, Miro, Jira, Notion
At WinterWinds Robotics I performed a wide variety of duties that all contributed to ensuring that all our work was going to meet user needs. It was my job to be an advocate for users during every stage of product development, from initial ideation to final release.
WinterWinds operates within many industries, but their main products and work are focused in two core areas: public safety and energy, both of which are currently industries that severely lack thoughtful and considerate user experiences. Many of the products in these industries provide users with cluttered, unintuitive, and sometimes just plain ugly interfaces. I found the work I did there to be uniquely challenging as the users that I designed for were not typical consumers; they had niche abilities and somewhat strenuous relationships with new technologies. I always had to consider the delicate balance between creating designs that were very practical and familiar to users, and creating designs that were innovative, and provided features that may not have been familiar to users, but improved greatly on the flaws of previous products.
PUBLIC SAFETY
One of WinterWinds’ main products focuses on providing first responders with a much improved way of tracking various sorts of data during emergency situations. I had two main roles in this project; one, to design the interface and system in which the data gathered was displayed and accessed by first responders, and two, to ensure that the experience of building, installing, and repairing the physical product was as smooth and as painless as possible.
Note: The following design artifacts are not the actual ones created while I was at WinterWinds and were created with notional data in order to protect proprietary information.
DISCOVERY PHASE
When I started working on this project, I really had next to zero knowledge about the current system of tracking data during emergency response, so my very first goal was to do as much research as I could on the topic. Since my boss was a former first responder, I was able to get in contact with and interview many subject matter experts in the area of emergency response. When it came to interviewing these experts, I focused on gathering information about how the current system of tracking data worked, the general organization structure of responders during emergencies, and the relationships with technology first responders had.
Talking to first responders was incredibly insightful and confirmed what I had suspected about their feelings towards newer technologies. The majority of potential users were hesitant to try new things and, quick to give up on something if frustrated. Based on these facts, I outlined some essential functionalities.
The dashboard or app created needed to be fairly simple and easy to navigate. To quote one interviewer, “….something a third grader can figure out”.
High level emergency and safety details needed to be readily available on the home screen.
Any features considered ‘complex’ needed to serve non-essential functions.
All interactions needed to doable with a sweaty or gloved hand given the different sort of emergency conditions users might be in.
DEFINE PHASE
I began the prototyping process by creating a few sketches that fleshed out some initial high level ideas, and put together a few user journeys to map out the flow of the dashboard. I wanted to limit the amount of different screens users would need to interact with to less than 4. Additionally, I limited all essential interactions to swipe and tap gestures, and anything requiring typing needed to have nothing to do with active emergency functionalities.
Since most of our users would be using this technology as a way to view their location in relation to the main area of an emergency and other resources, I decided that the fist screen they look at should be a map view. Right away, they can see where the active emergency is based on how areas of the map are colored, and they can see locations of other resources based on various icons. By clicking on any of those icons, they are provided with a table that clearly indicates status and provides exact location data.
Users like active responders in the field that only care about high level details pertaining to their own selves have the option to quickly scan information, but users that may be stationed at incident command that have more time and less urgency can drill down into more specific details.
DESIGN PHASE
The below design reflects a need for a simple and easy to use app that even an impatient first responder can handle. All interactions require but a simple tap or swipe. Users can quickly check in on resources through the map screen and an easy to scan table view. They can access detailed screens dedicated to every asset deployed, view their most important vitals, and look at what they are seeing via camera stream. Additionally, users can update their own profiles, and can look into their location history if needed. Most importantly, users have the options to send warning alerts to other responders that may be in danger and can always report if they are experiencing an escalating crisis themselves.
In addition to designing and iterating on an application design, I also assisted in making sure that the entire process of creating and delivering our physical product was as seamless and user friendly as possible. In order to truly understand our product, I learned about wiring and electrical assembly, vehicle installation, federal product requirements, and more. With this new knowledge, I was able to help our engineers pick an electronic component layout that would be easy to access and repair in the future, assist in picking a visual look and color scheme for the product, and I was able to help determine the best way for our product to be installed. Additionally, it was my responsibility to decide on how we went about teaching users to interact with our physical product and the previously mentioned application interface that came with it.