SAMMY: Smarter Room Controls for More Productive Workers

 
SAMMY was an AI driven, integrated lighting, HVAC, people and asset management system.

SAMMY was an AI driven, integrated lighting, HVAC, people and asset management system.

Simply Complex

The SAMMY system was comprised of several parts: Edge sensors, web services, an AI backend, a smartphone app, and a desktop app. These parts were controlled and interacted with via a collection of novel indoor navigation and room control technologies. After discussing the proposed system with all of the stakeholders, I devised this list of requirements for the mobile application.

  1. Reduce the amount of energy used in a building

  2. Reduce the complexity of scheduling and attending meetings

  3. Reduce the need for onsite help with HVAC settings

In order to understand how we might satisfy these requirements, I conducted informal interviews and observation sessions. My findings were summarized as follows:

  1. Users do not use online scheduling tools

    • Users call, visit or email the president’s executive assistant and request a room reservation (research staff is high value and always accommodated)

  2. Lighting is binary (on or off), all other lighting features are ignored

    • Mapping of light controls and actual lights isn’t understood

  3. All visitors are escorted (independent of IP risks)

  4. New employees are shown the space and then left unescorted

  5. HVAC system is not trusted

  6. HVAC system is overcorrected

From these observations and more meetings with stakeholders, these basic personas were identified:

  1. Professional employees (professionals) engaged in core business

    • Gender unimportant

    • Age unimportant

  2. Support staff (support)

    • Gender unimportant

    • Age unimportant

  3. Visiting professionals (who are not an IP risk)

    • Gender unimportant

    • Age unimportant

    • May not speak English

For these personas, the mobile UI must:

  1. Integrate lighting, HVAC, scheduling and indoor navigation into a single app

    1. Reduce navigation, scheduling and energy use errors

    2.  Reduce IT support calls (frequency & duration)

Once the basic scope of the project was agreed upon, a set of acceptance requirements was developed. As with all the items listed above, these requirements were subject to change. Product development in a research setting is much more iterative than in a more traditional enterprise. The requirements were as follows:

  1. With minimal instruction, all professionals can:

    • Schedule and navigate to a meeting

    • Adjust the climate and lighting to suit their needs

    • Accurately and concisely communicate concerns about the system to the support staff

  2. Leveraging their domain knowledge, support can:

    • Review energy usage

    • Preemptively repair or replace equipment

    • Establish guidelines based on accurate data

    • Quickly assist users based on minimal interactions 

  3. Underlying sensor mesh network and AI perform adequately to provide data needed

  4. System must integrate with existing systems (negotiable)

  5. Comfort (qualitative), productivity (quantitative) and energy efficiency (quantitative) are optimized

There is a lot of information contained in these outlines. Not every member of the team needs or wants to review all of it. As the Creative Director, it’s my job to make sure that a solid design foundation is laid. Whether or not anyone looks at the foundation is immaterial. They will certainly look at the products that we build upon it.

A sample UX flow

A sample UX flow

Over the course of several days, many user experience flows were produced and iterated upon using a combination of OmniGraffle and PowerPoint. Some illustrated the entire system, others (like the one pictured) illustrated a particular scenario. In every case, these were made in consultation with the development team. Designing things that can’t be built is pointless. However, limiting designs to things that are easy to build can be self defeating. The trick is striking just the right balance. Pick your battles, as they say. Please note: some elements in this UI flow have been altered or removed to protect IP. As with the outlines above, most of my colleagues didn’t need or want to review these UX flows. This is a quirk of the “collection of experts” corporate research setting I developed this particular UI within. In a more traditional enterprise, each of these steps is formally accepted and approved. Whatever the case, the work must be done. This principled approach allows me to produce UI designs that avoid obvious logical errors or user pain points. Even within this informal setting, the work was indirectly validated during deployment tests.

Videos were produced in order to better visualize and explain SAMMY. I use Adobe Premier, Adobe After Effects, Adobe Media Encoder and Audacity to produce the videos. The following video features the project manager and several members of the development team.

In the video, you may have noticed the edge sensors that were deployed as a part of SAMMY’s mesh network. In order to keep the project on track, I used TurboCAD Pro software to design a simple housing that was robust enough to deploy while meeting all of the necessary technical requirements. The housings were 3D printed with a MakerBot Replicator+ using MakerBot Print software.

Edge sensor housing

Edge sensor housing

As the project matured, several unique UI elements were designed. As usual, I began this process with hand drawn sketches. Often, these were produced during meetings. This approach significantly shortened the development time. Below, is an example of one of these sketches. In it, HVAC and lighting control options were explored. Also explored, was a method for depicting floor changes during indoor navigation.

Sketches of UI elements

Sketches of UI elements

After several rounds of pencil sketches, low fidelity black and white mock-ups were produced. This lead to medium fidelity greyscale mock-ups. Both iterations were used to validate the basic design and to perform informal “guerrilla” style UX tests with colleagues. The old fashioned paper mock-up kind! I use Adobe Illustrator during the entire design process. PNG (occasionally JPEG) and SVG files are exported for production as needs. Adobe XD was also used to produce simple interactive prototypes. Removing as much decoration as possible often reveals interesting quirks about a proposed UI. Also, helpfully, it reveals areas where additional design work is needed. In the end, a UI must communicate clearly. This step is where basic interactions are developed. Below, are screens from a set of medium fidelity mock-ups. The charts and maps were provided by our development team and were for placement only.

Medium fidelity mock-ups

Medium fidelity mock-ups

All of this work lead to a functional and robust proof of concept application. This application was produced for FXPAL (a Fuji Xerox research lab). Unfortunately, FXPAL has closed. It’s my understanding that Fuji Film has continued to develop these ideas.

TL;DR

I worked with a diverse group of professionals to design and build a robust room control prototype.