Friday, June 6, 2014

June 5th - CAPSTONE Event

Team Members: All

And that was that! We are officially done!

We presented our project alongside all the others at the Capstone Event on the evening of June 5th. We talked to quite a few people, many of whom were very interested in our solution to a mundane problem. Michael had a really professional, honed pitch; Eric explained much of the technical aspects; Micheal charmed friends and strangers alike to check out our project; and I filled in all the gaps when people seemed out of breath or were otherwise absent. Overall, we made a really good case and walked away with a few impressed visitors!

The judges were much harsher critics; rightly so, but it still felt a bit stifling. One of the major concerns brought up was that we should be more entrepreneurial in our endeavors, that if we had made this on our own, we could differentiate ourselves in the market. I understand why it was asked, but I still feel it ignored the context of our project and its original intent, which is that it was made for Ombitron and will continue to be used by them.

At the end of the day, we have an awesome project that will serve as the basis for future products by Ombitron, which is a fantastic portfolio piece and something I think all of us can be proud of.

Our final CAPSTONE Poster
Thanks for keeping up with this blog and this project!

Thursday, May 29, 2014

May 29th - Finalizing Presentation Script/Poster

Team members involved: Jonathan Wai and Micheal Seng

Currently, we are about 80% done on the poster. From the feedbacks that we got last week, a lot of people didn't understand what our project was about. They didn't know what problem we are trying to solve or which industry is Ombitron in. Therefore, we are focusing on cleaning our content to make sure people understand our project. Also, we were struggling with the problem statement for our poster between explaining Ombitron as a whole or just the dashboard. After today's meeting, all of us agreed to give readers the understanding of both Ombitron and our dashboard.

Here is a quick peek on our design:


This is the second draft of our poster - we realized this design is much cleaner and concise than our previous draft. This week we will polish both content and the architecture of it, before we print it out.

May 29th - Present Completed Project to Client for Final Review and Receive Final Feedback

Team members involved: All

Since our capstone project is pretty much completely finished, we had the honor of getting to go into Ombitron and presenting our product to them. We got to explain our process, design decisions, and how we worked as a team. This part of the milestone was very important for having us practice our presenting skills to an audience. From this presentation, we now have a clearer idea about how we want to present our project at the capstone event.

The second part of this deliverable had to do with receiving final feedback on our project based off of the user testing done by Ombitron since we make corrections based off of the last bit of user feedback we received. Unfortunately, Paul was very brief with us about giving meaningful feedback. Our team came up with a long list of questions to ask Paul, but his response only answered a few of the questions that he had. I can only imagine he already has a lot on his plate though being a CEO of the company. Regardless, Paul did let us know that he gave multiple demos of the dashboard to clients. They didn't have anything negative to say about the dashboard, and found it easy to use. They were able to tell what the graphs were and found the data valuable and easy to understand. Paul also agreed with us about changing the colors of one of the icons and gave us more information about how the hardware measures humidity.

This part of the deliverable helps us resolve any worries we had about making any last minute changes. Perhaps our process for obtaining information from Paul was flawed since we contacted him over email. we could have gotten more useful feedback if we chose a different way to contact him about feedback. Nevertheless, this part of the deliverable helps our team finalize our capstone project and allows us to solely focus on the poster from here on out.


Below is a screenshot of the email we sent Paul and the feedback we received.




Tuesday, May 27, 2014

May 24th (UNLISTED) - Success Definition and Operationalization

A few weeks ago, we were tasked with creating a document on "Success Definition and Operationalization;" basically, a list of goals that would define whether our project was a success or not. These will be listed below, along with our analysis of our success in those regards: successful goals will be defined by green text, and failed/uncertain goals in red.

-----------------------------------------

1.) Users can easily connect and organize devices within the dashboard
    • Automated connectivity removes mundane tasks from user workload
    • Device Groups in “folder system” allows easy management of devices
    • Admin users can manage/group users to have specific permissions
This success definition primarily focused on the Dashboard's functionality, and whether users responded well to those specific features. This was indeed the case, as all of Ombitron's testers reportedly had no problems understanding their capabilities and completing the tasks given to them.


-----------------------------------------

2.) Users of all skill levels can monitor the status of all connected devices quickly and accurately

    • Users will need only basic training to be acquainted with the system
    • Users can scan the Detail View for metrics information and graph performance overviews
    • Minimal effort to discover errors, measured by TLX Task Load for 50 connected devices
    • System outperforms competition with direct visualization of device metrics vs. sole list view
Unfortunately, we were unable to obtain any further testing data or metrics from Ombitron aside from a few quotes, making it impossible to critically analyze the results of Ombitron's formal testing. We could do some testing of our own to make up for this, though our data will not be significantly useful; the target audience of company workers and businessmen will not be the ones we test with. That lack of context will hamper efforts as students and friends don't have the same mindset, nor interest, for such a product. On the other hand, if they can use it well, it speaks to the effectiveness and simplicity of the Dashboard for non-specialists, which might be good point to make. 
That... might not be a bad idea. We may or may not conduct a few more user testing sessions with our Dashboard. If we do, expect an extra blog post about that.

We do know that the visualization of the Folder View provided some users great clarity in testing, so that one goal was a success. Those who preferred the List View used that, but more casual users found the Folders more appealing and clear. Having both options allows the Dashboard to be more adaptable depending on the preference of the user.

In the meantime, we have requested any of the following information from our client:
  • What confidentiality is in place that prevents us gaining the any of the information requested in these other questions?
  • Who was tested: businessmen, stakeholders, random employees, target users, other? Not looking for names, just benign tester information. How many, and for how long?
  • What were their tasks? How many tasks, and how complex?
  • Which features would be used in these tests?
  • How many testers passed all tasks? Were there any difficulties in any of them? Which task took the longest? The shortest?
  • What questions were asked by testers?
  • Did the facilitator provide any assistance during testing? If so, what?
  • Were there any specific testing metrics used? SUS, TLX Task Load, custom forms, or other?
  • Are there feedback sheets or transcripts we can get a look at?
  • Which companies "demoed" the site? If that information is unavailable, at least what was their size (startup, corporation, multinational) or industry (tech, industrial, venue, etc)?

-----------------------------------------

3.) The results are used by Ombitron for current or future projects
    • Web 10.0’s dashboard as the baseline for Ombitron’s other products and services
    • High SUS reported when used by client companies (CenturyLink Arena, others)
The second bullet of this Success Definition falls under the same problems as the one above.

The first, however, is very, very much a success. Ombitron is using our design moving forward, and the demoed design has been ours from the beginning. We expect continuing development based on our ideas for this product and ones like it in the future!

-----------------------------------------

So hopefully that covers it. The lack of testing data is a setback, but hopefully our own testing sessions will give us the stats we need to provide context as to whether our goals have been achieved or not. 

Tuesday, May 20, 2014

May 20th - Post-CAPSTONE Submissions Update

Team Members Involved - ALL

So Capstone deadlines came and went! Well, with the exception of the student survey fiasco. But luckily we were prepared, so Web 10.0 is nearly ready to deploy for the event on June 5th!

But we never got a chance to share our final work. It's with great honor that I formally introduce our project!


"Is Your Refrigerator Running?" Device Monitoring System Dashboard

Abstract
CenturyLink Arena is responsible for maintaining over a hundred refrigerators to keep products fresh for their customers. Every broken fridge costs CenturyLink more than $1000 in replacement parts and product. Existing Device Monitoring Systems (DMSs) can be used to track fridge failures, but are difficult to navigate and require data analysis training to use. 

Web 10.0 has teamed with Seattle startup Ombitron Inc. to create a new platform-as-service DMS to solve these issues. Ombitron hardware detects a variety of critical metrics about a device, then Web 10.0’s Dashboard interface delivers this information to users in a streamlined and visually appealing style for at-a-glance notification. Detailed metrics and time graphs are also available for each device to track history and changes in performance. This Dashboard will be adapted to suit a wide range of devices, and is currently being considered for use by multiple companies.
___________________________________________________

See our product in action...


Or visit us live and in-person at the iSchool Capstone event!
iSchool Capstone
June 5th, 6-9pm
HUB Ballroom, University of Washington
___________________________________________________

Thanks for following us this far! Here's a sneak peek at our poster design!





Thursday, May 15, 2014

May 8th - User Testing of Database and Advanced Features

Team Members Involved - Eric Oltean, Micheal Seng, Jonathan Wai

Unfortunately for us, Paul was out of the office for the last two weeks! However, Eric linked up with him this Friday and got the client feedback from Ombitron's testing sessions.

At least two companies tried out the prototype for general user testing and potential use. For security, we haven't been given the company names or their representatives. Here is a link to the demo they had access to: http://temphum.ombitron.net/

Pros: Most everyone liked the Dashboard! There was a lot of positive feedback, though none of it particularly specific. All clients were able to use the Dashboard effectively and complete testing tasks. Some of their comments:

  • Interface was very "Usable," "visual" 
  • Easy to navigate; no problems switching between tabs, views, or back out
  • Clients liked the "two systems for viewing": some preferred the List View and found no use for the Folder View, while others loved the Folder View and its simplification of the List.
  • "Simple overall," with no major problems of user experience

Cons: A few nitpicks here and there, but nothing major from a design standpoint. Instead, most were slight technical adjustments, a few content adjustments, and perhaps even some per-client customization based on their individual needs.

  • Temperature graphs were noted to be in Celsius. Of course, since the product will primarily be launched in the US, all clients wanted this information in Fahrenheit 
  • The refresh rate of information (rate of capture from device to dashboard) didn't need to be as frequent as the demo's setting of updating every 9 seconds. Suggested refresh rates by clients would be anywhere between 1-5 minute intervals, perhaps even up to 20 minute intervals.
  • Client-specific request: On the Detail View, "Status and Location" is first thing seen on the page in the upper left. However, to this particular client, this information is not as important, since the user should know which fridge they're looking at. The Client has requested that graphs or specific information should be displayed first, then status and location if necessary (lower hierarchy)
So overall, things are looking very good! This weekend will hash out the Promotional Video and Abstract, along with these final tweaks, and then the poster. Getting close to the end!

Thursday, May 8, 2014

May 8th - (UNLISTED) Ideation and CAPSTONE Progress

Team Members Involved - All

We spent the majority of our meeting today on the remaining Capstone requirements, namely the Abstract, Poster, and Video submissions.

In our discussions on how to proceed with those steps, we discovered a few points of information that were previously missing. We've added them below for reference.

  • Get a list of other DMS-able devices (to show our Dashboard as a proof-of-concept)
  • Highlight the fact that the Dashboard can be connected to devices via Ethernet connections and mobile internet
  • Focus on the strengths of our dashboard; simplified measurements display, fast information gathering

Abstract
We're still working on it! It's difficult to come up with the right words and concise approach to introduce our project without putting too much focus on background information, while avoiding over-simplification. Michael and I will be working on getting the abstract concise and up to par with the rest of our project.

Poster
We've been working on the poster design, trying to take cues from our Dashboard as the placement. Unfortunately, there isn't enough room on this particular design for images, so this idea will have to be modified in order to adapt to the poster format. We'll try to be taking cues from last year's Capstone posters, in order to see what layouts were ideal for images and text in equal but minimal measure.

Video
A lot of groups are opting out of this assignment, so we're opting in! One of the major choices we must make for this video is whether to intice or inform. Which direction fits the video's purpose and content best? Every team member had a different idea, so we're each writing our own scripts and figuring out which direction works best. Scripts will be due by this Saturday, and filming will occur the next weekend.

Personally, I'm hoping we can put "Is your refrigerator running?" on the Poster at least, and perhaps in the video as well. It's such a great tagline!