wk11_Sara & Stella

Halfway

From last week’s feedback, we noticed that the iOS system actually couldn’t read users’ messages. Therefore, we decided to create an app that allows 2 or more people find a place to meet up at a midpoint between them based on their information shared on the Facebook. It picks up personal preferences from your Facebook account as well as your locations to figure out what kind of place you want to meet up.

Insights from user test on 04/05

1.  Users care about the price of each place so it is important to show the price range in a conspicuous way. ——  “$”,”$$” and “$$$” to indicate low to high price.

2. What if users want to revote —— an “Edit Choices” button after they submit their votes.

3. Users feel confused about the dragging icons. “They look like a burger menu” —— delete the dragging icons and add a text instead.

4. How users know their friends have already made decisions—— Notifications from the app when both decide.

5. How to check the chosen place is at a midpoint——both people’s location will be shown on the map.

6. Users want to know more details about the places.

Iterated App Map

Iterated Wireframe

 

Week 11 – Franky Wang

App design for smart cars

+++ Group work with Yin Hu +++

Insights from user test on Apr 5th

1 – The button patterns are kind of messed up. Currently, there’re 3 kinds of buttons: Navigation buttons, re-action(functional) buttons and function buttons. Some of them share the same shapes while some of them are differentiated in colors. A clearer rule should be designed for the better understanding of the operation.

     

       

2 – The two circles on the index page are not friendly to operate, especially due to their close distance which may cause mistouch.

3 – Detail problems like the hard understanding of seat operation, too many texts on a singer button, confusions of auto settings and manual settings… The detailed problems are shown on the corrected wireframe.

Updated app map:

Week12 Kelsey Yu – Project 3 improved wireframes

++++ This is a group project with Yao Huang++++

project 3 wireframe improved

What I learned from paper prototype:

1: Allow users to connect the bank account to the application

2: Remove statistics on the tap bar

3: Make the app more intelligent which can automatically recommend the investment decision: including the amount of investment and sell/buy

4: Don’t ask users add a card when they logging in.

 

Week 10: Jaeky Cheong, Cici Xiang

Color picker

Project 3: Personalized eyeshadow, blush, and lipstick color suggestions
Functions of the app:
1. Analyze picture(skin tone, outfits color)
2. Suggests user the color of lipstick, blush and eyeshadow color which suits user’s
outfit, skin color, trend and season.
3. Suggests user related cosmetic products.
4. Purchase products in the app.
App map:
1. Homepage -search bar/ icon of search by pic/ icon of search by color
2. Picture – take a picture now/ uploading picture
3. Shopping Cart
4. Profile – pictures you posted/ products and colors you saved/ user’s skin tone/ user preference/ address/ name/ headshot

Week 10 – Ting and Jason

Smart app for meetings.

We are designing an app to record, analyze, and provide related references for group meetings. 

Inspiration:
Every time during the studio critique, guest crits always give us so many references and suggestions which are so hard to catch up in such short time. We usually take notes on the notebook but still miss out a lot. Also, we all have the experience that the crit gave an unsure name or terms which needed to be looked up afterward. Record the whole conversation could be a way to solve this problem, but how can we use Machine Learning to improve this experience?

Questions:
We followed this guideline from Human-Centered Machine Learning by Josh Lovejoy and Jess Holbrook to start off.

  • Describe the way a theoretical human “expert” might perform the task today.
    Ans:
    1.We would expect the expert to check if anyone mentioned incorrect information.
    2.to differentiate small talk/joke/sarcasm.
    3.to read the tone of the members. Who was agree or against with some certain idea?
    4.Categorize the transcription by recognizing the sound.
    5.Look up the dictionary or google.
  • If your human expert were to perform this task, how would you respond to them so they improved for the next time? Do this for all four phases of the confusion matrix.
    Ans:
    1.The user will be asked to delete unrelated reference on the report tab.
  • If a human were to perform this task, what assumptions would the user want them to make?