VSG > Knowledgebase > Library

 

Our Library

Find the answer you need - Perform a site-wide search

Editors Unbuilt Labs Editors Unbuilt Labs

An Introduction to Effective Business Research

Research has a fairly straightforward definition: the systematic study of a subject. Business research can mean several things: research on, for, or within a business. For the purpose of this essay, we want to focus on the latter two cases. In particular, we want to ask: how can decision-makers within a company identify and conduct good research; and how do we find answers in a world that is constantly changing?

Research conducted for and within a business has several defining characteristics. First, we do not conduct research for research’s sake. We need research to deliver results. Second, businesses have limited resources. We need to conduct research efficiently. Third, an overseer outside of or within an organization needs to approve the project and acknowledge the validity of the recommendations.

We also need to recognize the reality of conducting research in business environments. Unless your role is in R&D, a three-month research project would already be considered very generous. An agile sprint only lasts two weeks. In the face of a crisis where information needs to flow to executives in real time, a business decision needs to be made in one to three days.

Business research textbooks advocate for the highest standards of research, which is admirable, but is often not effective or even possible for executives. We do not have the resources, expertise, or need to boil the ocean and launch a multiyear study for every business decision - this would be ineffective. We cannot use conclusive data gathered and analyzed one or two years after the event to make business decisions either - this would be impossible.

In order for us to conduct research for and within a business, we need to be pragmatic, and instead answer questions to the best of our knowledge within resource constraints. Paradoxically, effective business research requires us to be comfortable with not knowing. This should not be a foreign concept. In the previous Coursebook, Managing Complexity, we discussed the idea of complexity: there are subjects that lie beyond the boundaries of human knowledge and confounds even the most sophisticated minds and technologies.

In the essay “Innovating systematically in complex conditions through guided trial and error”, we looked at how we can think about innovating, and the process can apply to research as well: faced with limited time and resources, how do you derive new and actionable insights? In the essay “basic research standards for evidence-based decision-making in business environments”, we established the minimum research standards executives should adhere to.

We always recommend following the basic research standards we have established. In this Coursebook, we will explore in further detail where these best practices come from, give you an idea of how a theory is legitimized, and help you anticipate challenges when you put forward your research. We want you to be able to design a research project well and secure the necessary sign-offs without delay.

In the process of addressing how to conduct effective business research, we will also discuss questions such as: Given strict standards and resource constraints, what do we have control over? Ultimately, we will lay out pragmatism, the research philosophy, as a foundation for multidisciplinary and mixed-method research. Executives today deal with a vast array of sources and topics in order to make decisions - by offering a thorough treatment of the topic, we will begin to open avenues to connect business decision-making with rigorous academic research.

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

Digital Mockups: Design Patterns, Accessibility, Responsive Design, Visual Design, UI

by Marvin Cheung, Head of Research and Strategy

If you recall Brad Frost’s Atomic Design Methodology, this is when all the pieces come together. Each industry has their own set of UX conventions. These are tried, tested, and effective methods to get things done. Make sure to research them thoroughly before creating a design system of your own.

There are two Design Patterns you should be familiar with. They are Google’s Material Design Guideline and Apple’s Human Interface Guideline. The simple thing to do would be to familiarize yourself with the graphics, starting with Google’s Material Design Guideline. IBM’s Carbon Design System is also worth mentioning. There is no need to memorize every detail and number, but you might be asked to double check the final interface and identify any anomalies even if you are not a UX Designer. Pay special attention to the accessibility section of the two guides. Especially if you are designing for the Web, make sure to have a basic understanding of responsive design principles. 

It is entirely possible to build a branded design system by changing the fonts, colors, corners, and drop shadows of the Design Patterns. As far as tools are concerned, we generally recommend using Figma because it allows for online collaboration and is free for a limited number of users. However, Sketch, Adobe XD, Illustrator, Photoshop, Invision, are also some of the common tools that UX Designers use to put mockups together. To animate interfaces, designers typically use Adobe After Effects and Facebook’s Origami Studio. Stay tuned for Branding and Identity: Design 101.

Happy creating!

Recommended readings: 

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

Zero-based Thinking: User Stories, Storyboarding, Wireframing, Paper Prototyping

by Marvin Cheung, Head of Research and Strategy

Zero-based thinking is when you start fresh: if you can start over, what would you do instead? Miniscule improvements upon competitor’s products will only get you so far. Maybe there is a radically different approach towards solving the problem. 

This is when user stories, storyboarding, and paper prototyping can help. 

  1. Interview and talk to your users. Get to know as much as you can about their workflow. Are different Jobs to be Done emerging? What are your users' pain points? Post-it notes or a Miro Board can help.

  2. Storyboard out the different contexts when the product will be used - stick figures work too!

  3. Brainstorm solutions. One of the questions that always get the conversation moving is “What is the absolute worst way to solve the problem?” 

  4. With reference to the research you have done, use a thick marker to begin drawing up rough, possible solutions on paper. Go wild!

  5. Iterate. Test different solutions with users, refine the product, and see the results.

  6. Finally, once the dust settles, you want to arrive at a set of screens that flow well from one to another. This is when we can blu-tack it to a board neatly and use it as a basis for a digital mockup. 

Recommended readings:

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

Know the field: Competitive Analysis, Mobile-first Approach, Information Architecture, and Jobs to be Done

by Marvin Cheung, Head of Research and Strategy

Before putting pen to paper, you want to conduct a little more research. At the heart of the competitive analysis is the question “What is everybody doing?”:

  • What are the solutions being offered?

  • How are people solving the problem currently?

  • Are people happy with the current solution?

While doing a feature list of competitors’ products can be simple, successfully reverse engineering the conditions from which design decisions are made require experience. Especially if you are developing a new product with no quantitative data available, being able to guess why certain design decisions were made can be incredibly helpful. 

Here, much like with Art or Literature, you would assume that the creators of the product have made decisions deliberately and based on quality data. We recommend taking a mobile-first approach towards designing: to design for the smallest screen size first. Some of the questions you want to ask:

  • Why did they opt for a particular wording?

  • Why is the Call to Action (CTA) where it is?

  • Which group of users are they prioritizing?

  • What are the Jobs to be Done?

An information audit and an accompanying user flow will be helpful here:

  • Can you account for all of the content on the page?

  • What steps are being taken to arrive at the solution?

The readings below give a good overview of the tools available to complete this task. Google’s “Basics of UX” in particular will help make sense of this step. 

Recommending readings:

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

Setting a UX Design brief

by Marvin Cheung, Head of Research and Strategy

Design is an iterative, hypothesis-driven process. Through preliminary research, you want to narrow down the set of hypotheses you want to test. The first step is to understand the problem space. There are several things you can do, for example:

  • Read through articles from online sources, eg. Google Search, Quora, other discussion forums. What questions are people asking?

  • Read through articles from academic sources, eg. on JSTOR. Has the problem been well researched?

  • Go through social media, eg. meme pages. What is the worst part of people’s experience of a particular problem?

  • Talk to friends in different industries. Might the solution be applicable to an unexpected target audience? What are some of the early thoughts?

By assigning different weights to different sources, you should arrive at a set of problem statements. Geunbae Lee’s article “Designer’s indispensable skill: the ability to write and present a solid problem statement” elaborates on what a problem statement should look like. Amy Ko’s article “How to understand problems” discusses some of the more technical qualitative user research methods, including user interviews, that can help you set a UX Design brief. People working with existing products will most likely have to work with quantitative user research as well. We will elaborate on user research methodologies in UX Research 101. 

Nevertheless, there are two guiding questions behind all UX briefs:

  1. What do we know to be true? 

  2. What hypothesis do we want to test? More specifically, what is the most critical hypothesis we need to test?

There are generally two entry points to a UX Design project. They have corresponding hypotheses:

  1. There is a new product, and you have to test the product vision ie. “Do people want the solution we are offering?”

  2. There is an existing product, and you want to figure out “How can we serve User Group X better?”

Practically speaking, a UX brief should have these components:

  1. Problem statement: What is the problem you are trying to solve?

  2. Goal: What are you optimizing for? Eg. engagement, transactions 

  3. Users: Who will use the product? (We will elaborate on Jobs to be Done in the next section)

  4. Stakeholders: Who needs to sign off on the project? Does any regulatory body need to be involved? 

  5. Constraints: How might we anticipate and navigate around different limitations?

Throughout the project, you will find yourself refining and iterating on the brief, maybe even scrap it altogether and start a new brief based on the information you discover. That is entirely okay! The goal here is to do as little work as possible to test as many different assumptions as you can. Make sure to keep a log of everything you learn along the way, and all the assumptions you did not manage to test. This list will be invaluable as you continue to develop the product.

As an aside, the main criticism of many spec (speculative) projects in design portfolios is that they do not do enough to reflect the real world challenges of creating a product. Here we want to acknowledge some of the questions that need to be asked beyond “Do people want our product?” but are not within the scope of this course book. Innovation by Design: MVP 101, Product Management 101 will be released soon.

  1. Will anybody be willing to pay for what we are offering? 

  2. Will they pay enough to maintain the quality of the new offering?

  3. Will the new product or feature fit into the overall business?

  4. Will it cannibalize the sales of an existing offering?

  5. Will it encourage nefarious activities in any way?

  6. Will it be compatible with the company mission? 

  7. Will it require expertise that will be very difficult to hire for?

  8. Will the additional work be fulfilling to the team in the long term?

Recommended readings:

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

The Atomic Design Methodology: UX Design in an ideal world

by Marvin Cheung, Head of Research and Strategy

Before diving into the details of UX Design and UX Research, we want to have a high level understanding of how the disparate components come together to make up an interface. Brad Frost’s Atomic Design Methodology provides a fantastic overview. 

According to the framework, you can break a product down into pages, templates, organisms, molecules, and atoms. An atom can be a text field, a button, an image, a logo etc. After designing the atoms, you can put them together into molecules. For example, a search bar consisting of a text field, a search icon, and the search button can be considered one molecule. Multiple molecules form an organism, like a navigation bar, and multiple organisms form a page template. Fill in all of the placeholders and you have a page! 

Does it sound too good to be true? Because it is. While it is not a bad framework by any means - this is especially useful for UX audits or case interviews, the actual UX process when you work in-house will be a lot messier. There are many reasons why that will be, to name a few:

  1. You may change the functionality of the product at any point in the design process especially if you are in the process of building an MVP.

  2. You may decide to change your tech stack and the new platforms you use may have different design limitations.

  3. You may run out of time and need to short circuit the process.

  4. You may and should prioritize certain components of your product over others. There is no reason to design the full interface before user testing. 

However, the Atomic Design Methodology is still a framework we recommend, because it helps you think about interfaces in a modular way. Just remember:

  1. If you follow the Atomic Design Methodology, you will be creating a design system. Most organizations might not have the budget, time, or expertise to put it together, nor do you need to. You can opt for an open source UI library instead.

  2. There is no reason to follow a framework to a tee. Do what works for your team!

Recommended reading: Frost, Brad. “Atomic Design Methodology”. AtomicDesign.BradFrost.com, 2016. https://atomicdesign.bradfrost.com/chapter-2/ 

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

“What are the UX benchmarks for organizations of different maturity?”

by Marvin Cheung, Head of Research and Strategy

While it takes a long time for big corporations to incorporate UX fully into their decision-making processes, new ventures would have a much easier time finding Product-Market Fit if they incorporate UX best practices from Day 1. 

Realistically, even strong UX Designers at a startup will be limited by time, budget, and resources. We have several benchmarks that take resource constraints into account to help you determine whether your venture is ahead of or behind the curve. Bear in mind that being ahead ie. confirming something before you have enough data to support a decision, can cause problems down the road as well. 

  • Pre-MVP: Once you have a vision, you should have a few brand assets, such as a pixel-perfect logo and a name. Logos and names are the most difficult to change, so you want to try to get this right. You can also make some basic branding decisions such as whether you want to go with a white or dark theme, what colors to use (ideally two to three), and what fonts you want to use. Generally, we recommend picking a header font from Google Fonts, since it has the most third-party support, and a body font from the list of Web Safe Fonts for page speed, legibility, and general ease of use. 

  • MVP: When you are making quick iterations, there is nothing wrong with using User Interface (UI) packs and templates. You can also use no-code solutions eg. Wix or Squarespace to test landing pages and ideas. We recommend wireframing and rapid prototyping to test out as many ideas as you can before committing anything to code. The platform you use will pose certain design constraints so there are times when you will have to be creative with how you translate your mockup into the final interface. You should also begin to think about quantitative user research at this point.

  • Pre-Scale Up: Before you make a big marketing push, you want to clean up your design. Some startups, often ones led by entrepreneurs without a design background, will opt for a rebrand at this stage as users in the early majority category tend to have certain product and design expectations. By cleaning up design quirks and optimizing the user flow, you will have a more efficient marketing funnel as well. This translates to cost savings.

  • Scale Up and beyond: You should have a coherent design system at this stage with reusable components. Especially for consumer products with a wide user base, small design changes can have a large impact on revenue. Scale Ups and beyond tend to focus on interaction design and microinteractions. Some organizations go further and employ a corporate nudging team that will use behavioural sciences to promote certain behaviours, though the ethics of such is contested. 

Recommended readings:

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

“What is the absolute minimum UX standard I should adhere to?”

by Marvin Cheung, Head of Research and Strategy

Whether you are at a startup or a Fortune 500, you can find yourself needing to meet tight resource constraints. This is when we receive questions like “what is the absolute minimum UX standard I should adhere to?” There is no one-size-fit-all answer, but these are the questions you should be asking:

  1. Does the design comply with all relevant regulations? 

  2. What immediate impact might the new release have if things go wrong? Is the interface usable? Might customers misunderstand the interface in any way? Might customers pay for the wrong product? 

  3. What long-term impact might the new release have? What are the customer support issues it might create? Might it damage your organization’s reputation? Are you creating long term product problems down the road?

  4. What is the absolute worst case scenario? How might you limit your risks?

Recommended reading:

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

“How much UX Design do I really need?”

by Marvin Cheung, Head of Research and Strategy

This is one of the questions we receive the most. UX Design practices will vary between geographies, organizations, and even within organizations. There are several contextual questions you should ask when you are determining your UX strategy or when you are trying to place UX in the overall value chain:

  1. Are you in a saturated market?

  2. Are you working with enterprise or consumer products?

  3. Are you in a mature or nascent tech ecosystem?

  4. How price sensitive are your customers? 

Unfortunately, spending $100 on UX does not translate directly into 100 UX points. Some UX initiatives eg. discussing new user insights during weekly all-hands meetings, can be easy and effective. Sarah Berchild talks more in depth about some of the small things that can help deliver a strong UX impact in her article “It starts with you”.

“Benefits of UX by role:

  • Product Owner/ Manager
    Benefit: UX can help a Product Owner by providing insight into user’s needs.

  • Front End Developer
    Benefit: UX can help by working closely to make sure ideas are feasible to build.

  • Finance
    Benefit: UX can help by measuring outcomes that affect the business.

  • Operations
    Benefit: UX can help by making products easier to use and streamlining processes.

  • Legal
    Benefit: UX can help by keeping customers informed without overwhelming them with information.

  • Sales
    Benefit: UX can help by increasing revenue for user-centric products.” 

- Sarah Berchild

Recommended reading:

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

Basic Research Standards for Evidence-Based Decision-Making in Business Environments

by Marvin Cheung, Head of Research and Strategy

Research in Business Environments

We will add on to the previous section on the relationship between frameworks in this section, and focus on working within a specific framework. In other words, instead of looking at the relationship between problem-solution pairs, we will examine the general best practices for resolving a single problem-solution pair. More specifically, we will explore the ways in which we can evaluate a solution.

There are two key challenges to resolving a problem-solution pair regardless of which framework you use. First, there is so much flexibility in problem formulation that it can be quite daunting. Second, we have to navigate incomplete, inaccurate, and even incorrect information in the real world. 

To address these two challenges, we will formulate guidelines based on academic research methods. It is important here to understand the differences between academic research and research in a business environment. While the quality of data and the amount of resources available is an obvious difference, there are more nuanced differences. 

Research in business environments does not have to be generalizable. For example, you can use existing theories to understand a phenomena you are observing eg. decreasing customer satisfaction, or you can confirm whether a piece of information you found online applies to your company. It does not matter whether or not your findings can be applied to a larger population.

We tend to avoid conducting generalizable research in business environments because it can be expensive, but there are times when generalizable research is necessary. For example, in R&D functions where a new technology or theory can advance business goals or when generalizable research is a prerequisite for operating in the industry, such as when clinical trials are needed. These research projects need to conform to strict industry standards. We recommend involving domain experts in these scenarios. 

We can, however, develop general best practices for everyday research. How we conduct research and the standards we adopt for our insights significantly influences our ability to make sound decisions. As you will see, non-generalizable research is not necessarily easier. People can be very scrupulous when several million dollars is on the line. 

There are several stages common across all problem-solution pairs. We will elaborate on each in the subsequent parts:

  1. Problem formulation: Are you asking the right question?

  2. Solution generation: Are your findings useful?

  3. Communication: Are you delivering your findings intentionally?

Part I: Problem formulation

While in earlier sections we have spoken about a starting condition and a series of problem-solution pairs in abstract through the pre-HCD Design Thinking framework, we want to now apply the innovation process to help us structure our inquiry.

In the first layer: we have the overarching research question. Oftentimes, you can get a solid research question just by adding the question word “how” in front of the relevant goal or metric. For example, “how might we find product-market-fit”, “how might we increase revenue”, or “how might we decrease churn”.

In the second layer: we want to specify the framework we will use to break down the research question. What is the angle? A well formulated problem needs to have a clear subject and should say something where you believe the bottleneck is. It also needs to be answerable, ie. testable and falsifiable to an acceptable degree of certainty within resource constraints.

A problem in the second layer tends to take one of these forms:

  1. Exploratory research: what do we think about the subject?

  2. Descriptive research: what are the characteristics of the subject?

  3. Evaluative research: how effective is the subject?

To formulate a problem well, you want to ask:

  1. What do we know to be true with a high degree of certainty?

  2. What can we infer right away based on past research or experiences?

  3. What are the areas that require further research?

In the third layer: we identify several interconnected variables that require further investigation and delineate their relationships. Resolving these relationships should provide specific, actionable outcomes. For example, a sentence in the copy needs to be replaced, or a new image is needed for the website. As you resolve problem-solution pairs, you will iterate and move between layers of abstraction. Changing frameworks, variables you study etc. is both common and expected.

This is a very broad description of how we formulate problems. For people unfamiliar with managing complexity, it can be easier to start by getting a sense of what a streamlined process looks like. Problem formulation in abstract can be difficult to grasp. We have included Bain and Co’s Case Interview Preparation page in the list of recommended readings. They have case studies available with video walkthroughs. These case studies take out the complexity and nuances of a situation, for example, stakeholder disagreements and other uncertainties outlined in Part IIC: Managing Uncertainties, but they nevertheless offer guidance on how to begin investigating problem-formulation.

Part II: Solution Generation

The solution generation process is in some ways more straightforward than the problem formulation process. The steps are fairly similar across most problem types:

  1. Examine easy-to-access existing literature. This can include news articles, academic essays, blog stories, government reports, corporate publications etc.

  2. Connect available literature to operational data. Do your best to understand the problem you have with the data available. (We will elaborate on how to work with operational data in later Coursebooks.)

  3. Create custom solutions to resolve the problem, if necessary. This can include integrating new monitoring solutions, building data pipelines, custom dashboards, and so on. It is important to weigh resource considerations with the associated risks. Sometimes it is better to accept the risk than to build a custom solution.

When we evaluate the credibility and usefulness of a solution, we examine it across several factors:

  1. Is the research ethical?

  2. Is the research comprehensive?

  3. Has the researcher accounted for different uncertainties?

  4. Are there errors in the data or analysis?

Part IIA: Ethics

Unethical methods damage the credibility of the researcher, the institution, and the findings. Ethical best practices are established to help prevent behaviours that might harm organizations, researchers, research subjects, and the public. 

If you are in a leadership role, you will be responsible for your organization’s ethical standards. Even if you are not in a leadership role, you should always voice your concern through proper channels and in accordance with your employee handbook, if you believe that your work will intentionally or unintentionally promote an unethical agenda. This can include promoting unhealthy behaviours or creating detrimental financial, physical or mental health impacts to children, teenagers, and even adults. 

Within a research project, there are two overarching questions:

  1. The ends: Will the research be used to promote unethical or illegal behaviours?

  2. The means: Will the research put anyone in harm’s way?

“Principlism” or the “Four Principles Approach” by Tom Beauchamp, Ruth Faden, and James Childress from the 1970s continues to provide guidance to researchers:

  1. Respect for autonomy: we should not interfere with the subject’s intended course in life and should avoid violations ranging from manipulative under-disclosure of relevant information to rejecting the subject’s refusal as a research subject.

  2. Nonmaleficence: we should avoid causing harm.

  3. Beneficence: the risk of harm presented by interventions must constantly be weighed against possible benefits for subjects and the general public.

  4. Justice: benefits, risks, and costs, should be fairly distributed - we should not recruit subjects unable to give informed consent or subjects who do not have the option to refuse participation.

To be clear, there is no circumstance in everyday research where you should prioritize your research over the participant’s safety. For example, if you are considering an ethnographic study examining people’s behaviour in supermarkets and you see a tin can about to fall on your participant’s head - please intervene.

If your research requires you to put participants at risk in any way, you should stop and seek legal advice. Some product tests are regulated by government agencies including the FDA and its European counterparts the EFSA, EMA, and ECHA. This includes but is not limited to: human foods, human drugs, vaccines, blood, biologics, medical devices, radiation-emitting electronic products, cosmetics, as well as animal and veterinary products.

There are a few additional best practices we have adapted from CITI’s Responsible Code of Conduct (RCR), originally designed for academic researchers:

  • Authorship: The general consensus is that authorship is based on intellectual rather than material contribution eg. data or funding. The research team should collectively decide who qualifies as an author, ideally before the project begins. Each author is responsible for reviewing the manuscript and can be held responsible for the work that is published. Those who do not qualify for author status can be recognized in the acknowledgements.

  • Plagiarism: Although it may seem obvious, it is important to avoid plagiarism. Always put quotation marks around direct quotations, and attribute an idea you referenced to the original source. Missing citations make it difficult for others, including people who need to sign off on a project, to check the work. You should also be prepared to cite the source of a piece of information when you are conducting a presentation.

  • Conflicts of interest: While there will always be a financial conflict of interest when you are conducting a study as an employee of an organization, you should still be wary of personal biases. For example, if you are a strong advocate for an idea, are you asking leading questions or bullying the interviewee into agreeing with you? As organizations mature, dedicated researchers can help maintain objectivity.

  • Data management: Ask for and record as little Personally Identifiable Information (PII) as possible. In most circumstances, an anonymous transcript of a user interview is sufficient. A screen-recording with voice of how users interact with your product can be helpful, but very rarely will recording the face of the interviewee add value to your research. You should clearly communicate the data that will be collected, as well as how it will be stored, and used. The tendency here is to over-collect, but the amount of PII needs to be balanced with the accuracy of the answers. Social-desirability bias can lead to an over-reporting of desirable behaviour and an under-reporting of undesirable behaviour. Please consult your legal team for details of the appropriate data privacy practices.

Part IIB: Comprehensiveness

One of the most common questions we receive is “When do I know I have enough research?” The simple answer is that you should exhaust all resources available to you within the resource constraints. There are some signs, however, that may indicate comprehensiveness before you reach that point:

  1. If your new sources are beginning to repeat information you already know, that is the first sign that your research process is close to completion. There are often 3-5 most important reports and authors on the subject that everyone references. Can you identify them and discuss their relationship?

  2. If you can identify errors in your sources’ reasoning and begin to develop your own perspective, this is the second sign that your research process is close to completion. At this point, the socratic method can be helpful. Depending on the size of the project and your own workflow, you can either have an informal discussion of your ideas before you start writing, or you can have a discussion after your first draft.

  3. The final sign is when you finish writing. Depending on the context, you might need a final project sign-off from your stakeholders. If they sign off, brilliant. You can also continue to validate your ideas through presentations and roundtable discussions. Publication is rare, since most works are either confidential or not up to publication standards due to resource constraints.

Part IIC: Managing Uncertainties

Uncertainties arise when we work with incomplete, inaccurate, and even incorrect information. To craft a credible and useful solution, we need to account for known, unknown and unknowable uncertainties, a framework by Clare Chua Chow and Rakesh K. Sarin first published in 2002 in the journal Theories and Decisions. There is no simple metric or combined uncertainty metric that can tell us when we need to eliminate a piece of information entirely. We can, however, still identify common uncertainties. Managing them well will require experience and good judgement.

Known uncertainties are the easiest to manage. Their presence is easily detectable and they skew findings towards a predictable direction: 

  1. Conflicts of interest: corporate reports and research funded by corporations tend to advocate for specific private interests. Some are helpful but it is important to be critical of any omissions, research methodologies, and gaps in reasoning.

  2. Missing research methodologies: reports, especially those by corporations, have in the past included very narrow and bizarre studies with odd metrics to prove a point. Sample selection bias, social desirability bias, and the hawthorne effect are examples of threats to a study’s internal validity. Review a study’s methodology or the legal fineprint on marketing materials whenever possible.

  3. No acknowledgement of the limitations of the study: some studies make overly generalized and unsubstantiated claims. This calls into question the research’s external validity. A closer look at the relationship between the study’s research methodology and the conclusions will often reveal any gaps in the author’s reasoning.

  4. Fuzzy language and buzzwords: what does it mean when a company says they use artificial intelligence or make sustainability claims? Be wary of ambiguous or poorly defined terms. 

  5. Social impact claims: we are incredibly careful when an organization makes social impact claims. Social problems are wicked problems where accounting for the second and third order impacts of an intervention is both difficult and expensive. We typically expect to see results from an ethnographic study to understand the potential impacts of an intervention on a specific community, and a randomized controlled trial (RCT) to understand the efficacy of an intervention.

Unknown uncertainties are more difficult to manage. Though their presence can be detected, they skew findings towards an unpredictable direction:

  1. Incomplete research: with resource constraints or limited expertise, some research may simply not meet the comprehensiveness criteria. Effects of incomplete research can include failing to take into account a confounding variable ie. a variable that affects both the dependent and independent variable, creating a spurious correlation, when there is no clear causal relationship. There are also projects we consider to be unrealistic, when certain aspects clearly go against the known logics of the industry. Formulating an answerable problem, acknowledging the limitations of a study, and consulting experts are key.

  2. Misinformation and disinformation: this is particularly problematic when working with pop culture or news sources. We have explored this in further detail in the recommended reading “Media — To sell a crisis: Understanding the incentives and control system behind sensationalist news and misinformation”.

  3. Uncalibrated tools: people often assume that digital tools, such as Google Analytics, are perfect. You can get a sense of how accurate your tools are if you do a few pre-tests eg. How often does it fail to track a click? When does it fail? What is the margin of error?

  4. Presence of systemic corruption: corruption muddies data, reports, and findings. We recommend being extra cautious when referencing a report on countries that are highly corrupt. The Transparency International’s Corruption Perceptions Index (CPI) is a good reference. 

Unknowable uncertainties are incredibly difficult to identify. Their presence is difficult to detect and skew findings towards an unpredictable direction:

  1. Unofficial narratives: these are the details left out of official reports. Stakeholder disagreements, details under confidentiality agreements etc. can be the root cause of an action without ever showing up on reports. This requires an insider’s perspective.

  2. Errors: mistakes and errors during the research process at a reputable organization are rare but can happen. The most common error is miscommunication when information flows up the chain of command. It is important to do a gut check when you read reports. 

Part IID: Errors

We want to describe some of the errors we commonly observe, with reference to the framework by Andrew W. Brown, Kathryn A. Kaiser, and David B. Allison in the article “Issues with data and analyses” published in the Proceedings of the National Academy of Sciences of the USA in 2018. Some errors are minor and do not impact the findings significantly, while others can invalidate the entire project.

  • Errors in design: poor data collection methods, research design, or sampling techniques can produce bad data. The most common mistake is when there is a mismatch between the concept being studied and how it is operationalized or measured. For example, we have seen papers that use residential real estate water usage figures to estimate commercial real estate water usage. Any conclusions from then onwards are questionable.

  • Errors in data management: this can be as simple as having one or two typos in the code you use to analyze the data. The bigger challenge, however, is when people fail to recognize the expiration date of their data. You need to review the validity of your data whenever there are big changes. This can include drastic changes in the macroenvironment, eg. the pandemic, or the product itself, eg. a rebrand.

  • Errors in statistical analysis: it is true that if you torture the numbers long enough, they will say anything. We are especially cautious when we read papers that look at the statistical correlation between two or more macroeconomic indices without fully considering the nuances of the content, the limitations of individual indices, and the limitations of the statistical methods applied. 

  • Errors in logic: at a basic level, you can either disagree with the premises or the conclusion. The most common problem is when researchers make an unjustified generalization eg. because a certain demographic responds well to a product in North America, the product will perform well in Asia as well. Confusing correlation and causation is also common and problematic. The Stanford Encyclopedia of Philosophy describes other common logical fallacies in detail.

  • Errors in communication: this happens generally towards the end of a paper when there is a mismatch between the conclusions of a study and the ambitions of the author. Overzealous authors or bloggers can extrapolate and exaggerate the impacts of a study. Sensationalized language, and the overuse of hedging eg. might, could etc. are two of the signs we pay attention to.

Part III: Effective Communications

Writing is a great way to think through a problem and clarify the relationship between the variables you have identified. However, there are times when a full paper is not needed. Think carefully about what you want to spend time and resources on. There are many faster alternatives:

  1. Share a quote: sometimes all you need to do is share a snippet from an article you have read to a colleague.

  2. Write notes on a presentation slide: get to the point; keep it short and simple

  3. Others: notes on the company white board, group slack messages etc. are all great options.

In the event a more formal document is needed, it should be as short as possible. By the time you reach two to three pages, you should include a one-paragraph executive summary. Longer papers may benefit from a one-page memo. These summaries should provide an overview of the topic and details of the recommendations. You should also assume that unless there are specific questions related to the methodology or the reasoning of the report, no one except your manager or an investor in a due diligence process will read the paper in full. Unfortunately, your colleagues are busy people too.

Getting the summary right is critical and we generally look for these components:

  1. Context: this should include any relevant information surrounding the problem. For example, people or partnering organizations who have been involved in the research, the inspiration behind the project etc.

  2. Research question: this is the overarching question described in the first layer from Part I. It is the high level question with reference to the business goal.

  3. Method: this should include the sources, data, analytical methods you used to explore the question.

  4. Recommendations: a specific course of action you recommend based on your research, which may include further research if necessary, as well as the limitations of the study.

There are a few tips we recommend when it comes to business writing, which applies to both short and long-form essays:

  1. Keep it simple. Write simple sentences in an active voice with a clear subject, verb, and object. As a researcher, it is your responsibility to communicate your findings to the readers. A report that is easier to read is more likely to be read.

  2. Structure it well. A generic structure is entirely okay and even encouraged. Start with an introduction explaining the context, the significance of the research, definitions for key terms, and details of the framework you will use. Then write a few paragraphs explaining your findings and end with a paragraph with your recommendations. Keeping it simple is key.

  3. Write good topic sentences. Each paragraph should have a clear, self-contained idea. The topic sentence should identify or introduce the idea, with sentences behind it for support or clarification. 

  4. Avoid overly long sentences. Consider separating a sentence into two separate sentences if it is longer than two lines. 

  5. Always define the technical terms you use. Write as though you are talking to somebody who is not familiar with the topic - we would not need the research project if we already know everything about it. A successful report will be read by many executives across departments. Clearly defining technical terms will help you communicate with your investors and stakeholders too.

Recommended readings:

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

Innovating systematically in complex conditions through guided trial and error

by Marvin Cheung, Head of Research and Strategy

Trial and error in complex conditions

Architecture professors Horst Rittel and Melvin Webber from UC Berkeley observed some of the properties of complexity in 1973 and called the type of challenges that involve complexity as “wicked problems”. At a basic level, it is when a challenge has factors that are deeply interwoven and cannot be understood in isolation. There is also no way to fully account for the solution’s knock-on effects until it is executed. 

Common innovation challenges, such as finding product-market-fit or scaling, can be considered a wicked problem. Even with the most sophisticated technologies, there is no way to know for certain whether a new product will succeed without testing it. Its success will depend on many factors ranging from the quality of the product or its marketing to its final design.

We can use variations of a puzzle as a metaphor for wicked problems and its inverse, “tame” problems. A tame problem is like a thousand-piece puzzle from a box: you can follow the picture, start with the edge pieces, and work towards the middle; it is not easy, but there is a definitive goal and the relationship between the parts is clear. 

We can model this through a tree diagram. There are one thousand pieces originally. You can start with any piece. Each time you find a match, the number of possible pieces decreases by one. Like any other games, there are established best practices that can facilitate the process. For example, you can start with the corner pieces and work your way into the center.

A wicked problem is like a hundred ten-piece puzzles mixed into the same box. Some of the ten pieces will produce a more desirable picture than the others, but you only have time to put one or two complete puzzles together. Immediately, we begin to see some characteristics of real world challenges. First, there is very little indicator of what a successful end result looks like. Second, there are resource constraints.

By definition, we know that some form of trial and error is required. There are, however, different kinds of trial and error, as outlined in Donald T. Piele and Larry E. Wood’s essay “Thinking Strategies with the Computer” as part of the anthology The Best of Creative Computing Volume 3 published in 1980 by Creative Computing Press. The three, with varying degrees of effectiveness, are random, systematic, and guided trial and error. 

The most basic strategy is random trial and error. This would be equivalent to moving the algebraic symbols around randomly when you are stuck on a math question. You pick up a random piece, build the first puzzle, then pick up another random piece and build the second puzzle. At the end of this process, you have two complete puzzles. The chances of you liking both puzzles is the same as the chances of you hating both of them. 

The better strategy is systematic trial and error. Instead of picking pieces at random, we set a parameter of trying to build a puzzle with at least one blue piece, and proceed to list the rest of the colours in order of preference in case there are no blue pieces. In this scenario, we decrease the likelihood of getting a puzzle you really hate, but we still have very little control over the outcomes. 

The most effective strategy is guided trial and error. Say we start with trying to build a blue race car puzzle but fail to find any blue pieces. You then decide that orange is your next favourite colour and consider building a monotone puzzle. You discover fifteen orange pieces - either enough for one orange monotone ten-piece puzzle, fifteen puzzles with one orange piece each, or anything in between. Unfortunately, none of the fifteen orange pieces work together. You see the opportunity to build a puzzle with an orange motorcycle and complete it. In this scenario, though the top choice was not available, you manage to find a close second.

The idea of guided trial and error is further observed by mathematician Keith Devlin in 2006 from Stanford’s Center for the Study of Language and Information, with reference to how gamers are reshaping the business world, and by the Headwaters Science Institute on the process of scientific research.

While guided trial and error may seem like an intuitive strategy, we are only just scratching the surface of the challenge. The bigger question remains: How do we apply a guided trial and error process to innovation, and even more importantly, is there a repeatable process we can use to help us innovate systematically? The answer, as it turns out, is not quite so simple. 

Existing innovation best practices

We will start with common best practices and build up to our new recommendations. For example, in the guided trial and error scenario, we have already introduced a popular idea in lean startup: iterations. Rapid trial and error will increase the chances of discovering a desirable outcome. We have also introduced the idea of Minimum Viable Product: build only as much as you need to get a sense of the prototype’s desirability.

To introduce the next best practice, we need to develop the metaphor further to better reflect the challenges of innovating in the real world: imagine trying to tackle the hundred ten-piece puzzle challenge in a team of five, where not everyone is allowed to see the puzzle pieces. Your colleagues, investors, advisors, and customers all have different motivations and viewpoints, yet the ultimate success of the project requires some alignment of your stakeholders. 

As a response to the complexity - intertwined stakeholder needs, Human-Centered Design (HCD), commonly associated with IDEO and the Stanford Design School, gained recognition. One of the hallmarks of HCD is the use of post-it notes and mind maps to facilitate communications and help secure stakeholder buy-in. 

If we return to the original guided trial and error scenario, we would still see two missing pieces in our current understanding of innovation: (1) how we choose our parameters, and (2) how we evaluate solutions within the parameters. We will address (2) in the next section of this Coursebook.

To understand the knowledge gap, we need to recognize the limitations of the hundred ten-piece puzzles metaphor in representing innovation challenges. For one, constraints are rarely as convenient as “puzzle pieces with only orange (monotone)”, even “puzzle pieces with some orange” would have increased the level of difficulty significantly. We have also taken perfect eyesight and perfect information for granted here. In real life, we have to work with imperfect instruments, as well as incomplete, inaccurate, and even incorrect information. 

At its core, (1) is the more abstract part of how we think about problems: how should we guide the thinking process in a way that enables innovation? From a practical point of view, how do we connect the many frameworks available and choose the right parameter at the right time?

We found the answer in the field of design. Design is of interest to the field of innovation, because designers produce creative outcomes in every project. We can observe the properties of wicked problems within a project: there are clients and stakeholders with different needs, and at each point there is an infinite number of possibilities. An architecture can take many forms, and an empty canvas can carry whatever image you put on it.  This is elaborated on in Richard Buchanan’s essay from 1992, “Wicked Problems in Design Thinking”, published by MIT Press. 

Design thinking in this context is not the same as HCD. It comes as a surprise to many people including designers, that a lot of literature on design thinking predates IDEO. In fact, while the design thinking literature was being canonized in 1991 at the first symposium on Research in Design Thinking held at TU Delft, nowhere does David Kelley, co-founder of IDEO, mention the phrase “design thinking'' at his Ted Talk, titled “Human-centered Design” in 2002. Stanford’s D. School, co-founded by David Kelley as well, only began teaching ‘design thinking’ in 2005, according to Design Thinking: Understand - Improve - Apply, published by Springer in 2011. As a further demonstration of the difference between Pre-HCD Design Thinking and the current understanding of Design Thinking, none of the literature quoted below is included in IDEO’s article on the history and evolution of design thinking. 

It seems unavoidable that we discuss, however briefly, the term “design thinking”: to think like a designer. As you may have noticed, design thinking has not been included in the titles of this essay. This is not only because of the semantic shift from the original meaning to HCD, but also because the term design thinking itself misses the point. All of the previous attempts to “scientise” design, as observed by Nigel Cross in his essay “Design Discipline versus Design Science” from 2001 published by MIT Press, have inevitably failed. Attempts to codify design ignores a fundamental truth of the discipline: art and design challenges the boundaries of the norms. The ways of thinking are not static. Here, we are not so much interested in the subject of design itself or how we can define a way of thinking exclusive to designers, but the subject of innovation and what we can learn from pre-HCD design thinking literature.                

We can begin by reformulating (1) in the language of design. “In design, ‘the solution’ does not arise directly from ‘the problem’; the designer’s attention oscillates, or commutes, between the two, and an understanding of both gradually develops, as Archer (1979) has suggested [...] Designers use alternative solution conjectures as a means of developing their understanding of the problem,” notes Nigel Cross in the proceedings of Research in Design Thinking published in 1992.

While it may seem at first glance that the creative process is like a pendulum swinging between two distinct ends with no apparent start or end, we know this to be untrue. As Cross observes, “a design solution is not an arbitrary construct - it usually bears some relationship to the problem, as given, which is, after all, the starting condition for considering solution possibilities.” 

We can cut through the pendulum swing and frame the creative process instead as a starting condition, followed by a series of problem-solution pairs. We have seen this at work in the guided trial and error scenario when we started with the parameter “blue racecar”, moved to the solution “no blue pieces”, and then to the second parameter “orange monotone puzzle” with the second solution “fifteen orange pieces”. We can also see how “blue racecar”, for example, serves as a parameter, a solution conjecture, and a hypothesis of a possible and highly desirable outcome. 

We can map our starting condition as well as problem-solution pairs in a tree diagram too. At the top layer, we have one starting condition, then we branch off to our second layer with the first problem-solution pair “blue racecar” and “no blue pieces”. Also on the second layer is the problem-solution pair “orange monotone puzzle” and “fifteen orange pieces”, which branches off into the third layer with the problem-solution pair “none of the fifteen orange pieces work together” and “build the puzzle with an orange motorcycle”.

Much of what we described above coincides with existing thinking in business. DMAIC in lean six sigma advocates for a continuous effort to Define, Measure, Analyze, Improve, and Control. The McKinsey Mind, published by McGraw-HIll Education in 2001, discusses in detail the importance of a hypothesis-led investigation, so we do not have to “boil the ocean” looking for solutions, and the use of logic trees with issues and sub-issues. Where we see a clear point of divergence is the MECE principle (Mutually Exclusive, Collectively Exhaustive) in the making of the logic tree. By definition, complexity involves non-mutually exclusive parts that are closely interconnected. 

Innovating systematically in complex conditions

This is when we need to leave the metaphor behind and instead consider the real world example described in Richard Buchanan’s essay from 1992: “Managers of a large retail chain were puzzled that customers had difficulty navigating through their stores to find merchandise. Traditional graphic design yielded larger signs but no apparent improvement in navigation - the larger the sign, the more likely people were to ignore it. Finally a design consultant suggested that the problem should be studied from the perspective of the flow of customer experience. After a period of observing shoppers walk through stores, the consultant concluded that people often navigate among different sections of a store by looking for the most familiar and representative examples of a particular type of product. This led to a change in display strategy, placing those products that people are most likely to identify in prominent positions.”

We can see many of the prior principles apply here:

  1. they have iterated and created small tests

  2. there is a starting condition: customers have difficulty navigating the client’s stores, and two problem-solution pairs

  3. they have taken a hypothesis-led approach

The larger question still remains: how did they move from signage and graphic design, to product placement and customer experience? Is that just a coincidence? Pre-HCD Design thinking literature says otherwise. Specifically, we observed three types of flexibility that enable innovation. First, flexibility with the starting condition. Second, flexibility with exploring different layers of abstraction. Thirdly, flexibility with the frameworks used. We also found that designers add in information and perspective to manage the immense flexibility of the process.

The first strategy we found was having flexibility with the starting condition. Quoted from Cross’ 1992 essay, “Thomas and Carroll (1979) concluded that ‘Design is a type of problem solving in which the problem solver views the problem or acts as though there is some ill-definedness in the goals, initial conditions or allowable transformations”. The literature makes explicit that there is not just flexibility in how we ideate hypotheses, but that we should be ready to accept that the original starting condition is renegotiable. A starting condition is not written in stone and we can pivot. 

The second strategy we found was allowing flexibility with exploring the different layers of abstraction. Because the parts and the wholes are interconnected in a complex challenge and the strict boundaries between the parts and the whole is unclear, we can and should be able to move between different levels of abstraction. Elaborated upon in the Doctrine of Placements by Buchanan in his 1992 essay, Buchanan observes how many designers work across the four broad areas he identified, including symbolic and visual communications, material objects, activities and organized services, as well as complex systems or environments to deliver creative outcomes.

Specifically, Buchanan notices the way the designers treat the areas as interconnected “with no priority given to any single one”. While the “sequence of signs, things, actions, and thought could be regarded as an ascent from confusing parts to orderly wholes”, “there is no reason to believe that parts and wholes must be treated in ascending rather than descending order.” 

One of the implications of this, is the idea that when we structure our thinking, we can at any point, move up and across branches in the tree. Unlike when we are executing a project where going down a certain path requires us to commit significant amounts of time and resources, creating a unidirectional navigation pattern down a decision tree, you can change paths in the thinking process with far fewer commitments, creating a multi-directional navigation pattern in a tree. 

The third strategy, flexibility with frameworks used, was also identified in Buchanan’s 1992 publication: “Although the [retail chain navigation challenge] is a minor example, it does illustrate a double repositioning of the design problem [...] There are so many examples of conceptual repositioning in design that it is surprising that no one has recognized the systematic pattern of invention that lies behind design thinking in the twentieth century [...] Understanding the difference between a category and a placement is essential if design thinking is to be regarded as more than a series of creative accidents. Categories have fixed meanings that are accepted within the framework of a theory or a philosophy, and serve as the basis for analyzing what already exists. Placements have boundaries to shape and constraint meaning, but are not rigidly fixed and determinate.” 

Creatively switching between disciplines, frameworks, and layers of abstraction all help illuminate the relationship between the parts and the wholes and enable us to deliver innovation in complex situations. This applies to both the problem formulation process and the solution generation process. Emerging literature in Cross Domain Deterrence reveals the efficiency of leveraging capabilities in one domain to compensate and strengthen the capabilities of another. Quite simply, formulating solution combinations across departments gives us more flexibility, options, and control.

Having too many options can feel overwhelming. Indeed, this seems to add to the inherent ambiguity of the innovation process. Pre-HCD Design Thinking literature has in fact observed ways in which designers manage the immense flexibility. Cross in the 1992 publication notes, “In early observational studies of urban designers and planners, Levin (1965) realized that they ‘added information’ to the problem as given, simply in order to make a resolution of the problem possible [...] Darke (1979) from her interviews with successful architects [...] also concluded that the architects had all found, generated or imposed particular strong constraints, or a narrow set of objectives, upon the problem, in order to help generate the early solution concept.”  In innovation, these can include your values: what you will and will not do, it can be a vision: how your venture will align with a well researched projection of the future, or it can be anything that you discover using different frameworks. 

Read More
Editors Unbuilt Labs Editors Unbuilt Labs

An Introduction to Complexity

by Marvin Cheung, Head of Research and Strategy

In a conversation, when something takes too long to explain, requires too much contextual information, or has iffy components, we might say - it is complicated. The idea of complexity goes a little further. While we use the word complicated to describe a challenging and messy situation, we use the word complexity to describe a complicated situation we cannot fully comprehend. 

Fixing a classic car is complicated. It may require parts to be shipped from abroad, or niche expertise to fix the particular engine. Fixing an oil spill is complex. It is not just a matter of scooping up the oil. There are second and third order effects that are difficult to predict, and you have no way of simulating an oil spill of the same magnitude at the same time and location, nor can you unspill it. 

Many strategy challenges can be considered complex, especially novel strategy problems that fall under the umbrella of innovation. Say you are considering launching a new feature you do not have the capacity to build in-house. Regardless of your hiring plan, you will have to adjust your pricing strategy to reflect the increase in costs. How will this affect your customers in a different price tier and will the aggregate effect increase or decrease profits?

Complexity defies the many efforts to study it. In many ways, that is the nature of complexity. It lies beyond the boundaries of human knowledge, and it confounds even the most sophisticated minds and technology. While there is no metric to measure complexity, we can still see why the world is increasing in complexity.

As we become increasingly connected, traditional boundaries of markets start to break down. Long gone are the insular markets with homogeneous demographics. Companies small and large, with a local or global footprint, find themselves subjected to rapidly changing macro and micro dynamics from international influences. Companies “cannot develop models of the increasingly complex environment in which they operate”, notes John C. Camillus in an article published by the Harvard Business Review. 

While we are still far from being able to understand complexity itself, there is general consensus that the number of stakeholders involved contribute greatly to the complexity of a challenge. One of the reasons is that each stakeholder has a different set of perspectives and needs we need to account for. Another less appreciated reason is that the reliability of the information on individual stakeholders decreases as the number of stakeholders increases. 

In the next two sections, we will first investigate strategies to manage complexity, and then establish basic information standards for evidence-based decision making in high-uncertainty environments.

Read More