A Complete Guide to Primary and Secondary Research in UX Design

secondary research in design thinking

To succeed in UX design, you must know what UX research methods to use for your projects.

This impacts how you:

  • Understand and meet user needs
  • Execute strategic and business-driven solutions
  • Differentiate yourself from other designers
  • Be more efficient in your resources
  • Innovate within your market

Primary and secondary research methods are crucial to uncovering this. The former is when you gather firsthand data directly from sources, while the latter synthesizes existing data and translates them into insights and recommendations.

Let's dive deep into each type of research method and its role in UX research.

If you are still hungry to learn more, specifically how to apply it practically in the real world, you should check out Michael Wong's UX research course . He teaches you  the exact process and tactics he used that helped him build a UX agency that generated over $10M+ million in revenue.

What is p rimary research in UX design

Primary UX research gathers data directly from the users to understand their needs, behaviors, and preferences.

It's done through interviews, surveys, and observing users as they interact with a product.

Primary research in UX: When and why to use it

Primary research typically starts at the start of a UX project. This is so that the design process is grounded in a deep understanding of user needs and behaviors.

By collecting firsthand information early on, teams can tailor their designs to address real user problems.

Here are the reasons why primary research is important in UX design: ‍

1. It fast-tracks your industry understanding

Your knowledge about the industry may be limited at the start of the project. Primary research helps you get up to speed because you interact directly with real customers. As a result, this allows you to work more effectively.

Example: Imagine you're designing an app for coffee lovers. But you're not a coffee drinker yourself. Through user interviews, you learn how they prefer to order their favorite drink, what they love or hate about existing coffee apps, and their "wishlist" features by talking directly to them.

This crucial information will guide you on what to focus on in later stages when you do the actual designing. ‍

2. You'll gain clarity and fill knowledge gaps

There are always areas we know less about than we'd like. Primary research helps fill these gaps by observing user preferences and needs directly.

Example: Let's say you're working on a website for online learning. You might assume that users prefer video lessons over written content, but your survey results show that many users prefer written material because they can learn at their own pace.

With that in mind, you'll prioritize creating user-friendly design layouts for written lessons. ‍

3. You get to test and validate any uncertainties

When unsure about a feature, design direction, or user preference, primary research allows you to test these elements with real users.

This validation process helps you confidently move forward since you have data backing your decisions.

Example: You're designing a fitness app and can't decide between a gamified experience (with points and levels) or a more straightforward tracking system.

By prototyping both options and testing them with a group of users, you discover that the gamified experience concept resonates more.

Users are more motivated when they gain points and progress levels. As a result, you pivot to designing a better-gamified experience.

Types of primary research methods in UX design

Here's a detailed look at common primary research methods in UX:

1. User interviews

  • What is it: User interviews involve one-on-one conversations with users to gather detailed insights, opinions, and feedback about their experiences with a product or service.
  • Best used for: Gathering qualitative insights on user needs, motivations, and pain points.
  • Tools: Zoom and Google Meet for remote interviews; Calendly for scheduling; Otter.ai for transcription. ‍
  • What is it: Surveys are structured questionnaires designed to collect quantitative data on user preferences, behaviors, and demographics.
  • Best used for: Collecting data from many users to identify patterns and trends.
  • Tools: Google Forms, SurveyMonkey, and Typeform for survey creation; Google Sheets and Notion for note taking. ‍

3. Usability testing

  • What is it: Usability testing involves observing users interact with a prototype or the actual product to identify usability issues and areas for improvement.
  • Best used for: Identifying and addressing usability problems.
  • Tools: FigJam, Lookback.io , UserTesting, Hotjar for conducting and recording sessions; InVision, Figma for prototype testing; Google Sheets to log usability issues and track task completion rates. ‍

4. Contextual inquiry

  • What is it: This method involves observing and interviewing users in their natural environment to understand how they use a product in real-life situations.
  • Best used for: Gaining deep insights into user behavior and the context in which a product is used.
  • Tools: GoPro or other wearable cameras for in-field recording; Evernote for note-taking; Miro for organizing insights. ‍

5. Card sorting

  • What is it: Card sorting is when users organize and categorize content or information.
  • Best used for: Designing or evaluating the information architecture of a website or application.
  • Tools: FigJam, Optimal Workshop, UXPin, and Trello for digital card sorting; Mural for collaborative sorting sessions. ‍

6. Focus groups

  • What is it: Group discussions with users that explore their perceptions, attitudes, and opinions about a product.
  • Best used for: Gathering various user opinions and ideas in an interactive setting.
  • Tools: Zoom, Microsoft Teams for remote focus groups; Menti or Slido for real-time polling and feedback. ‍

7. Diary studies

  • What is it: A method where users record their experiences, thoughts, and frustrations while interacting with a product over a certain period of time.
  • Best used for: Understanding long-term user behavior, habits, and needs.
  • Tools: Dscout, ExperienceFellow for mobile diary entries; Google Docs for simple text entries. ‍

8. Prototype testing

  • What is it: Prototype testing is when users evaluate the usability and design of early product prototypes with users.
  • Best used for: Identifying usability issues and gathering feedback on design concepts
  • Tools: Figma for creating and sharing prototypes; Maze for unmoderated testing and analytics. ‍

9. Eye-tracking

  • What is it: A method that analyzes where and how long users look at different areas on a screen.
  • Best used for: Understanding user attention, readability, and visual hierarchy effectiveness.
  • Tools: Tobii, iMotions for hardware; Crazy Egg for website heatmaps as a simpler alternative. ‍

10. A/B testing

  • What is it: A/B testing compares two or more versions of a webpage or app feature to determine which performs better in achieving specific goals.
  • Best used for: Making data-driven decisions on design elements that impact user behavior.
  • Tools: Optimizely, Google Optimize for web-based A/B testing; VWO for more in-depth analysis and segmentation. ‍

11. Field studies

  • What is it: Research done in real-world settings to observe and analyze user behavior and interactions in their natural environment.
  • Best used for: Gaining insights into how products are used in real-world contexts and identifying unmet user needs.
  • Tools: Notability, OneNote for note-taking; Voice Memos for audio recording; Trello for organizing observations. ‍

12. Think-aloud protocols

  • What is it: A method involves users verbalizing their thought process while interacting with a product. It helps uncover their decision-making process and pain points.
  • Best used for: Understanding user reasoning, expectations, and experiences when using the product.
  • Tools: UsabilityHub, Morae for recording think-aloud sessions; Zoom for remote testing with screen sharing.

Challenges of primary research in UX

Here are the obstacles that UX professionals may face with primary research:

  • Time-consuming : Primary research requires significant planning, conducting, and analyzing. This is particularly relevant for methods that involve a lot of user interaction.
  • Resource intensive : A considerable amount of resources is needed, including specialized tools or skills for data collection and analysis.
  • Recruitment difficulties : Finding and recruiting suitable participants willing to put in the effort can be challenging and costly.
  • Bias and validity : The risk of bias in collecting and interpreting data highlights the importance of carefully designing the research strategy. This is so that the findings are accurate and reliable. ‍

What is secondary research in UX design

Once primary research is conducted, secondary research analyzes and converts this data into insights. They may also find common themes and ideas and convert them into meaningful recommendations.

Using journey maps, personas, and affinity diagrams can help them better understand the problem.

Secondary research also involves reviewing existing research, published books, articles, studies, and online information. This includes competitor websites and online analytics to support design ideas and concepts. ‍

Secondary research in UX: Knowing when and why to use it

Secondary research is a flexible method in the design process. It fits in both before and after primary research.

At the project's start, looking at existing research and what's already known can help shape your design strategy. This groundwork helps you understand the design project in a broader context.

After completing your primary research, secondary research comes into play again. This time, it's about synthesizing your findings and forming insights or recommendations for your stakeholders.

Here's why it's important in your design projects:

1. It gives you a deeper understanding of your existing research

Secondary research gathers your primary research findings to identify common themes and patterns. This allows for a more informed approach and uncovers opportunities in your design process.

Example: When creating personas or proto-personas for a fitness app, you might find common desires for personalized workout plans and motivational features.

This data shapes personas like "Fitness-focused Fiona," a detailed profile that embodies a segment of your audience with her own set of demographics, fitness objectives, challenges, and likes. ‍

2. Learn more about competitors

Secondary research in UX is also about leveraging existing data in the user landscape and competitors.

This may include conducting a competitor or SWOT analysis so that your design decisions are not just based on isolated findings but are guided by a comprehensive overview. This highlights opportunities for differentiation and innovation.

Example: Suppose you're designing a budgeting app for a startup. You can check Crunchbase, an online database of startup information, to learn about your competitors' strengths and weaknesses.

If your competitor analysis reveals that all major budgeting apps lack personalized advice features, this shows an opportunity for yours to stand out by offering customized budgeting tips and financial guidance. ‍

Types of secondary research methods in UX

1. competitive analysis.

  • What is it: Competitive analysis involves systematically comparing your product with its competitors in the market. It's a strategic tool that helps identify where your product stands about the competition and what unique value proposition it can offer.
  • Best used for: Identifying gaps in the market that your product can fill, understanding user expectations by analyzing what works well in existing products, and pinpointing areas for improvement in your own product.
  • Tools: Google Sheets to organize and visualize your findings; Crunchbase and SimilarWeb to look into competitor performance and market positioning; and UserVoice to get insights into what users say about your competitors.

2. Affinity mapping

  • What is it: A collaborative sorting technique used to organize large sets of information into groups based on their natural relationships.
  • Best used for: Grouping insights from user research, brainstorming sessions, or feedback to identify patterns, themes, and priorities. It helps make sense of qualitative data, such as user interview transcripts, survey responses, or usability test observations.
  • Tools: Miro and FigJam for remote affinity mapping sessions.

3. Customer journey mapping

  • What is it: The process of creating a visual representation of the customer's experience with a product or service over time and across different touchpoints.
  • Best used for: Visualizing the user's path from initial engagement through various interactions to the final goal.
  • Tools: FigJam and Google Sheets for collaborative journey mapping efforts.

4. Literature and academic review

  • What is it: This involves examining existing scholarly articles, books, and other academic publications relevant to your design project. The goal is to deeply understand your project's theoretical foundations, past research findings, and emerging trends.
  • Best used for: Establishing a solid theoretical framework for your design decisions. A literature review can uncover insights into user behavior and design principles that inform your design strategy.
  • Tools: Academic databases like Google Scholar, JSTOR, and specific UX/UI research databases. Reference management tools like Zotero and Mendeley can help organize your sources and streamline the review process.

Challenges of secondary research in UX design

These are the challenges that UX professionals might encounter when carrying out secondary research:

  • Outdated information : In a world where technology changes fast, the information you use must be current, or it might not be helpful.
  • Challenges with pre-existing data : Using data you didn't collect yourself can be tricky because you have less control over its quality. Always review how it was gathered to avoid mistakes.
  • Data isn't just yours : Since secondary data is available to everyone, you won't be the only one using it. This means your competitors can access similar findings or insights.
  • Trustworthiness : Look into where your information comes from so that it's reliable. Watch out for any bias in the data as well. ‍

The mixed-method approach: How primary and secondary research work together

Primary research lays the groundwork, while secondary research weaves a cohesive story and connects the findings to create a concrete design strategy.

Here's how this mixed-method approach works in a sample UX project for a health tech app:

Phase 1: Groundwork and contextualization

  • User interviews and surveys (Primary research) : The team started their project by interviewing patients and healthcare providers. The objective was to uncover the main issues with current health apps and what features could enhance patient care.
  • Industry and academic literature review (Secondary research) : The team also reviewed existing literature on digital health interventions, industry reports on health app trends, and case studies on successful health apps. ‍

Phase 2: Analysis and strategy formulation

  • Affinity mapping (Secondary research) : Insights from the interviews and surveys were organized using affinity mapping. It revealed key pain points like needing more personalized and interactive care plans.
  • Competitive benchmarking (Secondary research) : The team also analyzed competitors’ apps through secondary research to identify common functionalities and gaps. They noticed a lack of personalized patient engagement and, therefore, positioned their app to fill this void in the market. ‍

Phase 3: Design and validation

  • Prototyping (Secondary research) : With a good grasp of what users need and the opportunities in the market, the startup created prototypes. These prototypes include AI-powered personalized care plans, reminders for medications, and interactive tools to track health.
  • Usability testing (Primary research) : The prototypes were tested with a sample of the target user group, including patients and healthcare providers. Feedback was mostly positive, especially for the personalized care plans. This shows that the app has the potential to help patients get more involved in their health. ‍

Phase 4: Refinement and market alignment

  • Improving design through iterations: The team continuously refined the app's design based on feedback from ongoing usability testing.
  • Ongoing market review (Secondary research) : The team watched for new studies, healthcare reports, and competitors' actions. This helped them make sure their app stayed ahead in digital health innovation. ‍

Amplify your design impact and impress your stakeholders in 10+ hours

Primary and secondary research methods are part of a much larger puzzle in UX research.

However, understanding the theoretical part is not enough to make it as a UX designer nowadays.

The reason?

UX design is highly practical and constantly evolving. To succeed in the field, UX designers must do more than just design.

They understand the bigger picture and know how to deliver business-driven design solutions rather than designs that look pretty.

Sometimes, the best knowledge comes from those who have been there themselves. That's why finding the right mentor with experience and who can give practical advice is crucial.

In just 10+ hours, the Practical UX Research & Strategy Course dives deep into strategic problem-solving. By the end, you'll know exactly how to make data-backed solutions your stakeholders will get on board with.

Master the end-to-end UX research workflow, from formulating the right user questions to executing your research strategy and effectively presenting your findings to stakeholders.

Learn straight from Mizko—a seasoned industry leader with a track record as a successful designer, $10M+ former agency owner, and advisor for tech startups.

This course equips you with the skills to:

  • Derive actionable insights through objective-driven questions.
  • Conduct unbiased, structured interviews.
  • Select ideal participants for quality data.
  • Create affinity maps from research insights.
  • Execute competitor analysis with expertise.
  • Analyze large data sets and user insights systematically.
  • Transform research and data into actionable frameworks and customer journey maps.
  • Communicate findings effectively and prioritize tasks for your team.
  • Present metrics and objectives that resonate with stakeholders.

Designed for flexible and independent learning, this course allows you to progress independently.

With 4000+ designers from top tech companies like Google, Meta, and Squarespace among its alumni, this course empowers UX designers to integrate research skills into their design practices.

Here's what students have to say about the 4.9/5 rated course:

"I'm 100% more confident when talking to stakeholders about User Research & Strategy and the importance of why it needs to be included in the process. I also have gained such a beautiful new understanding of my users that greatly influences my designs. All of the "guesswork" that I was doing is now real, meaningful work that has stats and research behind it." - Booking.com Product Designer Alyssa Durante

"I had no proper clarity of how to conduct a research in a systematically form which actually aligns to the project. Now I have a Step by Step approach from ground 0 to final synthesis." - UX/UI Designer Kaustav Das Biswas

"The most impactful element has been the direct application of the learnings in my recent projects at Amazon. Integrating the insights gained from the course into two significant projects yielded outstanding results, significantly influencing both my career and personal growth. This hands-on experience not only enhanced my proficiency in implementing UX strategies but also bolstered my confidence in guiding, coaching, mentoring, and leading design teams." - Amazon.com UX designer Zohdi Rizvi

Gain expert UX research skills and outshine your competitors.

secondary research in design thinking

Mizko, also known as Michael Wong, brings a 14-year track record as a Founder, Educator, Investor, and Designer. His career evolved from lead designer to freelancer, and ultimately to the owner of a successful agency, generating over $10M in revenue from Product (UX/UI) Design, Web Design, and No-code Development. His leadership at the agency contributed to the strategy and design for over 50 high-growth startups, aiding them in raising a combined total of over $400M+ in venture capital.

Notable projects include: Autotrader (Acquired. by eBay), PhoneWagon (Acquired by CallRails), Spaceship ($1B in managed funds), Archistar ($15M+ raised) and many more.

Table of contents

The design pulse.

secondary research in design thinking

New website, course and product updates - April 2024

secondary research in design thinking

Using Quantitative and Qualitative Research in UX Design

secondary research in design thinking

10 Best Free Fonts for UI Design (2024 Edition)

secondary research in design thinking

The Ultimate Guide to UX/UI Design in 2024

secondary research in design thinking

16 Best UX Research Tools in 2024: Gather Faster & Better Insights

secondary research in design thinking

15 Most Effective UX Research Methods: Pros and Cons

secondary research in design thinking

The Ultimate Guide to Figma for Beginners (Updated 2024)

secondary research in design thinking

The Ultimate Guide to UX Research (Updated 2024)

secondary research in design thinking

5 Reasons Why You Need UX Research in 2024

Join our newsletter.

secondary research in design thinking

Get 10% off on your first purchase

Skip navigation

Nielsen Norman Group logo

World Leaders in Research-Based User Experience

Design thinking in practice: research methodology.

secondary research in design thinking

January 10, 2021 2021-01-10

  • Email article
  • Share on LinkedIn
  • Share on Twitter

Project Overview 

Over the last decade, we have seen design thinking gain popularity across industries. Nielsen Norman Group conducted a long-term research project to understand design thinking in practice. The research project included 3 studies involving more than 1000 participants and took place from 2018 to 2020: 

  • Intercepts and interviews with 87 participants
  • Digital survey with 1067 respondents
  • In-depth case study at an institution practicing design thinking 

The primary goals of the project were to investigate the following:

  • How do practitioners learn and use design thinking?
  • How does design thinking provide value to individuals and organizations?
  • What makes design thinking successful or unsuccessful? 

This description of what we did may be useful in helping you interpret our results and apply them to your own design-thinking practice. 

Project Findings

The findings from this research are shared in the following articles and videos:

  • What Is Design Thinking, Really? (What Practitioners Say) (Article) 
  • How UX Professionals Define Design Thinking in Practice (Video) 
  • Design Thinking: The Learner’s Journey (Article)

In This Article:

Study 1: intercepts and interviews , study 2:  digital survey, study 3: case study .

In the first study we investigated how UX and design professionals define design thinking.  

This study consisted of 71 in-person intercepts in Washington DC, San Francisco, Boston, and North Carolina and 16 remote interviews over the phone and via video conferencing. These 87 participants were UX professionals from a diverse range of countries with varying roles and experience.

Intercepts consisted of two questions:

  • What do think of when you hear the phrase “design thinking”?
  • How would you define design thinking?

Interviews consisted of 10 questions, excluding demographic-related questions:

  • What are the first words that come to mind when I say “design thinking”?
  • Can you tell me more about [word they supplied in response to question 1]?
  • How would you define design thinking? Why?
  • What does it mean to practice design thinking?
  • What are the positive or negative effects of design thinking?
  • Products and services
  • Clients/customers
  • Using this scale, what is your experience using design thinking?
  • Using this same scale, how successful has design thinking been in your experience?
  • What could have been better?
  • What is good about design thinking? What is bad about design thinking?

Our second study consisted of a qualitative digital survey that ran for two months and had 1067 professional respondents primarily from UX-related fields. The survey had 14 questions, excluding demographic-related questions. An alternative set of 4 questions was shown to those with little to no experience using design thinking.  

  • Which of the following best describes your experience with design thinking?
  • Where did you learn design thinking?  
  • UX maturity 
  • Frequency of crossteam collaboration 
  • User-centered approach 
  • Research-driven decision making
  • How often do you, yourself, practice design thinking?
  • In your own words, what does it mean to practice design thinking? 
  • When do you use design thinking?
  • What methods or exercises are used?
  • In what situations is each one used and why?
  • Which ones are done individually versus as a group?
  • How is each exercise executed?
  • Gives your organization a competitive advantage
  • Drives innovation
  • Fosters collaboration
  • Provides structure to the organization
  • Increases likelihood of success
  • Please describe a situation where design thinking positively influenced your organization and why it was successful. 
  • Please describe a situation where design thinking may have negatively influenced your organization and why it was negative. 
  • Design thinking negatively affects efficiency.
  • Design thinking requires a collaborative environment to work well.
  • Anyone can learn and practice design thinking.
  • Design thinking is rigid.
  • Design thinking requires all involved to be human-centered.
  • Design thinking takes a lot of time.
  • Design thinking has low return on investment.
  • Design thinking empowers personal growth.
  • Design thinking grows interpersonal relationships.
  • Design thinking improves organizational progress.

The 1067 survey participants had diverse backgrounds: they held varying roles across industries and were located across the globe. 94 responses were invalid, so we excluded them from our analysis.  

The majority of participants (33%) were UX designers, followed by UX researchers (13%) and UX consultants (12%). 

Percentages of Different Job Roles

Of participants who responded “Other”, the most common response provided was an executive role (n=20). This included roles such as CEO, VP, director, founder, and “head of.” Other mentioned roles included service designer (n=17), manager (n=14), business designer or business analyst (n=11), and educator (including teacher, instructor, and curriculum designer) (n=11).

Geographically, we had respondents from 67 different countries. The majority of survey participants work in the United States (34%), followed by India (8%), United Kingdom (7%), and Canada (5%). 

Percentage of Participants by Country

Our survey participants also represented diverse industries, with the majority in software (22%) and finance or insurance (14%). 

Percentage of Participants by Each Industry

Of participants who responded Other , the most common response provided was agency or consulting (n=26), followed by telecommunications (n=17), marketing (n=8), and tourism (n=7).

Our third and final study consisted of an in-person case study at a large, public ecommerce company. The case study involved 9 interviews with company employees, 6 observation sessions of design-thinking (or related) workshops, and an internal resource and literature audit. 

The interviews were 1-hour long and semistructured. Of the 8 participants, 3 were on the same team but had different roles: 1 UX designer, 1 product manager, and 1 engineer. The other 5 interviewees (3 design leaders and 2 UX designers) worked in different groups across the organization. Each participant completed the same digital survey from the second study prior to interviewing.    

In addition to interviews, we conducted 6 observation sessions: 3 design-thinking workshops, 2 meetings, and 1 lunch-and-learn. After the workshops, all participants were invited to fill out a survey about the workshop. The survey had 5 questions: 

  • We achieved our goal of [x]. 
  • The time and resources spent to conduct the workshop were worth it.
  • What aspects were of greatest value to you, and why? 
  • Where there any aspects you felt were not useful, and why?
  • Will the workshop or its output impact any of your future work? If so, how?
  • What is your role?

Lastly, we conducted a resource and literature audit of the company’s internal resources related to design thinking available to employees.  

Related Courses

Generating big ideas with design thinking.

Unearthing user pain points to drive breakthrough design concepts

Interaction

Effective Ideation Techniques for UX Design

Systematic methods for creative solutions to any UX design or redesign challenge

Personas: Turn User Data Into User-Centered Design

Successfully turn user data into user interfaces. Learn how to create, maintain and utilize personas throughout the UX design process.

Related Topics

  • Design Process Design Process
  • Research Methods

Learn More:

secondary research in design thinking

The Role of Design

Don Norman · 5 min

secondary research in design thinking

Design Thinking Activities

Sarah Gibbons · 5 min

secondary research in design thinking

Design Thinking: Top 3 Challenges and Solutions

Related Articles:

Crafting Product-Specific Design Principles to Support Better Decision Making

Maria Rosala · 5 min

Design Thinking 101

Sarah Gibbons · 7 min

The 6 Steps to Roadmapping

Sarah Gibbons · 8 min

3 Types of Roadmaps in UX and Product Design

User Need Statements: The ‘Define’ Stage in Design Thinking

Sarah Gibbons · 9 min

What Is Design Thinking, Really? (What Practitioners Say)

InVisionApp, Inc.

Inside Design

4 types of research methods all designers should know

Emily esposito,   •   oct 22, 2018.

R emember that fifth grade science project where you learned about primary research for the first time? Like most things we learned in elementary school, you probably didn’t expect it to creep back into your day-to-day adult life. However, in reality, designers have to conduct research and analyze data all the time.

Design research is a critical step in creating the best user experience. It helps you understand your customers’ behavior and turn it into actionable insights to improve your design.

Top Stories

Primary research.

Perhaps the most important method in design research, this involves you or your team going directly to the source (your customers) to ask questions and gather data. Most often, the goal is to better understand who you are designing for or to validate your ideas with the actual end user.

Some examples of primary research include:

One-on-one interviews are a great place to start when collecting primary research. There are three main types of interviews: directed, non-directed, and ethnographic. Direct interviews are the most common and follow the standard question and answer format. Non-direct interviews are used when participants may not feel comfortable with direct questions. Instead, this interview is set up as a conversation (with some rough guidelines). Ethnographic interviews involve observing people in their day-to-day environment (very similar to the contextual inquiry method covered below).

The ultimate guide to web content accessibility

User groups.

Also known as focus groups, these are structured interviews involving three to six participants. A moderator guides the discussion, providing verbal and written feedback through the exercises. This research method is best when you need to get a lot of user insight in a short period of time.

Contextual inquiry

You first ask users a set of standard questions, then observe them in their natural environment as they complete their everyday tasks. It’s not just an interview or an observation—you want to watch people perform tasks as they explain what they are doing and why. This type of research is especially important in the beginning of the design process to learn what is important to users and how they interact with similar tools or services.

Asking users to document their own experience will help you see your product through their eyes.

“Design research helps you understand your customers’ behavior and turn it into actionable insights to improve your design.”

Diary study

Occurring over an extended period of time (from a week to a month, or even longer), participants are asked to keep a diary and log specific information about their activities. In-situ logging is the simplest way to collect data from diaries—users report all details about the activities as they complete them.

Usability testing

Once you’re deeper into the design process and have a prototype to share, usability testing helps you put that design into the wild to gather feedback. Here, you would ask potential or current users to complete a set of tasks using your prototype.

Secondary research

Secondary research is when you use existing data like books, articles, or the internet to validate or support existing research. You may use secondary research to create a stronger case for your design choices and provide additional insight into what you learned during primary research.

Work with existing content, like presentations or articles, to present a strong case for your design choices.

This type of research method is quick and cheap—all you need is internet access or a library card to start. However, some common challenges with secondary research include not being able to find the specific information you need, or battling outdated, low-quality data.Here are some places where you could gather secondary research:

  • Internal data, like your company database, sales reports, or historical information
  • Government statistics or information from government agencies
  • University research centers
  • Respected magazines and newspapers

These 5 major UI mistakes will kill your app

Generative or exploratory research.

Generative research, also known as exploratory research, focuses on a deeper understanding of user needs and desires. It is usually conducted at the beginning of the design project when you need to answer basic questions like, “What problem are we solving for our customers?” This discovery phase helps you to identify a design hypothesis and validate it with your customers. You won’t always know what the outcome or answers will be, but they will create a strong foundation to make good design decisions going forward.

You’ll see a lot of overlap between generative research and primary research since the whole point of generative research is to get out and talk to your users. Examples of generative research include interviews, user groups, surveys, and contextual inquiries.

Before you start your research, make sure you know what you intend to learn from the results.

Evaluative research.

After gathering your generative research, you’re prepared to design a solution for your customers. Evaluative research allows you to test that solution, giving users the opportunity to “evaluate” your prototype. Your goal is to collect feedback to help refine and improve the design experience. One of the most popular ways to conduct evaluative research is to have people use your product or service as they think out loud (again, a subset of primary research). A perfect example of this research method is usability studies.And, for whichever type of evaluative research you choose, there are two types: summative and formative. Summative emphasizes the outcome more than the process (looking at whether the desired effect is achieved) and formative is used to strengthen idea being tested (monitoring the success of a process).

Keep asking questions

How do you decide which research method to use? It depends on what you’re trying to learn. You may start with primary research and find that more questions arise after getting to know your customers better (and that’s a good thing!). These new questions will help you decide what you need to learn next. When in doubt, always follow the questions.

InVision Cloud

Sign Up Free

by Emily Esposito

Emily has written for some of the top tech companies, covering everything from creative copywriting to UX design. When she's not writing, she's traveling the world (next stop: Japan!), brewing kombucha, and biking through the Pacific Northwest.

Collaborate in real time on a digital whiteboard Try Freehand

Get awesome design content in your inbox each week, give it a try—it only takes a click to unsubscribe., thanks for signing up, you should have a thank you gift in your inbox now-and you’ll hear from us again soon, get started designing better. faster. together. and free forever., give it a try. nothing’s holding you back..

The modern guide to web accessibility

The modern guide to web accessibility

The 4 types of research methods in ui/ux design (and when to use them).

  • User Experience
  • 4 minute read
  • by Rich Staats

Design research is a necessary part of creating a user-centered product. When done right, you’re able to gather data that helps you:

  • Identify and solve relevant design problems.
  • Better understand the product’s end users.
  • Improve your designs based on data-driven research.

Though there are many different ways to collect data and do design research, they can broadly be categorized as either primary, secondary, exploratory, or evaluative research. In this article, we’ll explain these four types of research methods in the context of UI/UX design and when you should use them in your design process.

Primary research

Primary Research

Primary research is the simplest (and perhaps most effective) way to come up with data to get a better understanding of the audience for which you’re designing. The purpose of primary research is to validate design ideas and concepts early on in the design process. The data you collect from primary research allows you to design meaningful, user-centered solutions.

Let’s take a look at some examples of primary research:

Conducting interviews with individuals or in small groups is a great starting point, and there are many ways to go about it. Depending on your project, you might conduct direct interviews or indirect interviews. Direct interviews are simple question-answer format interviews whereas indirect interviews are set up in a more conversational style. You’ll also have to decide whether you’ll interview people in-person or remotely.

Focus groups

Focus groups are structured, group interviews in which a moderator guides the discussion. As a UI/UX designer, you might consider using this research method when you need to gather user insight quickly.

Usability testing

Once you develop a prototype, you can recruit test participants and conduct usability tests  to uncover foundational issues with the product’s user experience and gather user feedback. The idea is to define user goals and turn them into realistic task scenarios  that the test participants would have to complete using your prototype.

Secondary research

Secondary Research

Secondary research is when you use existing books, articles, or research material to validate your design ideas and concepts or support your primary research. For example, you might want to use the material you gather from secondary research to:

  • Explain the context behind your UI design.
  • Build a case for your design decisions.
  • Reinforce the data you gathered from primary research.

Generally speaking, secondary research is much easier (and faster) to do than primary research. You’ll be able to find most of the information you need on the internet, in the library, or your company’s archives. Here are some places you can collect secondary research from:

  • Your company’s internal data, which may include information contained in your company’s files, databases and project reports.
  • Client’s research department, e.g. the data your client has regarding user behavior with previous versions of the website/application, user interests, etc.
  • Industry statistics, i.e. the industry’s general consensus, standards and conventions.
  • Relevant books, articles, case studies and magazines.

Websites have evolved a great deal over the last two decades, and so has the way users interact with them. This is why one of the most common challenges with secondary research in UI/UX design is outdated data. In such cases, UI/UX designers resort to other research methods (such as primary research or exploratory research) to gather the data they need.

Exploratory research

Exploratory Research

Exploratory research is usually conducted at the start of the design process with a purpose to help designers understand the problem they’re trying to solve. As such, it focuses on gathering a thorough understanding of the end user’s needs and goals.

In the Define the Problem stage of the design thinking process , you can use exploratory research techniques to develop a design hypothesis and validate it with the product’s intended user base. By doing so, you’ll be in a better position to make hypothesis-driven design decisions throughout the design process.

You can validate your hypothesis by running experiments. Here are some of the ways you can validate your assumptions depending on where you are in the design process:

  • Conducting interviews and surveys
  • Organizing focus groups
  • Conducting usability tests
  • Running various A/B tests

Essentially, you’re combining exploratory research and primary research techniques to define the problem accurately. You can do this by asking questions that encourage interview participants to explore different design concepts and think outside the box.

Before you begin collecting data, remember to write down the experiment you’re running and define the outcomes that validate your design hypothesis. After doing exploratory research, you should have enough data to begin designing a solution.

Evaluative research

Evaluative Research

Exploratory research gives you enough data to begin designing a solution. Once you have a prototype on hand, you can use evaluative research to test that solution with real users. The goal of evaluative research is to help designers gather feedback that allows them to improve  their product’s design.

There are two main functions of evaluative research: summative and formative .

  • Summative evaluation is all about making a judgment regarding the efficacy of the product once it’s complete.
  • Formative evaluation, on the other hand, focuses on evaluating the product and making improvements (i.e., detecting and eliminating usability problems) during the development process.

For example, you can conduct usability tests in which you ask test participants to use the product to perform a set of tasks. Keep in mind that the purpose of evaluative research is to gather feedback from users regarding your product’s design. In case you’re short on time or low on budget, you can choose to conduct usability studies that fit in your time and budget constraints (such as guerrilla usability testing ).

Deciding which research method to use depends on what data you’re trying to gather and where you are in the design process. The information you collect through your design research will enable you to make informed design decisions and create better user-centered products.

Let’s quickly recap the four types of research methods UI/UX designers can use in the design process:

  • Primary research  is used to generate data by conducting interviews, surveys, and usability tests and/or organizing focus group sessions.
  • With secondary research,  you’re able to use existing research material to validate your design ideas and support your primary research.
  • Exploratory research  is when you come up with a design hypothesis and run experiments to validate it.
  • Once you have a prototype, you can use evaluative research  to see if there’s any room for improvement.

Which of these research methods do you use in your design process and how? Let us know in the comments section below.

Down The Rabbit Hole

5 Mistakes to avoid when writing task scenarios for usability testing

5 Mistakes to avoid when writing task scenarios for usability testing

  • 5 minute read

Get the asset

" * " indicates required fields

An adventure awaits...

Root out friction in every digital experience, super-charge conversion rates, and optimize digital self-service

Uncover insights from any interaction, deliver AI-powered agent coaching, and reduce cost to serve

Increase revenue and loyalty with real-time insights and recommendations delivered to teams on the ground

Know how your people feel and empower managers to improve employee engagement, productivity, and retention

Take action in the moments that matter most along the employee journey and drive bottom line growth

Whatever they’re are saying, wherever they’re saying it, know exactly what’s going on with your people

Get faster, richer insights with qual and quant tools that make powerful market research available to everyone

Run concept tests, pricing studies, prototyping + more with fast, powerful studies designed by UX research experts

Track your brand performance 24/7 and act quickly to respond to opportunities and challenges in your market

Explore the platform powering Experience Management

  • Free Account
  • For Digital
  • For Customer Care
  • For Human Resources
  • For Researchers
  • Financial Services
  • All Industries

Popular Use Cases

  • Customer Experience
  • Employee Experience
  • Net Promoter Score
  • Voice of Customer
  • Customer Success Hub
  • Product Documentation
  • Training & Certification
  • XM Institute
  • Popular Resources
  • Customer Stories
  • Artificial Intelligence

Market Research

  • Partnerships
  • Marketplace

The annual gathering of the experience leaders at the world’s iconic brands building breakthrough business results, live in Salt Lake City.

  • English/AU & NZ
  • Español/Europa
  • Español/América Latina
  • Português Brasileiro
  • REQUEST DEMO
  • Experience Management
  • Secondary Research

Try Qualtrics for free

Secondary research: definition, methods, & examples.

19 min read This ultimate guide to secondary research helps you understand changes in market trends, customers buying patterns and your competition using existing data sources.

In situations where you’re not involved in the data gathering process ( primary research ), you have to rely on existing information and data to arrive at specific research conclusions or outcomes. This approach is known as secondary research.

In this article, we’re going to explain what secondary research is, how it works, and share some examples of it in practice.

Free eBook: The ultimate guide to conducting market research

What is secondary research?

Secondary research, also known as desk research, is a research method that involves compiling existing data sourced from a variety of channels . This includes internal sources (e.g.in-house research) or, more commonly, external sources (such as government statistics, organizational bodies, and the internet).

Secondary research comes in several formats, such as published datasets, reports, and survey responses , and can also be sourced from websites, libraries, and museums.

The information is usually free — or available at a limited access cost — and gathered using surveys , telephone interviews, observation, face-to-face interviews, and more.

When using secondary research, researchers collect, verify, analyze and incorporate it to help them confirm research goals for the research period.

As well as the above, it can be used to review previous research into an area of interest. Researchers can look for patterns across data spanning several years and identify trends — or use it to verify early hypothesis statements and establish whether it’s worth continuing research into a prospective area.

How to conduct secondary research

There are five key steps to conducting secondary research effectively and efficiently:

1.    Identify and define the research topic

First, understand what you will be researching and define the topic by thinking about the research questions you want to be answered.

Ask yourself: What is the point of conducting this research? Then, ask: What do we want to achieve?

This may indicate an exploratory reason (why something happened) or confirm a hypothesis. The answers may indicate ideas that need primary or secondary research (or a combination) to investigate them.

2.    Find research and existing data sources

If secondary research is needed, think about where you might find the information. This helps you narrow down your secondary sources to those that help you answer your questions. What keywords do you need to use?

Which organizations are closely working on this topic already? Are there any competitors that you need to be aware of?

Create a list of the data sources, information, and people that could help you with your work.

3.    Begin searching and collecting the existing data

Now that you have the list of data sources, start accessing the data and collect the information into an organized system. This may mean you start setting up research journal accounts or making telephone calls to book meetings with third-party research teams to verify the details around data results.

As you search and access information, remember to check the data’s date, the credibility of the source, the relevance of the material to your research topic, and the methodology used by the third-party researchers. Start small and as you gain results, investigate further in the areas that help your research’s aims.

4.    Combine the data and compare the results

When you have your data in one place, you need to understand, filter, order, and combine it intelligently. Data may come in different formats where some data could be unusable, while other information may need to be deleted.

After this, you can start to look at different data sets to see what they tell you. You may find that you need to compare the same datasets over different periods for changes over time or compare different datasets to notice overlaps or trends. Ask yourself: What does this data mean to my research? Does it help or hinder my research?

5.    Analyze your data and explore further

In this last stage of the process, look at the information you have and ask yourself if this answers your original questions for your research. Are there any gaps? Do you understand the information you’ve found? If you feel there is more to cover, repeat the steps and delve deeper into the topic so that you can get all the information you need.

If secondary research can’t provide these answers, consider supplementing your results with data gained from primary research. As you explore further, add to your knowledge and update your findings. This will help you present clear, credible information.

Primary vs secondary research

Unlike secondary research, primary research involves creating data first-hand by directly working with interviewees, target users, or a target market. Primary research focuses on the method for carrying out research, asking questions, and collecting data using approaches such as:

  • Interviews (panel, face-to-face or over the phone)
  • Questionnaires or surveys
  • Focus groups

Using these methods, researchers can get in-depth, targeted responses to questions, making results more accurate and specific to their research goals. However, it does take time to do and administer.

Unlike primary research, secondary research uses existing data, which also includes published results from primary research. Researchers summarize the existing research and use the results to support their research goals.

Both primary and secondary research have their places. Primary research can support the findings found through secondary research (and fill knowledge gaps), while secondary research can be a starting point for further primary research. Because of this, these research methods are often combined for optimal research results that are accurate at both the micro and macro level.

Sources of Secondary Research

There are two types of secondary research sources: internal and external. Internal data refers to in-house data that can be gathered from the researcher’s organization. External data refers to data published outside of and not owned by the researcher’s organization.

Internal data

Internal data is a good first port of call for insights and knowledge, as you may already have relevant information stored in your systems. Because you own this information — and it won’t be available to other researchers — it can give you a competitive edge . Examples of internal data include:

  • Database information on sales history and business goal conversions
  • Information from website applications and mobile site data
  • Customer-generated data on product and service efficiency and use
  • Previous research results or supplemental research areas
  • Previous campaign results

External data

External data is useful when you: 1) need information on a new topic, 2) want to fill in gaps in your knowledge, or 3) want data that breaks down a population or market for trend and pattern analysis. Examples of external data include:

  • Government, non-government agencies, and trade body statistics
  • Company reports and research
  • Competitor research
  • Public library collections
  • Textbooks and research journals
  • Media stories in newspapers
  • Online journals and research sites

Three examples of secondary research methods in action

How and why might you conduct secondary research? Let’s look at a few examples:

1.    Collecting factual information from the internet on a specific topic or market

There are plenty of sites that hold data for people to view and use in their research. For example, Google Scholar, ResearchGate, or Wiley Online Library all provide previous research on a particular topic. Researchers can create free accounts and use the search facilities to look into a topic by keyword, before following the instructions to download or export results for further analysis.

This can be useful for exploring a new market that your organization wants to consider entering. For instance, by viewing the U.S Census Bureau demographic data for that area, you can see what the demographics of your target audience are , and create compelling marketing campaigns accordingly.

2.    Finding out the views of your target audience on a particular topic

If you’re interested in seeing the historical views on a particular topic, for example, attitudes to women’s rights in the US, you can turn to secondary sources.

Textbooks, news articles, reviews, and journal entries can all provide qualitative reports and interviews covering how people discussed women’s rights. There may be multimedia elements like video or documented posters of propaganda showing biased language usage.

By gathering this information, synthesizing it, and evaluating the language, who created it and when it was shared, you can create a timeline of how a topic was discussed over time.

3.    When you want to know the latest thinking on a topic

Educational institutions, such as schools and colleges, create a lot of research-based reports on younger audiences or their academic specialisms. Dissertations from students also can be submitted to research journals, making these places useful places to see the latest insights from a new generation of academics.

Information can be requested — and sometimes academic institutions may want to collaborate and conduct research on your behalf. This can provide key primary data in areas that you want to research, as well as secondary data sources for your research.

Advantages of secondary research

There are several benefits of using secondary research, which we’ve outlined below:

  • Easily and readily available data – There is an abundance of readily accessible data sources that have been pre-collected for use, in person at local libraries and online using the internet. This data is usually sorted by filters or can be exported into spreadsheet format, meaning that little technical expertise is needed to access and use the data.
  • Faster research speeds – Since the data is already published and in the public arena, you don’t need to collect this information through primary research. This can make the research easier to do and faster, as you can get started with the data quickly.
  • Low financial and time costs – Most secondary data sources can be accessed for free or at a small cost to the researcher, so the overall research costs are kept low. In addition, by saving on preliminary research, the time costs for the researcher are kept down as well.
  • Secondary data can drive additional research actions – The insights gained can support future research activities (like conducting a follow-up survey or specifying future detailed research topics) or help add value to these activities.
  • Secondary data can be useful pre-research insights – Secondary source data can provide pre-research insights and information on effects that can help resolve whether research should be conducted. It can also help highlight knowledge gaps, so subsequent research can consider this.
  • Ability to scale up results – Secondary sources can include large datasets (like Census data results across several states) so research results can be scaled up quickly using large secondary data sources.

Disadvantages of secondary research

The disadvantages of secondary research are worth considering in advance of conducting research :

  • Secondary research data can be out of date – Secondary sources can be updated regularly, but if you’re exploring the data between two updates, the data can be out of date. Researchers will need to consider whether the data available provides the right research coverage dates, so that insights are accurate and timely, or if the data needs to be updated. Also, fast-moving markets may find secondary data expires very quickly.
  • Secondary research needs to be verified and interpreted – Where there’s a lot of data from one source, a researcher needs to review and analyze it. The data may need to be verified against other data sets or your hypotheses for accuracy and to ensure you’re using the right data for your research.
  • The researcher has had no control over the secondary research – As the researcher has not been involved in the secondary research, invalid data can affect the results. It’s therefore vital that the methodology and controls are closely reviewed so that the data is collected in a systematic and error-free way.
  • Secondary research data is not exclusive – As data sets are commonly available, there is no exclusivity and many researchers can use the same data. This can be problematic where researchers want to have exclusive rights over the research results and risk duplication of research in the future.

When do we conduct secondary research?

Now that you know the basics of secondary research, when do researchers normally conduct secondary research?

It’s often used at the beginning of research, when the researcher is trying to understand the current landscape . In addition, if the research area is new to the researcher, it can form crucial background context to help them understand what information exists already. This can plug knowledge gaps, supplement the researcher’s own learning or add to the research.

Secondary research can also be used in conjunction with primary research. Secondary research can become the formative research that helps pinpoint where further primary research is needed to find out specific information. It can also support or verify the findings from primary research.

You can use secondary research where high levels of control aren’t needed by the researcher, but a lot of knowledge on a topic is required from different angles.

Secondary research should not be used in place of primary research as both are very different and are used for various circumstances.

Questions to ask before conducting secondary research

Before you start your secondary research, ask yourself these questions:

  • Is there similar internal data that we have created for a similar area in the past?

If your organization has past research, it’s best to review this work before starting a new project. The older work may provide you with the answers, and give you a starting dataset and context of how your organization approached the research before. However, be mindful that the work is probably out of date and view it with that note in mind. Read through and look for where this helps your research goals or where more work is needed.

  • What am I trying to achieve with this research?

When you have clear goals, and understand what you need to achieve, you can look for the perfect type of secondary or primary research to support the aims. Different secondary research data will provide you with different information – for example, looking at news stories to tell you a breakdown of your market’s buying patterns won’t be as useful as internal or external data e-commerce and sales data sources.

  • How credible will my research be?

If you are looking for credibility, you want to consider how accurate the research results will need to be, and if you can sacrifice credibility for speed by using secondary sources to get you started. Bear in mind which sources you choose — low-credibility data sites, like political party websites that are highly biased to favor their own party, would skew your results.

  • What is the date of the secondary research?

When you’re looking to conduct research, you want the results to be as useful as possible , so using data that is 10 years old won’t be as accurate as using data that was created a year ago. Since a lot can change in a few years, note the date of your research and look for earlier data sets that can tell you a more recent picture of results. One caveat to this is using data collected over a long-term period for comparisons with earlier periods, which can tell you about the rate and direction of change.

  • Can the data sources be verified? Does the information you have check out?

If you can’t verify the data by looking at the research methodology, speaking to the original team or cross-checking the facts with other research, it could be hard to be sure that the data is accurate. Think about whether you can use another source, or if it’s worth doing some supplementary primary research to replicate and verify results to help with this issue.

We created a front-to-back guide on conducting market research, The ultimate guide to conducting market research , so you can understand the research journey with confidence.

In it, you’ll learn more about:

  • What effective market research looks like
  • The use cases for market research
  • The most important steps to conducting market research
  • And how to take action on your research findings

Download the free guide for a clearer view on secondary research and other key research types for your business.

Related resources

Market intelligence 10 min read, marketing insights 11 min read, ethnographic research 11 min read, qualitative vs quantitative research 13 min read, qualitative research questions 11 min read, qualitative research design 12 min read, primary vs secondary research 14 min read, request demo.

Ready to learn more about Qualtrics?

Primary and Secondary Research

secondary research in design thinking

You may be hearing and reading a lot about the terms “primary and secondary research.” Actually, they are fancy terms to describe very simple concepts. This video does a nice job of explaining the concepts.

So basically…

If you heard it from  someone else , as “second-hand” information, it’s  secondary research .

If  you were the first person  (the primary person) to discover something, then it’s  primary research .

Secondary research is important because it allows you to catch up on what everyone else has already found and researched (hopefully thoroughly). After a literature review or other form of secondary research, you will be prepared to venture into the topic with confidence because you will know the existing knowledge on the topic.

Primary research is special because you are the first (or one of the few) people to study the phenomena. Considering design is always changing and the ways people react to it are seldom the same (depending on culture, time period, context, and many other factors), it’s very likely your research may be exploring uncharted territory.

Designorate

Designorate

Design thinking, innovation, user experience and healthcare design

How to Successfully Apply the Inspiration in Design Thinking

In the last few decades, there were different models for the design thinking process that have been introduced such as the d.school design thinking, IDEO human-centered design , double diamond, IBM design thinking , Google design sprinting and others. Noticeably, these models share a similar flow of phases that starts with understanding the problem addressed, reflect it in a form of a prototype and ends with the product or service release and measuring outcomes.

Although, you may find different names for the stages, but all the design thinking processes tend to introduce an inclusive model that consider building an effective problem solution while considering the consumer in the heart of the development process. Accordingly, these processes share the following characteristics: Reflective – it clearly aims to understand the consumer’s problem and reflect it in a form of a design challenge or a brief that can later be transformed into a prototype Iterative – both the team and clients contribute to an on-going improvement for the created prototypes in order to reach the most efficient solution output Measurable – Even after the product is released to the market, its success is measured and feedback is collected to evaluate its efficiency and use the feedback in improving the future versions of the product

design thinking

Related articles:

  • Design Thinking Guide: What, Why and How
  • How Design Thinking Reshaped Microsoft Products
  • How Design Thinking Approaches Problems

Design Thinking: Inspiration

In our previously published Design Thinking Guide , we explored different processes and the stages flow in each model. So, how can we conduct each stage? And what are the tools that we can use in each stage? Here, we follow a generic design thinking process that, according to the Stanford D.school and IDEO models, include three main phases; inspiration, ideation, and implementation. We are going to start with the first phase; the inspiration.

design thinking

One of the essential steps to conduct before starting the inspiration phase or at the beginning of it is to clearly set up the project need to be covered during the following phases. This includes defining the challenge, forming the team and conducting a secondary research as following.

Defining the Project

At the beginning of the project, there should be a clear understanding of what we need to do. Answering this question can be visualized in a form of a design challenge and a project plan:

  • The design challenge is a simple statement that frames the project challenge and answers the question about what is the problem need to be solved? Reaching the design challenge may involve brainstorming in order to reach the clearest definition of the solution need to reached. For example, Improving the public transportation experience for elders, or lighten houses in rural areas.
  • The project plan turns the challenge into an effective project though defining the different elements such as the timeline, meetings dates, budget, staff and the different tasks and who is going to do each task.

Building the Team

The team should be an interdisciplinary team that reflects the different departments working in the project and management representatives in order to have a clear vision about all the required tasks, challenges that may face the project and keeps everyone on the same page. According to the Design Thinking at Apple Case Study , developing a new product require the assembly of a team that represents the different departments to work on new product ideas.

apple sustainable design

Secondary research

the secondary research aims to investigate the existing resources about the challenge need to be addressed through existing researches, projects, local news, and existing raw data. Conducting the secondary research is crucial to identify the validity of the challenge and if there are any literature solutions. This can help us to determine if we can move forward with the project and reduce the cost of the primary research.

Primary Research

The primary research refers to the original research conducted by the team to investigate the consumer needs regardless any previous literature researches. While there are various of methods that can be used over the course of conducting a primary research. qualitative research methods such as interviews, focus groups, and observation are efficient tools to build an empathic experience about the consumer’s needs and experience.

You can use the interviews to not only get answers to questions but also to build an idea about the user experience through watching the consumers talking and observing their body language. In order to conduct an effective interview 1) the research team should be experienced enough to collect the data based on the attendees’ answers and their impressions, 2) the attendees should a representative field sample the reflect the end user, and 3) the interview questions should be chosen wisely to answer the targeted questions. Opening questions or general questions are helpful to put the consumer in the loop before jumping to more detailed questions.

While the individual interviews can give an in-depth understanding of the consumer through visiting them in their houses and workplace, it is expensive and time-consuming to meet with each person individually.

Group Interviews

The group interviews include more than one participant to hear different voices and diversity of opinions. Although it is less expensive and time-consuming comparing with the individual interviews, it doesn’t give insight view about the participant life and experience.

Observation (Immersion)

Observation is one of the essential tools in the primary research can allow the design research team to involve with the consumer life by actually being part of it. The team may travel to consumer’s place and see how they use the different products and take notes based on their observation. While it is one of the expensive tools especially if the consumer is living in another country, it can help designers to build their solutions upon real knowledge their saw and experienced base on their field trips. IKEA design process provides a successful example of using observation to understand their consumers.

ikea refugee house flat packing

The inspiration phase is where the team conducts the design research that can be considered the core of all the news phases. By the end of this phase, the team should have a clear understanding about the design challenge need to be addressed, a project plan for the timeline, budget, and team. This stage is crucial to ensure the success of the project as it is considered a pivot point where the different stages need to use to during the measurements and iteration after each of the ideation and implementation phases.

Wait, Join my Newsletters!

As always, I try to come to you with design ideas, tips, and tools for design and creative thinking. Subscribe to my newsletters to receive new updated design tools and tips!

Dr Rafiq Elmansy

As an academic and author, I've had the privilege of shaping the design landscape. I teach design at the University of Leeds and am the Programme Leader for the MA Design, focusing on design thinking, design for health, and behavioural design. I've developed and taught several innovative programmes at Wrexham Glyndwr University, Northumbria University, and The American University in Cairo. I'm also a published book author and the proud founder of Designorate.com, a platform that has been instrumental in fostering design innovation. My expertise in design has been recognised by prestigious organizations. I'm a fellow of the Higher Education Academy (HEA), the Design Research Society (FDRS), and an Adobe Education Leader. Over the course of 20 years, I've had the privilege of working with esteemed clients such as the UN, World Bank, Adobe, and Schneider, contributing to their design strategies. For more than 12 years, I collaborated closely with the Adobe team, playing a key role in the development of many Adobe applications.

secondary research in design thinking

You May Also Like

design research

What are Design Research Types and Applications?

mind mapping

If You Believe that Design Thinking is a Waste of Time, You’re Doing It Wrong

secondary research in design thinking

Customer Journey Mapping: A Complete Guide for Designers

secondary research in design thinking

Applying Design Thinking in Education to Fight Extremism

design thinking

Design Thinking Tools: Reverse Brainstorming

5 whys method

How to Apply 5 Whys Root Cause Analysis 

One thought on “ how to successfully apply the inspiration in design thinking ”.

secondary research in design thinking

would love to see a higher-res image where one can actually read the text 🙂

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Sign me up for the newsletter!

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Design | Step-by-Step Guide with Examples

Published on 5 May 2022 by Shona McCombes . Revised on 20 March 2023.

A research design is a strategy for answering your research question  using empirical data. Creating a research design means making decisions about:

  • Your overall aims and approach
  • The type of research design you’ll use
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research aims and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, frequently asked questions.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities – start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism, run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types. Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships, while descriptive and correlational designs allow you to measure variables and describe relationships between them.

With descriptive and correlational designs, you can get a clear picture of characteristics, trends, and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analysing the data.

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study – plants, animals, organisations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region, or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalise your results to the population as a whole.

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study, your aim is to deeply understand a specific context, not to generalise to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question.

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviours, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews.

Observation methods

Observations allow you to collect data unobtrusively, observing characteristics, behaviours, or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected – for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are reliable and valid.

Operationalisation

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalisation means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in – for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced , while validity means that you’re actually measuring the concept you’re interested in.

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method, you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample – by mail, online, by phone, or in person?

If you’re using a probability sampling method, it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method, how will you avoid bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organising and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymise and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well organised will save time when it comes to analysing them. It can also help other researchers validate and add to your findings.

On their own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyse the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarise your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarise your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

There are many other ways of analysing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

Operationalisation means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioural avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalise the variables that you want to measure.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2023, March 20). Research Design | Step-by-Step Guide with Examples. Scribbr. Retrieved 21 May 2024, from https://www.scribbr.co.uk/research-methods/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

  • Architecture and Design
  • Asian and Pacific Studies
  • Business and Economics
  • Classical and Ancient Near Eastern Studies
  • Computer Sciences
  • Cultural Studies
  • Engineering
  • General Interest
  • Geosciences
  • Industrial Chemistry
  • Islamic and Middle Eastern Studies
  • Jewish Studies
  • Library and Information Science, Book Studies
  • Life Sciences
  • Linguistics and Semiotics
  • Literary Studies
  • Materials Sciences
  • Mathematics
  • Social Sciences
  • Sports and Recreation
  • Theology and Religion
  • Publish your article
  • The role of authors
  • Promoting your article
  • Abstracting & indexing
  • Publishing Ethics
  • Why publish with De Gruyter
  • How to publish with De Gruyter
  • Our book series
  • Our subject areas
  • Your digital product at De Gruyter
  • Contribute to our reference works
  • Product information
  • Tools & resources
  • Product Information
  • Promotional Materials
  • Orders and Inquiries
  • FAQ for Library Suppliers and Book Sellers
  • Repository Policy
  • Free access policy
  • Open Access agreements
  • Database portals
  • For Authors
  • Customer service
  • People + Culture
  • Journal Management
  • How to join us
  • Working at De Gruyter
  • Mission & Vision
  • De Gruyter Foundation
  • De Gruyter Ebound
  • Our Responsibility
  • Partner publishers

secondary research in design thinking

Your purchase has been completed. Your documents are now available to view.

Design Thinking in Education: Perspectives, Opportunities and Challenges

The article discusses design thinking as a process and mindset for collaboratively finding solutions for wicked problems in a variety of educational settings. Through a systematic literature review the article organizes case studies, reports, theoretical reflections, and other scholarly work to enhance our understanding of the purposes, contexts, benefits, limitations, affordances, constraints, effects and outcomes of design thinking in education. Specifically, the review pursues four questions: (1) What are the characteristics of design thinking that make it particularly fruitful for education? (2) How is design thinking applied in different educational settings? (3) What tools, techniques and methods are characteristic for design thinking? (4) What are the limitations or negative effects of design thinking? The goal of the article is to describe the current knowledge base to gain an improved understanding of the role of design thinking in education, to enhance research communication and discussion of best practice approaches and to chart immediate avenues for research and practice.

Aflatoony, L., Wakkary, R., & Neustaedter, C. (2018). Becoming a Design Thinker: Assessing the Learning Process of Students in a Secondary Level Design Thinking Course. International Journal of Art & Design Education , 37 (3), 438–453. 10.1111/jade.12139 Search in Google Scholar

Altringer, B., & Habbal, F. (2015). Embedding Design Thinking in a Multidisciplinary Engineering Curriculum. In VentureWell. Proceedings of Open, the Annual Conference (p. 1). National Collegiate Inventors & Innovators Alliance. Search in Google Scholar

Anderson, N. (2012). Design Thinking: Employing an Effective Multidisciplinary Pedagogical Framework to Foster Creativity and Innovation in Rural and Remote Education. Australian and International Journal of Rural Education , 22 (2), 43–52. Search in Google Scholar

Apel, A., Hull, P., Owczarek, S., & Singer, W. (2018). Transforming the Enrollment Experience Using Design Thinking. College and University , 93 (1), 45–50. Search in Google Scholar

Badwan, B., Bothara, R., Latijnhouwers, M., Smithies, A., & Sandars, J. (2018). The importance of design thinking in medical education. Medical Teacher , 40 (4), 425–426. 10.1080/0142159X.2017.1399203 Search in Google Scholar

Beligatamulla, G., Rieger, J., Franz, J., & Strickfaden, M. (2019). Making Pedagogic Sense of Design Thinking in the Higher Education Context. Open Education Studies , 1 (1), 91–105. 10.1515/edu-2019-0006 Search in Google Scholar

Bosman, L. (2019). From Doing to Thinking: Developing the Entrepreneurial Mindset through Scaffold Assignments and Self-Regulated Learning Reflection. Open Education Studies , 1 (1), 106–121. 10.1515/edu-2019-0007 Search in Google Scholar

Bowler, L. (2014). Creativity through “Maker” Experiences and Design Thinking in the Education of Librarians. Knowledge Quest , 42 (5), 58–61. Search in Google Scholar

Bross, J., Acar, A. E., Schilf, P., & Meinel, C. (2009, August). Spurring Design Thinking through educational weblogging. In Computational Science and Engineering, 2009. CSE’09. International Conference on (Vol. 4, pp. 903–908). IEEE. 10.1109/CSE.2009.207 Search in Google Scholar

Brown, T. (2009). Change by Design: How Design Thinking Transforms Organizations and Inspires Innovation . New York: HarperCollins Publishers. Search in Google Scholar

Brown, T., & Wyatt, J. (2010). Design thinking for social innovation. Development Outreach , 12 (1), 29–43. 10.1596/1020-797X_12_1_29 Search in Google Scholar

Brown, A. (2018). Exploring Faces and Places of Makerspaces. AACE Review. Retrieved from March 3, 2019 https://www.aace.org/review/exploring-faces-places-makerspaces/ Search in Google Scholar

Buchanan, R. (1992). Wicked problems in design thinking. Design Issues , 8 (2), 5–21. 10.2307/1511637 Search in Google Scholar

Callahan, K. C. (2019). Design Thinking in Curricula. In The International Encyclopedia of Art and Design Education (pp. 1–6). American Cancer Society. 10.1002/9781118978061.ead069 Search in Google Scholar

Camacho, M. (2018). An integrative model of design thinking. In The 21st DMI: Academic Design Management Conference, ‘Next Wave’, London, Ravensbourne, United Kingdom, 1 – 2 August 2018 (p. 627). Search in Google Scholar

Cañas, A. J., Novak, J. D., & González, F. (2004). Using concept maps in qualitative research. In Concept Maps: Theory, Methodology, Technology Proc. of the First Int. Conference on Concept Mapping (pp. 7–15). Search in Google Scholar

Cantoni, L., Marchiori, E., Faré, M., Botturi, L., & Bolchini, D. (2009, October). A systematic methodology to use lego bricks in web communication design. In Proceedings of the 27th ACM international conference on Design of communication (pp. 187–192). ACM. 10.1145/1621995.1622032 Search in Google Scholar

Carroll, M. P. (2014). Shoot for the Moon! the Mentors and the Middle Schoolers Explore the Intersection of Design Thinking and STEM. Journal of Pre-College Engineering Education Research , 4 (1), 14–30. 10.7771/2157-9288.1072 Search in Google Scholar

Carroll, M., Goldman, S., Britos, L., Koh, J., Royalty, A., & Hornstein, M. (2010). Destination, Imagination and the Fires within: Design Thinking in a Middle School Classroom. International Journal of Art & Design Education , 29 (1), 37–53. 10.1111/j.1476-8070.2010.01632.x Search in Google Scholar

Cassim, F. (2013). Hands on, hearts on, minds on: Design thinking within an education context. International Journal of Art & Design Education , 32 (2), 190–202. 10.1111/j.1476-8070.2013.01752.x Search in Google Scholar

Carlgren, L., Rauth, I., & Elmquist, M. (2016). Framing design thinking: The concept in idea and enactment. Creativity and Innovation Management , 25 (1), 38–57. 10.1111/caim.12153 Search in Google Scholar

Cochrane, T., & Munn, J. (2016). EDR and Design Thinking: Enabling Creative Pedagogies. In Proceedings of EdMedia 2016--World Conference on Educational Media and Technology (pp. 315–324). Vancouver, BC, Canada: Association for the Advancement of Computing in Education (AACE). Retrieved April 3, 2018 from https://www.learntechlib.org/p/172969/ . Search in Google Scholar

Coleman, M. C. (2016). Design Thinking and the School Library. Knowledge Quest , 44 (5), 62–68. Search in Google Scholar

Cook, K. L., & Bush, S. B. (2018). Design Thinking in Integrated STEAM Learning: Surveying the Landscape and Exploring Exemplars in Elementary Grades. School Science and Mathematics , 118 , 93–103. 10.1111/ssm.12268 Search in Google Scholar

Crossan, M. M., & Apaydin, M. (2010). A multi-dimensional framework of organizational innovation: A systematic review of the literature. Journal of Management Studies, 47 (6), 1154–1191. 10.1111/j.1467-6486.2009.00880.x Search in Google Scholar

Dorst, K. (2011). The core of ‘design thinking’ and its application. Design Studies , 32 (6), 521–532. 10.1016/j.destud.2011.07.006 Search in Google Scholar

Douglass, H. (2016). Engineering Encounters: No, David! but Yes, Design! Kindergarten Students Are Introduced to a Design Way of Thinking. Science and Children , 53 (9), 69–75. 10.2505/4/sc16_053_09_69 Search in Google Scholar

Dunne, D., & Martin, R. (2006). Design Thinking and How It Will Change Management Education: An Interview and Discussion. Academy of Management Learning & Education , 5 (4), 512–523. 10.5465/amle.2006.23473212 Search in Google Scholar

Elsbach, K. D., & Stigliani, I. (2018). Design Thinking and Organizational Culture: A Review and Framework for Future Research. Journal of Management , 0149206317744252. 10.1177/0149206317744252 Search in Google Scholar

Eppler, M. J., & Kernbach, S. (2016). Dynagrams: Enhancing design thinking through dynamic diagrams. Design Studies , 47 , 91–117. 10.1016/j.destud.2016.09.001 Search in Google Scholar

Ferguson, R., Barzilai, S., Ben-Zvi, D., Chinn, C. A., Herodotou, C., Hod, Y., Kali, Y., Kukulska-Hulme, A., Kupermintz, H., McAndrew, P., Rienties, B., Sagy, O., Scanlon, E., Sharples, M., Weller, M., & Whitelock, D. (2017). Innovating Pedagogy 2017: Open University Innovation Report 6. Milton Keynes: The Open University, UK. Retrieved April 3, 2018 from https://iet.open.ac.uk/file/innovating-pedagogy-2017.pdf Search in Google Scholar

Ferguson, R., Coughlan, T., Egelandsdal, K., Gaved, M., Herodotou, C., Hillaire, G., ... & Misiejuk, K. (2019). Innovating Pedagogy 2019: Open University Innovation Report 7. Retrieved March 3, 2019 from https://iet.open.ac.uk/file/innovating-pedagogy-2019.pdf Search in Google Scholar

Fabri, M., Andrews, P. C., & Pukki, H. K. (2016). Using design thinking to engage autistic students in participatory design of an online toolkit to help with transition into higher education. Journal of Assistive Technologies , 10 (2), 102–114. 10.1108/JAT-02-2016-0008 Search in Google Scholar

Fouché, J., & Crowley, J. (2017). Kidding around with Design Thinking. Educational Leadership , 75 (2), 65–69. Search in Google Scholar

Fontaine, L. (2014). Learning Design Thinking by Designing Learning Experiences: A Case Study in the Development of Strategic Thinking Skills through the Design of Interactive Museum Exhibitions. Visible Language , 48 (2). Search in Google Scholar

Gallagher, A., & Thordarson, K. (2018). Design Thinking for School Leaders: Five Roles and Mindsets That Ignite Positive Change . ASCD. Search in Google Scholar

Gestwicki, P., & McNely, B. (2012). A case study of a five-step design thinking process in educational museum game design. Proceedings of Meaningful Play . Search in Google Scholar

Glen, R., Suciu, C., Baughn, C. C., & Anson, R. (2015). Teaching design thinking in business schools. The International Journal of Management Education , 13 (2), 182–192. 10.1016/j.ijme.2015.05.001 Search in Google Scholar

Goldman, S., Kabayadondo, Z., Royalty, A., Carroll, M. P., & Roth, B. (2014). Student teams in search of design thinking. In Design Thinking Research (pp. 11–34). Springer. 10.1007/978-3-319-01303-9_2 Search in Google Scholar

Goldschmidt, G. (2017). Design Thinking: A Method or a Gateway into Design Cognition?. She Ji: The Journal of Design, Economics, and Innovation , 3 (2), 107–112. 10.1016/j.sheji.2017.10.009 Search in Google Scholar

Gottlieb, M., Wagner, E., Wagner, A., & Chan, T. (2017). Applying design thinking principles to curricular development in medical education. AEM Education and Training , 1 (1), 21–26. 10.1002/aet2.10003 Search in Google Scholar

Gross, K., & Gross, S. (2016). Transformation: Constructivism, design thinking, and elementary STEAM. Art Education , 69 (6), 36–43. 10.1080/00043125.2016.1224869 Search in Google Scholar

Grots, A., & Creuznacher, I. (2016). Design Thinking: Process or Culture? In Design Thinking for Innovation (pp. 183–191). Springer. Search in Google Scholar

Groth, C. (2017). Making sense through hands: Design and craft practice analysed as embodied cognition . Thesis. Search in Google Scholar

Harth, T., & Panke, S. (2018). Design Thinking in Teacher Education: Preparing Engineering Students for Teaching at Vocational Schools. In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 392–407). Association for the Advancement of Computing in Education (AACE). Search in Google Scholar

Harth, T. & Panke, S. (2019). Creating Effective Physical Learning Spaces in the Digital Age – Results of a Student-Centered Design Thinking Workshop. In S. Carliner (Ed.), Proceedings of E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 284-294). Association for the Advancement of Computing in Education (AACE). Search in Google Scholar

Hawryszkiewycz, I., Pradhan, S., & Agarwal, R. (2015). Design thinking as a framework for fostering creativity in management and information systems teaching programs. In Pacific Asia Conference on Information Systems . AISEL. Search in Google Scholar

Hernández-Ramírez, R. (2018). On Design Thinking, Bullshit, and Innovation. Journal of Science and Technology of the Arts , 10 (3), 2–45. 10.7559/citarj.v10i3.555 Search in Google Scholar

Hodgkinson, G. (2013). Teaching Design Thinking. In J. Herrington, A. Couros & V. Irvine (Eds.), Proceedings of EdMedia 2013--World Conference on Educational Media and Technology (pp. 1520–1524). Victoria, Canada: Association for the Advancement of Computing in Education (AACE). Search in Google Scholar

Holzer, A., Gillet, D., & Lanerrouza, M. (2019). Active Interdisciplinary Learning in a Design Thinking Course: Going to Class for a Reason , 906–911. https://doi.org/10.1109/TALE.2018.8615292 10.1109/TALE.2018.8615292 Search in Google Scholar

Jacobs, C. D. (2016). “Making Is Thinking”: The Design Practice of Crafting Strategy. In Design Thinking for Innovation (pp. 131–140). Springer. 10.1007/978-3-319-26100-3_9 Search in Google Scholar

Jensen, C. N., Seager, T. P., & Cook-Davis, A. (2018). LEGO® SERIOUS PLAY® In Multidisciplinary Student Teams. International Journal of Management and Applied Research , 5 (4), 264–280. 10.18646/2056.54.18-020 Search in Google Scholar

Johansson-Sköldberg, U., Woodilla, J., & Çetinkaya, M. (2013). Design thinking: Past, present and possible futures. Creativity and Innovation Management , 22 (2), 121–146. 10.1111/caim.12023 Search in Google Scholar

Jordan, S., & Lande, M. (2016). Additive innovation in design thinking and making. International Journal of Engineering Education , 32 (3), 1438–1444. Search in Google Scholar

Kaptelinin, V., & Nardi, B. (2012). Activity theory in HCI: Fundamentals and Reflections. Synthesis Lectures Human-Centered Informatics , 5 (1), 1–105. 10.2200/S00413ED1V01Y201203HCI013 Search in Google Scholar

Keele, S. (2007). Guidelines for performing systematic literature reviews in software engineering. In Technical report, Ver. 2.3 EBSE Technical Report. EBSE. sn. Search in Google Scholar

Kimbell, L. (2011). Rethinking design thinking: Part I. Design and Culture , 3 (3), 285–306. 10.2752/175470811X13071166525216 Search in Google Scholar

Koria, M., Graff, D., & Karjalainen, T.-M. (2011). Learning design thinking: International design business management at Aalto University. Review on Design, Innovation and Strategic Management , 2 (1), 1–21. Search in Google Scholar

Kwek, S. H. (2011). Innovation in the Classroom: Design Thinking for 21st Century Learning. (Master’s thesis). Retrieved March 3, 2019 from http://www.stanford.edu/group/redlab/cgibin/publications_resources.php Search in Google Scholar

Larson, L. (2017). Engaging Families in the Galleries Using Design Thinking. Journal of Museum Education , 42 (4), 376–384. 10.1080/10598650.2017.1379294 Search in Google Scholar

Leeder, T. (2019). Learning to mentor in sports coaching: A design thinking approach. Sport, Education and Society , 24 (2), 208–211. 10.1080/13573322.2018.1563403 Search in Google Scholar

Lee, D., Yoon, J., & Kang, S.-J. (2015). The Suggestion of Design Thinking Process and its Feasibility Study for Fostering Group Creativity of Elementary-Secondary School Students in Science Education. Journal of The Korean Association For Science Education , 35 , 443–453. 10.14697/jkase.2015.35.3.0443 Search in Google Scholar

Levy, Y., & Ellis, T. J. (2006). A systems approach to conduct an effective literature review in support of information systems research. Informing Science: International Journal of an Emerging Transdiscipline, 9 (1), 181–212. 10.28945/479 Search in Google Scholar

Leifer, L., & Meinel, C. (2016). Manifesto: Design thinking becomes foundational. In Design Thinking Research (pp. 1–4). Springer. 10.1007/978-3-319-19641-1_1 Search in Google Scholar

Leverenz, C. S. (2014). Design thinking and the wicked problem of teaching writing. Computers and Composition , 33 , 1–12. 10.1016/j.compcom.2014.07.001 Search in Google Scholar

Liedtka, J. (2015). Perspective: Linking design thinking with innovation outcomes through cognitive bias reduction. Journal of Product Innovation Management , 32 (6), 925–938. 10.1111/jpim.12163 Search in Google Scholar

Lindberg, T., Meinel, C., & Wagner, R. (2011). Design thinking: A fruitful concept for it development? In Design Thinking (pp. 3–18). Springer Berlin Heidelberg. Search in Google Scholar

Lor, R. (2017). Design Thinking in Education: A Critical Review of Literature. In International academic conference on social sciences and management / Asian conference on education and psychology. conference proceedings (pp. 37–68). Bangkok, Thailand. Search in Google Scholar

Louridas, P. (1999). Design as bricolage: anthropology meets design thinking. Design Studies , 20 (6), 517–535. 10.1016/S0142-694X(98)00044-1 Search in Google Scholar

MacLeod, S., Dodd, J., & Duncan, T. (2015). New museum design cultures: harnessing the potential of design and ‘design thinking’ in museums. Museum Management and Curatorship , 30 (4), 314–341. 10.1080/09647775.2015.1042513 Search in Google Scholar

Martin, R. (2009). The design of business: Why design thinking is the next competitive advantage . Harvard Business Press. Search in Google Scholar

Matthews, J. H., & Wrigley, C. (2017). Design and design thinking in business and management higher education. Journal of Learning Design , 10 (1), 41–54. 10.5204/jld.v9i3.294 Search in Google Scholar

McLaughlin, J. E., Wolcott, M. D., Hubbard, D., Umstead, K., & Rider, T. R. (2019). A qualitative review of the design thinking framework in health professions education. BMC Medical Education , 19 , 98. 10.1186/s12909-019-1528-8 Search in Google Scholar

Melles, G., Howard, Z., & Thompson-Whiteside, S. (2012). Teaching design thinking: Expanding horizons in design education. Procedia – Social and Behavioral Sciences, 31, 162–166. 10.1016/j.sbspro.2011.12.035 Search in Google Scholar

Melles, G., Anderson, N., Barrett, T., & Thompson-Whiteside, S. (2015). Problem finding through design thinking in education. In Inquiry-based learning for multidisciplinary programs: A conceptual and practical resource for educators (pp. 191–209). Emerald Group Publishing Limited. Search in Google Scholar

Micheli, P., Wilner, S. J., Bhatti, S., Mura, M., & Beverland, M. B. (2018). Doing Design Thinking: Conceptual Review, Synthesis and Research Agenda. Journal of Product Innovation Management . Search in Google Scholar

Molinari, A., & Gasparini, A. A. (2019). When Students Design University: A Case Study of Creative Interdisciplinarity between Design Thinking and Humanities. Open Education Studies , 1 (1), 24–52. 10.1515/edu-2019-0002 Search in Google Scholar

Motschnig, R., Pfeiffer, D., Gawin, A., Gawin, P., Steiner, M., & Streli, L. (2019). Enhancing stanford design thinking for kids with digital technologies a participatory action research approach to challenge-based learning . 2018-October . Search in Google Scholar

Munyai, K. (2016). Design Thinking: A Methodology towards Sustainable Problem Solving in Higher Education in South Africa . International Association for the Development of the Information Society. Retrieved from https://eric.ed.gov/?q=title%3a%22Design+Thinking%22&pg=3&id=ED571612 Search in Google Scholar

Mumford, C., Zoller, T., & Proforta, T. (2016). How to Teach Design Thinking within Entrepreneurship- A Practical Guide. In United States Association for Small Business and Entrepreneurship. Conference Proceedings (pp. 1–3). Boca Raton: United States Association for Small Business and Entrepreneurship. Search in Google Scholar

Ohly, S., Plückthun, L., & Kissel, D. (2017). Developing Students’ Creative Self-Efficacy Based on Design-Thinking: Evaluation of an Elective University Course. Psychology Learning & Teaching , 16 (1), 125–132. 10.1177/1475725716681714 Search in Google Scholar

Panke, S. (2016). Creative Needs Assessment in Instructional Design: Selected Examples. In EdMedia+ Innovate Learning (pp. 349–353). Association for the Advancement of Computing in Education (AACE). Search in Google Scholar

Panke, S., Gaiser, B., & Werner, B. (2007). Evaluation as Impetus for Innovations in E-learning—Applying personas to the design of community functions. MERLOT Journal of Online Learning and Teaching , 3 (2), 179–190. Search in Google Scholar

Panke, S., Allen, G., & McAvinchey, D. (2014). Re-Envisioning the University Website: Participatory Design Case Study. In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 1540–1549). Association for the Advancement of Computing in Education (AACE). Search in Google Scholar

Panke, S., & Harth, T. (2018). Design Thinking for Inclusive Community Design:(How) Does it Work? In EdMedia+ Innovate Learning (pp. 284–296). Association for the Advancement of Computing in Education (AACE). Search in Google Scholar

Panke, S. (2018a). EdTech Research – Where to Publish, How to Share (Part 3): Social Networks and Identifiers for Sharing Articles and Monitoring Citations. AACE Review. Retrieved on March 3, 2019 from http://www.aace.org/review/edtech-research-publish-share-part-3-social-networks-identifiers-sharing-articles-monitoring-citations/ Search in Google Scholar

Panke, S. (2018b). EdTech Research: Finding, Organizing and Citing Research – Bibliographic Formats & Tools. AACE Review. Retrieved on March 3, 2019 from http://www.aace.org/review/edtech-research-finding-organizing-and-citing-research-bibliographic-formats-tools/ Search in Google Scholar

Parrish, J., Parks, R., & Taylor, A. (2017). Building Bridges with Student Mentoring: A Design Thinking Approach. College and University , 92 (1), 31. Search in Google Scholar

Peters, R. A., & Maatman, J. (2017). Long-Term Trends Accentuate the Import of Creative and Critical Thinking Skills Developed by Design Thinking and Ill-Defined Questions. Teaching Public Administration , 35 (2), 190–208. 10.1177/0144739416680850 Search in Google Scholar

Pope-Ruark, R., Moses, J., & Tham, J. (2019). Iterating the Literature: An Early Annotated Bibliography of Design-Thinking Resources. Journal of Business and Technical Communication , 33 (4), 456–465. 10.1177/1050651919854096 Search in Google Scholar

Primus, D. J., & Sonnenburg, S. (2018). Flow Experience in Design Thinking and Practical Synergies with Lego Serious Play. Creativity Research Journal , 30 (1), 104–112. 10.1080/10400419.2018.1411574 Search in Google Scholar

Pruitt, J., & Grudin, J. (2003, June). Personas: practice and theory. In Proceedings of the 2003 conference on Designing for user experiences (pp. 1–15). ACM. 10.1145/997078.997089 Search in Google Scholar

Purdy, J. P. (2014). What can design thinking offer writing studies? College Composition and Communication , 612–641. Search in Google Scholar

Rauth, I., Köppen, E., Jobst, B., & Meinel, C. (2010). Design thinking: an educational model towards creative confidence. In DS 66-2: Proceedings of the 1st international conference on design creativity (ICDC 2010) . Search in Google Scholar

Razzouk, R., & Shute, V. (2012). What Is Design Thinking and Why Is It Important? Review of Educational Research , 82 (3), 330–348. 10.3102/0034654312457429 Search in Google Scholar

Renard, H. (2014). Cultivating Design Thinking in Students through Material Inquiry. International Journal of Teaching and Learning in Higher Education , 26 (3), 414–424. Search in Google Scholar

Retna, K. S. (2016). Thinking about “Design Thinking”: A Study of Teacher Experiences. Asia Pacific Journal of Education , 36 , 5–19. 10.1080/02188791.2015.1005049 Search in Google Scholar

Rittel, H. W., & Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Sciences, 4 (2), 155–169. 10.1007/BF01405730 Search in Google Scholar

Roos, D. J., & Grey, J. (2004). Playing Seriously with Science Strategy. Imagination Lab Foundation Working Papers Series (45). Retrieved on March 3, 2019 from http://www.imagilab.org/research_workingpapers.htm#46 Search in Google Scholar

Sanders, E. B. N., & Stappers, P. J. (2008). Co-creation and the new landscapes of design. CoDesign , 4 (1), 5–18. 10.1080/15710880701875068 Search in Google Scholar

Schlenker, L. (2014). Design in Practice: Scenarios for Improving Management Education. In 11th International Conference on Cognition and Exploratory Learning in Digital Age, CELDA (pp. 187–194). Porto, Portugal. Search in Google Scholar

Sharples, M., McAndrew, P., Weller, M., Ferguson, R., FitzGerald, E., Hirst, T., & Whitelock, D. (2014). Innovating pedagogy 2014. Milton Keynes, UK: The Open University. Search in Google Scholar

Sharples, M., de Roock, R., Ferguson, R., Gaved, M., Herodotou, C., Koh, E., … Wong, L. H. (2016). Innovating Pedagogy 2016: Open University Innovation Report 5 . Milton Keynes: The Open University. Search in Google Scholar

Sheehan, N. T., Gujarathi, M. R., Jones, J. C., & Phillips, F. (2018). Using Design Thinking to Write and Publish Novel Teaching Cases: Tips from Experienced Case Authors. Journal of Management Education , 42 (1), 135–160. 10.1177/1052562917741179 Search in Google Scholar

Skaggs, P. (2018). Design Thinking: Empathy through Observation, Experience, and Inquiry. In E. Langran & J. Borup (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 1168–1172). Washington, D.C., United States: Association for the Advancement of Computing in Education (AACE). Search in Google Scholar

Sonalkar, N., Mabogunje, A., Pai, G., Krishnan, A., & Roth, B. (2016). Diagnostics for design thinking teams. In Design Thinking Research (pp. 35–51). Springer. 10.1007/978-3-319-19641-1_4 Search in Google Scholar

Suzianti, A., & Atthousi, H. N. (2019). Implementation of design thinking approach in designing learning support tools in the classroom for hearing impaired person (case study: Elementary school students in SLB-B Santi Rama). ACM International Conference Proceeding Series, 75–80. 10.1145/3332324.3332338 Search in Google Scholar

Taheri, M., Unterholzer, T., Hölzle, K., & Meinel, C. (2016). An educational perspective on design thinking learning outcomes. In ISPIM Innovation Symposium (p. 1). The International Society for Professional Innovation Management (ISPIM). Search in Google Scholar

Thoring, K., & Müller, R. M. (2011). Understanding the Creative Mechanisms of Design Thinking: An Evolutionary Approach. Proceedings of the Second Conference on Creativity and Innovation in Design , 137–147. 10.1145/2079216.2079236 Search in Google Scholar

Valentim, N. M. C., Silva, W., & Conte, T. (2017). The students’ perspectives on applying design thinking for the design of mobile applications. In Proceedings of the 39th International Conference on Software Engineering: Software Engineering and Education Track (pp. 77–86). IEEE Press. Search in Google Scholar

Vaughn, M. (2018). How Making and Makerspaces Promote Healthy Mindsets for Learning (p. 8). Search in Google Scholar

van de Grift, T., & Kroeze, R. (2016). Design Thinking as a Tool for Interdisciplinary Education in Health Care. Academic Medicine , 91 (1), 1234–1238. 10.1097/ACM.0000000000001195 Search in Google Scholar

von Thienen, J., Meinel, C., & Nicolai, C. (2014). How design thinking tools help to solve wicked problems. In Design thinking research (pp. 97–102). Springer. 10.1007/978-3-319-01303-9_7 Search in Google Scholar

von Thienen, J., Royalty, A., & Meinel, C. (2017). Design thinking in higher education: How students become dedicated creative problem solvers. In Handbook of research on creative problem-solving skill development in higher education (pp. 306–328). IGI Global. 10.4018/978-1-5225-0643-0.ch014 Search in Google Scholar

von Thienen, J., Clancey, W., Corazza, G., & Meinel, C. (2017). Theoretical Foundations of Design Thinking. Part I: John E. Arnold’s Creative Thinking Theories. In Theoretical Foundations of Design Thinking (pp. 13–40). IGI Global. Search in Google Scholar

von Thienen, J., Meinel, C., & Corazza, G. E. (2017). A short theory of failure. In Electronic Colloquium on Design Thinking Research (Vol. 17, pp. 1–5). Search in Google Scholar

Watson, A. D. (2015). Design Thinking for Life. Art Education , 68 (3), 12–18. 10.1080/00043125.2015.11519317 Search in Google Scholar

Wright, G., & West, R. (2010). Using Design Thinking to Improve Student Innovation. In J. Sanchez & K. Zhang (Eds.), Proceedings of E-Learn 2010--World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 391–395). Orlando, Florida, USA: Association for the Advancement of Computing in Education (AACE). Search in Google Scholar

Wrigley, C., & Straker, K. (2017). Design thinking pedagogy: The educational design ladder. Innovations in Education and Teaching International , 54 (4), 374–385. 10.1080/14703297.2015.1108214 Search in Google Scholar

Webster, J., & Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature review. MIS Quarterly , 26 (2), 13–23. Search in Google Scholar

Yeager, D. S., Romero, C., Paunesku, D., Hulleman, C. S., Schneider, B., Hinojosa, C., … Dweck, C. S. (2016). Using Design Thinking to Improve Psychological Interventions: The Case of the Growth Mindset During the Transition to High School. Journal of Educational Psychology , 108 (3), 374–391. 10.1037/edu0000098 Search in Google Scholar

Zientek, L. R., Werner, J. M., Campuzano, M. V., & Nimon, K. (2018). The use of Google Scholar for research and research dissemination. New Horizons in Adult Education and Human Resource Development , 30 (1), 39–46. 10.1002/nha3.20209 Search in Google Scholar

© 2019 Stefanie Panke, published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 Public License.

  • X / Twitter

Supplementary Materials

Please login or register with De Gruyter to order this product.

Open Education Studies

Journal and Issue

Articles in the same issue.

secondary research in design thinking

A meta-synthesis of primary and secondary student design cognition research

  • Published: 12 February 2019
  • Volume 30 , pages 243–274, ( 2020 )

Cite this article

secondary research in design thinking

  • Greg J. Strimel   ORCID: orcid.org/0000-0002-4847-4526 1 ,
  • Eunhye Kim 2 ,
  • Michael E. Grubbs 3 &
  • Tanner J. Huffman 4  

1061 Accesses

8 Citations

6 Altmetric

Explore all metrics

Design within primary and secondary schools has been increasingly emphasized over the past decade. As a response to this increased interest, qualitative research examining students’ cognitive processes involved in the practices of design has been on the rise. These studies have commonly employed the concurrent think-aloud research methodology to examine and describe an individual’s or group’s thought processes while engaged in a design task. However, the variety of coding schemes used to code and describe the collected think-aloud data has limited the synthesis of findings across design cognition studies, which can be a concern as the synthesis of qualitative studies can potentially lead to the development of more formal and possibly more generalizable theories (Glaser and Strauss in Status passage, Aldine, Chicago, 1971 ). Nevertheless, a study conducted by Grubbs, Strimel, and Kim ( 2018 ) examined the different coding schemes used in analyzing the design cognition of primary and secondary students that were published between 1995 and 2016. Their investigation led to the identification of three distinct themes for the foundation and intent of the various design cognition coding schemes and provided a basis for a more informed meta-synthesis of design cognition research. Therefore, this study examined the design cognition studies identified by Grubbs et al. ( 2018 ) and synthesized both the findings and discussions of each, according to the three coding scheme themes. The results of this investigation can provide deeper insights into primary and secondary students’ design thinking and can help inform design pedagogy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

secondary research in design thinking

Similar content being viewed by others

secondary research in design thinking

A systematic literature review of design-based research from 2004 to 2013

secondary research in design thinking

Reflection on Design-Based Research: Challenges and Future Direction

secondary research in design thinking

Design principles in design research: A commentary

Atman, C., Adams, R. S., Cardella, M., Turns, J., Mosborg, S., & Saleem, J. (2007). Engineering design processes: A comparison of students and expert practitioners. Journal of Engineering Education, 96 (4), 359–379.

Article   Google Scholar  

Atman, C. J., & Bursic, K. M. (1998). Verbal protocol analysis as a method to document engineering student design processes. Journal of Engineering Education, 87 (2), 121–132.

Bowers, D. H., & Evans, D. L. (1990). The role of visualization in engineering design. In Proceedings of the NSF symposium on modernization of the engineering design graphics curriculum , Austin, TX.

Crismond, D. P., & Adams, R. S. (2012). A scholarship of integration: The matrix of informed design. Journal of Engineering Education, 101 (4), 738–797. https://doi.org/10.1002/j.2168-9830.2012.tb01127.x .

Critical Appraisal Skills Programme (CASP). 10 questions to help you make sense of qualitative research. Retrieved from http://docs.wixstatic.com/ugd/dded87_25658615020e427da194a325e7773d42.pdf .

Cross, N. (2004). Expertise in design: An overview. Design Studies, 25 (5), 427–441.

De Vries, M. J. (2018). The T and E in STEM: From promise to practice. In M. J. de Vries, S. Fletcher, S. Kruse, P. Labudee, M. Lang, I. Mammes, C. Max, D. Münk, B. Nicholl, J. Strobel, & M. Winterbottom (Eds.), Research in technology education . New York: Waxmann.

Google Scholar  

El-Zanfaly, D. (2015). [I3] Imitation, iteration and improvisation: Embodied interaction in making and learning. Design Studies, 41 ((Part A)), 79–109. https://doi.org/10.1016/j.destud.2015.09.002 .

Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data . Cambridge, MA: Massachusetts Institute of Technology Press.

Book   Google Scholar  

Erwin, E. J., Brotherson, M. J., & Summers, J. A. (2011). Understanding qualitative metasynthesis: Issues and opportunities in early childhood intervention research. Journal of Early Intervention, 33 (3), 186–200.

Field, B. W. (1994). A course in spatial visualization. In Proceedings of the 6th international conference on engineering design graphics and descriptive geometry , Tokyo.

Gero, J. S., & Kannengiesser, U. (2004). The situated Function–Behavior–Structure framework. Design Studies, 25 (4), 373–391.

Glaser, B. G., & Strauss, A. L. (1971). Status passage . Chicago: Aldine.

Glass, G., et al. (1981). Meta-analysis in social research . Newbury Park, CA: Sage.

Grubbs, M. E. (2016). Further characterization of high school pre - and non - engineering students’ cognitive activity during engineering design (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses database. (UMI No. 3662376).

Grubbs, M. E. & Strimel, G. J. (2016). Cognitive research: Transferring theories and findings to k-12 engineering educational practice, American Society for Engineering Education 103rd Annual Conference and Exposition. New Orleans, LA, June 26, 2016.

Grubbs, M. E., Strimel, G. J., & Kim, E. (2018). Examining design cognition coding schemes for P-12 engineering/technology education. International Journal of Technology and Design Education, 28 (4), 899–920.

Halfin, H. H. (1973). Technology: A process approach . (Doctoral dissertation, West Virginia University, 1973) Dissertation Abstracts International, (1) 1111A.

Hill, R. B. (1997). The design of an instrument to assess problem-solving activities in technology education. Journal of Technology Education, 9 (1), 31–46.

Hope, G. (2000). Beyond their capability? Drawing, designing and the young child. Journal of Design & Technology Education, 5 (2), 106–114.

Huffman, T. J., Strimel, G. J., & Grubbs, M. E. (2018). Determining the engineering knowledge dimension: What all high school students should know to be engineering literate. In Paper presented at the American Society of engineering education annual conference , Salt Lake City, UT.

Hynes, M., Portsmore, M., Dare, E., Milto, E., Rogers, C., & Hammer, D. (2011). Infusing engineering design into high school STEM courses. Retrieved from http://ncete.org/flash/pdfs/Infusing_Engineering_Hynes.pdf .

Jonassen, D. H. (2011). Learning to solve problems: A handbook for designing problem-solving learning environments . New York: Routledge.

Jonassen, D. H., Strobel, J., & Lee, C. B. (2006). Everyday problem solving in engineering: Lessons for engineering educators. Journal of Engineering Education, 95 (2), 139–151. https://doi.org/10.1002/j.2168-9830.2006.tb00885.x .

Kannengiesser, U., Gero, J., Wells, J., & Lammi, M. (2015). Do high school students benefit from pre-engineering design education? In Proceedings of the 20th international conference on engineering design (ICED) . Human behaviour in design (Vol 11, pp. 267–276), July 27–30, Milan, Italy.

Kelley, T. R. (2008). Cognitive processes of students participating in engineering-focused design instruction. Journal of Technology Education, 19 (2), 50–64.

Kelley, T. R. (2017). Design sketching: A lost skill. Technology and Engineering Teacher, 76 (8), 8–12.

Kelley, T. R., Brenner, D. C., & Pieper, J. T. (2010). Two approaches to engineering design: Observations in stem education. Journal of stem Teacher Education, 47 (2), 5–40.

Kelley, T. R., Capobianco, B. M., & Kaluf, K. J. (2015). Concurrent think-aloud protocols to assess elementary design students. International Journal of Technology and Design Education, 25 (4), 521–540.

Kelley, T. R., & Sung, E. (2017a). Examining elementary school students’ transfer of learning through engineering design think-aloud protocol analysis. Journal of Technology Education, 28 (2), 83–108.

Kelley, T. R., & Sung, E. (2017b). Sketching by design: Teaching sketching to young learners. International Journal of Technology and Design Education, 27 (3), 363–386.

Lammi, M. D. (2011) Characterizing high school students’ systems thinking in engineering design through the function–behavior–structure (FBS) framework . All Graduate Theses and Dissertations. Paper 849. http://digitalcommons.usu.edu/etd/849 .

Lammi, M. D., & Becker, K. (2013). Engineering design thinking. Journal of Technology Education, 24 (2), 55–77.

Lammi, M. D. & Gero, J. S. (2011). Comparing design cognition of undergraduate engineering students and high school pre-engineering students. In Paper presented at the 2011 frontiers in education conference , Rapid City, SD.

Major, C., & Savin-Baden, M. (2010). An introduction to qualitative research synthesis: Managing the information explosion in social science research . New York, NY: Routledge.

McKim, R. H. (1980). Experiences in visual thinking . Boston, MA: PWS Publishers.

Mentzer, N. (2014). Team based engineering design thinking. Journal of Technology Education, 25 (2), 52–72.

Mentzer, N., Becker, K., & Sutton, M. (2015). Engineering design thinking: High school students’ performance and knowledge. Journal of Engineering Education, 104 (4), 417–432.

Middleton, H. E (1998). The role of visual imagery in solving complex problems in design . Unpublished Dissertation. Griffith University.

Moore, P. L., Atman, C. J., Bursic, K. M., Shuman, L. J. & Gottfried, B. S. (1995). Do freshmen design texts adequately define the engineering design process? In ASEE annual conference proceedings .

National Academy of Engineering and National Research Council. (2014). STEM integration in K-12 education: Status, prospects, and an agenda for research . Washington, DC: The National Academies Press. https://doi.org/10.17226/18612 .

Noblit, G. W., & Hare, R. D. (1988). Meta-ethnography: Synthesizing qualitative studies . Newbury Park, CA: Sage.

Purcell, A. T., Gero, J. S., Edwards, H., & McNeill, T. (1996). The data in design protocols: The issue of data coding, data analysis in the development of models of the design process. In N. Cross, H. Christiaans, & K. Dorst (Eds.), Analysing design activity (pp. 225–252). Chichester: Wiley.

Sandelowski, M., Docherty, S., & Emden, C. (1997). Qualitative meta-synthesis: Issues and techniques. Research in Nursing & Health, 20, 365–371.

Scruggs, T. E., Mastropieri, M. A., & McDuffie, K. A. (2006). Summarizing qualitative research in special education: Purposes and procedures. In T. E. Scruggs & M. A. Mastropieri (Eds.), Advances in learning and behavioural disabilities (Vol. 19, pp. 325–346)., Applications of research methodology Oxford, UK: Elsevier.

Sorby, S. A. (2009). Educational research in developing 3-D spatial skills for engineering students. International Journal of Science Education, 31 (3), 459–480.

Sorby, S. A., & Baartmans, B. J. (1996). A course for the development of 3-D spatial visualization skills. Engineering Design Graphics Journal, 60 (1), 13–20.

Strimel, G. J. (2014). Engineering design: A cognitive process approach (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses database. (UMI No. 3662376).

Strimel, G. J., Grubbs, M. E., & Wells, J. G. (2016). Engineering education: A clear decision. Technology and Engineering Teacher, 76 (1), 19–24.

Welch, M. (1996). The strategies used by ten grade 7 students, working in single - sex dyads, to solve a technological problem . Unpublished doctoral dissertation, McGill University, Montreal, Canada.

Welch, M., & Lim, H. S. (2000). The strategic thinking of novice designers: Discontinuity between theory and practice. Journal of Technology Studies, 26 (2), 34–44.

Wells, J., Lammi, M., Grubbs, M., Gero, J., Paretti, M. & Williams, C. (2014). Design cognition of high school students: Initial comparison of those with and without pre-engineering experiences. In Proceedings of the technology education research conference , Australia.

Wilson, A. A., Smith, E. R., & Householder, D. L. (2013). High school students’ cognitive activity while solving authentic problems through engineering design processes. In Proceedings of the American Society of engineering education . Atlanta, Georgia.

Zimmer, L. (2006). Qualitative meta-synthesis: A question of dialoguing with texts. Journal of Advanced Nursing, 53, 311–318.

Download references

Author information

Authors and affiliations.

Department of Technology Leadership and Innovation, Purdue University, West Lafayette, IN, USA

Greg J. Strimel

School of Engineering Education, Purdue University, West Lafayette, IN, USA

Career and Technology Education, Baltimore County Public Schools, Towson, MD, USA

Michael E. Grubbs

Department of Integrative STEM Education, The College of New Jersey, Ewing Township, NJ, USA

Tanner J. Huffman

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Greg J. Strimel .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Strimel, G.J., Kim, E., Grubbs, M.E. et al. A meta-synthesis of primary and secondary student design cognition research. Int J Technol Des Educ 30 , 243–274 (2020). https://doi.org/10.1007/s10798-019-09505-9

Download citation

Accepted : 06 February 2019

Published : 12 February 2019

Issue Date : April 2020

DOI : https://doi.org/10.1007/s10798-019-09505-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Design cognition
  • Think-aloud protocols
  • Engineering education
  • Technology education
  • Find a journal
  • Publish with us
  • Track your research

secondary research in design thinking

Princeton Correspondents on Undergraduate Research

Design Thinking in Research

I remember it like it was just yesterday. The steps to the scientific method: Question. Research. Hypothesis. Experiment. Analysis. Conclusion. I can actually still hear the monotonous voices of my classmates reciting the six steps to the content of the middle school science fair judges.

Princeton student researchers working at the Lewis Thomas lab

For our middle school science fair, I had created a web-based calculator that could output the carbon footprint of an individual based on a variety of overlooked environmental factors like food consumption and public transportation usage. Having worked on the project for several months, I was quite content when I walked into our gym and stood proudly next to my display board. Moments later the first judge approached my table. Without even introducing himself, he glanced at my board and asked me, W here’s your hypothesis? Given the fact that my project involved creating a new tool rather than exploring a scientific cause-effect relationship, I told him that I didn’t think a hypothesis would make sense for my project. To my dismay, he told me that a lack of hypothesis was a clear violation of the scientific method, and consequently my project would not be considered.

This was quite disheartening to me, especially because I was a sixth grader taking on my very first attempt at scientific research. But at the same time, I was confident that the scientific method wasn’t this unadaptable set of principles that all of scientific research aligned to. A few years later, my suspicions were justified when my dad recommended I read a book called Design Thinking  by Peter Rowe. While the novel pertains primarily to building design, the ideas presented in the book are very applicable in the field of engineering research, where researchers don’t necessarily have hypotheses but rather have envisioned final products. Formally, design thinking is a 5-7 step process:

Steps to the Design Thinking Process

  • Empathize – observing the world, understanding the need for research in one’s field
  • Define – defining one particular way in which people’s lives could be improved by research
  • Ideate – relentless brainstorming of ideas without judgment or overanalysis
  • Prototype – sketching, modeling, and outlining the implementation of potential solutions
  • Choose – choosing the solutions that provide the highest level of impact without jeopardizing feasibility
  • Implement – creating reality out of an idea
  • Learn – reflecting on the results and rethinking the process for endless improvement

But more generally, advocates of design thinking call it a “method of creative action”. In design thinking, researchers are not concerned about solving a particular problem, but are looking more broadly at a general solution. In fact, design thinkers don’t even necessarily identify a problem or question (as outlined in the scientific method); they are more concerned about reaching a particular goal that improves society.

This view of research is particularly insightful especially in disciplines beyond the scientific realm. One aspect that particularly appeals to me is the relative importance placed on the solution’s impact. In design thinking, researchers empathize. They understand at a personal level the limitations of current solutions. And once they implement their solutions, they learn from the results and dive right back into the entire process. Societal impact is their overall goal – an idea that carries over into humanities and social science research.

The most important aspect, in my opinion, is the freedom of design thinking. In design thinking, the ‘brainstorming’ process and the solution are given the most attention. Design thinkers are primarily concerned with the overall effectiveness of potential solutions, worrying about the individual details afterwards. This inherently promotes a creative and entrepreneurial research process. Combined with the methodology and analysis components of the scientific method, the principles of design thinking help research ideas blossom into realities. In a sense, design thinking repackages the scientific method to create a general research process in non-scientific fields. Artists, fashion designers, and novelists all use design thinking when creating their products.

So while I certainly didn’t impress the judges that day at the science fair, I did learn something far more resourceful than a display board could teach. In order to complete a satisfying research project, one doesn’t need to rigorously follow a well-outlined protocol. Often, all one needs is the drive to design creative and impactful solutions.

— Kavi Jain, Engineering Correspondent

Share this:

  • Share on Tumblr

secondary research in design thinking

Design Thinking

Primary and secondary research.

Primary research (also called field research ) involves the collection of data that does not already exist. This can be through numerous forms, including questionnaires and telephone interviews amongst others. Secondary research (also known as desk research ) involves the summary, collation and/or synthesis of existing research rather than primary research, where data is collected from, for example, research subjects or experiments.

No comments:

Post a Comment

Definitions

  • Design methods
  • Experience Prototyping
  • NABC, Needs, Approach, Benefits, Competition
  • Service Blueprint
  • Service Design
  • Value chain

View Mikael Fuhr's profile on LinkedIn

4 chemistry lab designs that keep our Bunsens burning

Stephen Blair

May 15, 2024

Social Sharing

  • Copied to clipboard!

Chemistry laboratories play a critical role in scientific discovery and innovation, propelling our understanding of the natural world, informing our daily lives and shaping the course of society.

Within chemistry teaching labs, the next generation of scientists can discover and learn. In research labs, chemists unravel the mysteries of matter, forge new compounds, and pioneer breakthroughs that reverberate across disciplines. From advancing medicine and clean energy solutions to training young scientists, chemistry labs spark novel ideas that galvanize scientific progress and promise. 

Design is pivotal in propelling chemistry labs toward greater efficiency, safety and scientific breakthroughs. Here are four projects that show what’s possible with chemistry lab design today. 

A radical mid-century chemistry lab renovation  

Yale Sterling Chemistry Laboratory Renovation  

Higher education facilities around the U.S. face the challenge of renovating mid-century laboratories in existing buildings — some with historical value and importance to the campus fabric. One such laboratory building was the Sterling Chemistry Lab (SCL) on Yale University’s Science Hill. To improve STEM teaching and learning at Yale, this building underwent a major interior transformation while still preserving the beauty of the historic exterior architecture. 

In a bold approach to sustainability and preservation, our design carved out the building interior, inserted state-of-the-art chemistry and biology labs, and married the new STEM environment with the existing building shell. Science is at the forefront of the design and is on display throughout, showing current and prospective students Yale’s commitment to STEM education. Ultimately, this renovation helps the university enhance STEM teaching principles through collaborative learning spaces and hands-on approaches to science education. 

Enhancing capabilities for an R1 research center  

University of Wisconsin-Milwaukee, Chemistry Building  

Chemistry forms the fundamental basis for numerous growing industries, and the demand for STEM professionals is rapidly outpacing that of non-STEM fields. To meet this demand, the University of Wisconsin-Milwaukee (UWM) is embarking on constructing a cutting-edge new chemistry building that will equip its students with the necessary tools to excel across a broad range of industries. 

One of the few R1 research institutions in the Midwest, UWM seeks to provide state-of-the-art science facilities that fully support learning and research as they evolve. The new chemistry building, replacing a 50-year-old facility that has not been updated since 1972, will shift paradigms for how chemistry is taught, studied and shared, and include space for the thousands of students who take chemistry and biochemistry classes each year. 

A mass timber home for bold climate solutions  

California Institute of Technology, Resnick Sustainability Center  

A giant maker space for scientists, Caltech’s Resnick Sustainability Center will be a dynamic hub for critical research into our most pressing climate and sustainability challenges. Chemistry labs are expertly integrated into the larger building program, including physical sciences, life sciences, and engineering. The building is designed not only to support undergraduate education with state-of-the-art teaching laboratories and learning centers but also to maximize wellness by providing daylight-filled environments where students can participate in group activities or individual study and experience a positive, welcoming space open to imaginative inquiry. 

Aligned with the building’s sustainability mission, a soaring, low-carbon, timber-framed atrium houses the center’s social and collaborative spaces, and the swooping glass curtain wall, which floods this multi-story space with natural light, incorporates a mass timber grid shell. This project is targeting LEED Platinum certification. 

Advancing integration at one of the top laboratories in the world  

Johns Hopkins University Applied Physics Laboratory, Building 201  

The Johns Hopkins Applied Physics Laboratory (APL) Building 201 is a veritable wonderland for scientists, boasting cutting-edge tools, equipment and laboratories that push the boundaries of research and development. Whether unraveling the mysteries of the universe, pioneering new materials, or confronting complex national security challenges, Building 201 equips researchers with everything they need to explore new frontiers. 

Chemistry labs are radically interwoven into other disciplines like materials testing, nanotechnologies, and synthetic chemistry. The building’s overall design promotes scientists’ interaction through openness, transparency and visual connectivity; leverages workplace strategies to integrate lab and office environments; and helps to increase connections to the university and outside industries. 

Explore more

Design that accelerates scientific discovery.

From developing life-saving vaccinations to combating climate change, to studying the smallest of nanoparticles, to exploring the vastness of space, it is critical as designers to know how we can play a part in scientific discovery.

Designing the lab of the future: A look inside the Johns Hopkins Applied Physics Laboratory Building 201

Building 201 is widely lauded as one of the most innovative examples of laboratory design anywhere in the world.

Building a human-centered organization

Breaking down insights from 7 years of research and application.

For those of us helping our enterprises readjust to a new normal, mobilize a response to the global crisis, or prepare to freeze entire businesses, we can’t help but reflect on the essential purpose of these organizations.

Organizations are facing an unprecedented convergence of social, technological, political, and ecological forces. Short-term changes in market dynamics pose an existential threat to businesses that cannot adapt to customer expectations, shifting supply chains, and fluctuations in capacity. Long-term super-crises, from a changing climate to rising economic inequality , are putting ambiguous but fast-approaching expiration dates on our way of life.

As stewards of large enterprises, it’s difficult not to notice gaps between our best intentions and our best outcomes, at times. Even the most well-intentioned organizations sometimes stumble over themselves to collaboratively fulfill their customers’ and communities’ needs.

Some organizations, including IBM, have turned to the values of human-centered design as a framework to balance the needs of the organization with the needs of its users, customers, and community. But many human-centered design movements have struggled to achieve impact. Often relegated to the delivery and execution layers of the organization, these movements often struggle to structurally shift an organization’s value to users, customers, and communities.

Since IBM embarked on its journey to build to a sustainable culture of human-centered outcomes in 2013, we’ve built a workforce of over 20,000 human-centered design professionals and trained over 200,000 employees in human-centered practices. We’ve worked through our own stakeholder challenges, and we’ve guided dozens of clients on their journeys. Although our transformation is ongoing, we’ve packaged the lessons we’ve learned into a framework any organization can use to create a more human-centered organization.

We won’t downplay the difficulty of this task. To do this for your own organization, it will require a deep understanding of and empathy for specific people and circumstances. However, the insights we’ve uncovered from 7 years of working with IBM and our clients have showed up consistently across industries. They’ve been useful for us, and we hope they’ll prove valuable to your journey, too.

The human-centered organization, defined

To guide our path forward, we found it necessary to create a working definition of the ideal organization. We call this the human-centered organization.

The human-centered organization is one that exists to fulfill a purpose for its users, customers, and community, and orients all of its innovation and operations activities around those people. It has instilled the principles of human-centered design and applied them in their most pure form to every aspect of their organization.

A human-centered organization:

  • focuses on creating better human experiences
  • builds resilience and de-risks innovation through continuous iteration and learning
  • cares as much about the experience of its diverse, empowered teams as it does about its customers
  • intentionally, actively embeds these principles into the fabric of the organization

Of course, building this organization is ongoing—a utopian ideal, if you will, that we continually strive to meet.

Changing how we value businesses

Most analysts in the industry measure the value of human-centered design against traditional business metrics. The data has not failed to deliver.

Higher revenue than traditional organizations by

Deliver outcomes to market faster by

Ultimately outperform the S&P 500 by

Metrics like these help human-centered design win allies in the C-suite. Yet, these traditional metrics for business success don’t address many of the drivers that demand human-centered solutions to begin with: the value of an enterprise to its users and customers, the welfare of its employees, and the resilience of the organization in the midst of external threats.

So, how should a human-centered organization measure its value? Some business leaders have been asking a similar question in different contexts.

Larry Fink, the founder and chief executive of the investment firm BlackRock, stated in his 2020 letter to CEOs that because of climate change, “we are on the edge of a fundamental reshaping of finance.” He announced that his firm would put sustainability at the center of its investment strategy, arguing that “a strong sense of purpose and a commitment to stakeholders helps a company connect more deeply to its customers and adjust to the changing demands of society. Ultimately, purpose is the engine of long-term profitability.”

Fink, along with Jamie Dimon and Warren Buffet , have pushed to eliminate quarterly earnings reports in order to disincentivize “short-termism.” The 1950s management theory of stakeholder capitalism has seen a comeback , gaining so much influence in the World Economic Forum that they updated the Davos Manifesto to include it for the first time since 1973.

While understanding the performance of an organization through the lens of multiple stakeholders and success metrics isn’t controversial, we’re only beginning to learn how to integrate these lenses into the way leaders run their organizations. However, here’s what we know so far: becoming a human-centered organization involves a fundamental transformation of an organization’s purpose. This is more than just a change in ways of working, skills, or infrastructure; it’s a change to the reason we come to work.

Ask yourself

  • How has your organization’s purpose changed over the years? What triggered those changes?
  • How does your current purpose serve people?
  • Most importantly, how does this purpose trickle down into your employees’ day-to-day operations?

An evolution of purpose, not just operations

Here are 3 forms of transformation we often see take place alongside human-centered transformations. Each of these are a type of operational transformation—an evolution of how work gets done, but not an evolution of why work gets done.

However, each of these can still provide critical support for a human-centered transformation.

“Just drive.” “Where to?” “You’re not hearing me, just drive .”

One of the most common types of transformations we encounter is the agile transformation. This movement intends to empower teams to respond to changing conditions, decreasing the time between design decisions and customer outcomes.

Paired with strong human-centered purpose, agile transformations empower the whole organization to execute on a shared mission. But organizations should take care: increasing delivery capacity alone is not a surefire way to remain competitive in a market of rapidly changing customer expectations. In fact, the opposite often occurs: without a shared understanding of the value delivered to people, iteratively delivering smaller increments of work can result in a fractured customer experience.

"Why don’t you get a design thinker on it?"

Skills transformations create a new organizational capability by bringing in a new skill or discipline. The classic form most commonly related to human-centered organizational transformation is the “design transformation” wherein an organization hires a formally-trained design workforce and simply facilitates their physical integration into the organization.

Alone, introducing any new skill or discipline into an organization generally doesn’t produce meaningful structural change to the organization. However, as an organization’s purpose evolves, it may require new skills to deliver a new class of outcomes.

"The same thing you were doing before, now with AI."

Digital transformations seek to increase efficiency and amplify innovation potential by changing an organization’s operating infrastructure. New digital routes to market are established, new internal tools are developed, and new policies are enacted. Especially today, it may seem more than ever that a failure to “go digital” can make the difference between an organization brought to its knees and an organization poised to help lift a society out of crisis.

When oriented around a clear purpose, digital transformations can support and perpetuate better ways of knowing and acting. But we’ve seen many costly digital transformations fail because they didn’t address the real needs of the people they served. Digital infrastructure hardens bad behaviors as well as good ones. And changes to infrastructure don’t raise an organization’s innovation potential unless employees know how to apply these tools to fulfill the organization’s new purpose.

What does a human-centered organization stand for?

It intentionally, actively embeds these principles into the fabric of the enterprise., everything the enterprise does is focused on driving better experiences for users, customers, and the community., it builds resilience and de-risks innovation through continuous iteration and learning..

It invests in the experience of its employees, shareholders, and suppliers as much as it does in that of its users, customers, and the community.

A focus on human outcomes

  • Instead of framing opportunities, projects, and organizational constructs around internal logistics, human-centered organizations frame them from the outside-in: as user or customer needs.
  • Instead of measuring success against metrics relevant only to the organization, human-centered organizations measure success against metrics relevant to users, customers, or the broader community of stakeholders.
  • Instead of designing for abstract, fixed personas, human-centered organizations recognize the dynamic nature of their users and customers, empowering them to actively participate in the design and delivery of their experiences.

It begins at the mission statement and permeates down to every standard an employee needs to meet in each of their tasks. It spans from the front of the marketing funnel to the last conversation with a support agent. It goes from how they compensate executives to how they name fields in performance management systems . Perhaps the most important hallmark of the ideal human-centered organization is that it aligns everything it does to the needs it fulfills for the people it serves.

This principle also reflects the ideals of the largest segment of today’s workforce . According to a 2017 Deloitte Millennial Survey , 9 in 10 Millennials across the world believe that “the success of a business should be measured in terms of more than just its financial performance.”

Even for-profit enterprises in ruthlessly competitive markets should be able to agree with some interpretation of this principle: market share comes from making customers happier than the competition . Yet, in our experience with ourselves and our clients, a focus on human outcomes is perhaps the most challenging principle for an organization to adopt.

We’ve worked with non-profits who frame their purpose as strongly human-centered, yet struggle to orient their organizational infrastructure or nurture their employees’ human-centered practices. Conversely, we’ve seen teams inside enterprises enthusiastically adopt human-centered mindsets, even as their senior leadership teams foster practices that perpetuate poor user experiences.

Diverse, empowered teams

It invests in the experience of its employees, shareholders, and suppliers as much as it does in their users, customers, and the community..

  • Instead of investing solely in what customers see, human-centered organizations empower employees, shareholders, and suppliers with experiences that help them drive user and customer outcomes.
  • Instead of building homogeneous teams, human-centered organizations build diverse and inclusive teams.
  • Instead of forcing teams to gain approval for every minor decision, human-centered organizations empower their employees to learn through action.

The design-literate call it “service design.” Human resources departments call it “employee experience.” Danny Meyer, the restaurant magnate behind storied institutions like the Union Square Cafe and Shake Shack, calls it the “virtuous cycle of enlightened hospitality.” Whatever you call it, this principle recognizes that poor user or customer outcomes can be traced back to decisions made by employees and stakeholders.

This isn’t just the responsibility of a human resources department. It encompasses everything an employee touches within an organization, from their sense of psychological safety amongst team members and senior leaders, to the affordances and constraints embedded within organizational infrastructure, to the welfare of their families and dependents. It also expands past the employee, into the experience of an organization’s shareholders and suppliers—from the information they receive, to their cost of doing business.

What does this look like in practice?

Mature enterprises are creating the building blocks that act as both a source of employee empowerment and connective tissue that makes the organization feel cohesive and consistent to everyone. Digital design systems aim to help software development teams deliver higher-quality, more coherent experiences, faster; customer experience management platforms help teams identify experiences to prioritize; and since the definition and boundaries of design are in constant flux, career frameworks help employees envision a future for themselves as next-generation leaders of the organization.

In response to a deluge of studies run by organizations like Deloitte and Google —each corroborating diversity in management with financial outcomes and team performance with psychological safety—enterprises are investing in creating a more diverse workforce and inclusive workplace, applying diversity and inclusion targets to hiring and retention practices.

From C-level executives forgoing a year’s worth of compensation to save jobs and preserve employee culture through a crisis, to organizations that equitably redistribute compensation across people at every level, we’re seeing stories of enterprises actively prioritizing the welfare of the employee over the comfort of senior leadership.

Above all, it’s important to note that a transformation will fail if you can’t convince all stakeholders that it will help them fulfill their own (selfish) purposes better, as well as accelerate the organization’s achievements.

“We must constantly remember how important the human factor is to building great teams, to working well with our customers, and to creating amazing organizations,” Royal Philips executive committee member Ronald de Jong told MIT Sloan Management Review. “Being cognizant of the psychodynamics of change and transformation are vital to a company’s long-term success.”.

Restless reinvention

  • Instead of falling in love with a solution, human-centered organizations learn to fall in love with the fundamental human need behind it.
  • Instead of planning projects in large waterfall releases, human-centered organizations minimize risk by iteratively developing ideas, making decisions, and delivering outcomes.

Fundamental needs rarely change, even as the ways we address them change constantly. Yesterday’s horse-drawn carriage was a prototype for today’s automobile. Today’s automobile is just another prototype for tomorrow’s transportation breakthrough. 

While the problem is defined by a fundamental human need—getting from A to B—the solution at any point in time works within the constraints and affordances of the era (technological advancements, evolving resources, changing consumer expectations, etc.). Enterprises are resilient when they stay stubborn about the problem they solve. They fail when they stay stubborn about a specific solution.

Accepting this principle seems to result in 2 contradictory phenomena. The first phenomenon is an unwavering focus on a fundamental need. Paradoxically, the most resilient organizations build their identity around a mission that changes slowly, if ever.

But the second phenomenon is a willingness to continuously evolve the methods used to solve the problem. Resilient organizations rapidly iterate and bring to market new ways to serve their customers, employees, and stakeholders.

This is where the value of “going digital” reveals itself today.

Externally, digital (and particularly mobile-first) enterprises can more flexibly service their users and customers through a broad array of channels. Internally, digital enterprises can iterate and retool their business infrastructure faster than “analog” businesses.

But digital enterprises should not be lulled into a false sense of security. Although today’s environment has tipped the balance in their favor, the same cannot be said about the unknowable next paradigm. Restless reinvention isn’t a matter of technology—it’s a matter of an organization’s capacity for change.

  • What do users, customers, and the community say about your organization? What structural barriers within your organization prevent people from focusing on human outcomes?
  • What do employees and shareholders say about your organization? What keeps employees from bringing the best version of themselves to their work?
  • How do the interests of different stakeholders (e.g. users, shareholders, community, employees) conflict with one another?
  • What fundamental human need does or can your organization anchor itself in? How many different ways have you fulfilled the need?

How do people build a human-centered organization?

Putting the principles to use.

In practice, applying the 3 principles discussed above looks different depending on your context.

We often see organizations apply human-centered principles through the adoption of design thinking: a structured approach to help teams deliver human-centered outcomes more often. But enterprises that begin with “design thinking transformations” often fail to really transform. This isn’t to say that the spirit of these efforts is misguided. What is misguided is the common belief that as long as everyone is an expert in design thinking, the organization will successfully become human-centered.

Through our experience coaching and tracking hundreds of thousands of design thinkers across hundreds of distinct job roles, we began to see successful behaviors organize into 4 related but distinct practices. While “design thinking” does play a key role, building a sustainable culture of human-centered outcomes requires all of these practices..

More importantly, we discovered that these practices need to balance each other out in any human-centered organization. Over-committing to one of the practices produces diminishing returns. The absence of any of the 4 greatly undermines an organization’s ability to conduct the other practices effectively.

Explore the practices

Team-level practices.

Focus on external outcomes

Deliver outcomes

Deliver outcomes using human-centered design.

Focus on internal capability

Guide teams

Ensure that a team is effectively practicing to sustainably deliver great human outcomes.

Organization- level practices

Invest in opportunities.

Deploy and direct human-centered teams to deliver human outcomes aligned to organizational goals.

Transform conditions

Foster sustainable organizational conditions for human-centered design to scale.

from theory to tangible outcomes

For any transformation to win legitimacy, it must quickly prove its ability to deliver tangible outcomes. Building a human-centered delivery capability is often the first step.

You’re a designer working for the IT department at a large car rental company. You’re tasked with adding something the CMO calls “value-added services” to the mobile app. The one you’re working on right now is a customer loyalty rewards feature that gives car renters a daily allowance at local restaurants.

A typical day has you working through wireframes and testing coded prototypes. With limited access to customers, you spend a lot of time listening in on customer calls, studying up on what the giants in the automotive insurance industry are doing, and looking at usage stats on the app. So far, task completion rates hover just below 75%. The app has 4 stars in the App Store. You’re good at your job.

But you see the writing on the wall. Things have been turbulent in the car rental industry for some time as business travelers increasingly turn to services like ride-sharing. The mobile app had just 2 new downloads in the last week.

Having spent hours listening to customer calls, you think you understand the real job to be done: mobility in an unfamiliar city. And though you love cars, you also have a passing interest in urban planning. What if the app was also a public transit pass? What if it connected to an electric scooter service or crowdsourced a bike share? What if it got people free admission into restaurants? You’re full of ideas—if only you didn’t work for a car rental company.

It’s the stated goal of design thinking, Lean UX, and agile development alike. It’s the bread and butter of researchers, designers, front-end developers and marketers, among others. When business leaders say their organization is “human-centered,” they’re often referring to the way teams deliver outcomes.

This is the most visible of the 4 practices, because it produces the most tangible results—a critical element for any change movement to win legitimacy. For this reason alone, when organizations look to transform, they often build out their human-centered delivery capability first.

To build out this capability, we often see change leaders hire new delivery talent (e.g. formally trained designers), provide training to select teams (e.g. pilot teams), and establish incubators and innovation hubs to achieve quick wins that support program expansion. As programs win legitimacy, change leaders often look to transfer this capability out of innovation hubs and integrate it into their core business.

Like many other enterprises, IBM began its transformation through a rapid increase in its human-centered delivery capability. This consisted of 2 elements: hiring human-centered skillsets (e.g. designers, front-end developers) that the company lacked, and re-skilling its existing workforce in human-centered practices (e.g. design thinking).

In 2013, we hired 100 formally-trained designers and onboarded 7 multi-disciplinary teams (approximately 1,000 employees) working on critical projects across the business. The designers and front-end developers were embedded into those 7 existing teams. We kicked off the integration of these people and the introduction to the new practices with a 1-week face-to-face bootcamp.

  • a funded project
  • how the problem is framed
  • what a team will deliver

Great human outcomes: human needs, well fulfilled

Inclusive outcomes: solutions that reflect the needs and input of many users and stakeholders 

Continuously improved outcomes: solutions that continuously improve in response to new information about organizational objectives, technical capacity, and human needs

Research users’ needs

Synthesize insights

Ideate on solutions

Prototype concepts

Deliver solutions

Evaluate outcomes

Is the project doing better than it was before?

Is the need satisfied, or are people able to satisfy a need?

Is the solution more desirable or easier to use than it would have been otherwise?

Investing in opportunities ensures favorable project parameters. Without a strong investor, even the most competent delivery practitioner’s success is suppressed by poor project parameters.

For example, let’s say a highly competent software delivery team gets assigned a new project. If the success criteria defined is framed as a technical feature, the team must either work to gain support to reframe the opportunity, or deliver the feature as requested—resulting in a poor outcome.

Guiding teams integrates human-centered practices into every part of delivery. Without practitioners guiding teams, the team will be unable to integrate human-centered practices into their day-to-day workflow.

For example, a delivery team may uncover a new insight about their users’ behavior. But if the insight doesn’t get translated into human-centered tasks in the backlog, the insight may be lost in the course of delivery.

Transforming conditions makes delivering outcomes easy. Without transformation allies, a delivery team will be unequipped to conduct basic practices required to deliver good human outcomes.

For example, a business may deploy a delivery team to serve a new customer segment. If organizational policies prevent the team from getting direct access to primary customer research, they will struggle to develop an offering that customers value.

from workshop facilitation to practice integration

Without further actions taken, any “enlightenment” gained in special moments like workshops, design sprints, or innovation incubator projects gives way to the inertial influence of business-as-usual practices. Teams must integrate human-centered practices into their way of working. This requires ongoing operational contact with their upline leadership and the political capital to have the right conversations with the right people at the right time.

You’re still a designer working for the IT department at a large car rental company.

You’ve had some problems with the rest of the team: The engineers don’t build the things as designed. Instead, they seem to be prioritizing other things on their own. Those things may be important, but so is rewarding customers for their loyalty. You know that the single biggest ROI from human-centered design is its ability to align everyone around the same customer priorities. But you’re having trouble getting the team focused on this.

At the start of a new release, you get the whole team together—designers, engineers, product owners—to help synthesize transcripts from customer calls and decide on a direction. Sticky notes in hand, your whole team bumbles through personas, scenario maps, and storyboards—but they make it through, and you’re proud of them. You snap photos of each artifact and document them in a shared folder.

As you wrap up, the team congratulates you on a good workshop. “That was great! I feel aligned,” they say. But as you ramp up into the next week, you see the same old behaviors. To the detriment of the newly designed and aligned-upon loyalty rewards experience, the developers still seem to be spending their time on “database migration to managed cloud.”

With a heavy sigh, you figure it’s time to schedule another 1-on-1 with the development manager.

Change movements need to prove concrete value, quickly. But efforts to achieve “quick wins” don’t result in sustainable behavior change if the practices that got them those wins aren’t rapidly integrated into their routines.

Without further actions taken, teams fall back on old habits. But for a select few, these moments of enlightenment are revelatory, and they recognize the need to build the relationships necessary to introduce and integrate human-centered delivery practices into the team’s routine way of working.

Some will refocus their role on the team, while some will turn to “design thinking coaches” or “master facilitators” outside their organization for help. But teams should remember that the goal of this practice is not merely education or workshop facilitation; it’s lasting change in the team’s behavior and mindset over time.

At IBM, after teams came back from our intensive, week-long design bootcamps, we saw some teams go straight back to their old habits. But we also saw teams that truly changed, adopted new ways of working, and delivered transformational outcomes.

We spent a long time researching both archetypes: the failures and the successes. What emerged was an understanding that teams that successfully adapted shared an interesting trait: their success came down to specific people.

These people built good relationships within their cross-functional delivery teams, built strong rapport “up the chain” with senior stakeholders, and leveraged “back-channel” means to spur their teams upward.

We called these people “magic people.”

Since then, we’ve intentionally developed programming to increase the rate at which magic people (now we call them Coaches) emerge: advanced facilitation training, Coach training, communities of practice, and more.

At IBM, being a Coach isn’t a full-time role. The most effective Coaches are on the team, but they have some enhanced responsibilities and respect.

  • how work should get done

A human-centered team that regularly frames opportunities as human needs and fulfills those needs through human-centered design practices

A diverse, empowered team inclusive of a range of perspectives

An agile team capable of rapid iteration and responsiveness to new information about organizational objectives, technical capacity, and human needs

Facilitate team alignment

Integrate human-centered activities into team rituals

Advocate for team needs to stakeholders

Mentor individuals

Convene communities

How much is the team collaborating?

How much does the team trust each other?

Do team members believe their work is valuable?

Do team members believe themselves to have agency over the outcome?

Do people feel like they are growing on the team?

How long do people stay on the team?

How long does it take for the team to deliver human outcomes of a certain quality?

How much external input does the team require to deliver improved human outcomes?

Delivering outcomes gives a path to impact. Without teams delivering outcomes, there’s no point to trying to guide teams.

For example, a design facilitator with expertise in guiding teams may be hired into an organization, but without a direct relationship with delivery teams themselves, they can’t directly impact a team’s way of working.

Investing in opportunities aligns the team around a human-centered objective a guide can work with. Without a well-defined human-centered opportunity, any human-centered guidance will conflict with the team’s business objectives.

For example, let’s say a sales team gets assigned a new client. If the team is pressured to achieve a sales target at the expense of client satisfaction, guides will have trouble changing the behavior and mindset of their teams.

Transforming conditions makes delivering outcome easy. Without transformation allies, a delivery team can’t act on the best—or even the most obvious—guidance.

For example, a business may deploy a delivery team to serve a new customer segment. If the delivery team is blocked from conducting primary customer research, it doesn’t matter how well a guide can fit research into a workflow.

from trusting the process to framing the opportunity

Teams are constrained by the resources and opportunities they’re given. Stakeholders who provide resources and assign opportunities must learn to apply human-centered principles, too.

You’re the head of IT at a car rental company.

Which means you’re also “head of digital.” And with the company increasingly relying on digital channels, it also means you’re “head of transformation.”

You are operationally inclined. You like to sit in the trenches with the teams. You even participated in the customer loyalty rewards workshop the designer led last week. Day-to-day, you’re handling 2 big initiatives: cutting costs by migrating the entire rates engine to a managed cloud, and working with the CMO to create “value-added services” for renters through the mobile app.

The “value-added services” thing keeps you up at night. The CEO thinks this will save the company. But you see the writing on the wall. Things have been turbulent in the car rental industry for some time as business travelers increasingly turn to services like ride-sharing; in fact, the company lost close to 30% of its business in the last 4 years, and your department’s budget for new initiatives this year is half of what it was last year. You’ve been here a while, seen things come and go. But this is different.

The customer loyalty rewards tool finally ships. On the first week of launching it, only 32 people used their points through the app. The second week, 36 people used their points. While the team is publicly congratulated for the release, you know in your heart that it was a failure.

You’re tired of pushing out promises of success to the next release. In the ensuing post-mortem with the team, the CMO blurts out, “Our customers are driving around an unfamiliar city, and all we can care about is what car they’re driving?”

The team pauses. You think back to all the things your designer said. What *if* the app was also a public transit pass? What *if* it crowdsourced a bike share? *How might we help people feel like they know where they’re going like it’s their own neighborhood?* The retrospective ends on a tense note, but out of the corner of your eye, you see the designer grinning ear-to-ear.

Teams don’t operate in a vacuum, nor are they ever completely autonomous. A team’s mere formation presumes a problem to be solved; the resources provided to a team presume a solution to be delivered. Team members trying to integrate human-centered practices into their team routines will discover an upper limit imposed by 1 of 2 factors: the resources available to the team or the team’s mandate itself. However, these factors are often decided upon by stakeholders distant from the delivery team and, more to the point, distant from the real users of the very thing they’re funding.

Furthermore, these stakeholders have often been schooled in legacy artifacts like project planning charts, feature/function lists, or estimates of points cleared from a backlog. They often struggle to understand or engage with human-centered teams whose deliverables and artifacts generally look fundamentally different than those of a “traditional” team. Faced with this uncertainty, stakeholders either resist the unknown and suppress human-centered practices, or they “trust the process” without adequately lending support or connecting the team’s work to the organization’s broader objectives.

Confronted with poor resource support or a poorly framed mandate, savvy guides may attempt to remediate the situation by trying to help these organizational stakeholders understand the downstream impact of their practices.

An error often made at this point is to train these stakeholders on the same practices used by a delivery team (e.g. send them to an “executive design thinking workshop”). However, we’ve found that being a good stakeholder is a distinct practice with distinct objectives and methods: identifying human-centered opportunities, framing human-centered problems, properly chartering teams, and then knowing how to evaluate the progress of human-centered development team.

Not surprisingly, our experience with ourselves and our clients has shown time and again: a strong grasp of this practice is critical for the success of any transformation movement. When business leaders with resources and organizational mandate advocate for human-centered practices, it lends crucial legitimacy to the movement; without their support, the movement risks making change for change’s sake.

At IBM, as we bootstrapped our program, our focus was not on enabling individuals, but on the teams delivering against business priorities. We worked closely with those teams and kept senior leaders engaged and apprised of the teams’ progress. But we’d often run into a problem: While teams generally understood how these new practices integrated with their day-to-day work, their first and second-line management didn’t know how to support them.

We created a design thinking course for executives, but that didn’t work. It wasn’t that these business leaders didn’t understand “design thinking.” It was that they didn’t know how to manage design thinking teams. Ultimately, we caught our error, and redesigned the curriculum for business leaders to specifically address what it takes to be a good customer, stakeholder, and advocate of human-centered design teams.

As our program’s scope grew from a handful of teams to the entirety of the business, our focus shifted from investing directly in strategic projects to giving business leaders the means to grow and support their teams for themselves. Fostering strong relationships with our senior-most executives, we built a corporate communications program that gave these leaders insight into the health of their delivery teams doing the work and the tools to be good stakeholders.

  • a human-centered delivery team
  • upline management support
  • how success is measured
  • who’s on the team
  • when things get done
  • where in the organization teams get deployed and money gets spent

Success criteria defined as the fulfillment of human needs

Diverse, empowered teams staffed with the necessary perspectives, skills, and authority

Teams deployed to and resources invested in opportunities that fulfill the most important needs

An environment conducive to agile iteration and responsiveness to new information about organizational objectives, technical capacity, and human needs

Reframe organizational objectives as human opportunities

Deploy the right amount of resources and diverse, empowered teams

Interface between human-centered teams and organizational demands by tracking team progress, advocating for team insight, and celebrating team success

Support (and remove blockers to) human-centered methods

Are customers willing to advocate for the organization or solution?

How fast do customers join us?

How fast are customers leaving us?

Are teams fulfilling the intended need for the intended user or customer type?

Does the intended market segment see the solution or organization as desirable?

How many people with this need pay the organization to fulfill it?

Is the organization gaining market share? 

How much does it cost to acquire and service a user?

How much do people end up spending?

Guiding teams and delivering outcomes gets return on investment. Without people capable of guiding teams and delivering outcomes, investing in an opportunity may not net any return.

For example, an enterprise may choose to invest in serving a new customer segment. Without a delivery team capable of conducting primary customer research, the investment may fail to pay off as the team struggles to develop an offering customers love, stay for, or pay for.

Transforming conditions reduces internal risk to their investments. Without transformation allies, bold investments in human-centered opportunities may be hindered by limitations in organizational infrastructure.

For example, an organization may choose to invest in a new AI service offering that hinges on the expertise of content designers. But without a formalized career path for content designers, it may be difficult to hire, let alone retain, the new skillset long enough to support a viable offering.

from heroic efforts to business-as-usual

Nobody is hostile to human-centered outcomes. But organizations are resistant to change. Teams daring enough to defy the odds face an uphill battle against the status quo. To achieve a lasting, sustainable culture of great human outcomes, change leaders must make the organization habitable to human-centered practitioners.

You’re still the head of IT at a car rental company.

The company has its back against the wall. It’s undergoing a reinvention, but you know it won’t come easy.

Last year, after a series of failed attempts at bringing “value-added services” to the mobile app, you took a chance on a project you believed was crazy. It was a small win, but now it’s the fastest-growing segment of the business.

You know that others will scoff at you when you proclaim proudly that “We are not here to rent cars—we are here to make every city in America feel like home.” Of course they’d scoff. The infrastructure that has helped the company survive as long as it has tells them otherwise. The company has been renting out cars, not delivering “experiences.” Its lifeblood has always been charging money to rent a capital asset. Your job is to build an entirely new company while managing the old one.

You’ve had one success. Now you’ve been asked to repeat this success everywhere. You are terrified, but you have allies. The CMO is in your court. The design department has been waiting for this moment for years. The CEO is curious and encouraging—mostly.

Until this point, we’ve only discussed the practices that enable a single team to deliver great human outcomes. But for most large organizations, these practices aren’t enough to make substantive changes to an organization’s outcomes. What does it take for these principles to truly impact the organization?

First, it takes an order-of-magnitude increase in the adoption of human-centered principles, at every level of the organization.

Most large enterprises are made of many teams, each of which contribute to a collective outcome. Consider a customer of a typical enterprise software offering: at various parts of the customer’s journey, they will interact with touchpoints owned by the marketing team, the sales team, multiple product squads, and multiple support teams. It is often difficult to value the relatively small improvements made by a few human-centered teams in the midst of many poor customer experiences. If you want to dramatically improve a customer journey, every team needs to operate at their best, in concert.

Second: it takes a willingness to upend the status-quo and re-envision the systems and processes that form the backbone of the organization.

Consider a human-centered delivery team operating within a traditional organization. In this environment, competing incentives like short-term revenue targets may prevent investment in the most impactful human outcomes. The effort required to deliver a great human outcome can be described as heroic. Faced with organizational realities beyond their control, many teams would simply choose to take the path of least resistance.

Transforming conditions requires visionary leadership to create a culture where human-centered decision-making is the convenient and obvious thing to do. This practice includes the work of convening stakeholders, forming a vision, and driving alignment across the organization. But it also includes the groundwork of culture change: changing the management system to include measurement of and incentives for great customer experience; building spaces that foster human-centered understanding and creativity, and intentionally building pipelines for talent with diverse thought and lived experience, and more.

In 2012, IBM established a small division of the company with a clear mandate: build a sustainable culture of human-centered outcomes.

As we bootstrapped the program, we focused on creating a microclimate that would prove value quickly. This meant working with the most strategic projects in the business to achieve their objectives. But it also meant scouring the market to build a formally-trained human-centered design workforce, defining human-centered design practices, and commissioning the first physical design studios and virtual workplaces to support those practices.

As our movement evolved from inception to explosive growth, we shifted our tactics to enabling the entire business at scale. For example, we shifted our resources away from providing high-touch coaching of project teams, toward building self-service tooling for enablement and measurement and fostering communities of practice. We developed a management system that gave senior business leaders insight into the behavior and health of the delivery teams doing the work. And we created new policies that aligned our measurements and incentives with human outcomes, such as linking Net Promoter Score with a portion of executive compensation.

Now, as we move from explosive growth to continuous evolution, our focus shifts again. Within our walls, we continue to drive increasing standards for excellence into our products and services. But we’ve also shifted our focus to guiding our clients and partners as they begin their human-centered transformation journeys, too.

  • an organization
  • what an organization stands for and how everyone aligns to this standard
  • who gets to participate in innovation and how they’re enabled to succeed
  • the innovation methods used by the whole organization
  • the digital and physical environments available to teams tasked to innovate
  • what teams work on and how their problems are framed
  • measurements, incentives, and infrastructure that impact a team’s ability to innovate

An organization effective at improving the welfare of all people in its stakeholder ecosystem

A resource pool of diverse, inclusive, and empowered teams

A resource pool of enabled human-centered design practitioners

A resilient community of practice that support each others’ practice growth

An organization with practices suited for problems they face

An organization with the physical and digital environments that facilitate cross-functional and cross-organizational collaboration

An organization that frames investments as problems to be solved for people

An organization that measures success by the value delivered to humans

An organization that incentivizes its people based on the value they deliver to humans

An organization with infrastructure that supports human-centered practices

Define a vision for human-centered change

Measure the organization’s external outcomes and internal maturity

Build urgency and align stakeholders

Build a program to identify talent needs, find and screen talent, onboard talent, and retain them through career growth

Define shared language and practices and mature them through awareness, evaluation, adoption, and integration campaigns

Commission new spaces by identifying space needs and creating or procuring physical tools, digital tools, and environments

Identify strategic project engagement criteria and opportunities

Pilot, scale, and integrate new incentives, processes, and infrastructure that enable a human-centered practice

Do employees believe their work is valuable?

Do employees believe they have agency to make outcomes happen?

Do employees feel like they are growing inside the organization?

Are employees staying with the organization?

Are teams applying human-centered practices?

Are teams delivering improved human outcomes?

How long does it take for the average team to deliver human outcomes of a certain quality?

Investing in opportunities provides the legitimacy required to get a change initiative off the ground. Without investor allies, a change initiative may be out of touch with the needs of the organization, perceived as coming from an “ivory tower,” or seen as tinkering unnecessarily with an already precariously-balanced ecosystem.

For example, we saw an enterprise attempt to transform the way its sellers were incentivized in order to improve long-term client relationship health. The transformation wasn’t sponsored by sales leaders, so the reform was perceived as a threat to quarterly sales targets.

Guiding teams helps practitioners delivering outcomes navigate transformed conditions. Without guidance, practitioners may have difficulty understanding how to take advantage of newly transformed conditions.

For example, we saw a regional government adopt matrix management in order to facilitate collaboration across a citizen journey. Without guidance on how to behave in this new paradigm and integrate these behaviors into the team’s workflow, the new management structure failed to overcome the cultural silos that inhibited cross-organizational collaboration.

Delivering outcomes proves the value of the change. Without delivering outcomes, it will be difficult to retain continued support for any initiative to transform conditions.

For example, an organization deployed formally-trained designers to strategic projects. But because those projects didn’t have the research skills needed to deliver improved outcomes, the initiative to build a formally-trained design capability lost legitimacy.

  • Which of these practices have your organization’s past transformations focused on?
  • To what extent do these practices exist in your organization today?
  • Which ones are the strongest? Which are the weakest?

It starts with the first step

For those scrambling to respond to our current crisis, we understand that it may feel like an inopportune moment to introspect on your organization’s principles and practices. But in times like these, focusing on human outcomes is more important than ever.

Becoming human-centered isn’t a head thing. It’s a heart thing. We’ve found, at least for us, it’s best to learn through doing. You don’t have to license a training platform for your company. You don’t have to fully—or even partially—understand how your whole company gets “enabled.” What you need to do is to start doing it.

Find a team that’s working on something critical. If you can, start with a couple of teams. See if their leadership wants to deliver something better, faster, and for less than it would normally cost.

You’ll need some human-centered expertise on these teams, but you don’t all need to be experts. Within a few weeks you’ll know whether these approaches are working.

Let us know your thoughts

Every story, idea, and circumstance helps us either validate or improve upon this point of view. We’re always eager to talk with people who believe systems should work in service of people—especially those striving to make that a reality. If you have feedback to offer, want to share a case study, saw your own story within our insights, or want to learn more about our research, contact us.

Related resources

Free design thinking courses and activities

See how IBM Services can help transform your business

Share Options

  • Share to Facebook
  • Share to Linkedin

Systems Thinking in Action: Undergraduate Research Takes On Complex Problems

The projects presented at this year’s IEEE Systems and Information Engineering Design Symposium , hosted by the UVA School of Engineering and Applied Sciences, explored alternative solutions to a wide range of problems: a team of Duke University students worked to improve audible safety alerts for freedivers; a Purdue University research group applied systems thinking to assess the performance of spaceports (the sites for launching or receiving spacecraft); and students from the UVA School of Data Science harnessed artificial intelligence as a method for detecting signs of human trafficking in state-level court cases. 

One UVA team from the Department of Systems and Information Engineering sought to help neurodiverse members of the Charlottesville community gain meaningful employment.

Building Bridges

Through a local business called VIAble Ventures , the VIA Center for Neurodevelopment provides adults with intellectual and developmental disorders with jobs as artisans, making candles, bath salts, and sachets and selling them online and at Charlottesville farmer’s markets. For Sophie Kikuchi, a fourth-year student in the Department of Systems and Information Engineering , the service-minded business was a dream client for a systems engineering project.  In research showcased at the symposium, Kikuchi and her teammates, advised by systems engineering professors Sara Riggs and Robert Riggs , applied systems thinking to help VIAble Ventures boost its sales and employ more adults with autism. Focusing on online sales, the group found ways to highlight top-selling products on the VIAble Ventures website and put the business’s unique mission front and center online.

“The mission of VIAble Ventures is just one that I've really grown to love,” Kikuchi said. “I feel like it's making an impact in the sense that, hopefully, getting the new website up will help increase the sales and employ more people with autism. The program does a lot of on-site job training and soft-skill teaching. It’s a great stepping stone for these individuals to be able to gain the experience they need to earn a paycheck.”

Kikuchi’s team presented one of the 100 papers featured this year at UVA’s design symposium, which has become one of the field’s leading student-focused forums for applied research, development and design over the past 20 years. Joining Kikuchi and her peers from UVA systems and information engineering at the conference were students from 32 higher education institutions in total, nine of which were international.

From Consulting to Co-design  

Another student team from the Department of Systems and Information Engineering looked at ways to streamline operations for outpatient cancer infusion centers, which will need to meet the nation’s growing demand for cancer care.

Rupa Valdez , an associate professor in the Department of Systems and Information Engineering, advised the team as they closely examined the workings of an outpatient cancer infusion center in the Mid-Atlantic region. “It’s exciting to see students learn to integrate their quantitative skills with qualitative approaches to understanding system complexity,” Valdez said. “I also find it meaningful to watch students build trusting relationships with clients throughout the project, enabling us to move from a consulting to a co-design model.”

Kikuchi was able to experience that same shift to co-designing solutions with the client while working with VIAble Ventures. “It’s been really fun working with the people at VIA,” Kikuchi said. “They’ve been so responsive. They like to be very hands-on and give feedback on new website designs. They’re excited about it, too, which has made the experience even better on our end.”

For students, presenting projects at the Systems and Information Engineering Design Symposium each year allows them not only to showcase their own work, but to see the vast array of subjects that systems engineers can explore, Kikuchi said. “It really shows how broadly you can apply the education you get through UVA systems engineering,” she said.

Explore Systems and Information Engineering

Many of the new technologies being introduced today hold the promise of substantial societal benefit if we successfully apply the technologies to solve real-world problems, such as treating and curing diseases and protecting critical systems from cyber attacks.

Skip to Content

  • Faculty & Staff
  • Community Partners

Apply Now for the Experiential Learning Design Accelerator

Have you been thinking about designing/redesigning an undergraduate course to integrate a community-based, experiential learning component  , do yo u already have a relationship with a community group/organization that can partner with  you on the course   , if so, this new opportunity is for you.

This summer, a small cohort of CU Boulder faculty will join a University of Arizona program to support the design of undergraduate courses that integrate community-engaged, experiential learning opportunities.  The program starts with a five-day, virtual, intensive workshop and offers the opportunity to participate in a year-long community of learning. The program introduces faculty to a  human-centered design approach  for integrating external stakeholders and real projects in undergraduate courses. In this transformative experiential learning model, students work alongside their peers and community partners to  develop desirable, feasible, and viable solutions to challenges informed by partners’ needs . 

The Experiential Learning Design Accelerator is offered by the University of Arizona, in partnership with CU Engage, and with support from CU Boulder PACES and the Center for Teaching and Learning. The University of Arizona will provide mentorship and a community of practice among faculty who have successfully developed scholarship, including grant proposals, papers and presentations based on engaged learning pedagogy. CU Engage will support the CU Boulder cohort in connecting to a CU Boulder community of scholars interested in community engaged scholarship.

 The 5 day, online, summer intensive will take place from August 12-16

  • 10 am- 4 pm: Monday-Friday for faculty
  • 10 am - 4 pm: Friday for community partners + invitation to participate at different times throughout the week 

Faculty participants and their community partners (1 partner per faculty member) will receive stipends to support their participation: faculty participants ($2000) and community partners ($1000).

 Apply by May 8 th  by submitting:

  • A 3-minute video pitch
  • A brief informational form

Strong applications will present an idea for (re)design of an undergraduate course that builds on an existing relationship between a faculty (tenure stream, teaching, research) member and an (off-campus) community partner.

Apply for the design accelerator  here .

Contact Vandna Sinha ( [email protected] ) with questions about the program, eligibility, or the application.

PACES logo

  • Announcement

IMAGES

  1. Fases De Design Thinking

    secondary research in design thinking

  2. Etapas Do Design Thinking

    secondary research in design thinking

  3. Design Thinking 101

    secondary research in design thinking

  4. Design Thinking Process Define Stage

    secondary research in design thinking

  5. Define In Design Thinking Process

    secondary research in design thinking

  6. The 5 Stages in the Design Thinking Process (2023)

    secondary research in design thinking

VIDEO

  1. Things to consider when designing a logo #shorts #logodesign

  2. Design Thinking and Research

  3. Design Is Not Neutral

  4. Understanding UX Research, Design Thinking & Information Architecture

  5. Design Thinking

  6. concepts in research design

COMMENTS

  1. A Complete Guide to Primary and Secondary Research in UX Design

    Here's why it's important in your design projects: 1. It gives you a deeper understanding of your existing research. Secondary research gathers your primary research findings to identify common themes and patterns. This allows for a more informed approach and uncovers opportunities in your design process.

  2. What is Secondary Research?

    Secondary research is a research method that uses data that was collected by someone else. In other words, whenever you conduct research using data that already exists, you are conducting secondary research. On the other hand, any type of research that you undertake yourself is called primary research. Example: Secondary research.

  3. Secondary Research in UX

    Secondary Research in UX. Mayya Azarova. February 20, 2022. Summary: Secondary research is an essential foundation for UX work, necessary to explore the problem space and scope of prior projects and to identify important questions and best practices in the field of study. It also helps to focus the scope of your own project and often saves money.

  4. Design Thinking in Practice: Research Methodology

    Over the last decade, we have seen design thinking gain popularity across industries. Nielsen Norman Group conducted a long-term research project to understand design thinking in practice. The research project included 3 studies involving more than 1000 participants and took place from 2018 to 2020: Intercepts and interviews with 87 participants.

  5. 4 types of research methods all designers should know

    Secondary research. Secondary research is when you use existing data like books, articles, or the internet to validate or support existing research. You may use secondary research to create a stronger case for your design choices and provide additional insight into what you learned during primary research.

  6. How to use desk research to kick-start your design process

    4. Research can generally be split into two categories: Primary: observations in the field, conducting interviews, usability tests, collecting surveys, diaries. Secondary: desk research. Primary research is time consuming and as I hope you've experienced yourself, exhausting. Furthermore, if results are not summarised and presented well, all ...

  7. The Power of Secondary Research in UX Design: Leveraging ...

    Understanding Secondary Research in UX Design: Secondary research involves the utilization of existing information, studies, and data to gain insights into a particular topic or problem. It ...

  8. The 4 types of research methods in UI/UX design (and when to use them)

    Improve your designs based on data-driven research. Though there are many different ways to collect data and do design research, they can broadly be categorized as either primary, secondary, exploratory, or evaluative research. In this article, we'll explain these four types of research methods in the context of UI/UX design and when you ...

  9. Secondary Research to the Rescue

    Secondary Research can be a powerful weapon in your arsenal as a UX designer irrespective of your level of expertise. It should be a default practice that comes even before Primary Research and has utility across various stages of the design process. Photo by ... McKinsey & Company, Guardian — What I'm really thinking series, Gartner ...

  10. To Frame or Reframe: Where Might Design Thinking Research Go Next

    Abstract. Design thinking is gaining widespread attention in the practitioner and academic literature. Successful implementation has been documented, and its value shown in empirical studies. There is little examination, however, of how design thinking practices fit with other approaches from which firms might choose to frame and solve problems ...

  11. Secondary Research: Definition, Methods & Examples

    This includes internal sources (e.g.in-house research) or, more commonly, external sources (such as government statistics, organizational bodies, and the internet). Secondary research comes in several formats, such as published datasets, reports, and survey responses, and can also be sourced from websites, libraries, and museums.

  12. Primary and Secondary Research

    After a literature review or other form of secondary research, you will be prepared to venture into the topic with confidence because you will know the existing knowledge on the topic. Primary research is special because you are the first (or one of the few) people to study the phenomena. Considering design is always changing and the ways ...

  13. Design Thinking Tools and Methods Complete Guide

    Secondary Research: This is where the designer looks into previous information such as previous projects, competitive clients and different types of data already there. This type of research is helpful early as it helps designers build a vision of the situation and determine the type of research needed. ... In Design thinking research (pp. 1-11 ...

  14. What is Secondary Research?

    The point of secondary research is to collect data that deepens your understanding of the problem space. Secondary research is beneficial because it is quick, cheap to conduct, and helps guide ...

  15. How to Successfully Apply the Inspiration in Design Thinking

    The inspiration phase is where the team conducts the design research that can be considered the core of all the news phases. By the end of this phase, the team should have a clear understanding about the design challenge need to be addressed, a project plan for the timeline, budget, and team. This stage is crucial to ensure the success of the ...

  16. Research Design

    A research design is a strategy for answering your research question ... about answering this question. Your research design choices should be driven by your aims and priorities - start by thinking carefully about what you want to achieve. ... it would be very difficult to get a sample that's representative of all secondary school students ...

  17. Secondary Design: A Case of Behavioral Design Science Research

    Research into secondary design provides insights into two areas that can inform the design of information systems. First, it strengthens the conceptualization ... we see too often that the "functionalist thinking acts as an end point rather than a starting point for analysis" (Pierson, 2000, p. 477). In constructively addressing

  18. Design Thinking in Education: Perspectives, Opportunities and Challenges

    The Suggestion of Design Thinking Process and its Feasibility Study for Fostering Group Creativity of Elementary-Secondary School Students in Science Education. Journal of The ... Meinel, C., & Nicolai, C. (2014). How design thinking tools help to solve wicked problems. In Design thinking research (pp. 97-102). Springer. 10.1007/978-3-319 ...

  19. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  20. A meta-synthesis of primary and secondary student design cognition research

    The research objectives that guided this study include the following: RO 1: Synthesize the findings of studies investigating the design thinking of primary and secondary students according to the three coding scheme foundation categories. To examine primary and secondary design cognition, studies have analyzed how students allocate their time to a variety of predetermined cognitive processes ...

  21. Secondary Qualitative Research Methodology Using Online Data within the

    In addition to the challenges of secondary research as mentioned in subsection Secondary Data and Analysis, in current research realm of secondary analysis, there is a lack of rigor in the analysis and overall methodology (Ruggiano & Perry, 2019). This has the pitfall of possibly exaggerating the effects of researcher bias (Thorne, 1994, 1998 ...

  22. Research on the Effect of Design Thinking on Enterprise Sustainable

    Design thinking is an interdisciplinary problem-solving approach that combines user-centered focus, challenge framing, visual thinking, experimental methods, and cross-functional collaboration. It has always been regarded as an important means to improve corporate sustainable innovation capabilities and team performance. However, its effectiveness is a controversial topic, with some proponents ...

  23. Design Thinking in Research

    Formally, design thinking is a 5-7 step process: Steps to the Design Thinking Process. Empathize - observing the world, understanding the need for research in one's field. Define - defining one particular way in which people's lives could be improved by research. Ideate - relentless brainstorming of ideas without judgment or overanalysis.

  24. Design Thinking: Primary and Secondary Research

    Primary research (also called field research) involves the collection of data that does not already exist. This can be through numerous forms, including questionnaires and telephone interviews amongst others. Secondary research (also known as desk research) involves the summary, collation and/or synthesis of existing research rather than ...

  25. Chemistry lab design solutions for teaching and research

    Daylight pours into the four-story open atrium at APL where scientists from a number of disciplines converge. Chemistry labs are equipped with open ceilings, overhead utility carriers and mobile casework to optimize flexibility. Design is pivotal in propelling chemistry labs toward greater efficiency, safety and scientific breakthroughs.

  26. Human-centered organizations: why and how to build them

    It has instilled the principles of human-centered design and applied them in their most pure form to every aspect of their organization. A human-centered organization: focuses on creating better human experiences. builds resilience and de-risks innovation through continuous iteration and learning.

  27. Systems Thinking in Action: Undergraduate Research Takes On Complex

    The projects presented at this year's IEEE Systems and Information Engineering Design Symposium Skip Main Navigation Skip to ... Systems Thinking in Action: Undergraduate Research Takes On Complex Problems . Menu Secondary. Departments & Programs. Biomedical Engineering.

  28. Apply Now for the Experiential Learning Design Accelerator

    Strong applications will present an idea for (re)design of an undergraduate course that builds on an existing relationship between a faculty (tenure stream, teaching, research) member and an (off-campus) community partner. Apply for the design accelerator here.