Framework: Context Analysis of Technologies in Social Change Projects
Context analysis helps you to understand the elements of an environment and a group of potential users so that you can design a better technology project. It should involve key stakeholders, including implementing partners, donors, local and national authorities, and community members. We suggest five key lines of inquiry that context analyses should consider: People: Levels of education and literacy, information habits and needs, access to disposable income for equipment, electrical power to charge devices, and airtime and data to run them, and network access; Community: How membership of specific groups may affect access to technology and communications habits. For example, a nomadic clan may have attributable characteristics shared by its members, and variations in levels of access and freedom within the clan differentiated by gender and age. Market environment: An understanding of the key players, legal and regulatory issues, the mobile market, including both cost and distribution of agent networks, and the infrastructure, including commercial mobile infrastructure such as the availability of short-codes and APIs are all critical to making good design decisions. Political environment: understanding governance and control of, and access to, communications infrastructure by government and other actors Implementing organization: Many interventions have failed because staff were not able to maintain technology, because power or access to internet were not strong enough, because staff capacity was low or went away, or because the intervention was not supported by a broader culture of innovation and adaptive learning.
Science Forum: Ten common statistical mistakes to watch out for when writing or reviewing a manuscript | eLife
A Field Guide to “Fake News” and Other Information Disorders
A Field Guide to “Fake News” and Other Information Disorders explores the use of digital methods to study false viral news, political memes, trolling practices and their social life online. It responds to an increasing demand for understanding the interplay between digital platforms, misleading information, propaganda and viral content practices, and their influence on politics and public life in democratic societies.
Diary Studies: Understanding Long-Term User Behavior and Experiences
Search for “diary studies“ - User Experience Magazine
Pinboard: bookmarks for adrianh tagged 'diarystudy'
resources on diary studies
Working within resource constraints: a qualitative segmentation study: Journal of Strategic Marketing: Vol 0, No 0
What, who, when: 3 steps for planning market research | Pearson Insight
Two of the most valuable resources out there on scientific writing
Qualitative Research for Social Marketing: One Organization’s Journey to Improved Consumer Insight
In this paper, we describe how PSI's qualitative research program developed from 2003 to 2013, and how using an interpretive approach and more appropriate data collection methods improved our consumer insight and marketing planning process.
The Customer-Centered Innovation Map
original “jobs to be done“ article from 2008
“How to Map a Customer Job” – Anthony Ulwick
So how is it done? We’ve found that all jobs have the same eight steps. To use job mapping, we look for opportunities to help customers at every step:
Challenge Mapping Part 1 - Challenge Map Basics — 7 League Studio
There are a few enormous benefits to using challenge maps. First, challenge maps help teams surface the key decision points that will have the greatest potential impact, both for users and the business. Challenge maps also help teams get aligned and on the same page about the most impactful next step. Finally, and maybe most importantly, challenge maps help teams see where their thinking has been too limited, inspire fresh thinking, and unlock innovation.
Writing Publishable Mixed Research Articles: Guidelines for Emerging Scholars in the Health Sciences and Beyond
Hello, and Thanks for All the Fish: Tips for effective research recruiting
Data Playbook Toolkit | Global Disaster Preparedness Center
The Data Playbook Beta is a recipe book or exercise book with examples, best practices, how to's, session plans, training materials, matrices, scenarios, and resources. The data playbook will provide resources for National Societies to develop their literacy around data, including responsible data use and data protection. The content aims to be visual, remixable, collaborative, useful, and informative. There are nine modules, 65 pieces of content, and a methodology for sharing curriculum across all the sectors and networks. Material has been compiled, piloted, and tested in collaboration with many contributors from across IFRC and National Societies. Each module has a recipe that puts our raw materials in suggested steps to reach a learning objective. To help support you in creating your own recipe, we also include a listing of 'ingredients' for a topic, organised by type:
What’s Wrong With Your Survey? How to Reduce Error and Increase Reliability
Your Friendly Guide to Colors in Data Visualisation | Chartable
Evaluating Effect Size in Psychological Research: Sense and Nonsense - David C. Funder, Daniel J. Ozer, 2019
Daniel J. O’Keefe PUBLICATIONS AND PAPERS
research on health comm messaging effects
Message Pretesting Using Perceived Persuasiveness Measures: Reconsidering the Correlational Evidence: Communication Methods and Measures: Vol 0, No 0
Why and how to use video in research :: Social Change
How to spot a statistical problem: advice for a non-statistical reviewer | BMC Medicine | Full Text
Back to Reality: The Challenges and Joys of Conducting User Research in VR — UIE’s All You Can Learn Library
Quanta Magazine – Illuminating Science | Quanta Magazine
4 Ways to Turn Eye-Glazing Data Into Eye-Opening Stories | Inc.com
User Testing with Sensitive Data (Video)
How to conduct user research for systems with confidential or otherwise sensitive data, for example in domains like healthcare or financial services, where it can be problematic to record screens or otherwise share the user's information.
Cognitive Mapping in User Research
Unconventional Techniques for Better Insights from Satisfaction Surveys
Priming and User Interfaces
Summary: Exposure to a stimulus influences behavior in subsequent, possibly unrelated tasks. This is called priming; priming effects abound in usability and web design.
A Fundamental Mind Shift For Usability Testing - Jared M. Spool - Medium
This idea, that five to eight users will reveal 85% of all usability problems, is an old myth. It’s not true. It’s never been true.
What science reporters should know about meta-analyses before covering them
Reference Collection to push back against “Common Statistical Myths“ - data analysis - Datamethods Discussion Forum
How to ask Why: Stated versus Revealed Preference Research
centre himalayan studies
Affinity Diagramming: Collaborate, Sort and Prioritize UX Ideas (Video)
Green institute Nepal
Social media analytics: A practical guidebook for journalists and other media professionals | Publications | DW Akademie | DW | 17.07.2019
This guidebook helps media professionals of small media houses develop a better understanding of how to use data for improving their social media performance. Also includes worksheets and templates.
A Favorite User Research Trick - Jared M. Spool - Medium
How to use Screening Questions to Select the Right Participants for User Research
2019 UX Research Tools Map
How to create a better research poster in less time (including templates) - YouTube
Every field in science uses the same, old, wall-of-text poster design. If we can improve the knowledge transfer efficiency of that design even by a little bit, it could have massive ripple effects on all of science. Also, poster sessions tend to suck, so here's my pitch to make them more efficient AND more fun with a new approach to designing scientific posters/academic posters that is both more usable, and easier to create!
Social and Behavior Change Monitoring Guidance | Breakthrough ACTION and RESEARCH
Breakthrough ACTION has distilled guidance on social and behavior change (SBC) monitoring methods into a collection of technical notes. Each note provides an overview of a monitoring method that may be used for SBC programs along with a description of when to use the method and its strengths and weaknesses.
Doing ethical research with vulnerable users – Bernard Tyers
Dot Voting: A Simple Decision-Making and Prioritizing Technique in UX
Unsupervised word embeddings capture latent knowledge from materials science literature | Nature
Here we show that materials science knowledge present in the published literature can be efficiently encoded as information-dense word embeddings11,12,13 (vector representations of words) without human labelling or supervision. Without any explicit insertion of chemical knowledge, these embeddings capture complex materials science concepts such as the underlying structure of the periodic table and structure–property relationships in materials. Furthermore, we demonstrate that an unsupervised method can recommend materials for functional applications several years before their discovery. This suggests that latent knowledge regarding future discoveries is to a large extent embedded in past publications. Our findings highlight the possibility of extracting knowledge and relationships from the massive body of scientific literature in a collective manner, and point towards a generalized approach to the mining of scientific literature.
How You Can Have More Impact as a People Analyst
Making Personas Truly Valuable by Making Them Scenario-based
Two new Online Tools for investigating Behaviour Change: The Theory & Techniques Tool and the Behaviour Change Technique Study Repository - Human Behaviour Change Project (HBCP)
Understanding how and why people change - Journal of Marketing Management
We applied a Hidden Markov Model* (see Figure 1) to examine how and why behaviours did or did not change. The longitudinal repeated measure design meant we knew about food waste behaviour at two points (the amount of food wasted before and after the program), changes in the amount of food wasted reported over time for each household (more or less food wasted) and other factors (e.g. self-efficacy). By using a new method we could extend our understanding beyond the overall effect (households in the Waste Not Want Not program group wasted less food after participating when compared to the control group).