yabs.io

Yet Another Bookmarks Service

Viewing weinreich's Bookmarks

evaluation delete ,

[https://popcollab.medium.com/how-do-we-know-if-we-have-transformed-narrative-oceans-751ba416c341] - - public:weinreich
entertainment_education, evaluation, social_change, storytelling - 4 | id:1489287 -

And the result is a new beta framework: INCITE — Inspiring Narrative Change Innovation through Tracking and Evaluation. This new learning and evaluation framework has been developed to equip the pop culture narrative change field — comprised of artists, values-aligned entertainment leaders and companies, movement leaders, cultural strategists, narrative researchers, philanthropic partners, and more — with a shared methodology to unearth learnings and track short and long-term impact, at both the individual and collective levels. This launch of the beta INCITE framework is the first step in a road testing process set to take place over 2024 to make it useful and usable by field members and funders alike.

[https://towardsdatascience.com/ditch-statistical-significance-8b6532c175cb] - - public:weinreich
campaign_effects, evaluation, health_communication, how_to, quantitative, research - 6 | id:1484440 -

“significant” p-value ≠ “significant” finding: The significance of statistical evidence for the true X (i.e., statistical significance of the p-value for the estimate of the true X) says absolutely nothing about the practical/scientific significance of the true X. That is, significance of evidence is not evidence of significance. Increasing your sample size in no way increases the practical/scientific significance of your practical/scientific hypothesis. “significant” p-value = “discernible” finding: The significance of statistical evidence for the true X does tell us how well the estimate can discern the true X. That is, significance of evidence is evidence of discernibility. Increasing your sample size does increase how well your finding can discern your practical/scientific hypothesis.

[https://journals.sagepub.com/doi/10.1177/15245004231187134] - - public:weinreich
design, evaluation, management, social_marketing, strategy - 5 | id:1484374 -

While failure in social marketing practice represents an emerging research agenda, the discipline has not yet considered this concept systematically or cohesively. This lack of a clear conceptualization of failure in social marketing to aid practice thus presents a significant research gap.

[https://www.who.int/europe/publications/i/item/WHO-EURO-2022-6045-45810-65956] - - public:weinreich
behavior_change, evaluation - 2 | id:1414227 -

This framework proposes a stagewise model for evaluating the effectiveness and sustainability of behaviourally and culturally informed interventions in complex settings, with detailed guidance and accompanying tools. It presents the theoretical background, addresses the challenges of assessing causality during times of change and of influencing factors, and provides a method for measuring the unintended positive and negative effects of interventions on well-being, trust and social cohesion.

[https://psyarxiv.com/58udn] - - public:weinreich
behavior_change, campaign_effects, evaluation - 3 | id:1287028 -

Social and behavioral science research proliferated during the COVID-19 pandemic, reflecting the substantial increase in influence of behavioral science in public health and public policy more broadly. This review presents a comprehensive assessment of 742 scientific articles on human behavior during COVID-19. Two independent teams evaluated 19 substantive policy recommendations (“claims”) on potentially critical aspects of behaviors during the pandemic drawn from the most widely cited behavioral science papers on COVID-19. Teams were made up of original authors and an independent team, all of whom were blinded to other team member reviews throughout. Both teams found evidence in support of 16 of the claims; for two claims, teams found only null evidence; and for no claims did the teams find evidence of effects in the opposite direction. One claim had no evidence available to assess. Seemingly due to the risks of the pandemic, most studies were limited to surveys, highlighting a need for more investment in field research and behavioral validation studies. The strongest findings indicate interventions that combat misinformation and polarization, and to utilize effective forms of messaging that engage trusted leaders and emphasize positive social norms.

[https://www.nngroup.com/articles/cognitive-walkthrough-workshop/?utm_source=Alertbox&utm_campaign=27cc444eff-EMAIL_CAMPAIGN_2020_11_12_08_52_COPY_01&utm_medium=email&utm_term=0_7f29a2b335-27cc444eff-24361717] - - public:weinreich
design, evaluation, how_to, research - 4 | id:1080276 -

A cognitive walkthrough is a technique used to evaluate the learnability of a system. Unlike user testing, it does not involve users (and, thus, it can be relatively cheap to implement). Like heuristic evaluations, expert reviews, and PURE evaluations, it relies on the expertise of a set of reviewers to assess the interface. Although cognitive walkthroughs can be conducted by an individual, they are designed to be done as part of a group in a workshop setting where evaluators walk through a task in a highly structured manner from a new user’s point of view.

[https://behaviorchangeimpact.org/] - - public:weinreich
behavior_change, bibliography, campaign_effects, evaluation, sample_campaigns - 5 | id:1074480 -

Research consistently shows evidence-based social and behavior change (SBC) programs can increase knowledge, shift attitudes and norms and produce changes in a wide variety of behaviors. SBC has proven effective in several health areas, such as increasing the uptake of family planning methods, condom use for HIV prevention, and care-seeking for malaria. Between 2017 and 2019, a series of comprehensive literature reviews were conducted to consolidate evidence that shows the positive impact of SBC interventions on behavioral outcomes related to family planning, HIV, malaria, reproductive empowerment, and the reproductive health of urban youth in low- and middle-income countries. The result is five health area-specific databases that support evidence-based SBC. The databases are searchable by keyword, country, study design, intervention and behavior. The databases extract intervention details, research methodologies and results to facilitate searching. For each of the five health areas, a “Featured Evidence” section highlights a list of key articles demonstrating impact.

[https://emerge.ucsd.edu/] - - public:weinreich
evaluation, quantitative, research - 3 | id:1022011 -

EMERGE (Evidence-based Measures of Empowerment for Research on Gender Equality) is a project focused on gender equality and empowerment measures to monitor and evaluate health programs and to track progress on UN Sustainable Development Goal (SDG) 5: To Achieve Gender Equality and Empower All Girls. As reported by UN Women (2018), only 2 of the 14 SDG 5 indicators have accepted methodologies for measurement and data widely available. Of the remaining 12, 9 are indicators for which data are collected and available in only a limited number of countries. This assessment suggests notable measurement gaps in the state of gender equality and empowerment worldwide. EMERGE aims to improve the science of gender equality and empowerment measurement by identifying these gaps through the compilation and psychometric evaluation of available measures and supporting scientifically rigorous measure development research in India.

[https://www.meta-analysis-learning-information-center.com/] - - public:weinreich
evaluation, how_to, quantitative, research - 4 | id:958540 -

The Meta-Analysis Learning Information Center (MALIC) believes in equitably providing cutting-edge and up-to-date techniques in meta-analysis to researchers in the social sciences, particularly those in education and STEM education.

[https://brooketully.com/results/] - - public:weinreich
evaluation, social_marketing, strategy - 3 | id:802633 -

Achieving sustained behavior change takes a long time. I mean, hell, we’re still running ads about buckling seat-belts and most states made it a law 35 years ago! Beyond achieving behavior change, seeing the positive impact of said change on species, habitats and ecosystems can take even longer. So how can we balance these longer term goals with the need to show more immediate outcomes?

[https://breakthroughactionandresearch.org/wp-content/uploads/2019/10/guidelines-for-costing-sbc-interventions.pdf] - - public:weinreich
behavior_change, evaluation, management, price - 4 | id:574107 -

Costing is the process of data collection and analysis for estimating the cost of a health intervention. High-quality cost data on SBC are critical not only for developing budgets, planning, and assessing program proposals, but can also feed into advocacy, program prioritization, and agenda setting. To better serve these data needs, these guidelines aim to increase the quantity and quality of SBC costing information. By encouraging cost analysts to use a standardized approach based on widely accepted methodological principles, we expect the SBC Costing Guidelines to result in well-designed studies that measure cost at the outset, to allow assessment of cost-effectiveness and benefit-cost ratios1 for SBC programming. Such analyses could also potentially help advocates for SBC to better make the case for greater investment in SBC programming.2 These guidelines lay out a consistent set of methodological principles that reflect best practice and that can underpin any SBC costing effort.

[https://www.squarepeginsight.com/post/all-that-glitters-is-not-gold-8-ways-behaviour-change-can-fail] - - public:weinreich
behavior_change, evaluation - 2 | id:488545 -

Before we dive in, here is a quick summary of the proposed taxonomy of behaviour change failures: No effect Backfiring Intervention is effective but it's offset by a negative side effect Intervention isn't effective but there's a positive side effect A proxy measure changes but not the ultimate target behaviour Successful treatment effect offset by later (bad) behaviour Environment doesn't support the desired behaviour change Intervention triggers counteracting forces

[https://www.cell.com/trends/cognitive-sciences/fulltext/S1364-6613(20)30224-2] - - public:weinreich
behavior_change, campaign_effects, evaluation, theory - 4 | id:436909 -

The behavioural change enterprise disproportionately focuses on promoting successes at the expense of examining the failures of behavioural change interventions. We review the literature across different fields through a causal explanatory approach to identify structural relations that impede (or promote) the success of interventions. Based on this analysis we present a taxonomy of failures of behavioural change that catalogues different types of failures and backfiring effects. Our analyses and classification offer guidance for practitioners and researchers alike, and provide critical insights for establishing a more robust foundation for evidence-based policy. Behavioural change techniques are currently used by many global organisations and public institutions. The amassing evidence base is used to answer practical and scientific questions regarding what cognitive, affective, and environment factors lead to successful behavioural change in the laboratory and in the field. In this piece we show that there is also value to examining interventions that inadvertently fail in achieving their desired behavioural change (e.g., backfiring effects). We identify the underlying causal pathways that characterise different types of failure, and show how a taxonomy of causal interactions that result in failure exposes new insights that can advance theory and practice.

[https://www.comminit.com/health/content/facilitation-guide-integrated-evaluation-methodology-most-significant-change-and-photovo?utm_medium=email&utm_campaign=drumbeat784&utm_content=facilitation-guide-integrated-evaluation-methodology-most-significant-ch] - - public:weinreich
behavior_change, evaluation, how_to, research - 4 | id:350257 -

[https://breakthroughactionandresearch.org/our-work/costing-and-economic-evaluation/] - - public:weinreich
behavior_change, evaluation, how_to, management - 4 | id:272141 -

Currently Available Costing and Economic Evaluation Products The Business Case for Investing in Social and Behavior Change (report) new Guidelines for Costing Social and Behavior Change Interventions (report) new The Added Value of Costing Social and Behavior Change Interventions (brief) new Social and Behavior Change Business Case and Costing Webinar Generating Evidence to Inform Integrated Social and Behavior Change Programming in Nigeria Making the Business Case for Social and Behavior Change Programming (activity brief)

[https://www.ahrq.gov/ncepcr/tools/self-mgmt/pemat.html] - - public:weinreich
evaluation, health_communication - 2 | id:272090 -

The Patient Education Materials Assessment Tool (PEMAT) is a systematic method to evaluate and compare the understandability and actionability of patient education materials. It is designed as a guide to help determine whether patients will be able to understand and act on information. Separate tools are available for use with print and audiovisual materials.

With marked bookmarks
| (+) | |

Viewing 1 - 50, 50 links out of 233 links, page: 1

Follow Tags

Manage

Export:

JSONXMLRSS