Top 10 Application-Design Mistakes
Here is our current list of the top 10 application-design mistakes that are both egregious and commonplace.
Here is our current list of the top 10 application-design mistakes that are both egregious and commonplace.
Based on their comprehensive review of available research, Duckworth, Milkman, and Laibson propose a framework that organizes evidence-based self-control strategies along two dimensions based on how the strategies are implemented and who is initiating them. They observe that in some cases the best self-control strategy involves us changing the situation to create incentives or obstacles that help us exercise self-control, such as using apps that restrict our phone usage or keeping junk food out of the house. In other cases it’s more effective to change how we think about the situation — for example, by making an if-then plan to anticipate how we’ll deal with treats in the office — so that exercising self-control becomes more appealing or easier to accomplish. Other strategies work better when someone else implements them for us. For example, our electricity company might use social norms to prompt a change in our thinking, showing us how our energy usage compares with that of our neighbors. And policymakers often use situational constraints to prompt behavior focused on the long-term. Examples range from incentives (e.g., tax rebates for eco-friendly building materials) to penalties (e.g., raising taxes on cigarettes and alcohol). Employers are increasingly using another type of situational constraint, defaults, to encourage employees to save for retirement; many are requiring people to opt out of an employer-provided retirement plan if they don’t want to participate.
Giving advice, as opposed to receiving it, appears to help unmotivated people feel powerful because it involves reflecting on knowledge that they already have. So if you’re completely clueless about the resources or strategies necessary for progress, asking for help is probably the best first step. But if you (like most of us), know what you need to do, but are having trouble actually doing it, giving someone advice may be the push you need.
An industry rule of thumb, verified by USA TODAY through interviews with nearly a dozen influencers, marketing professionals and influencer platform founders, is a baseline rate of about 1 percent of follower counts per sponsored Instagram post, or $100 for every 10,000 followers. That means someone with 100,000 followers might start around $1,000 per sponsored post, while an influencer with 1 million followers could charge $10,000. And some experts called that conservative. Along with pricing structures based on follower counts, CPEs (cost per engagement) have emerged as another way to calculate marketing rates. Engagement is typically defined by interactions with content such as likes, comments, clicks or shares. Engagement rates can be found by adding up all engagements on a post, dividing it by follower counts and multiplying by 100.
In a study, the researchers said that smokers who had limited familiarity with information technology were more likely to consider antismoking messages manipulative and boring when they browsed those messages on a website with interactive features, such as sliders, mouseovers and zooming tools.
Sometimes we marketers can climb so far up the brand ladder from functional benefits to emotional benefits to social benefits, we can lose touch with why people are buying our products in the first place. There is power in purpose-driven brands. And yet, when every piece of marketing attempts to communicate some kind of social purpose, social purpose can start to lose its meaning, particularly when purpose is left to the agency.Sometimes we marketers can climb so far up the brand ladder from functional benefits to emotional benefits to social benefits, we can lose touch with why people are buying our products in the first place.
short video re: different modes of decisionmaking
Data viz - "I'll pause for a moment so you can let this information sink in."
Graphic of layers - 1) what you share, 2) what your behavior tells them, 3) what the machine thinks about you
To provide technical advice to the scriptwriters of Callaloo to help translate science to relevant messages and actions to address knowledge, attitude and behavior changes in the key results areas. Building the knowledge, shifting attitudes and ultimately changing behaviors will support reaching the objectives of the program. The following HIV/AIDS scriptwriters guide has been developed based on the results of the knowledge, attitude and behavior change (KAB) baseline survey conducted between January to March 2012 and supplemented by current research conducted by key partners (Refer to Sources of Information).
Convinced Masai elders to do away with cutting but to keep the rest of the coming of age ceremony
The 2019 Edelman Trust Barometer reveals that trust has changed profoundly in the past year—people have shifted their trust to the relationships within their control, most notably their employers. Globally, 75 percent of people trust “my employer” to do what is right, significantly more than NGOs (57 percent), business (56 percent) and media (47 percent).
This article reports on new research that finds certain messages reduce fear of sharks, key to promoting conservation-minded responses to shark bites. Here it is argued that the sophistication in public feelings toward these highly emotional events has allowed new actors to mobilize and given rise to the ‘Save the Sharks’ movement. In a unique experiment coupling randomly assigned intent-based priming messages with exposure to sharks in a ‘shark tunnel’, a potential path to reduce public fear of sharks and alter policy preferences is investigated. Priming for the absence of intent yielded significant fear extinction effects, providing a viable means of increasing support for non-lethal policy options following shark bite incidents. High levels of pride and low levels of blame for bite incidents are also found. In all, this article provides a step towards improving our understanding of fear and fear reduction in public policy.
The Subtask 8 deliverable was to create a testable toolbox for behaviour change interventions: • A description and evaluation of the validity and effectiveness of the Collective Impact Approach in the energy arena, as a peer-reviewed paper (Rotmann, 2016 and 2017a, Cobben 2017). • A Decision-making Tree that enables Behaviour Changers to better utilise the findings of ST1 & 2 • A peer-reviewed paper on the impact of storytelling in energy research (Rotmann, 2017b; Moezzi, Janda and Rotmann, 2017; Rotmann, 2018). • A collection of sector stories from each Behaviour Changer (see ST6 Final reports & Rotmann, 2017b) • This includes a list of behavioural intervention tools each Behaviour Changer has at their disposal in each of their national and sectoral contexts (see Task 24 workshop minutes and ST6 Final reports). • Continued testing and development of evaluation tools created in ST 3 & 9 (Rotmann and Chapman, 2018). • Testable toolbox for national Behaviour Changers (when choosing to take part in ST11, see Cowan et al 2017 and 2018) and/or synthesis of internationally-valid tools to feed into the Overarching Story
Week 1 When Everything Looks Like a Nail: Building Better “Behavioral Economics” Teams By Jason Collins Nudges Alone Won’t Save Nemo: Conservation in the Great Barrier Reef By John Pickering From Ph.D. to Policy: Facilitating Connections Between Junior Scholars and Policymakers By Ashley Whillans and Heather Devine Shouldn’t We Make It Easy to Use Behavioral Science for Good? By Manasee Desai RCTs Are Not (Always) the Answer By Tania Ramos and João Matos Week 2 Why Governments Need to Nudge Themselves By Michael Hallsworth and Mark Egan Behavioral Development Economics By Syon Bhanot and Aishwarya Deshpande Why Governments Should Treat Cybersecurity the Way They Do Infectious Diseases By Karen Renaud and Stephen Flowerday Pour One Out for Nudge’s Forgotten Peers By Jesse Dashefsky Helping Parents Follow Through By Nadav Klein, Keri Lintz, Ariel Kalil, and Susan E. Mayer Week 3 A New Model for Integrating Behavioral Science and Design By Sarah Reid and Ruth Schmidt Applying Behavioral Science Upstream in the Policy Design Process By Kate Phillips Lessons in “Nudging” From the Developing World By Abigail Goodnow Dalton Choice Architecture 2.0: How People Interpret and Make Sense of Nudges By Job Krijnen What the Origins of the “1 in 5” Statistic Teaches Us About Sexual Assault Policy By Alexandra Rutherford BONUS Nudge Turns 10: A Q&A With Cass Sunstein By Elizabeth Weingarten Nudge Turns 10: A Q&A With Ricard Thaler By Evan Nesterak
Government policies and services can be hard to navigate for people who are already under pressure. By understanding the effects of scarcity, we can make these easier to access for the people who need them. https://bi.dpc.nsw.gov.au/blog/2018/12/13/a-guide-to-reducing-the-effects-of-scarcity/
Whenever you're trying to change a behavior, you should ask yourself the following four questions: 1. Am I clearly prompting the target person to do the behavior I want? 2. Is the behavior really hard to do? 3. Is the target person motivated to do the behavior I want them to do? 4. Am I rewarding the target person for doing the behavior? That's it. That's your behavior-design checklist.
Innovative solutions based on how people act and make decisions in the real world are often buried in academic journals. The Behavioral Evidence Hub (B-Hub) brings them into the light of day. On the B-Hub you’ll find strategies proven to amplify the impact of programs, products, and services—and improve lives. Projects + checklists
We demonstrate that the mere-measurement effect occurs because asking an intention question is not perceived as a persuasion attempt. In experiments 1 and 2, we show that when persuasive intent is attributed to an intention question, consumers adjust their behavior as long as they have sufficient cognitive capacity to permit conscious correction. In experiment 3 we demonstrate that this finding holds with product choice and consumption, and we find that persuasionknowledge mediates the effects. In experiment 4, we show that when respondents are educated that an intention question is a persuasive attempt, the behavioral impact of those questions is attenuated.
"Pink alert" story - bearing silent witness - nurses
The brain, it seems, does not make much of a distinction between reading about an experience and encountering it in real life; in each case, the same neurological regions are stimulated. Keith Oatley, an emeritus professor of cognitive psychology at the University of Toronto (and a published novelist), has proposed that reading produces a vivid simulation of reality, one that “runs on minds of readers just as computer simulations run on computers.” Fiction — with its redolent details, imaginative metaphors and attentive descriptions of people and their actions — offers an especially rich replica. Indeed, in one respect novels go beyond simulating reality to give readers an experience unavailable off the page: the opportunity to enter fully into other people’s thoughts and feelings.