visualisation of all available AI principles. No need for new ones, but a need to operationalise & contextualise them across AI life cycle, for all stakeholders involved - data scientist, business execs, procurement & regulators
Consumers, employees, students, and others are often subjected to “sludge”: excessive or unjustified frictions, such as paperwork burdens, that cost time or money; that may make life difficult to navigate; that may be frustrating, stigmatizing, or humiliating; and that might end up depriving people of access to important goods, opportunities, and services. Because of behavioral biases and cognitive scarcity, sludge can have much more harmful effects than private and public institutions anticipate. To protect consumers, investors, employees, and others, firms, universities, and government agencies should regularly conduct Sludge Audits to catalogue the costs of sludge, and to decide when and how to reduce it. Much of human life is unnecessarily sludgy. Sludge often has costs far in excess of benefits, and it can have hurt the most vulnerable members of society.
However, nudges aimed at reducing carbon emissions could have a pernicious indirect effect if they offer the promise of a ‘quick fix’ and thereby undermine support for policies of greater impact.
So what counts as the “right” kind of problem for behavioral science to solve? Put more bluntly: How might our sense about what we should solve, or even what qualifies as a problem worth solving, be biased by how we think about what we can solve?
To ensure these partnerships are beneficial to all involved—companies, employees, customers, and researchers—behavioral scientists need a set of ethical standards for conducting research in companies. To address this need, we created The Behavioral Scientist’s Ethics Checklist. In the checklist, we outline six key principles and questions that behavioral scientists and companies should ask themselves before beginning their research. To illustrate how each principle operates in practice, we provide mini case studies highlighting the challenges other researchers and companies have faced.
Graphic of layers - 1) what you share, 2) what your behavior tells them, 3) what the machine thinks about you
In that study, gender and ethnicity information was removed from descriptions of potential job candidates. It was a study designed to interrupt unconscious biases against women and ethnic minorities. The results were surprising - blind recruitment made things worse for women and members of ethnic minorities. These results illustrate the limits of behavioural economics in action.
How do the photos used by development organisations affect perceptions of international development? How do agencies ensure that images preserve their subjects’ dignity? Has social media created new opportunities for self-representation, or just reinforced the use of outdated visual clichés? These are some of the questions addressed during last week’s #DevPix Twitter chat hosted by the Overseas Development Institute. The topic sparked a lively conversation…