Because evil exists beyond the limits of reason, what matters for Ricoeur is not that we identify evil, but that we respond to it appropriately. He rightly observes that the tragedy of evil is not the act committed, but the experience of the victim. Separating evil perpetrated from evil suffered shifts the concern from what or who is evil to the best possible action in the face of it, which according to him is “not a solution, but a response.”
In the common conception, solutions to evil require retribution, and the most obvious way to achieve retribution is through violence. Responses, on the other hand, engender what Ricoeur calls “wisdom,” an unwavering commitment to relieve and prevent suffering. Any violence used in a response to evil would, therefore, be focused on the alleviation of suffering rather than the attempt to stamp out evil where we think we see it. (Steven Paulikas “How Should We Respond to ‘Evil’?”)
“As computer programmers, our formative intellectual experience is working with deterministic systems that have been designed by other human beings. These can be very complex, but the complexity is not the kind we find in the natural world. … Success in the artificially constructed world of software design promotes a dangerous confidence. … [T]he real world is a stubborn place. It is complex in ways that resist abstraction and modeling. It notices and reacts to our attempts to affect it. Nor can we hope to examine it objectively from the outside, any more than we can step out of our own skin.” (Maciej Cegłowski, “The Moral Economy of Tech”)
Technology might be a social bandage at best and a crutch at worst. Insofar as technology seeks to solve social problems—human problems—it can’t do much to enlighten us. Technology itself doesn’t make us better people; that’s not work we can pawn off to anyone (or anything) else.
For instance, modern cars may be safer than ever, saving thousands of lives in accidents; but they do nothing to discourage drunk driving. If anything, safer cars means that we can take more chances, since we’re more likely to survive accidents. Today, algorithms that make decisions for us mean we’re outsourcing that intellectual labor, risking the loss of deliberative skills, including moral reasoning. Those are exactly the skills we need for social progress.
Because it’s built on a technological foundation, modern society overprivileges empirical knowledge. Many seem to believe that engineering is real, while ethics is just opinions, and opinions don’t matter much. As a result, we see an emphasis on STEM (science, technology, engineering, and mathematics) education from primary schools to most universities. But non-empirical matters, such as ethics, continue to be difficult to teach and are often neglected, despite lip service to the arts and humanities.
More than tolerance, wisdom is difficult to sustain. We’re having to re-learn lost lessons—sometimes terrible lessons— from the past, and intergenerational memory is short. “History does not repeat itself, but it rhymes,” as Mark Twain (might have) said. (Patrick Lin “Technological vs. Social Progress: Why the Disconnect?“)