Corporate social responsibility, or CSR, is a deeply embedded concept in Western society. It embodies the idea that corporations have an ethical responsibility to society beyond financial return and beyond their immediate shareholders. CSR organizations, contractors and reporters have proliferated in recent decades as activist pressure around labor rights, equity and environmental destruction including climate change has ramped up. This...