This report explores the risks of AI in Pakistan, focusing on algorithmic unfairness in automated decision-making and its impact on constitutional rights to equality and non-discrimination. It examines how societal biases can be embedded in AI systems, reinforcing inequalities. Drawing on expert insights, the report identifies gaps between legal frameworks and AI development and provides policy recommendations for ethical and fair AI governance in Pakistan.
Project lead: Uzma Nazir Chaudhry
This report explores the risks of AI in Pakistan, focusing on algorithmic unfairness in automated decision-making and its impact on constitutional rights to equality and non-discrimination. It examines how societal biases can be embedded in AI systems, reinforcing inequalities. Drawing on expert insights, the report identifies gaps between legal frameworks and AI development and provides policy recommendations for ethical and fair AI governance in Pakistan.
Project lead: Uzma Nazir Chaudhry
We work across Pakistan, driving legal reform, advocacy, and policy change to protect human rights and empower communities.