{"id":54567,"date":"2025-08-07T11:00:45","date_gmt":"2025-08-07T15:00:45","guid":{"rendered":"https:\/\/ehrc.ca\/bias-in-ai-systems\/"},"modified":"2025-08-14T16:14:57","modified_gmt":"2025-08-14T20:14:57","slug":"prejuges-dans-les-systemes-d-ia","status":"publish","type":"post","link":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/","title":{"rendered":"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA"},"content":{"rendered":"<p><span data-contrast=\"auto\">Les pr\u00e9jug\u00e9s dans l\u2019intelligence artificielle (IA) constituent une pr\u00e9occupation majeure pour les chercheur\u00b7euse\u00b7s, les \u00e9thicien\u00b7ne\u00b7s et les responsables des politiques. Si les syst\u00e8mes d\u2019IA peuvent sembler neutres, ils <\/span><a href=\"https:\/\/www.nature.com\/articles\/s43588-024-00741-1\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">codifient, refl\u00e8tent et amplifient souvent<\/span><\/a><span data-contrast=\"auto\"> les pr\u00e9jug\u00e9s soci\u00e9taux int\u00e9gr\u00e9s dans les donn\u00e9es avec lesquelles ils sont entra\u00een\u00e9s. Ces pr\u00e9jug\u00e9s se manifestent dans de nombreux secteurs et apparaissent dans les mod\u00e8les de langage, les syst\u00e8mes de reconnaissance faciale, les outils d\u2019embauche, les algorithmes de surveillance et les diagnostics de sant\u00e9.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<p><span data-contrast=\"auto\">Le contenu suivant fournit un regard neuf sur les derni\u00e8res recherches, le journalisme d\u2019enqu\u00eate et les d\u00e9veloppements politiques en mati\u00e8re de pr\u00e9jug\u00e9s dans l\u2019IA. Il met en \u00e9vidence le chemin parcouru pour comprendre et traiter ce probl\u00e8me, ainsi que les efforts en cours pour rendre l\u2019IA plus juste et \u00e9quitable.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<h5><span data-contrast=\"auto\">Pr\u00e9jug\u00e9s dans les mod\u00e8les de langage et le traitement du langage naturel<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/h5>\n<p><span data-contrast=\"auto\">Les grands mod\u00e8les de langage (GML), comme les transformeurs g\u00e9n\u00e9ratifs pr\u00e9entra\u00een\u00e9s et LLaMA, ont montr\u00e9 des pr\u00e9jug\u00e9s persistants fond\u00e9s sur la race, le genre, la religion et la classe sociale. Une \u00e9tude de\u202f2024 analysant 77\u202fGML a r\u00e9v\u00e9l\u00e9 que la plupart d\u2019entre eux <\/span><a href=\"https:\/\/www.nature.com\/articles\/s43588-024-00741-1\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">d\u00e9montraient des biais endogroupes et de d\u00e9rogation de l\u2019autre groupe<\/span><\/a><span data-contrast=\"auto\">, refl\u00e9tant les pr\u00e9jug\u00e9s sociaux humains.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<p><span data-contrast=\"auto\">Des analyses plus approfondies ont r\u00e9v\u00e9l\u00e9 que <\/span><a href=\"https:\/\/aclanthology.org\/2025.naacl-long.600\/\"><span data-contrast=\"none\">la plupart des strat\u00e9gies d\u2019att\u00e9nuation<\/span><\/a><span data-contrast=\"auto\"> ont \u00e9t\u00e9 mises en \u0153uvre principalement en anglais, ce qui a entra\u00een\u00e9 la propagation de st\u00e9r\u00e9otypes occidentaux dans d\u2019autres langues, un ph\u00e9nom\u00e8ne qualifi\u00e9 de \u00ab\u202fcolonialisme num\u00e9rique\u202f\u00bb. Par exemple, les syst\u00e8mes d\u2019IA ont <\/span><a href=\"https:\/\/www.wired.com\/story\/ai-bias-spreading-stereotypes-across-languages-and-cultures-margaret-mitchell\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">v\u00e9hicul\u00e9 des tropes genr\u00e9s<\/span><\/a><span data-contrast=\"auto\">, comme le st\u00e9r\u00e9otype de la \u00ab\u202fblonde idiote\u202f\u00bb, dans des langues o\u00f9 ils n\u2019existaient pas auparavant.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<p><span data-contrast=\"auto\">Certains mod\u00e8les renforcent \u00e9galement les pr\u00e9jug\u00e9s en fabriquant des citations \u00e9rudites pour justifier des affirmations pr\u00e9judiciables, une pratique qu\u2019<\/span><a href=\"https:\/\/www.wired.com\/story\/ai-bias-spreading-stereotypes-across-languages-and-cultures-margaret-mitchell\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">une personne sp\u00e9cialiste en recherche a qualifi\u00e9e de<\/span><\/a><span data-contrast=\"auto\"> \u00ab\u202fjustifications hallucin\u00e9es\u202f\u00bb.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<h5><span data-contrast=\"auto\">Pr\u00e9jug\u00e9s dans les syst\u00e8mes de reconnaissance faciale<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/h5>\n<p><span data-contrast=\"auto\">Les logiciels de reconnaissance faciale ont montr\u00e9 d\u2019importantes disparit\u00e9s de performance entre les groupes d\u00e9mographiques. Le National Institute of Standards and Technology (NIST) des \u00c9tats-Unis <\/span><a href=\"https:\/\/nvlpubs.nist.gov\/nistpubs\/ir\/2019\/NIST.IR.8280.pdf\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">a rapport\u00e9 en\u202f2019<\/span><\/a><span data-contrast=\"auto\"> que la plupart des syst\u00e8mes commerciaux pr\u00e9sentaient des taux de faux positifs (mauvaise correspondance) plus \u00e9lev\u00e9s pour les visages noirs, asiatiques et f\u00e9minins. L\u2019auteure Joy Buolamwini a racont\u00e9 comment les outils de reconnaissance faciale <\/span><a href=\"https:\/\/www.penguinrandomhouse.ca\/books\/670356\/unmasking-ai-by-dr-joy-buolamwini\/9780593241844\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">ne d\u00e9tectaient pas son visage<\/span><\/a><span data-contrast=\"auto\"> avant qu\u2019elle porte un masque blanc, mettant ainsi en \u00e9vidence les pr\u00e9jug\u00e9s raciaux syst\u00e9miques dans les mod\u00e8les de vision par ordinateur.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<p><span data-contrast=\"auto\">Utilis\u00e9e par les forces de l\u2019ordre, la reconnaissance faciale a men\u00e9 \u00e0 des arrestations ill\u00e9gales, <\/span><a href=\"https:\/\/www.washingtonpost.com\/podcasts\/post-reports\/arrested-by-ai\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">affectant de mani\u00e8re disproportionn\u00e9e les hommes noirs<\/span><\/a><span data-contrast=\"auto\">. Amnistie Internationale <\/span><a href=\"https:\/\/www.amnesty.org.uk\/predictive-policing\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">a qualifi\u00e9 ces applications<\/span><\/a><span data-contrast=\"auto\"> au Royaume-Uni comme \u00e9tant une forme de \u00ab\u202fracisme automatis\u00e9\u202f\u00bb.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<h5><span data-contrast=\"auto\">Pr\u00e9jug\u00e9s dans les algorithmes d\u2019embauche et de recrutement<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/h5>\n<p><span data-contrast=\"auto\">Les outils de recrutement pilot\u00e9s par l\u2019IA ont montr\u00e9 une tendance \u00e0 reproduire les pr\u00e9jug\u00e9s historiques li\u00e9s au genre et \u00e0 la race. Par exemple, l\u2019outil d\u2019IA de s\u00e9lection de CV d\u2019Amazon, qui a \u00e9t\u00e9 abandonn\u00e9, <\/span><a href=\"https:\/\/link.springer.com\/article\/10.1007\/s13347-022-00543-1\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">p\u00e9nalisait les candidatures f\u00e9minines<\/span><\/a><span data-contrast=\"auto\"> en d\u00e9valorisant les mentions d\u2019activit\u00e9s \u00ab\u202ff\u00e9minines\u202f\u00bb.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<p><span data-contrast=\"auto\">Les outils d\u2019entretien vid\u00e9o ont <\/span><a href=\"https:\/\/www.theguardian.com\/technology\/2023\/mar\/27\/robot-recruiters-can-bias-be-banished-from-ai-recruitment-hiring-artificial-intelligence\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">d\u00e9montr\u00e9 des pr\u00e9jug\u00e9s linguistiques et capacitistes<\/span><\/a><span data-contrast=\"auto\">, p\u00e9nalisant injustement les personnes ayant une langue maternelle diff\u00e9rente et les personnes en situation de handicap en raison de leur voix ou de d\u00e9tails faciaux atypiques que les algorithmes interpr\u00e8tent n\u00e9gativement.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<h5><span data-contrast=\"auto\">Pr\u00e9jug\u00e9s dans le maintien de l\u2019ordre pr\u00e9dictif et la justice p\u00e9nale<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/h5>\n<p><span data-contrast=\"auto\">Les critiques ont largement condamn\u00e9 les logiciels de maintien de l\u2019ordre pr\u00e9dictif qui renforcent le profilage racial. Une enqu\u00eate de\u202f2023 a r\u00e9v\u00e9l\u00e9 que le logiciel PredPol <\/span><a href=\"https:\/\/themarkup.org\/prediction-bias\/2023\/10\/02\/predictive-policing-software-terrible-at-predicting-crimes\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">ciblait de mani\u00e8re disproportionn\u00e9e les communaut\u00e9s de personnes de couleur<\/span><\/a><span data-contrast=\"auto\">, malgr\u00e9 une faible pr\u00e9cision des pr\u00e9dictions.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<p><span data-contrast=\"auto\">Amnistie Internationale a constat\u00e9 que 32\u202fdes 45\u202fcorps policiers britanniques utilisaient des outils de pr\u00e9vision qui <\/span><a href=\"https:\/\/www.amnesty.org.uk\/predictive-policing\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">renfor\u00e7aient les sch\u00e9mas racistes d\u2019application de la loi<\/span><\/a><span data-contrast=\"auto\">. Aux \u00c9tats-Unis, <\/span><a href=\"https:\/\/www.washingtonpost.com\/podcasts\/post-reports\/arrested-by-ai\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">des arrestations ill\u00e9gales li\u00e9es \u00e0 des outils de reconnaissance faciale erron\u00e9s<\/span><\/a><span data-contrast=\"auto\"> ont donn\u00e9 lieu \u00e0 des poursuites et \u00e0 des r\u00e9actions n\u00e9gatives de la part du public.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<h5><span data-contrast=\"auto\">Pr\u00e9jug\u00e9s dans les algorithmes de soins de sant\u00e9<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/h5>\n<p><span data-contrast=\"auto\">Une \u00e9tude r\u00e9alis\u00e9e en 2019 a r\u00e9v\u00e9l\u00e9 qu\u2019un <\/span><a href=\"https:\/\/pubmed.ncbi.nlm.nih.gov\/31649194\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">algorithme de risque largement utilis\u00e9 dans les soins de sant\u00e9<\/span><\/a><span data-contrast=\"auto\"> \u00e9tait discriminatoire \u00e0 l\u2019\u00e9gard de la patient\u00e8le noire en l\u2019identifiant comme pr\u00e9sentant un risque plus faible. L\u2019algorithme utilisait les d\u00e9penses de sant\u00e9 comme indicateur des besoins, ce qui d\u00e9savantage cette patient\u00e8le qui, en raison d\u2019un acc\u00e8s r\u00e9duit aux soins, d\u00e9pense g\u00e9n\u00e9ralement moins, m\u00eame si ses besoins en mati\u00e8re de sant\u00e9 sont plus \u00e9lev\u00e9s.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<p><span data-contrast=\"auto\">D\u2019autres \u00e9tudes ont montr\u00e9 que <\/span><a href=\"https:\/\/www.npr.org\/sections\/health-shots\/2023\/06\/06\/1180314219\/artificial-intelligence-racial-bias-health-care\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">les outils de diagnostic sont moins performants sur les peaux plus fonc\u00e9es<\/span><\/a><span data-contrast=\"auto\"> et que les notes cliniques contiennent souvent un langage racialis\u00e9, ce qui peut biaiser les syst\u00e8mes d\u2019IA en aval.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<h5><span data-contrast=\"auto\">Manifestations plus larges de pr\u00e9jug\u00e9s de l\u2019IA<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/h5>\n<p><span data-contrast=\"auto\">Les g\u00e9n\u00e9rateurs d\u2019images d\u2019IA <\/span><a href=\"https:\/\/www.penguinrandomhouse.ca\/books\/670356\/unmasking-ai-by-dr-joy-buolamwini\/9780593241844\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">pr\u00e9sentent souvent les professions \u00e0 statut \u00e9lev\u00e9 comme \u00e9tant blanches et masculines<\/span><\/a><span data-contrast=\"auto\">, tandis que les r\u00f4les \u00e0 statut faible ou la criminalit\u00e9 sont repr\u00e9sent\u00e9s par des personnes de couleur. De m\u00eame, les syst\u00e8mes automatis\u00e9s de mod\u00e9ration de contenu ont <\/span><a href=\"https:\/\/www.theguardian.com\/technology\/2023\/mar\/27\/robot-recruiters-can-bias-be-banished-from-ai-recruitment-hiring-artificial-intelligence\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">fait preuve de pr\u00e9jug\u00e9s sexistes<\/span><\/a><span data-contrast=\"auto\"> en signalant les images de femmes comme explicites plus souvent que celles d\u2019hommes dans des contextes \u00e9quivalents.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<h5><span data-contrast=\"auto\">Efforts d\u2019att\u00e9nuation<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/h5>\n<p><span data-contrast=\"auto\">De nombreuses organisations s\u2019efforcent de lutter contre les pr\u00e9jug\u00e9s de l\u2019IA, notamment l\u2019<\/span><a href=\"https:\/\/www.ajl.org\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">Algorithmic Justice League<\/span><\/a><span data-contrast=\"auto\">, le <\/span><a href=\"https:\/\/www.dair-institute.org\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">Distributed AI Research Institute (DAIR)<\/span><\/a><span data-contrast=\"auto\"> et <\/span><a href=\"https:\/\/huggingface.co\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">Hugging Face<\/span><\/a><span data-contrast=\"auto\">. <\/span><span data-contrast=\"auto\">Parmi les r\u00e9ponses politiques, on peut citer le projet de la Maison-Blanche des \u00c9tats-Unis intitul\u00e9 <\/span><a href=\"https:\/\/bidenwhitehouse.archives.gov\/ostp\/ai-bill-of-rights\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">Blueprint for an AI Bill of Rights<\/span><\/a><span data-contrast=\"auto\">, le <\/span><a href=\"https:\/\/commission.europa.eu\/news-and-media\/news\/ai-act-enters-force-2024-08-01_fr\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">r\u00e8glement sur l\u2019IA<\/span><\/a><span data-contrast=\"auto\"> de l\u2019Union europ\u00e9enne et la <\/span><a href=\"https:\/\/ised-isde.canada.ca\/site\/innover-meilleur-canada\/fr\/loi-lintelligence-artificielle-donnees-liad-document-complementaire\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">Loi sur l\u2019intelligence artificielle et les donn\u00e9es (LIAD)<\/span><\/a><span data-contrast=\"auto\"> du Canada. Les <\/span><a href=\"https:\/\/www.healthindustrywashingtonwatch.com\/2025\/01\/articles\/regulatory-developments\/hhs-developments\/office-for-civil-rights-hhs-developments\/hhs-recent-guidance-on-ai-use-in-health-care\/\" target=\"_blank\" rel=\"noopener\"><span data-contrast=\"none\">r\u00e9glementations de la FDA et du HHS<\/span><\/a><span data-contrast=\"auto\"> propres \u00e0 la sant\u00e9 exigent d\u00e9sormais la divulgation des pr\u00e9jug\u00e9s et la transparence concernant les outils d\u2019IA.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n<h5><span data-contrast=\"auto\">La lutte contre les pr\u00e9jug\u00e9s de l\u2019IA n\u00e9cessite une action collective<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/h5>\n<p><span data-contrast=\"auto\">Les pr\u00e9jug\u00e9s dans l\u2019IA ne sont pas un d\u00e9faut technique\u202f: ils refl\u00e8tent des in\u00e9galit\u00e9s structurelles. Bien que la prise de conscience soit croissante et que des mesures r\u00e9glementaires soient en cours, l\u2019att\u00e9nuation reste in\u00e9gale. Les d\u00e9veloppeur\u00b7euse\u00b7s, les v\u00e9rificateur\u00b7rice\u00b7s et les communaut\u00e9s doivent s\u2019engager activement dans des d\u00e9veloppements inclusifs, mener des v\u00e9rifications ind\u00e9pendantes et participer \u00e0 la prise de d\u00e9cisions pour s\u2019assurer que l\u2019IA sert tout le monde de mani\u00e8re \u00e9quitable.<\/span><span data-ccp-props=\"{}\">\u00a0<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Bias in artificial intelligence (AI) has been a prominent concern among researchers, ethicists and policymakers. While AI systems may appear neutral, they often encode, reflect and amplify the societal biases embedded in the data on which they are trained. <\/p>\n","protected":false},"author":968,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"_relevanssi_hide_post":"","_relevanssi_hide_content":"","_relevanssi_pin_for_all":"","_relevanssi_pin_keywords":"","_relevanssi_unpin_keywords":"","_relevanssi_related_keywords":"","_relevanssi_related_include_ids":"","_relevanssi_related_exclude_ids":"","_relevanssi_related_no_append":"","_relevanssi_related_not_related":"","_relevanssi_related_posts":"","_relevanssi_noindex_reason":"","footnotes":""},"categories":[269,169],"tags":[],"audience":[],"topic":[250],"news_year_published":[315],"news_type":[255],"class_list":["post-54567","post","type-post","status-publish","format-standard","hentry","category-blog-fr","category-dei-fr","topic-diversity-equity-inclusion-dei-fr","news_year_published-2025-fr","news_type-blog-post-fr"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA - Electricity Human Resources Canada<\/title>\n<meta name=\"description\" content=\"Bias in artificial intelligence (AI) has been a prominent concern among researchers, ethicists and policymakers. While AI systems may appear neutral, they often encode, reflect and amplify the societal biases embedded in the data on which they are trained.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/\" \/>\n<meta property=\"og:locale\" content=\"fr_FR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA - Electricity Human Resources Canada\" \/>\n<meta property=\"og:description\" content=\"Bias in artificial intelligence (AI) has been a prominent concern among researchers, ethicists and policymakers. While AI systems may appear neutral, they often encode, reflect and amplify the societal biases embedded in the data on which they are trained.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/\" \/>\n<meta property=\"og:site_name\" content=\"Electricity Human Resources Canada\" \/>\n<meta property=\"article:published_time\" content=\"2025-08-07T15:00:45+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-08-14T20:14:57+00:00\" \/>\n<meta name=\"author\" content=\"Muhammad Ahsan\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"\u00c9crit par\" \/>\n\t<meta name=\"twitter:data1\" content=\"Muhammad Ahsan\" \/>\n\t<meta name=\"twitter:label2\" content=\"Dur\u00e9e de lecture estim\u00e9e\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/ehrc.ca\\\/fr\\\/prejuges-dans-les-systemes-d-ia\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/ehrc.ca\\\/fr\\\/prejuges-dans-les-systemes-d-ia\\\/\"},\"author\":{\"name\":\"Muhammad Ahsan\",\"@id\":\"https:\\\/\\\/ehrc.ca\\\/#\\\/schema\\\/person\\\/efee749e6a581aee86508278ace4f377\"},\"headline\":\"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA\",\"datePublished\":\"2025-08-07T15:00:45+00:00\",\"dateModified\":\"2025-08-14T20:14:57+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/ehrc.ca\\\/fr\\\/prejuges-dans-les-systemes-d-ia\\\/\"},\"wordCount\":1228,\"articleSection\":[\"Blog\",\"DEI\"],\"inLanguage\":\"fr-FR\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/ehrc.ca\\\/fr\\\/prejuges-dans-les-systemes-d-ia\\\/\",\"url\":\"https:\\\/\\\/ehrc.ca\\\/fr\\\/prejuges-dans-les-systemes-d-ia\\\/\",\"name\":\"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA - Electricity Human Resources Canada\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/ehrc.ca\\\/#website\"},\"datePublished\":\"2025-08-07T15:00:45+00:00\",\"dateModified\":\"2025-08-14T20:14:57+00:00\",\"author\":{\"@id\":\"https:\\\/\\\/ehrc.ca\\\/#\\\/schema\\\/person\\\/efee749e6a581aee86508278ace4f377\"},\"breadcrumb\":{\"@id\":\"https:\\\/\\\/ehrc.ca\\\/fr\\\/prejuges-dans-les-systemes-d-ia\\\/#breadcrumb\"},\"inLanguage\":\"fr-FR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/ehrc.ca\\\/fr\\\/prejuges-dans-les-systemes-d-ia\\\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/ehrc.ca\\\/fr\\\/prejuges-dans-les-systemes-d-ia\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/ehrc.ca\\\/fr\\\/home\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/ehrc.ca\\\/#website\",\"url\":\"https:\\\/\\\/ehrc.ca\\\/\",\"name\":\"Electricity Human Resources Canada\",\"description\":\"Improving our industry, together.\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/ehrc.ca\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"fr-FR\"},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/ehrc.ca\\\/#\\\/schema\\\/person\\\/efee749e6a581aee86508278ace4f377\",\"name\":\"Muhammad Ahsan\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/5d8d0dca61eace328e3c00eaa58348adf7658c5ec91bd3084f8ea6b9394a92ab?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/5d8d0dca61eace328e3c00eaa58348adf7658c5ec91bd3084f8ea6b9394a92ab?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/5d8d0dca61eace328e3c00eaa58348adf7658c5ec91bd3084f8ea6b9394a92ab?s=96&d=mm&r=g\",\"caption\":\"Muhammad Ahsan\"},\"url\":\"https:\\\/\\\/ehrc.ca\\\/fr\\\/author\\\/ahsan\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA - Electricity Human Resources Canada","description":"Bias in artificial intelligence (AI) has been a prominent concern among researchers, ethicists and policymakers. While AI systems may appear neutral, they often encode, reflect and amplify the societal biases embedded in the data on which they are trained.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/","og_locale":"fr_FR","og_type":"article","og_title":"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA - Electricity Human Resources Canada","og_description":"Bias in artificial intelligence (AI) has been a prominent concern among researchers, ethicists and policymakers. While AI systems may appear neutral, they often encode, reflect and amplify the societal biases embedded in the data on which they are trained.","og_url":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/","og_site_name":"Electricity Human Resources Canada","article_published_time":"2025-08-07T15:00:45+00:00","article_modified_time":"2025-08-14T20:14:57+00:00","author":"Muhammad Ahsan","twitter_card":"summary_large_image","twitter_misc":{"\u00c9crit par":"Muhammad Ahsan","Dur\u00e9e de lecture estim\u00e9e":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/#article","isPartOf":{"@id":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/"},"author":{"name":"Muhammad Ahsan","@id":"https:\/\/ehrc.ca\/#\/schema\/person\/efee749e6a581aee86508278ace4f377"},"headline":"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA","datePublished":"2025-08-07T15:00:45+00:00","dateModified":"2025-08-14T20:14:57+00:00","mainEntityOfPage":{"@id":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/"},"wordCount":1228,"articleSection":["Blog","DEI"],"inLanguage":"fr-FR"},{"@type":"WebPage","@id":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/","url":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/","name":"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA - Electricity Human Resources Canada","isPartOf":{"@id":"https:\/\/ehrc.ca\/#website"},"datePublished":"2025-08-07T15:00:45+00:00","dateModified":"2025-08-14T20:14:57+00:00","author":{"@id":"https:\/\/ehrc.ca\/#\/schema\/person\/efee749e6a581aee86508278ace4f377"},"breadcrumb":{"@id":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/#breadcrumb"},"inLanguage":"fr-FR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/ehrc.ca\/fr\/prejuges-dans-les-systemes-d-ia\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/ehrc.ca\/fr\/home\/"},{"@type":"ListItem","position":2,"name":"Pr\u00e9jug\u00e9s dans les syst\u00e8mes d\u2019IA"}]},{"@type":"WebSite","@id":"https:\/\/ehrc.ca\/#website","url":"https:\/\/ehrc.ca\/","name":"Electricity Human Resources Canada","description":"Improving our industry, together.","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/ehrc.ca\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"fr-FR"},{"@type":"Person","@id":"https:\/\/ehrc.ca\/#\/schema\/person\/efee749e6a581aee86508278ace4f377","name":"Muhammad Ahsan","image":{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/secure.gravatar.com\/avatar\/5d8d0dca61eace328e3c00eaa58348adf7658c5ec91bd3084f8ea6b9394a92ab?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/5d8d0dca61eace328e3c00eaa58348adf7658c5ec91bd3084f8ea6b9394a92ab?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/5d8d0dca61eace328e3c00eaa58348adf7658c5ec91bd3084f8ea6b9394a92ab?s=96&d=mm&r=g","caption":"Muhammad Ahsan"},"url":"https:\/\/ehrc.ca\/fr\/author\/ahsan\/"}]}},"_links":{"self":[{"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/posts\/54567","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/users\/968"}],"replies":[{"embeddable":true,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/comments?post=54567"}],"version-history":[{"count":2,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/posts\/54567\/revisions"}],"predecessor-version":[{"id":54569,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/posts\/54567\/revisions\/54569"}],"wp:attachment":[{"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/media?parent=54567"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/categories?post=54567"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/tags?post=54567"},{"taxonomy":"audience","embeddable":true,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/audience?post=54567"},{"taxonomy":"topic","embeddable":true,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/topic?post=54567"},{"taxonomy":"news_year_published","embeddable":true,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/news_year_published?post=54567"},{"taxonomy":"news_type","embeddable":true,"href":"https:\/\/ehrc.ca\/fr\/wp-json\/wp\/v2\/news_type?post=54567"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}