top of page

3 Questions To… Aurélie Jean

3 Questions To… Aurélie JEAN, PhD, Computational Scientist & Entrepreneur. Having worked at the Massachusetts Institute of Technology and Bloomberg, Dr. Aurélie JEAN calls upon everyone - including and especially women - to fully understand and capture the underlying technology mechanisms of computer science and coding as a mean of empowerment.


We often use the metaphor of the “black box” to describe our current data society in which algorithms help taking daily decisions that affect our lives - from the loan we get, the insurance rate we pay, opportunities at school which influence our success or otherwise, our potential recidivism risk assessment - in an opaque and incontestable way. As an expert in Applied Mathematics and Computer Programming, in your view what tools can be used to promote algorithm explainability? Should we design technical standards or legal obligations, like a right to audit?

Aurélie JEAN: Algorithm explainability is a challenge. That being said, we (designers of algorithms) should all work to make users, decision makers and state organizations understand the mechanisms of a given algorithm. This also provides an opportunity to improve the algorithm thanks to close collaboration between users and designers. This explainability can be shaped in a broad range of ways, and depends on the type of algorithms. Indeed, while explicit algorithms (that are explicitly defined by humans) are easily explainable by listing its explicit criteria, hypothesis and logic conditions, implicit algorithms are much less straightforward to explain as their criteria are implicitly and abstractly defined within what we call the invariants of the neural network that is created from machine learning.

"It seems obvious that auditing algorithms does not make sense at all."

Once we understand how algorithms are designed and used, It seems obvious that auditing algorithms does not make sense at all. Indeed, it is impossible to provide users with the list of implicit criteria because the invariants are much too abstract and largely unknown. Instead, it is more valuable and concrete to audit the best practices of developing, deploying and testing algorithms within an organization. When developing an algorithm one should guarantee that the data chosen to calibrate or train the algorithm is statistically representative, and that some relevant tests on the algorithms are done to identify any risk of underlying bias . One could also list the basic logic mechanisms of the algorithm as well as the purpose of this algorithm, what it means to do and what it does not mean to achieve. Providing users and states with the list of best practices for a given algorithm is key to help those people provide relevant and well formulated questions or even possible solutions to algorithm issues. Auditing those best practices is also important to reveal any data usage abuse, thereby preventing scandals and minimizing the reputational risk of the company that develops those algorithms.


You claim that everyone should be exposed to coding. Without needing to become a software developer, you are convinced that it is a powerful way to capture the feel of this discipline, to understand the process of categorization, and the influences on the choice of algorithm. What public policy measures would you advise to implement toward that goal?

Indeed, being exposed to coding is a powerful way to understand the basic concepts of data, algorithm, data erasure, data analysis, data mirror copy, memory…. As well as to develop a higher level of self-confidence towards technologies, to be in the driving seat at the conception / at the outset by becoming a strong collaborator of the tech companies. As I often say, users are the best enemies of tech companies as they can challenge them and disturb them in order to improve the technologies. But to do so, people need to understand how technologies are built and are working.

"Users are the best enemies of tech companies as they can challenge them and disturb them in order to improve the technologies. But to do so, people need to understand how technologies are built and are working. "

In practice, this can be achieved in schools or in companies in the context of continuous learning. Some methods were created to teach algorithms and coding to kids without using computers, such as the methods proposed by the company COLORI, which were designed for 3-6 year old kids. Some coding classes are proposed in elementary and middle schools in many countries all over the world. In France, mandatory and optional computer sciences classes are now provided to high school students. This is a big deal as France is one of the few countries in the world to offer such classes country-wide. With regard to adults and professionals, companies should provide their employees with coding and algorithm training to develop their analytical mindset and to enable them become power users of technologies. Even though many employees will never write a single line of code in their career, they will need to work closely with tech people as companies become more and more, focused on tech given the current artificial intelligence transformation. As I often say, coding and algorithm is the next language to learn!


We often hear about regulation when it comes to platforms and dominant tech players. According to you, what is the user’s role in this equilibrium? Are there responsibilities that one should bear in mind when using such services?

I believe that we need regulations to frame the development and the usage of such technologies. That being said those regulations need to be created smartly to accelerate innovation, as well as to concretely protect users. To reach that goal, those regulations should be written in close collaboration with scientists and developers of technologies, who are working in the industry and not only by legislators. Also, those regulations should not only be framed around the obligations of tech actors, but also around the duties and roles of users. Indeed, having responsibilities helps users be free as they understand their leverage and role in the ecosystem. Users should be more proactive and understand the main mechanisms of technologies to better protect and defend themselves as well.



bottom of page