Wednesday, 22 May, 2024, 11:58 AM
Advance Search

Gender biasness in the age of automation

Published : Sunday, 21 May, 2023 at 12:00 AM  Count : 1525

Gender biasness in the age of automation

Gender biasness in the age of automation

In today's fast-moving world Artificial Intelligence (AI), Machine Learning (ML) and automation are the most hyped terms. AI is mainly about making decisions based on human input whereas ML and automation refer to the completion of performing a predefined set of tasks.

The basic building procedure of ML models should be known to not only tech people but also others to depict its capacity and impacts. ML models are built by training the model with data in such a way that training data is the seed and the model is the big tree from that seed.

The seamless integration of AI, ML and automation with every sphere of human life is vigilant widely. From shopping to research, AI, ML and automation are everywhere. These new technologies are here to make human life easy, trouble-free and effortless to the maximum possible extent. But with the soaring of these new technologies, controversies around them arose too. There are sayings that AI reinforces gender biases.

We know about Siri, Alexa, Cortana and Google Assistant. They are all AI voice assistants of Apple, Amazon, Microsoft and Google respectively. The names of the AI assistants, especially Siri, Alexa and Cortana are female names and they all are acquainted with female voices.

Yet, AI doesn't need gender to do its functions. If a physical avatar is used by AI to operationalize its activities then it must be made and designed such that its functioning becomes easier or meaningful in terms of interaction with humans.

In fact, gender presentation of AI can be only a user interference, not more than that. UNESCO indicates that the above-mentioned AI assistants are designed to have submissive personalities and ultimately it points to stereotypically feminine traits.

AI can alter a company's hiring process and eventually, it can affect the whole company by its phenomenon. In 2014-2015, an incident took place in Amazon where discrimination against women was held during the resume screening process. Amazon initiated an automated resume screening model and unfortunately, the model was trained with a biased dataset.

The model was trained by a branch of AI named, Natural language Processing (NLP) and it trained the model to use some linguistic signals which underestimated women and thus women were left out by the resume screening model.

As soon as the officials of Amazon discovered this, they discarded the biased model. AI is so powerful that it can influence people's opinions and behaviour not only as a company but also as an individual. As a result, AI is contributing highly to global shape.

Matilda Effect is a bias that is used against acknowledging women scientists and crediting her male colleague or scientist for her achievements. Steps should be taken so that societal bugs like Matilda Effect cannot take possession of AI.

Gender biasness in the age of automation

Gender biasness in the age of automation

The gender disparity of AI is actually the reflection of people's thoughts and limitations. The norms and values believed in society are transferred to the new technologies through the training model step of machine learning. It is to be kept in mind that behind all these technical upgrades, there is human and thus the human cannot be above flaws.

The current world is focusing on development and progress ultimately leading to pursuing sustainable development goals and reconciliation with both men and women. Ethical AI aligns with human-centered values and thus gender biases cannot be a part of it. There is a quest for establishing ethical AI and thus solving gender imparity to ensure developments in a meaningful way.

Some diversity can act as weapons to combat the gender biases in AI, for example, diversity in training data, and diversity in people groups who are labelling training data. Along with these, ensuring that the AI models are not biased towards any demographics and checking it for different demographic groups can also be a way out. Lastly, encouragement towards unbiased ML models should be habituated where necessary.

To equip women for today's advancements and to eradicate gender biases from root to leaves including society to ML models, the importance of STEM (Science, Technology, Engineering, and Mathematics) education for women is beyond boundary. Ensuring gender-equal rewards and equal opportunities contribute no less to solving biases also. The intersection of modern technologies with gender is not a newly coined term but the urgency to settle them ethically is a top priority now.

The writer is a research associate, Bangladesh Institute of Governance and Management(BIGM)

Latest News
Most Read News
Editor : Iqbal Sobhan Chowdhury
Published by the Editor on behalf of the Observer Ltd. from Globe Printers, 24/A, New Eskaton Road, Ramna, Dhaka.
Editorial, News and Commercial Offices : Aziz Bhaban (2nd floor), 93, Motijheel C/A, Dhaka-1000.
Phone: PABX- 41053001-06; Online: 41053014; Advertisement: 41053012.
E-mail: info©, news©, advertisement©, For Online Edition: mailobserverbd©
  [ABOUT US]     [CONTACT US]   [AD RATE]   Developed & Maintenance by i2soft