ML bringing STEM Diversity

Even though progress has been made over the past decades, gender and racial disparities in STEM (science, technology, math, and engineering) fields continue to persist.

A 2021 Pew Research study found that only 9 percent and 8 percent of STEM jobs are held by Black and Hispanic workers, respectively. And while the study found that women hold 50 percent of all STEM jobs (including health-related jobs), the percentages are far lower for jobs in physical sciences (40 percent), computing (25 percent), and engineering (15 percent).

Could machine learning help researchers better understand the factors that contribute to those disparities? Or are machine-learning tools partly to blame for the gender and racial discrepancies in STEM? Haewon Jeong, a postdoctoral fellow in the lab of Flavio Calmon, Assistant Professor of Electrical Engineering at the Harvard John A. Paulson School of Engineering and Applied Sciences, is embarking on a research study to explore both questions.

Jeong received a 2021 Harvard Data Science Initiative Postdoctoral Research Fund award for the project. She will collaborate with Nilanjana Dasgupta, Professor in the Department of Psychological and Brain Sciences at the University of Massachusetts Amherst, who studies implicit bias and stereotypes in STEM education, and Muriel Médard, Cecil H. and Ida Green Professor in the Electrical Engineering and Computer Science Department at MIT.

“We will use machine learning tools to find out what factors feed into implicit bias about science and math in middle school students,” Jeong said. “Because machine learning can pick up subtle patterns that humans often miss, we may be able to use it to detect students who are likely to fall prey to implicit bias and may be at risk for giving up on more advanced math classes in school.”

Jeong and her collaborators will rely on a dataset Dasgupta gathered from a five-year longitudinal field study at 10 U.S. middle schools. Middle school is a critical time to focus on STEM education, Jeong said, since students are developing stronger critical thinking and problem solving skills while also beginning to consider future career paths.

That survey data was collected from 3,000 students in grades 7 to 9, and their parents. It gathered demographic and socioeconomic information and included questions about student self-confidence in math and science classes. The surveys also captured students’ scores from implicit association tests (IATs), which measure subconscious attitudes and beliefs by asking individuals to pair concepts (such as male and female) with attributes (such as logical).

The survey also asked parents for their perceptions of how proficient their children are at STEM subjects, how hard they work in STEM classes, and how much they enjoy them.

By building machine-learning models to analyze that dataset, Jeong hopes to reveal factors that contribute to students’ biases about STEM education and careers.

While the algorithms may provide valuable insights, the risks the technology poses when applied to testing, grading, and class placement inspired Jeong to study the downsides of machine learning.

“Machine learning can be a double-edge sword,” she said. “If you just use machine learning without care, you can induce more bias.”

Jeong will use different tests to measure the fairness of machine learning models used in educational settings. One method involves verifying individual fairness — whether two similar individuals received similar treatment by the algorithm.

Another involves studying group fairness. In the case of using machine learning for class placement, for instance, the numbers of male and female students placed into an advanced class should be similar if there are similar numbers of male and female students in the dataset.

The datasets themselves will likely pose the biggest challenges to this project, she said.

“Data can be noisy. Especially with survey data, we don’t know if each student responded to the survey honestly,” she said. “If the data are too noisy, that could lead to a lot of missing pieces. If that’s the case, can we really find something statistically significant from this data and, if not, can we find other sources of similar data to validate our hypothesis?”

Ultimately, Jeong is hoping the project will inspire others to become interested in this area of research and dig deeper into the factors that lead to disparities in STEM education. A future research study could use AI to analyze audio or video transcripts from students, which could paint a clearer picture of the factors that contribute to implicit biases, she said.

And as she prepares to dive into this complex project, Jeong’s experiences as a STEM student inspire her to tackle such a thorny problem. During her undergraduate studies in Korea, the university department’s 90 electrical engineering professors were all men.

“The theme of this project, increasing diversity in STEM fields, really resonates with me. This project marks the first time in my entire academic career that I have had female mentors,” she said. “So I really want to know how we can encourage more women and students of color to pursue STEM education and careers.”

This research is also supported by the National Science Foundation.

This article has been published from the source link without modifications to the text. Only the headline has been changed.

Source link