As far as we can tell, the answer’s a tepid ‘sure, a little.’ We’re seeing small changes filter through in the form of corporate and academic commitments, but continuing studies demonstrate there’s a lot of work left to be done when it comes to actual recruitment and equal-opportunity treatment. While this is probably true throughout most traditional industries, in the STEM fields it’s especially interesting to view the issue of diversity through the lens of data science and data scientists. I interviewed Radhika Krishnan, Chief Product Officer, Hitachi Vantara, to find out what was at the heart of the issue of diversity in 2022. Krishnan told me that we needed to ensure we were taking steps to understand the problem, if we wanted to solve it: But there’s more than one problem when it comes to issues of diversity in data science. There’s the science problem, and the scientist problem. As Krishnan explained, bias can creep into data or systems through either of these avenues: Yet, it takes more than identifying and measuring problems to solve them. Arguably, nobody should be unaware of STEM’s diversity issues in 2022. Unfortunately, developers can’t just push a magic button labeled ‘fix bias’ to solve the problem. Measuring how much diversity you have in the data sciences goes a long way, but we need to be careful about what we say constitutes the “data science” realm. We can’t just assume we are talking about data engineers and data architects. It’s also pertinent to the domain experts with the expertise that are collecting the data and mining insights. There must be diversity there as well. As straight forward as it sounds, a lot of this comes down to customer centricity. If you know your customer base that’s key. Your customer base is rarely all one type of people. It doesn’t matter what industry you’re in, you need to have diversity in the organization that’s catering to the customers, because the customer base itself is diverse. Bias enters most technology systems unintentionally. The developers aren’t necessarily trying to make machines or algorithms that work better for certain demographics than others. That’s why you’ll often hear bias referred to as something that “creeps” into systems. Luckily, it’s 2022 and we’re starting to see some solutions to these problems creep onto the scene themselves. Chief among these is transparency and explainability. As Krishnan transparently explained to me: At the end of the day, however, solving the problem of diversity in STEM is no longer just an issue of supporting a diverse workforce. It’s also a matter of supporting clients, partners, customers, and users in the global technology space. Such transparency aids in complying with regulatory requirements, but also in providing context to leadership on everything from decision-making to operational results. It also helps root out bias in the algorithms. I would say this is where we go back to thinking about the customer. Making sure your data is representative of the customer base you’re going over is going to be important. But it still takes effort to get there. As the above mentioned studies all conclude, STEM still gravitates toward an imbalance of straight, white, CIS-men, despite global efforts to balance the field. In other words, there’s still work to do. According to Krishnan, that means we need to take the burden off those seeking better treatment by continuing to implement changes at the organizational level through policy: Did you know Radhika Krishnan is speaking at TNW Conference this summer? Check out the full list of speakers here. One other important aspect is mentorship programs for those in underserved demographics. The importance of having a role model can’t be understated.