Strategies Learned at the DISC Unconference for Measuring Diversity in the Julia Community

This November, NumFOCUS hosted the Diversity and Inclusion in Scientific Computing (DISC) Unconference alongside the 2017 NYC PyData Conference.

The Unconference was a fantastic learning opportunity and a great place to meet individuals whose interests in diversity align well with ours at Julia Computing. The unconference attendees proved to be a valuable source of ideas for our upcoming diversity initiatives since many of them have already encountered and considered some of the challenges we face.


Unconference participants sort their ideas to choose topics to cover during the event.

One of the topics covered at the unconference that was particularly helpful for me was about metrics to quantify diversity. Julia Computing and the broader Julia community want to make the Julia user community more diverse, but there are many ways we might attempt to do this. How can we determine what works and what does not? How can we know if our efforts are paying off?

One way to evaluate our success might be to use the results of surveys given to attendees at Julia events to measure how diverse our audience is and how diversity within our audiences changes over time. Prior to the unconference, I had not thought about the trade-off between accurately and responsibly collecting personal information for diversity statistics. Mandatory self-reporting of, for example, gender, race, and age may allow conference/workshop organizers to generate more representative statistics than would optional self-reporting, but mandating that attendees provide this information may make them feel uncomfortable. Furthermore, asking an attendee to identify as a member of a set of pre-defined categories may leave some attendees feeling excluded.

DISC Unconference participants discussed strategies to quantify diversity responsibly. For example, event organizers may choose to give attendees a mandatory survey for quantifying diversity statistics that gives the attendee the option to decline to respond to each question. By requiring survey participation without requiring self-reporting, this approach will heighten self-reporting amongst those who are comfortable relaying how they identify, but who may not have taken the time to do so in an optional survey. Inclusive wording on a survey for quantifying diversity might include merely asking event attendees whether they identify as under-represented in their field because of their gender, for example, rather than asking them to specify their gender. Moreover, surveys asking for personal information from attendees should tell those attendees what will be done with their information and why it is being requested of them. Thanks to these discussions, I feel more prepared to measure the effectiveness of our diversity efforts at Julia Computing.

Jane Herriman is Director of Diversity and Outreach at Julia Computing and a PhD candidate in applied physics and materials science at Caltech.