You are browsing the archive for Meg Foulkes.

Applications for the CoAct Open Calls on Gender Equality (July 1st, 2021- September 30th, 2021) are open!

- July 28, 2021 in Open Knowledge

CoAct is launching a call for proposals, inviting civil society initiatives to apply for our cascading grants with max. 20.000 Euro to conduct Citizen Social Science research on the topic of Gender Equality. A maximum of four (4) applicants will be selected across three (3) different open calls. Applications from a broad range of backgrounds are welcome, including feminist, LGTBQ+, none-binary and critical masculinity perspectives. Eligible organisations can apply until September 30th, 2021, 11:59 PM GMT. All information for submitting applications is available here: https://coactproject.eu/opencalls/ If selected, CoAct will support your work by
  • providing funding for your project (10 months max), alongside dedicated activities, resources and tools
  • providing a research mentoring program for your team. In collaborative workshops you will be supported to co-design and explore available tools, working together with the CoAct team to achieve your goals.
  • connecting you to a community of people and initiatives, tackling similar challenges and contributing to common aims. You will have the opportunity to discuss your projects with the other grantees and, moreover, will be invited to join CoAct´s broader Citizen Social Science network.
You should apply if you:
  • are an ongoing Citizen Social Science project looking for support, financial and otherwise, to grow and become sustainable;
  • are a community interested in co-designing research to generate new knowledge about gender equality topics, broadly defined;
  • are a not-for-profit organization focusing on community building, increasing the visibility of specific communities, increasing civic participation, and being interested in exploring the use of Citizen Social Science in your work.
Read more about the Open Calls here: https://coactproject.eu/opencalls/

OK Justice Programme secures definitive guidance on the use of algorithms in online exams. Our first win in the fight to ensure that Public Impact Algorithms do no harm!

- July 6, 2021 in Open Knowledge

An independent inquiry adopts nearly all of our recommendations in our first challenge to the misuse of Public Impact Algorithms. Strong guidance given to the UK’s Bar Standards Board on the use of “remote proctoring software” which should now guide others’ use of this technology.

Photo by Jon Tyson on Unsplash

  About The Justice Programme  The Justice Programme is a project of the Open Knowledge Foundation, which works to ensure that public impact algorithms do no harm. Find out more about The Justice Programme here, and Public Impact Algorithms here.   The story so far During the Covid pandemic, many educational institutions started using remote proctoring software to monitor students during their exams – i.e monitoring students using their webcams in combination with facial recognition and behavioral recognition technology. Remote proctoring software invades students’ privacy, and runs a serious risk of replicating discrimination in the use of  opaque algorithmic systems. Read more about it here. In the UK, the Bar Standards Board (BSB) contracted with Pearson Vue to provide such software for the vocational exams for barristers in 2020. The use of remote proctoring software was justified by the BSB on the grounds that it was necessary to ensure the ‘integrity’ of the exams. With funding from the Digital Freedom Fund, we notified the Bar Standards Board (BSB) in the UK that we intended to bring legal action, due to concerns that the procedural protections against the use of opaque systems, namely a data protection impact assessment (and an equality impact assessment) had not been properly conducted, if at all.   The Independent Inquiry In response, the BSB announced that use of remote proctoring software would be paused whilst an independent expert inquiry took place. The inquiry was run by Professor Huxley-Binns, an expert in the topic, working alongside Dr Sarabajaya Kumar, an expert in diversity and disability. The focus of the inquiry was to find out what happened, why it happened, who was to blame and what can be done to prevent it from happening again.   Our submissions to the inquiry The Justice Programme Litigation Team  gave lengthy evidence to the inquiry, culminating in a set of recommendations. When the inquiry’s findings were published, seven out of our eight recommendations were adopted by Professor Huxley-Binns. This is a big achievement! The BSB has agreed to adopt these recommendations in the form of an action plan. The report and action plan should act as a safeguard, in that future students experiencing problems with remote proctoring software can use the recommendations to hold the BSB to account. In a time when so much of new algorithmic decision-making systems are still unregulated, safeguards such as guidelines and recommendations are extremely important. Our recommendations included:
  • putting the voices, needs and experiences of students at the centre of any future procurement and/or deployment of exam solutions based on emerging technologies.
  • consolidating and simplifying the data protection framework, with clear data protection standards for all course and exam providers and the timely use of data protection impact assessments to identify and mitigate the risks before contracts are put in place
  • ensuring open access to the type of technology being used and in all cases ensure it is transparent and explained to the end user.
What’s Next? Use of remote proctoring technology is expanding fast, and this case is only a drop in the ocean. We need to raise awareness of the potential harms of these opaque technologies and challenge further misuse. Here at The Justice Programme we are already researching the use of remote proctoring in migrant language testing in the UK, as well as in student doctors’ vocational exams, where harms have been widely reported. Stay tuned for further news, here and on the OKFJP twitter channel.  

Open Knowledge Justice Programme challenges the use of algorithmic proctoring apps

- February 26, 2021 in Open Knowledge

Today we’re pleased to share more details of the Justice Programmes new strategic litigation project: challenging the (mis)use of remote proctoring software.  

What is remote proctoring?

Proctoring software uses a variety of techniques to ‘watch’ students as they take exams. These exam-invigilating software products claim to detect, and therefore prevent, cheating. Whether this software can actually do what it claims, or not, there is concern that they breach privacy, data and equality rights and that the negative impacts of their use on students are significant and serious. 

Case study: Bar Exams in the UK

In the UK, barristers are lawyers who specialise in courtroom advocacy. The Bar Professional Training Course (BPTC) is run by the professional regulatory body: the Bar Standards Board (BSB). In August 2020, because of COVID 19, the BPTC exams took place remotely, and used a proctoring app from US company Pearson Vue. Students taking exams had to allow their room to be scanned and an unknown, unseen exam invigilator to surveil them.  Students had to submit a scan of their face to verify their identity – and were prohibited from leaving their seat for the duration of the exam. That meant up to 5 hours (!) without a toilet break.  Some students had to relieve themselves in bottles and buckets under their desks whilst maintaining ‘eye contact’ with their faceless invigilator. Muslim women were forced to remove their hijabs – and at least one individual had to withdraw from sitting the exam rather than, as they felt it,  compromise their faith. The software had numerous errors in functionality, including suddenly freezing without warning and deleting text. One third of students were unable to complete their exam due to technical errors.

Our response

The student reports, alongside our insight into the potential harms caused by public impact algorithms, prompted us to take action. We were of the opinion that what students were subjected to breached data, privacy and other legal rights as follows: Data Protection and Privacy Rights
  • Unfair and opaque algorithms. The software used algorithmic decision-making in relation to the facial recognition and/or matching identification of students and behavioural analysis during the exams. The working of these algorithms was unknown and undisclosed.
  • The app’s privacy notices were inadequate. There was insufficient protection of the students’ personal data. For example, students were expressly required to confirm that they had ‘no right to privacy at your current location during the exam testing session’ and to ‘explicitly waive any and all claims asserting a right to individual privacy or other similar claims’. Students were asked to consent to these questions just moments before starting an extremely important exam and without being warned ahead of time.
  • The intrusion involved was disproportionate. The software required all students to carry out a ‘room scan’ (showing the remote proctor around their room). They were then surveilled by an unseen human proctor for the duration of the exam. Many students felt this was unsettling and intrusive.
  • Excessive data collection. The Pearson VUE privacy notice reserved a power of data collection of very broad classes of personal data, including biometric information, internet activity information (gleaned through cookies or otherwise), “inferences about preferences, characteristics, psychological trends, preferences, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes” and protected characteristics.
  • Inadequately limited purposes. Students were required to consent to them disclosing to third parties their personal data “in order to manage day to day business needs’, and to consent to the future use of “images of your IDs for the purpose of further developing, upgrading, and improving our applications and systems”.
  • Unlawful data retention. Pearson VUE’s privacy notice states in relation to data retention that “We will retain your Personal Data for as long as needed to provide our services and for such period of time as instructed by the test sponsor.”
  • Data security risks. Given the sensitivity of the data that was required from students in order to take the exam, high standards of data security are required. Pearson VUE gave no assurances regarding the use of encryption. Instead there was a disclaimer that “Information and Personal Data transmissions to this Site and emails sent to us may not be secure. Given the inherent operation and nature of the Internet, all Internet transmissions are done at the user’s own risk.”
  • Mandatory ‘opt-ins’. The consent sought from students was illusory, as it did not enable students to exert any control over the use of their personal data. If they did not tick all the boxes, they could not participate in the exam. Students could not give a valid consent to the invasion of privacy occasioned by online proctoring when their professional qualification depended on it. They were in effect coerced into surrendering their privacy rights. According to the GDPR, consent must be “freely given and not imposed as a condition of operation”.
  Equality Rights Public bodies in the UK have a legal duty to carefully consider the equalities impacts of the decisions they make. This means that a policy, project or scheme must not unlawfully discriminate against individuals on the basis of a ‘protected characteristic’: their race, religion or belief, disability, sex, gender reassignment, sexual orientation, age, marriage or civil partnership and/or pregnancy and maternity. In our letter to the BSB, we said that the BSB had breached their equality rights duties by using a software that featured facial recognition and/or matching processes, which are widely proven to discriminate against people with dark skin. The facial recognition process also required female students to remove their religious dress, therefore breaching the protections that are afforded to people to observe their religion. Female Muslim students were unable to select being observed by female proctors, despite the negative cultural significance of unknown male proctors viewing them in their homes. We also raised the fact that some people with disabilities or women who were pregnant were unfairly and excessively impacted by the absence of toilet breaks for the duration of the assessment. The use of novel and untested software, we said, had the potential to discriminate against older students with fewer IT skills.

The BSB’s Reply

After we wrote to express these concerns, the BSB:
  • stopped using remote proctoring apps, as was scheduled for the next round of bar exams
  • announced that an inquiry into their use of remote proctoring apps in August 2020 to produce an independent account of the facts, circumstances and reasons as to why things went wrong. The BSB invited us to make submissions to this inquiry, which we have done. You can read them here.

Next steps

Here at the Open Knowledge Justice Programme, we’re delighted that the BSB has paused the use of remote proctoring and keenly await the publication of the findings of the independent inquiry. However, we’ve been recently concerned to discover that the BSB has delegated decision-making authority for the use of remote proctoring apps to individual educational providers – e.g universities, law schools – and that many of these providers are scheduling exams using remote proctoring apps. We hope that the independent inquiry’s findings will conclusively determine that this must not continue.   Sign up to our mailing list or follow the Open Knowledge Justice Programme on Twitter to receive updates.      

What is a public impact algorithm?

- February 4, 2021 in Open Knowledge

Meg Foulkes discusses public impact algorithms and why they matter.

When I look at the picture of the guy, I just see a big Black guy. I don’t see a resemblance. I don’t think he looks like me at all.

This is what Robert Williams said to police when he was presented with the evidence upon which he was arrested for stealing watches in June 2020. Williams had been identified by an algorithm, when Detroit Police ran grainy security footage from the theft through a facial recognition system. Before questioning Williams, or checking for any alibi, he was arrested. It was not until the matter came to trial that Detroit Police admitted that he had been falsely, and solely, charged on the output of an algorithm. It’s correct to say that in many cases, when AI and algorithms go wrong, the impact is pretty innocuous – like when a music streaming service recommends music you don’t like. But often, AI and algorithms go wrong in ways that cause serious harm, as in the case of Robert Williams. Although he had done absolutely nothing wrong, he was deprived of a fundamental right on the basis of a computer output: his liberty. It’s not just on an individual scale that these harms are felt. Algorithms are written by humans, so they can reflect human biases. What algorithms can do is amplify, through automatedly entrenching the bias, this prejudice over a massive scale. The bias isn’t exclusively racialised; last year, an algorithm used to determine exam grades disproportionately downgraded disadvantaged students. Throughout the pandemic, universities have been turning to remote proctoring software that falsely identifies students with disabilities as cheats. For example, those who practice self-stimulatory behavior or ‘stimming’ may get algorithmically flagged again and again for suspicious behavior, or have to disclose sensitive medical information to avoid this. We identify these types of algorithms as ‘public impact algorithms’ to clearly name the intended target of our concern. There is a big difference between the harm caused by inaccurate music suggestions and algorithms that have the potential to deprive us of our fundamental rights. To call out these harms, we have to precisely define the problem. Only then can we  hold the deployers of public impact algorithms to account, and ultimately to achieve our mission of ensuring public impact algorithms do no harm. Sign up to our mailing list or follow the Open Knowledge Justice Programme on Twitter to receive updates.  

Open Knowledge Justice Programme takes new step on its mission to ensure algorithms cause no harm

- January 27, 2021 in Open Knowledge Foundation, Open Knowledge Justice Programme

Today we are proud to announce a new project for the Open Knowledge Justice Programme – strategic litigation. This might mean we will go to court to make sure public impact algorithms are used fairly, and cause no harm. But it will also include advocacy in the form of letters and negotiation.  The story so far Last year, Open Knowledge Foundation made a commitment to apply our skills and network to the increasingly important topics of artificial intelligence (AI) and algorithms. As a result, we launched the Open Knowledge Justice Programme in April 2020. Our  mission is to ensure that public impact algorithms cause no harm. Public impact algorithms have four key features:
  • they involve automated decision-making
  • using AI and algorithms
  • by governments and corporate entities and
  • have the potential to cause serious negative impacts on individuals and communities.
We aim to make public impact algorithms more accountable by equipping legal professionals, including campaigners and activists, with the know-how and skills they need to challenge the effects of these technologies in their practice. We also work with those deploying public impact algorithms to raise awareness of the potential risks and build strategies for mitigating them. We’ve had some great feedback from our first trainees!  Why are we doing this?  Strategic litigation is more than just winning an individual case. Strategic litigation is ‘strategic’ because it plays a part in a larger movement for change. It does this by raising awareness of the issue, changing public debate, collaborating with others fighting for the same cause and, when we win (hopefully!) making the law fairer for everyone.  Our strategic litigation activities will be grounded in the principle of openness because public impact algorithms are overwhelmingly deployed opaquely. This means that experts that are able to unpick why and how AI and algorithms are causing harm cannot do so and the technology escapes scrutiny.  Vendors of the software say they can’t release the software code they use because it’s a trade secret. This proprietary knowledge, although used to justify decisions potentially significantly impacting people’s lives, remains out of our reach.  We’re not expecting all algorithms to be open. Nor do we think that would necessarily be useful.  But we do think it’s wrong that governments can purchase software and not be transparent around key points of accountability such as its objectives, an assessment of the risk it will cause harm and its accuracy. Openness is one of our guiding principles in how we’ll work too. As far as we are able, we’ll share our cases for others to use, re-use and modify for their own legal actions, wherever they are in the world. We’ll share what works, and what doesn’t, and make learning resources to make achieving algorithmic justice through legal action more readily achievable.  We’re excited to announce our first case soon, so stay tuned! Sign up to our mailing list or follow the Open Knowledge Justice Programme on Twitter to receive updates.

Launching the Open Knowledge Justice Programme

- April 14, 2020 in Featured, Open Knowledge Foundation, Open Knowledge Justice Programme

Supporting legal professionals in the fight for algorithmic accountability Last month, Open Knowledge Foundation made a commitment to apply our unique skills and network to the emerging issues of AI and algorithms. We can now provide you with more details about the work we are planning to support legal professionals (barristers, solicitors, judges, legal activists and campaigners) in the fight for algorithmic accountability.  Algorithmic accountability has become a key issue of concern over the past decade, following the emergence and spread of technologies embedding mass surveillance, biased processes or racist outcomes into public policies, public service delivery or commercial products.  Despite a growing and diverse community of researchers and activists discussing and publishing on the topic, legal professionals across the world have access to very few resources to equip themselves in understanding algorithms and artificial intelligence, let alone enforce accountability. In order to fill this gap, we are pleased to announce today the launch of the Open Knowledge Justice Programme.  The exact shape of the programme will evolve in response to the feedback of the legal community as well as the contribution from domain experts, but the our initial roadmap includes a mix of interventions across our open algorithm action framework as seen below:
Shared definitions Standard resources Literacy
Accountability Contribution to the public debate through participation to conferences, seminar and outreach to experts

Building a global community of legal professionals and civic organisations to build a common understanding of the issues and needs for actions raised by algorithms and AI from a legal perspective
Participation to the elaboration of the European Union’s AI policy Contribution to current UK working groups around algorithms, AI and data governance Participation to other national and international public policy debates to embed accountability in upcoming regulations, in collaboration with our partners Developing open learning content and guides on existing and potential legal of analysis of algorithms and AI in the context of judicial review or other legal challenge
Monitoring Mapping of relevant legislation, case law and ethics guidelines with the help of the community of experts Delivering trainings for legal professionals on algorithm impact investigation and monitoring
Improvement Curation, diffusion and improvement of existing algorithm assessment checklists such as the EU checklist Training and supporting public administration lawyers on algorithmic risk
  How these plans came about These actions build on our past experience developing the open data movement. But we’ve also spent the last six months consulting with legal professionals across the UK. Our key finding is that algorithms are becoming part of legal practice, yet few resources exist for legal professionals to grapple with the issues that they raise.  This is due in part to the lack of a clear legal framework, but mainly because the spread of algorithm-driven services, either public or private, has accelerated much faster than the public debate and public policies have matured. What is an algorithm? What is the difference between algorithms and artificial intelligence? Which laws govern their use in the police force, in public benefit allocation, in banking? Which algorithms should legal professionals be on the lookout for? What kind of experts can help legal professionals investigate algorithms and what kind of questions should be asked of them?  All these questions, although some are seemingly basic, are what lawyers, including judges, are currently grappling with. The Open Knowledge Justice Programme will answer them.  Stay tuned for more on the topic! For comments, contributions or if you want to collaborate with us, you can email us at contact@okfn.org

Introducing the 2018 Class of School of Data Fellows!

- June 27, 2018 in School of Data, School of Data Fellows

This blog has been reposted from the School of Data blog. School of Data is delighted to announce its sixth class of fellows. From June until January 2019, the programme will allow fellows to deepen their data literacy skills and work alongside local partner organisations to enhance the data literacy network local to them. We were really pleased to receive a large number of applications and would like to both congratulate and wish all our new fellows the very best for their fellowship! Pamela Gonzales is passionate about data visualization and bridging the digital divide for women. She is the co-founder of Bolivia Tech Hub, a collaborative space for tech projects to contribute to the prosperity of an innovative ecosystem in Bolivia. Pamela is also the Regional Ambassador for Technovation, a San Francisco based program that equips girls with the skills needed to solve real-world problems through technology. She holds a Bachelor of Science degree in Computer Science from Universidad Mayor de San Andres.         Odanga Madung is the co-founder and Data Science Lead at Odipo Dev, a data science and analytics firm operating out of Nairobi Kenya that delivers services to various bluechip companies and NGOs across the country. Odanga’s deepest interest is at the intersection between data and culture and it is through this that Odipo Dev has been able to carry out data analysis and visualisation on various activities for a wide range of clients and occurrences in Kenya and the world.Some of his work has been featured in publications such as Adweek, Yahoo, BBC, CNBC, Quartz, and Daily Nation, just to mention a few. He will be working on Open Contracting in Kenya during the period of his fellowship. You can follow him on Twitter @Odangaring and Odipo Dev @OdipoDev for more information.   Nzumi Malendeja is a Research Associate at an Independent Evaluation and Research Cell of BRAC International in Tanzania, where he leads larger-scale research projects in education, agriculture, and health. Here, he has developed mobile-based data collection platforms (ODK Collect and SurveyCTO), which replaced the traditional paper-based methods. Before this, Mr. Nzumi worked as a Field Monitor and Research Assistant at SoChaGlobal and Maarifa ni Ufunguo respectively, both in education and construction sector transparency projects. Mr. Nzumi has attended a 4 week Summer School Training on Research Methods and Teaching Skills, hosted by Hamburg University of Applied Sciences in Germany, funded by the Germany Academic Exchange Services (DAAD). Presently, Mr. Nzumi is working on his thesis towards the fulfillment of the Master of Research and Public Policy at the University of Dar es Salaam.   Sofia Montenegro A fan of nature and the teachings it hides, Sofia has dedicated herself to research in the social sciences. She studied Political Science at the Universidad Francisco Marroquin and Public Opinion and Political Behavior through a Masters degree at the University of Essex, where she deepened her interest in data methodologies in social research. Sofia is interested in academia only as long as it drives political action. She looks to help other women to be involved freely in data practice and political spaces. Sofia is also interested in network analysis, studying corruption as a social phenomenon, following electoral processes and learning research methods.   Elias Mwakilama is a lecturer at University of Malawi-Chancellor College and Coordinator of Research, Seminar and Consultancies, and Diploma in Statistics programme in the Mathematical Sciences Department, Elias Mwakilama is a computational and applied mathematician in the field of operations research. He lectures and supervises undergraduate students in Mathematics & Statistics fields. His research interests are in working with optimisation models using mathematical statistics techniques integrated with computing skills to offer solutions of industrial related problems in theoretical and practical arena. Elias holds a first upper class MSc degree in Mathematical Sciences from University of Malawi. His website is here. During his fellowship, he hopes to support the “public procurement open contract platform” for Civil Society Organisations (CSOs) in Malawi with Hivos.   Ben Hur Pintor is an open-source and open-data advocate from the Philippines​ who believes in democratising not only data, but ​also ​the means of utilising and analysing data.​ He’s a geospatial generalist and software developer who’s​ worked on projects related to renewable energy, blue carbon ecosystems, and participatory disaster risk mapping and assessment. ​Ben is currently pursuing an MS Geomatics Engineering degree at the University of the Philippines. As part of his advocacy for Free and Open Source Software (FOSS), he’s a co-organiser and active participant of FOSS4G Philippines and MaptimeDiliman — avenues for sharing open​ ​source mapping technologies with the community.   Hani Rosidaini is passionate about how technology can be adopted and applied for people’s needs. She combines her technical skills, especially in information systems and data science, with social and business knowledge, to help companies and organisations in Indonesia, Australia, and Japan. This includes her own ventures. Highly relevant to this year fellowship’s focus of data procurement, Hani has experience as a data specialist for public policy in the Indonesia Presidential Office, where she has analysed the national integrated data platform, data.go.id, contributed to data-driven policy making, advocated ministries and agencies, as well as engaged with civic and local communities.

Introducing the 2018 Class of School of Data Fellows!

- June 27, 2018 in School of Data, School of Data Fellows

This blog has been reposted from the School of Data blog. School of Data is delighted to announce its sixth class of fellows. From June until January 2019, the programme will allow fellows to deepen their data literacy skills and work alongside local partner organisations to enhance the data literacy network local to them. We were really pleased to receive a large number of applications and would like to both congratulate and wish all our new fellows the very best for their fellowship! Pamela Gonzales is passionate about data visualization and bridging the digital divide for women. She is the co-founder of Bolivia Tech Hub, a collaborative space for tech projects to contribute to the prosperity of an innovative ecosystem in Bolivia. Pamela is also the Regional Ambassador for Technovation, a San Francisco based program that equips girls with the skills needed to solve real-world problems through technology. She holds a Bachelor of Science degree in Computer Science from Universidad Mayor de San Andres.         Odanga Madung is the co-founder and Data Science Lead at Odipo Dev, a data science and analytics firm operating out of Nairobi Kenya that delivers services to various bluechip companies and NGOs across the country. Odanga’s deepest interest is at the intersection between data and culture and it is through this that Odipo Dev has been able to carry out data analysis and visualisation on various activities for a wide range of clients and occurrences in Kenya and the world.Some of his work has been featured in publications such as Adweek, Yahoo, BBC, CNBC, Quartz, and Daily Nation, just to mention a few. He will be working on Open Contracting in Kenya during the period of his fellowship. You can follow him on Twitter @Odangaring and Odipo Dev @OdipoDev for more information.   Nzumi Malendeja is a Research Associate at an Independent Evaluation and Research Cell of BRAC International in Tanzania, where he leads larger-scale research projects in education, agriculture, and health. Here, he has developed mobile-based data collection platforms (ODK Collect and SurveyCTO), which replaced the traditional paper-based methods. Before this, Mr. Nzumi worked as a Field Monitor and Research Assistant at SoChaGlobal and Maarifa ni Ufunguo respectively, both in education and construction sector transparency projects. Mr. Nzumi has attended a 4 week Summer School Training on Research Methods and Teaching Skills, hosted by Hamburg University of Applied Sciences in Germany, funded by the Germany Academic Exchange Services (DAAD). Presently, Mr. Nzumi is working on his thesis towards the fulfillment of the Master of Research and Public Policy at the University of Dar es Salaam.   Sofia Montenegro A fan of nature and the teachings it hides, Sofia has dedicated herself to research in the social sciences. She studied Political Science at the Universidad Francisco Marroquin and Public Opinion and Political Behavior through a Masters degree at the University of Essex, where she deepened her interest in data methodologies in social research. Sofia is interested in academia only as long as it drives political action. She looks to help other women to be involved freely in data practice and political spaces. Sofia is also interested in network analysis, studying corruption as a social phenomenon, following electoral processes and learning research methods.   Elias Mwakilama is a lecturer at University of Malawi-Chancellor College and Coordinator of Research, Seminar and Consultancies, and Diploma in Statistics programme in the Mathematical Sciences Department, Elias Mwakilama is a computational and applied mathematician in the field of operations research. He lectures and supervises undergraduate students in Mathematics & Statistics fields. His research interests are in working with optimisation models using mathematical statistics techniques integrated with computing skills to offer solutions of industrial related problems in theoretical and practical arena. Elias holds a first upper class MSc degree in Mathematical Sciences from University of Malawi. His website is here. During his fellowship, he hopes to support the “public procurement open contract platform” for Civil Society Organisations (CSOs) in Malawi with Hivos.   Ben Hur Pintor is an open-source and open-data advocate from the Philippines​ who believes in democratising not only data, but ​also ​the means of utilising and analysing data.​ He’s a geospatial generalist and software developer who’s​ worked on projects related to renewable energy, blue carbon ecosystems, and participatory disaster risk mapping and assessment. ​Ben is currently pursuing an MS Geomatics Engineering degree at the University of the Philippines. As part of his advocacy for Free and Open Source Software (FOSS), he’s a co-organiser and active participant of FOSS4G Philippines and MaptimeDiliman — avenues for sharing open​ ​source mapping technologies with the community.   Hani Rosidaini is passionate about how technology can be adopted and applied for people’s needs. She combines her technical skills, especially in information systems and data science, with social and business knowledge, to help companies and organisations in Indonesia, Australia, and Japan. This includes her own ventures. Highly relevant to this year fellowship’s focus of data procurement, Hani has experience as a data specialist for public policy in the Indonesia Presidential Office, where she has analysed the national integrated data platform, data.go.id, contributed to data-driven policy making, advocated ministries and agencies, as well as engaged with civic and local communities.

Introducing the 2018 Class of School of Data Fellows!

- June 22, 2018 in announcement, fellowship

School of Data is delighted to announce its sixth class of fellows. From June until January 2019, the programme will allow fellows to deepen their data literacy skills and work alongside local partner organisations to enhance the data literacy network local to them. We were really pleased to receive a large number of applications and would like to both congratulate and wish all our new fellows the very best for their fellowship! Pamela Gonzales is passionate about data visualization and bridging the digital divide for women. She is the co-founder of Bolivia Tech Hub, a collaborative space for tech projects to contribute to the prosperity of an innovative ecosystem in Bolivia. Pamela is also the Regional Ambassador for Technovation, a San Francisco based program that equips girls with the skills needed to solve real-world problems through technology. She holds a Bachelor of Science degree in Computer Science from Universidad Mayor de San Andres.       Odanga Madung is the co-founder and Data Science Lead at Odipo Dev, a data science and analytics firm operating out of Nairobi Kenya that delivers services to various bluechip companies and NGOs across the country. Odanga’s deepest interest is at the intersection between data and culture and it is through this that Odipo Dev has been able to carry out data analysis and visualisation on various activities for a wide range of clients and occurrences in Kenya and the world.Some of his work has been featured in publications such as Adweek, Yahoo, BBC, CNBC, Quartz, and Daily Nation, just to mention a few. He will be working on Open Contracting in Kenya during the period of his fellowship. You can follow him on Twitter @Odangaring and Odipo Dev @OdipoDev for more information.   Nzumi Malendeja is a Research Associate at an Independent Evaluation and Research Cell of BRAC International in Tanzania, where he leads larger-scale research projects in education, agriculture, and health. Here, he has developed mobile-based data collection platforms (ODK Collect and SurveyCTO), which replaced the traditional paper-based methods. Before this, Mr. Nzumi worked as a Field Monitor and Research Assistant at SoChaGlobal and Maarifa ni Ufunguo respectively, both in education and construction sector transparency projects. Mr. Nzumi has attended a 4 week Summer School Training on Research Methods and Teaching Skills, hosted by Hamburg University of Applied Sciences in Germany, funded by the Germany Academic Exchange Services (DAAD). Presently, Mr. Nzumi is working on his thesis towards the fulfillment of the Master of Research and Public Policy at the University of Dar es Salaam.   Sofia Montenegro A fan of nature and the teachings it hides, Sofia has dedicated herself to research in the social sciences. She studied Political Science at the Universidad Francisco Marroquin and Public Opinion and Political Behavior through a Masters degree at the University of Essex, where she deepened her interest in data methodologies in social research. Sofia is interested in academia only as long as it drives political action. She looks to help other women to be involved freely in data practice and political spaces. Sofia is also interested in network analysis, studying corruption as a social phenomenon, following electoral processes and learning research methods.   Elias Mwakilama is a lecturer at University of Malawi-Chancellor College and Coordinator of Research, Seminar and Consultancies, and Diploma in Statistics programme in the Mathematical Sciences Department, Elias Mwakilama is a computational and applied mathematician in the field of operations research. He lectures and supervises undergraduate students in Mathematics & Statistics fields. His research interests are in working with optimisation models using mathematical statistics techniques integrated with computing skills to offer solutions of industrial related problems in theoretical and practical arena. Elias holds a first upper class MSc degree in Mathematical Sciences from University of Malawi. His website is here. During his fellowship, he hopes to support the “public procurement open contract platform” for Civil Society Organisations (CSOs) in Malawi with Hivos.   Ben Hur Pintor is an open-source and open-data advocate from the Philippines​ who believes in democratising not only data, but ​also ​the means of utilising and analysing data.​ He’s a geospatial generalist and software developer who’s​ worked on projects related to renewable energy, blue carbon ecosystems, and participatory disaster risk mapping and assessment. ​Ben is currently pursuing an MS Geomatics Engineering degree at the University of the Philippines. As part of his advocacy for Free and Open Source Software (FOSS), he’s a co-organiser and active participant of FOSS4G Philippines and MaptimeDiliman — avenues for sharing open​ ​source mapping technologies with the community.   Hani Rosidaini is passionate about how technology can be adopted and applied for people’s needs. She combines her technical skills, especially in information systems and data science, with social and business knowledge, to help companies and organisations in Indonesia, Australia, and Japan. This includes her own ventures. Highly relevant to this year fellowship’s focus of data procurement, Hani has experience as a data specialist for public policy in the Indonesia Presidential Office, where she has analysed the national integrated data platform, data.go.id, contributed to data-driven policy making, advocated ministries and agencies, as well as engaged with civic and local communities. Flattr this!

Announcing our new member: ‘Caribbean School of Data’

- June 21, 2017 in announcement, community

Today we’re delighted to welcome a new organisational member to our network: the Caribbean Open Institute! They will carry the Caribbean School of Data Initiative. The new Caribbean initiative is led by Maurice McNaughton who coordinates the Caribbean Open Institute, as the regional node for the Open Data for Development network activities in the Caribbean. The COI coalition of partner organisations and individuals conduct regional open data research, advocacy, and capacity-building activities such as the Global Open Data Index and the Open Data Barometer. The new “Caribbean School of Data” will be hosted at the Mona School of Business & Management, UWI and affiliate institutions are planned for other countries across the Caribbean (including Trinidad & Tobago,  Haiti, Cuba and Guyana). Already in the group’s pipeline is a virtual incubation model to encourage and facilitate data-driven entrepreneurial startups as well as a project to build a Caribbean data competency map, to identify and make searchable and accessible, individual and institutional clusters of data skills, knowledge and capabilities in the region. School of Data is already working with the Caribbean Open Institute on a Data literacy project in Haïti called “Going Global: Digital Jobs and Gender” for which we have recently recruited two Fellows. Welcome, Caribbean School of Data!   About School of Data members School of Data’s organisational members are legally independent groups, affiliated formally through a memorandum of understanding. Our members are groups whose mission and activities are aligned with ours and with whom we plan to collaborate in this data literacy work. Caribbean School of Data  is our fourteenth member!   Flattr this!