You are browsing the archive for Mapping.

Join #MappingEcuador

- April 19, 2016 in community, Mapping

terremoto On April 16th, a magnitude-7.8 earthquake hit the coast of Ecuador. As the victim count reached near 300 people on Sunday, the Open Data community around the globe organized to create a base map for rescue and distress relief purposes. A lot has been done in a short time, but there is still plenty to do.

Helping remotely

You don’t need to live in Ecuador to help the volunteers: mapping can be done from wherever you are. If you don´t know how, start with this wiki document. Open Street Maps (OSM) is just one of the tools you can use without being in Ecuador. It basically digitalizes satellite images and transforms them into an open and editable database and a map so rescue squads and people in general know where to allocate resources or avoid risks. The OSM Task Manager prioritizes and divides the work among users so volunteer work won´t overlap. This is where to begin. Mapping efforts are conducted by Humberto Yances (Humanitarian Open Street Map Team) and Daniel Orellana (Open Street Maps Ecuador), who launched two tutorials in Spanish yesterday, for basic and intermediate mappers. You can a tutorial in english here.

Helping when in Ecuador

If you are in Ecuador, there are other ways to help beside Open Street Maps: platform Mapa Desastre allows you to send and visualize reports on specific issues and their geographic location. You can also set GPS alerts based on their changing location. All of this data is also public and available for both the general population and humanitarian squads. Another indispensable tool for those in Ecudaor is Google Person Finder, an online repository where you can search for missing people or send information about lost people you found yourself. Mapillary and Open Street Map Android Tracker allow you to upload pictures of specific locations at the disaster. To strenghten the ranks of volunteer mappers, Open Data Ecuador organized yesterday, April 18, a workshop to train volunteers to map at the Ciespal building in Quito. This article is a translation by Gibrán Mena from an original article in Spanish published on the Escuela de Datos website. Flattr this!

Join #MappingEcuador

- April 19, 2016 in community, Mapping

terremoto On April 16th, a magnitude-7.8 earthquake hit the coast of Ecuador. As the victim count reached near 300 people on Sunday, the Open Data community around the globe organized to create a base map for rescue and distress relief purposes. A lot has been done in a short time, but there is still plenty to do.

Helping remotely

You don’t need to live in Ecuador to help the volunteers: mapping can be done from wherever you are. If you don´t know how, start with this wiki document. Open Street Maps (OSM) is just one of the tools you can use without being in Ecuador. It basically digitalizes satellite images and transforms them into an open and editable database and a map so rescue squads and people in general know where to allocate resources or avoid risks. The OSM Task Manager prioritizes and divides the work among users so volunteer work won´t overlap. This is where to begin. Mapping efforts are conducted by Humberto Yances (Humanitarian Open Street Map Team) and Daniel Orellana (Open Street Maps Ecuador), who launched two tutorials in Spanish yesterday, for basic and intermediate mappers. You can a tutorial in english here.

Helping when in Ecuador

If you are in Ecuador, there are other ways to help beside Open Street Maps: platform Mapa Desastre allows you to send and visualize reports on specific issues and their geographic location. You can also set GPS alerts based on their changing location. All of this data is also public and available for both the general population and humanitarian squads. Another indispensable tool for those in Ecudaor is Google Person Finder, an online repository where you can search for missing people or send information about lost people you found yourself. Mapillary and Open Street Map Android Tracker allow you to upload pictures of specific locations at the disaster. To strenghten the ranks of volunteer mappers, Open Data Ecuador organized yesterday, April 18, a workshop to train volunteers to map at the Ciespal building in Quito. This article is a translation by Gibrán Mena from an original article in Spanish published on the Escuela de Datos website. Flattr this!

The Data Journalism Bootcamp at AUB Lebanon

- January 29, 2015 in #OpenData Party, American University of Beirut, Big Data, bootcamp, Data Journalism, Events, fellowship, gephi, Mapping, School of Data, Workshop

Data love is spreading like never before. Unlike previous workshops we did in the MENA region, on the 18th of January 2015, we gave an intensive data journalism workshop at the American University of Beirut for four consecutive days in collaboration with Dr. Jad Melki, Director of media studiesilovedata program at AUB. The Data team at Data Aurora were really happy sharing this experience with students from different academic backgrounds, including media studies, engineering or business. The workshop was mainly led by Ali Rebaie, a Senior School of Data fellow, and Bahia Halawi, a data scientist at Data Aurora, along with the data community team assistants; Zayna Ayyad and Noor Latif. The aim of the workshop was to give the students an introduction to the world of open data and data journalism, in particular, through tutorials on open source tools and methods used in this field. Moreover, we wanted to put students on track regarding the use of data.AUBworkshop On the first day, the students were introduced to data journalism, from a theoretical approach, in particular, the data pipeline which outlined the different phases in any data visualization project: find, get, verify, clean, analyze and present. After that, students were being technically involved in scraping and cleaning data using tools such as open refine and Tabula. Day two was all about mapping, from mapping best practices to mapping formats and shapes. Students were first exposed to different types of maps and design styles that served the purpose of each map. Moreover, best mappings techniques and visualizations were emphasized to explain their relative serving purpose. Eventually, participants became able to differentiate between the dot maps and the choropleth maps as well as many others. Then they used twitter data that contained geolocations to contrast varying tweeting zones by placing these tweets at their origins on cartodb. Similarly, they created other maps using QGIS and Tilemill. The mapping exercises were really fun and students were very happy to create their own maps without a single line of code. On the third day, Bahia gave a lecture on network analysis, some important mathematical notions needed for working with graphs as well as possible uses and case studies related to this field. Meanwhile, Ali was unveiling different open data portals to provide the students with more resources and data sets. After these topics were emphasized, a technical demonstration on the use of Gephi to analyze two topics wasworkshopaub performed. Students were analyzing climate change and later, the AUB media group on Facebook was also analyzed and we had its graph drawn. It was very cool to find out that one of the top influencers in that network was among the students taking the training. Students were also taught to do the same analysis for their own friends’ lists. Facebook data was being collected through Netviz and the visualizations were being drawn using Gephi. After completing the interactive types of visualizations, the fourth day was about static ones, mainly, infographics. Each student had the chance to extract the information needed for an interesting topic to transform it into a visual piece.  Bahia was working around with students, teaching them how to refine the data so that it becomes simple and short, thus usable for building the infographic design. Later, Yousif, a senior creative designer at Data Aurora, trained the students on the use of Photoshop and illustrator, two of the tools commonly used by infographic designers. At the end of the session, each student submitted a well done infographic of which some are posted below. After the workshop Zayna had small talks with the students to get their feedback and here she quoted some of their opinions: “It should be a full course, the performance and content was good but at some point, some data journalism tools need to be more mature andStatic Infographics developed by the students at the workshop. user-friendly to reduce the time needed to create a story,” said Jad Melki, Director of media studies program at AUB, “it was great overall.” “It’s really good but the technical parts need a lot of time. We learned about new apps. Mapping, definitely I will try to learn more about it,” said Carla Sertin, a media student. “It was great we got introduced to new stuff. Mapping, I loved it and found it very useful for me,” said Ellen Francis, civil engineering student. “The workshop was a motivation for me to work more on this,” she added, “it would work as a one semester long course.” Azza El Masri, a media student, is interested in doing MA in data journalism. “I like it I expected it to be a bit harder, I would prefer more advanced stuff in scraping,” she added.   flattr this!

Instigating the Rise of Demand for Data: The #OpenData Party in Abuja

- December 8, 2014 in #OpenDataParty, Data, Data for NGos, Events, Follow the Money, Mapping, Open Data

So what happens when you have 102 Nigerians representing all the six regions of the country in   Abuja to teach and learn about what they can use data or open data for? “It was an action – packed, idea generating, brain storming, mind grooming which will help me in my advocacy as well as in tracking how the budget of my country is being spent, a challenging and yet fun – filled event” as described by Clinton Ezeigwe of People to People International; “As someone working in a non-government organization, this event has boost my knowledge on data sourcing, data collection, data analysis, and will help me in mapping my work environment” informed Aniekan Archibong of Partners for Peace in Akwa Ibom state.
What participants said about the 2 - day event

What participants said about the 2 – day event

In a 2 – day event on Friday, November 28 and Saturday 29, 2014 at the African University of Science and Technology, that was meant to raise the awareness on how NGOs can use available data to monitor service delivery in the health sector; empower journalist on using data for creating compelling stories that can cause change; and in all create a platform (on-the training) that can be used to monitor service delivery in the health sector. “We will be most interested in how citizens turned professionals like you all here, can take up stories from the data that will be curated during this event, in asking government questions about inputs in the health sector, and other sectors as well” said Christine K, Country Director of Heinrich Boell Stiftung Nigeria, during her keynote at the event. In the minds of many participants was how we fit into this new world of Open Data with a party at the end. Did you ever wonder why the party? Well to clear the air, we started the “party” helping participants to know what data will mean to us, they as participants, and what it can change in the life of that curious woman that walks 30km from Keta to Goronyo to join an antenatal care program; what it meant for that hardworking man to transit from Potiskum to Kaduna before he can get a Hepatitis C viral load test, even though he had to borrow the 23, 000 Naira meant for this test. Yes, available and structured data can create a great story out of this recurring event.” If you are still looking for what could then happen from the gathering of these 102 participants – it’s all written in gold here, even though these are still stories in the making, but we can do much more” exclaimed Anas Sani Anka of the Nigeria Television Authority in Gusau, Zamfara
Adam Talsma of Reboot sharing skills that can make data matter to people on ground

Adam Talsma of Reboot sharing skills that can make data matter to people on ground

Going through the data pipeline (data sourcing, collection, collation, analysis, reporting and use) surprisingly, we got this shock again! Only 2% of the participants knew where to quickly find the available data of the federal government budget in Nigeria. Whilst data pipelines was meant to guide participants through the data management processes (in a participatory manner) it was another opportunity to share where the available data are online in the country, and how they can be used in advocacy and storytelling to start conversation around transparency and accountability; and also in exchanging feedbacks between the people and government. Leading the skill share session was Adam Talsma of Reboot taking participants through using formhub and textit and Michael Egbe of eHealth Africa introducing participants to how they are mapping Nigeria using Open Street Maps. The storytelling sessions had Tina Armstrong, an award winning data journalist that is interested in telling stories of vulnerable communities using data; Joshua Olufemi shared skills and tools that has made Premium Times the best online investigative media in the country; while the session was rounded up by Ledum of Right to Know, showing participants how to enact the Freedom of Information Act in getting data from the government.
Joshua Olufemi of Premium Times Nigeria sharing skills on telling stories with data

Joshua Olufemi of Premium Times Nigeria sharing skills on telling stories with data

The high point of the first day was the, I want to learn, and I want to teach session – a remix of the School of Data Summer Camp World Cafe and Skill Share Session. “Learning particular skills in 10 minutes can be mind blowing and something I will not want to forget in a long time, I only hope we could have had more time other than the 30 minutes for the 10 min/skill session” narrated Michael Saanu of Africa Hope Foundation. Amongst skills that were taught is using Microsoft Excel for analysis, creating Survey form using Google Form, collaboration techniques with the Google Drive, writing funding proposals, community building, using Twitter and Facebook for advocacy, data scraping using Tabula amongst others. After this session, it was clear that participants wanted to be part of all the sessions, but they were only limited to three, as the night crept in faster than we expected – what an energetic way to end the first day!
Participants using sticky notes to chose what to learn and what to teach

Participants using sticky notes to chose what to learn and what to teach

Kick starting day 2, with the sun and expectations so high was lessons from participants, and an ice breaker on the power of around leadership. This day was dedicated to Open Street Maps Goronyo Mapping Party and Data Sprint on Funds meant for inputs in the health sector. Moving from scraping the data from the budget office to visualizing it, and creating a monitoring instrument amongst the participants. Working through the available health facility data for Goronyo, we found out that most data were not reliable – How can we have latitude of 322 on a latitude column on data from the just released NMIS data? So if we can’t use that, how do we get the government health facility data – most participants of this group concluded that the dhis2 data could be more reliable but its usage still remains difficult! Anyone wants to help in getting Goronyo health facility geo-referenced data? Please comment here. Not giving up, Sidney Bamidele of eHealth Africa trained participants on how to add, and edit points on open street maps and how to create task managers on HOTOSM.
Sidney Bamidele of eHealth Africa training participants on using Open Street Maps

Sidney Bamidele of eHealth Africa training participants on using Open Street Maps

Nevertheless, the data sprint with music, and drinks took the whole day, and I couldn’t stop hearing – OMG! So 20 million was budgeted for the construction of this health facility in my LGA, how come it is still at this state, I think we need to go and ask”; “I have found that so many time, descriptions of budget data has been duplicated – and how do we stop this”. As it has always been, only one sprinter had an apple laptop out of the 50 laptops on the tables; Most of the participants agreed that only 30% of Nigerians own a smart phone, so how many will used it, and how many will use an android or that new android app you are about to make? Maybe the feature of mobile activism in the country still lies in feature phones. These and many are conversations that always ensue during training and data sprint sessions I have facilitated. At the end what did we make – an Ushahidi Crowdmap instance of where funds for health input will go? a first step in starting a conversation around monitoring service delivery in that sector.
Participants during the Mapping and Data Sprint

Participants during the Mapping and Data Sprint

What next? in the words of the Hamzat Lawal, the Chief Executive of Connected Development [CODE], it is important that we brace up, and start using the data on this platform in asking questions directed not only to the government on if budget data description got to citizens it was meant for, but also to citizens it was meant for – on facility and health input usage and quality. As a School of Data Fellow, I have learnt that citizens need basic tools and skills to hold government accountable. As a monitoring and evaluation expert, I can see that in few years, lots of data will be released (even though most wouldn’t be responsible), but how citizens will identify and use the reliable ones remain a herculean task. As a human being, I learned how hardworking and brave my colleagues and participants are. At no time did I feel that facilitating data trainings was futile. Ultimately, what I really learned about data, or open data, or available data is that the NGOs, journalist, activist and governments still need more capacity building around this phenomenon. Pictures from This event are on Flickr flattr this!

Call for action: Help improve the open knowledge directory

- November 10, 2014 in directory, Featured Project, Mapping, open steps

opensteps This is a guest blog post from Open Steps, an independent blog aggregating worldwide information around Open Cultures in form of articles, videos and other resources. Its aim is to document open knowledge (OK) related projects and keep track on the status of such initiatives worldwide. From organisations using Open Data, promoting Open Source technologies, launching Open Government initiatives, following the principles behind Open Science, supporting the release of information to newsrooms practicing Data Journalism. In this way, their site seeks to continue, this time virtually, the globetrotter project realised between July 2013 to July 2014 and discover further OK projects all around the world. If you followed the journey across Europe, India, Asia and South-America that Margo and Alex from Open Steps undertook last year, you probably already know their open knowledge directory. During those 12 months, in every of the 24 visited countries they had the chance to met numerous enthusiastic activists sharing the same ideas and approaches. In order to keep record of all those amazing projects they created what began as a simple contact list but soon evolved in a web application that has been growing since then. okdirectory1 After some iterations a new version has been recently released which not only features a new user interface with better usability but also sets a base for a continuous development that aims to encourage collaboration among people across borders while monitoring the status of open knowledge initiatives worldwide and raising awareness about relevant projects worth to discover. If you haven’t done it yet, head to http://directory.open-steps.org and join it!

New version implementing PLP Profiles

One of the main features of this new version is the implementation of the Portable Linked Profiles, short PLP. In a nutshell, PLP allows you to create a profile with your basic contact information that you can use, re-use and share. Basic contact information refers to the kind of information you are used to type in dozens of online forms, from registering on social networks, accessing web services or leaving your feedback in forums, it is always the same information: Name, Email, Address, Website, Facebook, Twitter, etc…PLP addresses this issue but also, and most important, allows you to decide where you want your data to be stored. okdirectory2 By implementing PLP, this directory does not make use anymore of the old Google Form and now allow users to edit their data and keep it up-to-date easily. For the sake of re-usability and interoperability, it makes listing your profile in another directory so easy as just pasting the URI of your profile on it. If you want to know more about PLP, kindly head to the current home page, read a more extensive article about it on Open Steps or check the github repository with the documentation. PLP is Open Source software and is based on Open Web Standards and Common Vocabularies so collaboration is more than welcome.

Participate on defining the next steps for the open knowledge directory

Speaking about collaboration, on the upcoming Wednesday 12th of November, a discussion will take place on how the worldwide open knowledge community can benefit from such a directory, how the current Open Steps’ implementation can be improved and what would be the next steps to follow. No matter what background you have, if you are a member of the worldwide open knowledge community and want to participate on the improvement of the open knowledge directory, please join us.
When? Wednesday, 12th November 2014. 3pm GMT

Event on Google+: https://plus.google.com/events/c46ni4h7mc9ao6b48d9sflnetvo

References

This blog post is also available on the Open Education Working Group blog.

Mapping Skillshare with Codrina

- October 10, 2014 in community, Events, Fellowships, Geocoding, HowTo, Mapping, School_Of_Data

Why maps are useful visualization tools? What doesn’t work with maps? Today we hosted a School of Data skillshare with Codrina Ilie, School of data Fellow.

Codrina Ilie shares perspectives on building a map project

What makes a good map? How can perspective, assumptions and even colour change the quality of the map? This is a one-hour video skillshare to learn all about map making from our School of Data fellow:

Learn some basic mapping skills with slides

Codrina prepared these slides with some extensive notes and resources. We hope that it helps you on your map journey.
Hand drawn map

Resources:

(Note: the hand drawn map was created at School of Data Summer Camp. Photo by Heather Leson CCBY) flattr this!

Breaking the Knowledge Barrier: The #OpenData Party in Northern Nigeria

- October 1, 2014 in #OpenData Party, Budget Data, Budget Tracking, community, Data Expeditions, Data for CSOs, Events, Follow the Money, Geocoding, Mapping, Nigeria, spreadsheets, Storytelling, visualisation, Zamfara

If the only news you have been watching or listening to about Northern Nigeria is of the Boko Haram violence in that region of Nigeria, then you need to know that other news exist, like the non-government organizations and media, that are interested in using the state and federal government budget data in monitoring service delivery, and making sure funds promised by government reach the community it was meant for. This time around, the #OpenData party moved from the Nigeria Capital – Abuja to Gusau, Zamfara and was held at the Zamfara Zakat and Endowment Board Hall between Thursday, 25 and Friday, 26, 2014. With 40 participant all set for this budget data expedition, participants included the state Budget Monitoring Group (A coalition of NGOs in Zamfara) coordinated by the DFID (Development for International Development) State Accountability and Voice Initiative (SAVI),other international NGOs such as Society for Family Health (SFH), Save the Children, amongst others.
IMAG1553

Group picture of participants at the #OpenData Party in Zamfara

But how do you teach data and its use in a less-technology savvy region? We had to de-mystify teaching data to this community, by engaging in traditional visualization and scraping – which means the use of paper artworks in visualizing the data we already made available on the Education Budget Tracker. “I never believed we could visualize the education budget data of the federal government as easy as what was on the wall” exclaimed Ahmed Ibrahim of SAVI
IMAG1516

Visualization of the Education Budget for Federal Schools in Zamfara

As budgets have become a holy grail especially with state government in Nigeria, of most importance to the participants on the first day, was how to find budget data, and processes involved in tracking if services were really delivered, as promised in the budget. Finding the budget data of the state has been a little bit hectic, but with much advocacy, the government has been able to release dataset on the education and health sector. So what have been the challenges of the NGOs in tracking or using this data, as they have been engaged in budget tracking for a while now?
Challenges of Budget Tracking Highlighted by participants

Challenges of Budget Tracking Highlighted by participants

“Well, it is important to note that getting the government to release the data took us some time and rigorous advocacy, added to the fact that we ourselves needed training on analysis, and telling stories out of the budget data” explained Joels Terks Abaver of the Christian Association of Non Indigenes. During one of the break out session, access to budget information and training on how to use this budget data became a prominent challenge in the resolution of the several groups. The second day took participants through the data pipelines, while running an expedition on the available education and health sector budget data that was presented on the first day. Alas! We found out a big challenge on this budget data – it was not location specific! How does one track a budget data that does not answer the question of where? When involved in budget tracking, it is important to have a description data that states where exactly the funds will go. An example is Construction of Borehole water pump in Kaura Namoda LGA Primary School, or we include the budget of Kaura Namoda LGA Primary School as a subtitle in the budget document.
Taking participants through the data pipelines and how it relates to the Monitoring and Evaluation System

Taking participants through the data pipelines and how it relates to the Monitoring and Evaluation System

In communities like this, it is important to note that soft skills are needed to be taught – , like having 80% of the participants not knowing why excel spreadsheets are been used for budget data; like 70% of participants not knowing there is a Google spreadsheet that works like Microsoft Excel; like all participants not even knowing where to get the Nigeria Budget data and not knowing what Open Data means. Well moving through the school of data through the Open Data Party in this part of the world, as changed that notion.”It was an interesting and educative 2-day event taking us through the budget cycle and how budget data relates to tracking” Babangida Ummar, the Chairman of the Budget Working Group said. Going forward, this group of NGO and journalist has decided to join trusted sources that will be monitoring service delivery of four education institutions in the state, using the Education Budget Tracker. It was an exciting 2-day as we now hope to have a monthly engagement with this working group, as a renewed effort in ensuring service delivery in the education sector. Wondering where the next data party will happen? We are going to the South – South of Nigeria in the month of October – Calabar to be precise, and on the last day of the month, we will be rocking Abuja! flattr this!

A Weekend of Data, Hacks and Maps in Nigeria

- September 16, 2014 in charity data, Data Cleaning, Data Expeditions, event, Mapping, maps, School_Of_Data, spreadsheets, visualisation

It was another weekend of hacking for good all around the world, and Abuja, Nigeria was not left out of the weekend of good, as 30 participants gathered at the Indigo Trust funded space of Connected Development [CODE], scraping datasets, brainstorming creating technology for good, and not leaving one thing out – talking soccer (because it was a weekend, and Nigeria “techies” love soccer especially the English premiership).
Participants at the Hack4Good 2014 in Nigeria

Participants at the Hack4Good 2014 in Nigeria

Leading the team, was Dimgba Kalu (Software Architect with Integrated Business Network and founder TechNigeria), who kick started the 3 day event that was built around 12 coders with other 18 participants that worked on the Climate Change adaptation stream of this year #Hack4Good. So what data did we explore and what was hacked over the weekend in Nigeria? Three streams were worked :
  1. Creating a satellite imagery tagging/tasking system that can help the National Space Research Development Agency deploy micromappers to tag satellite imageries from the NigeriaSat1 and NigeriaSat2
  2. Creating an i-reporting system that allows citizen reporting during disasters to Nigeria Emergency Management Agency
  3. Creating an application that allows citizens know the next water point and its quality within their community and using the newly released dataset from the Nigeria Millennium Development Goal Information System on water points in the country.
Looking at the three systems that was proposed to be developed by the 12 coders, one thing stands out, that in Nigeria application developers still find it difficult to produce apps that can engage citizens – a particular reason being that Nigerians communicate easily through the radio, followed by SMS as it was confirmed while I did a survey during the data exploration session.
Coders Hackspace

Coders Hackspace

Going forward, all participants agreed that incorporating the above medium (Radio and SMS) and making games out of these application could arouse the interest of users in Nigeria.  “It doesn’t mean that Nigerian users are not interested in mobile apps, what we as developers need is to make our apps more interesting” confirmed Jeremiah Ageni, a participant. The three days event started with the cleaning of the water points data, while going through the data pipelines, allowing the participants to understand how these pipelines relates to mapping and hacking. While the 12 hackers were drawn into groups, the second day saw thorough hacking – into datasets and maps! Some hours into the second day, it became clear that the first task wouldn’t be achievable; so much energy should be channelled towards the second and third task.
SchoolofData Fellow - Oludotun Babayemi taking on the Data Exploration session

SchoolofData Fellow – Oludotun Babayemi taking on the Data Exploration session

Hacking could be fun at times, when some other side attractions and talks come up – Manchester United winning big (there was a coder, that was checking every minutes and announcing scores)  , old laptops breaking (seems coders in Abuja have old  ones), coffee and tea running out (seems we ran out of coffee, like it was a sprint), failing operating systems (interestingly, no coders in the house had a Mac operating system), fear of power outage (all thanks to the power authority – we had 70 hours of uninterrupted power supply) , and no encouragement from the opposite sex (there was only two ladies that strolled into the hack space).
Bring on the energy to the hackspace

Bring on the energy to the hackspace

As the weekend drew to a close, coders were finalizing and preparing to show their great works.  A demo and prototype of streams 2 and 3 were produced. The first team (working on stream 2), that won the hackathon developed EMERGY, an application that allows citizens to send geo-referenced reports disasters such as floods, oil spills, deforestation to the National Emergency Management Agency of Nigeria, and also create a situation awareness on disaster tagged/prone communities, while the second team, working on stream 3, developed KNOW YOUR WATER POINT an application that gives a geo-referenced position of water points in the country. It allows communities; emergency managers and international aid organizations know the next community where there is a water source, the type, and the condition of the water source.
(The winning team of the Hack4Good Nigeria) From Left -Ben; Manga; SchoolofData Fellow -Oludotun Babayemi; Habib; Chief Executive, CODE - Hamzat

(The winning team of the Hack4Good Nigeria) From Left -Ben; Manga; SchoolofData Fellow -Oludotun Babayemi; Habib; Chief Executive, CODE – Hamzat

Living with coders all through the weekend, was mind blowing, and these results and outputs would not be scaled without its challenges. “Bringing our EMERGY application live as an application that cuts across several platforms such as java that allows it to work on feature phones can be time consuming and needs financial and ideology support” said Manga, leader of the first team. Perhaps, if you want to code, do endeavour to code for good!   flattr this!

How to: Choropleth Maps with D3

- June 6, 2014 in crisis.net, Data Journalism, Geocoding, HowTo, Mapping, maps, ushahidi, visualizations


[Guest Cross-post from Jonathon Morgan of Crisis.net. CrisisNET finds, formats and exposes crisis data in a simple, intuitive structure that’s accessible anywhere. Now developers, journalists and analysts can skip the days of tedious data processing and get to work in minutes with only a few lines of code. See the Original post]
syriamapcut D3 is quickly become the de facto library for browser-based data visualizations. However while it’s widely used for line graphs and bar charts, its mapping features are still fairly underutilized — particularly in relation to more established tools like CartoDB, and of course Google Maps. Those tools have their place, but when you need fine-grained control over the presentation and interactivity of your geospatial data, D3 can be a powerful alternative. Today we’ll walk through how to create a popular visualization; the choropleth map. These are used to show the relative concentration of data points within a given region. For example this might be the number of people within a particular age range in every county in a state, or the number of reported cases of the flu in each state in a country. The information we’ll be mapping is a little more exotic. I recently collaborated with Eliot Higgins, an arms transfer analyst focused on the ongoing conflict in Syria, to retrieve data from 1,700 Facebook pages and YouTube accounts associated with militant groups and humanitarian organizations working in Syria. We ingested that data into CrisisNET, which then made it possible for us to generate a “heat map” showing which parts of Syria are experiencing the most intense fighting. In order to do this we’ll need to:
  • Work with projections to transform latitude, longitude pairs to x, y browser coordinates
  • Render city boundaries as SVG paths using D3 drawing tools
  • Shade each city relative to its reported level of violence
Let’s get started. Before we can do anything we’ll need some data. A geospatial “feature” (like a city, state, etc), is defined as a polygon, which is represented as a list of latitude/longitude pairs. For example:
[
[ 36.712428478000049, 35.83274311200006 ],
[ 36.704171874000053, 35.830347390000043 ],

]

Each pair is a corner of the polygon, so if you plotted them on a map and connected the dots, you would get the outline of the feature. Awesome! Geospatial data comes in a variety of formats, like shapefiles, and KML. However the emerging standard, particularly for use in web applications, is GeoJSON. Not surprisingly, this is the format supported by D3 and the one we’ll be using. Depending on the region you’re trying to map, GeoJSON polygons defining features in that region may be easy to find — like these GeoJSON files for all counties in the United States. On the other hand, particularly if you’re interested in the developing world, you’ll probably need to be more creative. To map cities in Syria, I tracked down a shapefile from an NGO called Humanitarian Response, and then converted that shapefile to GeoJSON using a tool called ogr2ogr. Fortunately for you, I’ve made the GeoJSON file available, so just download that and you’ll be ready to go.

Let’s Talk Projections

With our polygons in hand, we can start mapping. Remember that latitude and longitude coordinates denote positions on the surface of the Earth, which is not flat (it is an ellipsoid). Your computer screen is a plane (which means it’s flat), so we need some way to translate the position of a point on a curved surface to its corresponding point on a flat surface. The algorithms for doing this are called “projections.” If, like me, you’ve forgotten most of your high school geometry, you’ll be pleased to learn that D3 comes included with a number of popular projections, so we won’t need to write one. Our only job is to choose the correct projection for our visualization. The Albers and Azimuthal Equal Area projections are recommended for choropleth maps, but I found both rendered my cities in a way that didn’t connect all the points in the polygons from our shapefile, so some of the city outlines didn’t form an enclosed shape. This made it impossible to shade each city without the color overflowing into other parts of the map. Although this is probably due more to my lack of familiarity with the specifics the Albers and Azimutha projections, I found that the Conic Conformal projection worked out of the box, so that’s the one I chose.

Drawing the Map

Now that you understand the background, we can start coding. First, attach an element to the DOM that will serve as our canvas.
Next create an SVG element and append it to the map DOM node we just created. We’ll be drawing on this SVG element in just a second.
// Size of the canvas on which the map will be rendered
var width = 1000,
height = 1100,
// SVG element as a JavaScript object that we can manipulate later
svg = d3.select(“#map”).append(“svg”)
.attr(“width”, width)
.attr(“height”, height);
Despite the rather lengthy explanation, defining the projection in our application is actually fairly straightforward.
// Normally you’d look this up. This point is in the middle of Syria
var center = [38.996815, 34.802075]; // Instantiate the projection object
var projection = d3.geo.conicConformal()
.center(center)
.clipAngle(180)
// Size of the map itself, you may want to play around with this in
// relation to your canvas size
.scale(10000)
// Center the map in the middle of the canvas
.translate([width / 2, height / 2])
.precision(.1);
With a projection ready to go, we’re ready to instantiate a path. This is the path across your browser window D3 will take as it draws the edges of all our city polygons.
// Assign the projection to a path
var path = d3.geo.path().projection(projection);
Finally, let’s give some geospatial data to our path object. This data will be projected to x, y pairs, representing pixel locations on our SVG element. When D3 connects these dots, we’ll see the outlines of all the cities in Syria. Let’s use d3′s json method to retrieve the GeoJSON file I referenced earlier.
d3.json(“cities.json”, function(err, data) {
$.each(data.features, function(i, feature) {
svg.append(“path”)
.datum(feature.geometry)
.attr(“class”, “border”);
});
});
That’s it! Most of the heavy lifting is taken care of by D3, but in case you’re curious about what’s happening, here’s a little more detail. Our GeoJSON file contains an array of features, each of which is a polygon (which is represented as an array of longitude, latitude coordinate pairs). We pass the polygon to our path using the datum method, and the polygon is then converted by our projection to a linestring of pixel positions which is used by the browser to render a path DOM node inside our svg element. Phew. With a working map of the country, we can now change its appearence and add interactivity just like any other DOM node. Next week we’ll use the CrisisNET API to count reports of violent incidents for each city in Syria, and shade each city on the map with CSS based on those report counts. In the meantime you can checkout the full, working map on our Syria project page. flattr this!

Putting Points on Maps Using GeoJSON Created by Open Refine

- May 19, 2014 in Data Cleaning, Data for CSOs, HowTo, Mapping

Having access to geo-data is one thing, quickly sketching it on to a map is another. In this post, we look at how you can use OpenRefine to take some tabular data and export it in a format that can be quickly visualised on an interactive map. At the School of Data, we try to promote an open standards based approach: if you put your data into a standard format, you can plug it directly into an application that someone else has built around that standard, confident in the knowledge that it should “just work”. That’s not always true of course, but we live in hope. In the world of geo-data – geographical data – the geojson standard defines a format that provides a relatively lightweight way of representing data associated with points (single markers on a map), lines (lines on a map) and polygons (shapes or regions on a map). Many applications can read and write data in this format. In particular, Github’s gist service allows you to paste a geojson data file into a gist, whereupon it will render it for you (Gist meets GeoJSON). Gists_and_test So how can we get from some tabular data that looks something like this: simple_geo_points-tab_-_OpenRefine Into the geojson data, which looks something like this?
{"features": [   {"geometry": 
        {   "coordinates": [  0.124862,
                 52.2033051
            ],
            "type": "Point"},
         "id": "Cambridge,UK",
         "properties": {}, "type": "Feature"
    },
   {"geometry": 
        {   "coordinates": [ 151.2164539,
                 -33.8548157
            ],
            "type": "Point"},
         "id": "Sydney, Australia",
         "properties": {}, "type": "Feature"
    }], "type": "FeatureCollection"}
[We're assuming we have already geocoded the location to get latitude and longitude co-ordinates for it. To learn how to geocode your own data, see the School of Data lessons on geocoding or this tutorial on Geocoding Using the Google Maps Geocoder via OpenRefine].

One approach is to use OpenRefine [openrefine.org]. OpenRefine allows you to create your own custom export formats, so if we know what the geojson is supposed to look like (and the standard tells us that) we can create a template to export the data in that format.
Steps to use Open Refine:
Locate the template export tool is in the OpenRefine Export drop-down menu: export-_OpenRefine Define the template for our templated export format. The way the template is applied is to create a standard header (the prefix), apply the template to each row, separating the templated output for each row by a specified delimiter, and then adding a standard footer (the suffix). simple_geo_points_-_OpenRefine Once one person has worked out the template definition and shared it under an open license, the rest of us can copy it, reuse it, build on it, improve it, and if necessary, correct it…:-) The template definitions I’ve used here are a first attempt and represent a proof-of-concept demonstration: let us know if the approach looks like it could be useful and we can try to work it up some more. It would be useful if OpenRefine supported the ability to save and import different template export configuration files, perhaps even allowing them to be imported from and save to a gist. Ideally, a menu selector would allow column names to be selected from the current data file and then used in template. Here are the template settings for template that will take a column labelled “Place”, a column named “Lat” containing a numerical latitude value and a column named “Long” containing a numerical longitude and generate a geojson file that allows the points to be rendered on a map. Prefix:
{"features": [
Row template:
 {"geometry": 
        {   "coordinates": [ {{cells["Long"].value}},
                {{cells["Lat"].value}}
            ],
            "type": "Point"},
         "id": {{jsonize(cells["Place"].value)}},
         "properties": {}, "type": "Feature"
    }

Row separator:
,

Suffix:
], "type": "FeatureCollection"}

This template information is also available as a gist: OpenRefine – geojson points export format template. Another type of data that we might want to render onto a map is a set of markers that are connected to each other by lines. For example, here is some data that could be seen as describing connections between two places that are mentioned on the same data row: point_to_point_demo_tab_-_OpenRefine The following template generates a place marker for each place name, and also a line feature that connects the two places. Prefix:
{"features": [

Row template:
 {"geometry": 
        {   "coordinates": [ {{cells["from_lon"].value}},
                {{cells["from_lat"].value}}
            ],
            "type": "Point"},
         "id": {{jsonize(cells["from"].value)}},
         "properties": {}, "type": "Feature"
    },
{"geometry": 
        {   "coordinates": [ {{cells["to_lon"].value}},
                {{cells["to_lat"].value}}
            ],
            "type": "Point"},
         "id": {{jsonize(cells["to"].value)}},
         "properties": {}, "type": "Feature"
    },
{"geometry": {"coordinates": 
[[{{cells["from_lon"].value}}, {{cells["from_lat"].value}}], 
[{{cells["to_lon"].value}}, {{cells["to_lat"].value}}]], 
"type": "LineString"}, 
"id": null, "properties": {}, "type": "Feature"}

Row separator:
,

Suffix:
], "type": "FeatureCollection"}

If we copy the geojson output from the preview window, we can paste it onto a gist to generate a map preview that way, or test it out in a geojson format checker such as GeoJSONLint: GeoJSONLint_-_Validate_your_GeoJSON I have pasted a copy of the OpenRefine template I used to generate the “lines connecting points” geojson here: OpenRefine export template: connected places geojson. Finally, it’s worth noting that if we can define a standardised way of describing template generated outputs from tabular datasets, libraries can be written for other programming tools or languages, such as R or Python. These libraries could read in a template definition file (such as the gists based on the OpenRefine export template definitions that are linked to above) and then as a direct consequence support “table2format” export data format conversions. Which makes me wonder: is there perhaps already a standard for defining custom templated export formats from a tabular data set? flattr this!