Universidade Aberta, Inclusão Digital Aberta, Cidade Aberta, Paulista Aberta e Ciência Aberta!

Tom - August 27, 2015 in Alexandre, Av. Paulista, avaliação, CGM, ciência aberta, CMSP, Conhecimento Livre, Dados Abertos, Destaque, EACH, Educação Aberta, espaço urbano, igualdade, inclusão digital, Jorge Machado, lançamento, lei, livro, meritocracia, Open Knowledge Brasil, Parceiros, Paulista, Police Neto, São Paulo, Sarita Albagli, Secretaria Municipal de Serviços, Sociedade, sustentabilidade, USP Leste

A última semana foi intensa para a Open Knowledge Brasil. Participamos de cinco eventos que envolve algum tipo de abertura, alinhado com o que promovemos para uma sociedade mais justa e igualitária. Vamos descrever cada um desses eventos. Fotos dos eventos aqui.

Apresentações na USP Leste sobre dados abertos, meritocracia, universidades públicas e ciência aberta

Seminários USPFomos convidados para participar da 5ª Semana de Sistemas de Informação da USP, que ocorreu entre os dias 18 e 21 de agosto de 2015, na Escola de Artes, Ciências e Humanidades (EACH) da Universidade de São Paulo.

No dia 21 (quinta-feira), apresentei sobre ‘Meritocracia, Dados Abertos e Universidades Públicas’ e o Alexandre Abdo, conselheiro consultivo da OKBr, sobre ‘Sistemas, informação e a confiabilidade do conhecimento científico-acadêmico‘.

Na minha apresentação comecei definindo como via meritocracia, distinguindo a boa da má, bastante inspirado no ‘Good Meritocracy, Bad Meritocracy‘, de Donal Low, que aponta algumas falhas do sistema meritocrático de Singapura e dá sugestões como resolver esses problemas. A questão da igualdade de oportunidade, que sempre surge quando meritocracia é abordada, teve inspiração no artigo ‘Equality of Opportunity‘, da enciclopédia de filosofia de Stanford.

Por essas palestras recebemos, eu e o Alexandre, R$ 100 (R$ 50 para cada), que será doado para a Open Knowledge Brasil e destinado aos custos para manter o site do Grupo de Trabalho em Ciência Aberta. Ofereci metade do dinheiro para os custos do livro sobre ciência aberta recém lançado (vejam abaixo), mas ele foi gentilmente doado para nós pela professora Sarita! (Obrigado, Sarita!)

Inclusão Digital Aberta

inclusao digital abertaTambém fomos convidados para participar pela Secretaria Municipal de Serviços, da cidade de São Paulo, da discussão da Lei Municipal nº 14.668/2008, criada pelo vereador José Police Neto. Já foi proposto pelo vereador Police, quando presidente da Câmara Municipal de São Paulo (CMSP), o uso do fundo previsto por ele para criarmos um portal de dados abertos mantido pela sociedade civil, o que nunca ocorreu. Tentamos um diálogo entre a CMSP via sua presidência e a Controladoria Geral do Município na época, mas não houve progresso.

Saudamos a iniciativa dessa secretaria em retomar o diálogo com a sociedade civil e colocar essa lei sob consulta pública para seu aprimoramente, disponível nesse site aqui. Acho de extrema importância nossa participação para que esse fundo, proveniente essencialmente do ISS (Imposto Sobre Serviço) para empresas de tecnologia, tenha um processo aberto e transparente para seu uso, que foi o que propusemos há quase 2 anos no diálogo entre CMSP e CGM.

Durante o painel que participei, destaquei a importância de focarmos menos na questão da infra-estrutura quando formos pensar em inclusão digital, mas também no conteúdo, mencionando o caso do tanto de conhecimento produzido com dinheiro público que fica trancafiados em locais como se fossem feudos, como por exemplo universidades públicas, onde a maior parte de seu conhecimento é financiado, em alguns casos, por impostos indiretos, mas apenas uma minoria tem acesso a tudo o que é produzido de forma fechada. Citei exemplos nossos, como a Escola de Dados, que oferece cursos gratuitos sobre alguns temas de extrema relevância para o que estava sendo discutifdo.

Cidade Aberta e Hackeável: espaços urbanos

espacos urbanos

No sábado fomos eu e a Heloisa Pait, conselheira consultiva da OKBR, num interessante debate sobre Espaço urbano: interesse privado, poder público, organicidade e planejamento, que levantava a questão principal sobre como podem as mídias digitais contribuir para a construção de novas perspectivas dentro deste embate?

Questionei o fato de alguns espaços não terem estímulos públicos para o seu uso, como alguns campi da Universidade de São Paulo, que poderia servir nos finais de semana para levar para a população cultura e ciência através de programas de extensão, mas por algum motivo que desconhecemos, não há políticas públicas que estimulem isso na cidade.

Paulista Aberta: transporte sustentável

paulista abertaComo em nosso estatuto foi previsto a promoção de políticas públicas sustentáveis, também participamos da inauguração de mais um trecho da ciclovia na região da Av. Paulista, ligando seu início na praça do cliclita até a região do Paraíso. No último domingo a Av. Paulista foi aberta para toda população que quisesse passear com suas bicicletas, familiares e amigos, num clima muito bom de confraternização na cidade de pedra.

Lançamento do livro Ciência Aberta, Questões Abertas

Fomos também convidados pela professora Sarita Albagli para o lançamento do livro ‘Ciência Aberta, Questões Aberta’, organizado pela prória Sarita,  pelo Alexandre Abdo e pela Maria Lucia Maciel. Ficamos muito contentes que, além de receber a doação mencionada acima no valor de R$ 50, recebemos um livro para nossa biblioteca assinado pela Sarita, pelo Alexandre e pelos professores Jorge Machado, Henrique Parra e pela Luca. Muito obrigado a todos!

GT de Ciência Aberta

Flattr this!

Colabora con el Índice Global de Datos Abiertos 2015!

Yas García - August 26, 2015 in Gobierno Abierto, Noticia

Al cabo de dos meses, 82 ideas para los conjuntos de datos, 386 votantes, trece consultas de organización de la sociedad civil y debates muy activos del foro Índice, hemos llegado finalmente a un consenso sobre lo que los conjuntos de datos serán incluidos en el Índice Global de Open Data 2015 (GODI) .

Este año, como parte de nuestro objetivo de garantizar que el Índice Global Open Data es más que una herramienta de medición simple, empezamos una discusión con la comunidad open data y nuestros socios en la sociedad civil para ayudar a determinar qué datos son de alta y social valor democrático y debe evaluarse en el Índice 2015. Creemos que al hacer la elección de los conjuntos de datos de una decisión de colaboración, vamos a ser capaces de dar a conocer e iniciar una conversación en torno a los conjuntos de datos necesarios para el Índice de llegar a ser verdaderamente una auditoría de la sociedad civil de la revolución de datos abierta. El proceso incluyó una encuesta global, una consulta a la sociedad civil y un foro de discusión (leer más en un blog anterior sobre el proceso).

La comunidad tuvo algunas sugerencias maravillosas, haciendo de decidir quince conjuntos de datos no es tarea fácil. Para reducir la selección, empezamos por la eliminación de los conjuntos de datos que no eran adecuados para el análisis global. Por ejemplo, algunos conjuntos de datos se recogen en el plano de la ciudad y por lo tanto no pueden ser comparados fácilmente a nivel nacional. En segundo lugar, nos miramos a ver si se era un estándar global que nos permita comparar fácilmente entre países (como los requisitos de las Naciones Unidas para los países, etc.). Por último, tratamos de encontrar un equilibrio entre los conjuntos de datos financieros, datos ambientales, conjuntos de datos geográficos y conjuntos de datos relativos a la calidad de los servicios públicos. Consultamos con expertos de diferentes campos y refinamos nuestras definiciones antes de que finalmente la elección de los siguientes conjuntos de datos:

– Ver más en: http://blog.okfn.org/2015/08/20/the-2015-global-open-data-index-is-around-the-corner-these-are-the-new-datasets-we-are-adding-to-it/#sthash.3PGruLNL.dpuf

開會之前的知識流通

OKFN Taiwan - August 26, 2015 in 評論

最近幾年每個月都會收到好幾次由各種智庫發來的會議邀請,除少數場合之外,委託方通常是政府機構。有時人不在台灣,只好成堆推卻。來來回回之間,一些情事實在是不忍卒睹。有經驗的或許不願意多寫,也不想公開得罪。但若這些智庫功能不彰,影響可不謂不大!僅先示一點,揣陋說明,並提供可能之解方思考。

諮詢會議的背景資訊

諮詢會議通常不會獨立存在,有其服務之目的。例如研究計劃內包含專家訪談,網站建置和營運則包含需求訪談。訪談有外訪內訪,外訪名單若是傳統領域,則來源名單,無論適切與否,在智庫承辦心底,通常早有人選。若涉及新興領域,由於案源複雜度可能不高,交付物也趨於單純,訪談名單來源,就見仁見智。我常會收到敲打會議「關鍵字」然後透過搜尋引擎來信之邀請,則多半屬於此類。

13351149863_c53c905e5a_o然則由於智庫研究案,在新興領域通常價金微薄,人事成本就佔大比,業主又多以政府機關為主。專家會議,千篇一律,一事一場,一場一開,聊表慰藉。不只毫無新意,而且積習多年,嚴重囿限執行深度。這深度可以切出一刀來講,就是會議的背景資訊提供。

所謂開會開會,必然要有框架,有時間,有目的。這些都屬於背景資訊。有些會議名稱浩大,範圍偉大,我本以為這非常態。然則三兩年後,每每收到信都不禁自以為高尚。動不動就「策略」會議,但其實是「發言」會議。或是「諮詢」會議,但要求在席。不過以意見書面遞交,搞不好更是有效完整。在這光譜區間,更多是搞不清楚會議背景,會議目的,以及會議本身在計畫期程的那一個階段的狀態。

這三項都能透過網路平台,公告示人。若因業務涉及機密機敏,不能公開示人,在這個年代,也能以優廉成本,提供通用性的網路平台,讓被徵詢者自行上網,檢閱計畫的相關資訊。此舉對於傳統不習網路作業之聞人達人,或許仍有困擾。但基本上只要會用電子郵件,會登入網站,大多能勝任無虞。若有仍須電話作業者,再特開管道,電訪聯繫,紙本資訊寄送即可。

時至今日,諸多國際智庫之徵詢作為,之計畫訊息公示、之訊息揭露管道,早已不可同日而語。即便是本地智庫所交涉之鄰近諸國智庫,多半在網路和諮詢流程優化,還不到此層次,但智庫智庫,難道不自覺與網路和新興領域,漸行漸遠?難道不該想想,會議一次又一次,精進了什麼?

我們就以單一情境,頗析會議的背景資訊提供,是如何亟需逐步優化。

話說某一徵詢會議,務談園區的國際化,於是在策略層面,邀集眾家擁有不同國際經商經驗者。有經驗者不必然在台灣,因此會議不能身赴參加,此挑戰是一。是否能遠端語音參加,牽涉當場會議網路設備,還有主持功力,此挑戰是二。若在巴林或特殊國度,連網不易,此挑戰是三。若邀集之眾家有外籍人士,不懂中文,那麼會議資訊,要出兩份,此挑戰是四。希望眾家提供產業發展之策略意見,但園區產業現況卻一問三不知,沒有基本資訊。來者大頭,還是各言爾志,苦的是承辦方的承辦人,此挑戰是五。諮詢主題,主辦承辦,歷有作為,但要交叉查詢,別說受訪者,連承辦也是搞不清楚,沒有方便的內部系統可查,查議題、查人、查進展,然後學到教訓,甚至從他案拉出背景資訊,供參此案,此為挑戰六。

試想,若一單位,每年諮詢會議,不下三五百場,這加總起來,浪費多少時間,又失去了多少溝通的機會?從一場會議的背景資訊提供,我們看到了缺乏細緻作為,缺乏彈性架構,以及缺乏效率的因襲。所謂願景策略,只要弄框架的基本功夫練好,到頭來的產出,就不至於偏移的太離譜。資訊會議,就是框架的基本功夫。別小看細節,就是這些關節,讓成千上萬智庫所承攬的會議,看似開放,其實是一關又一關的水密閘門。資訊洪流,任憑外在世界如何翻騰,能進到報告裡的,只是滲透逕流。等到潛望出艙,整船海員,嗚嗚不能不行。

作法建議:從主管自身開始,授權小組,作為起手。餘裕之間,尋找新興議題,徵詢對象來源,以年輕為主,佐以三兩資深專家。網路平台部分,稍微複雜,但可用通用的文件托管平台,具備簡易權限設定,在單一會議,理出單一具體徵詢構面,將可能的背景資訊,以方便傳遞和轉換格式的文件、超連結書籤等,提供給與會者。由於執事之師,並非政府機構,也非時下熱門之「開放社群」。在議題部分,沒有必要公開對公眾負責。因此操作風險低,頂多是主管和承辦,工作人時增加,作業文化有所磨合。兩次三次後,再誠實檢討。若要加入變數,例如提供雙語文件,且對外籍人士,那麼難度雖然提高,至少有些評估的基礎。多次之後,整件事除非事要跨主管權限到其他部分,否則通常難度不若想像的高,也不是純然是網路平台和技術的問題。

這經驗的積累,內化為團隊小組的工作基因。未來在面對新興議題,團隊反應就快了,在會議資訊的提供上,也不會有如四五十年前一紙公文的水準,讓人看了很想直接丟進水裡。

初淺見解,如表以上。

作者:TH

Masthead

Adam Green - August 26, 2015 in Uncategorized

The Public Domain Review was founded in 2011 by Adam Green and Jonathan Gray, and is a project of Open Knowledge. It is based wherever the Editor-in-Chief may lay his hat/MacBookPro, though this is mainly London, UK. Editorial Team Although mostly the product of the Editor-in-Chief’s many hours of lonely toil, there are a key group of volunteers who are intsrumental to the running of the project: Lauren (who castsâ�¦

PASTEUR4OA Data Visualisations

Marieke Guy - August 26, 2015 in DataViz, PASTEUR4OA

As part of our work on the PASTEUR4OA project we have been creating a series of data visualisations using data from ROARMAP. ROARMAP, or the Registry of Open Access Repository Mandates and Policies is a searchable international registry charting the growth of Open Access mandates adopted by universities, research institutions and research funders.  Early PASTEUR4OA work involved developing a new classification scheme for the registry allowing users to record and search the held information with far more detail than before. The project has also added over 250 new policy entries to the ROARMAP database, it currently has 725 policies (as of 24th August 2015). Post rebuilding of ROARMAP a policy effectiveness exercise was carried out that examined deposit rates resulting from mandated and non-mandated policies. The exercise highlighted important evidence that shows three specific elements that support effectiveness: a mandatory deposit, a deposit that cannot be waived, and linking depositing with research evaluation.  You can read more about these findings (including policy typology and effectiveness and list of further policymaker targets) in the Workpackage 3 report on policy recording exercise.

Tableau visualisation of Open Access policies in Europe

Tableau visualisation of Open Access policies in Europe

While it was agreed that the effectiveness exercise was useful it was recognised that long, comprehensive reports often fail to have the required effect on policy makers. One idea was to carry out some data visualisation work on the ROARMAP data and create both an online data visualisation hub and a series of infographics to feature as part of the advocacy material being developed.

Getting started

I was chosen to lead on the data visualisation work for PASTEUR4OA, but I hadn’t created a series of visualisations like this before. The prospect was a little daunting! However I was lucky enough to have a more experienced colleague whom I could ask for help and bounce ideas around with.

My main brief was to exploit the ROARMAP database and create visuals to be produced for advocates to use in presentations, literature etc. These visuals would show the statistics in attractive and interesting ways, so for example in the form of maps etc. The visualisations would need to be useful for policy makers, institutions, researchers and individuals interested in Open Access. It was also suggested that we use live data when possible.

Some of the questions I asked myself and others prior to starting the work are listed below:

  • What is the budget for the work?
  • What is the resourcing/time available for the work?
  • How will we get the data out of the system it is in? API, URL or other?
  • Where will we store the visualisations?
  • Where will we store the new data created? Will we release it openly?
  • How often will the data be updated?
  • Who can help me with my work?
  • What is genuinely do-able given my skill set?

There are quite a few guides on the process of data visualisation creation but one that I found particularly useful was this overview from Jan Willem Tulp published on the Visual.ly blog. I also appreciated the clarification of roles in the 8 hats of data visualization design by Andy Kirk.

Early on in the process to make sure that I was thinking about the story we wanted to tell I set up a scratch pad in which to record the questions we wanted to ask of the data. So for example: How many Open Access policies are there worldwide? Can I see this on a map? Which countries have the most policies? How many policies are mandatory? How many comply with the Horizon 2020 OA policy? Does mandating deposit result in more items in repositories? How many policies mention Article Processing Charges? Etc.

We also agreed on the data we would be using:

Experimenting with the Many Eyes tool

Experimenting with the Many Eyes tool

Most of the data would be coming from ROARMAP and we worked closely with the ROARMAP developers, and had had significant input into the data on the site, so we were confident that it was reliable. Usually when selecting sources it is useful to keep in mind a couple of questions: is it a reputable source? Is it openly available? Is it easy to get out and work on? Has it been manipulated? Are there omissions of data? Will you need to combine data sets? The ROARMAP site doesn’t have an API but you can get a JSON feed out of the site, or search for data and create excel dumps.

Manipulating data

To begin with I started working on excel dumps from the site. One of the first hurdles I had to jump was getting the country names added to the data. ROARMAP data was categorised using the United Nations geoscheme and the country names were missing. Most of the manipulation could be done in Excel, it is a pretty powerful tool but it requires sensible handling! Some of the useful functions I learnt about include:

  • Sum – adding up
  • Count – the number of cells in a range that have numbers in them
  • Vlookup – lets you search for specific information in your spreadsheet
  • Concatenate – lets you combine text from different cells into one cell
  • Trim – removes extra spaces
  • Substitute – like replace but more versatile

Although you don’t need to be an expert in Excel or Google Spreadsheets it does help if you can use the tool fairly confidently. For me much of the confidence came from being able to manipulate how much data was shown on a sheet or page: so being able to hide rows, lock rows, filter data etc. Less is more – and if there is only the data you need on the page then life becomes a lot easier. Another lesson I learnt early on is the need for regular sanity checks to ensure you are being consistent with data and using the right version of the data set. I kept copious amounts of notes on what I’d done to the data – this proved to be very useful if I wanted to go back and repeat a process. Also I’d suggest that you learn early on how to replace a data set within a tool – you don’t want to get pretty far down the line and not be able to update your data set.

Data visualisation tools

Once I had an idea of which questions needed to be answered…I began to experiment with data visualisation tools. There is a great list of tools available on the datavisualisation.ch site. The main ones I tested out were:

I also experimented with the following infographic tools:

Whilst trialing each of these I had a few questions at the back of my mind:

  • How much does it cost to use?
  • What type of licence does the tool offer?
  • Do I have the correct OS?
  • Can we get the visualisation out of the tool?
  • Can it link to live data?
  • Can we embed the visualisation outside of the site?
  • Can we make a graphic of the results?
  • Can users download the visualisation, graphic or data?
  • Does the tool expect users to be able to programme?

I looked primarily at free services, which obviously have some limitations. Some tools wouldn’t allow me to take the visualisations and embed them elsewhere while others required that I had significant programming skills (in SQL, PHP, Python, R or Matlab) – something I seriously didn’t have time to learn at that point.

Tableau Public  came out on top as an all-round tool and I made the decision to stick with one tool for the online visualisations (Tableau public) and one tool for the infographics (here I chose Infogram). Unfortunately both tools didn’t link to live data, in fact none of the free tools seemed to do this in any user-friendly type way.

Linking to live data

Whilst I’ve been working on the data visualisations for PASTEUR4OA the number of Open Access policies that have been submitted to ROARMAP has been increasing. While this is great news for the project it has meant that my data is out of date as quickly as I download it. However I’ve discovered that linking to live data isn’t that easy. Few of the free tools allow it and the best way to create visualisations that do this seems to require programming skills. A colleague of mine helped me pull the JSON feed into a Google spreadsheet and then build a map on top of it but the result is slow to load and not particularly attractive. Linking to live data was going to require better skills than those I possessed – so I asked PASTEUR4OA’s project partner POLITO to help us. Their main work so far has been creating Linked Data SPARQL end points for some Open Access data sets but they have also been experimenting with live data visualisations. You can see an example of their efforts so far in this dynamic ball map.

ROARMAP live data in a Google map

ROARMAP live data in a Google map

Delivering data visualisations

Once I started creating the data visualisations it made sense to have somewhere to store them all. I set up a Github site and worked on a series of pages. Our in-house designer added some PASTEUR4OA styling and the result is available at http://pasteur4oa-dataviz.okfn.org/. The site has information on the questions we have been asking and the data used as well as a FAQ page to explain what the visualisations are for. The visualisations site is linked to from the main menu on the PASTEUR4OA website.

At this point I spent some time thinking about the look and feel of the visualisations. The PASTEUR4OA team suggested we use the ROARMAP colours as a ‘palette’ for the visualisations.

The PASTEUR4OA palette

The PASTEUR4OA palette

I also added headings, legends and explanations for the online visualisations to explain what questions they were asking. As part of this work a series of infographics (.png files) have been created from the Infogram visualisations with the intention of using them in blog posts, presentations etc. The images are embedded in the main data visualisation website.

Treemap Infographic of Open Access Policies Worldwide by Continent

Treemap Infographic of Open Access Policies Worldwide by Continent

Some things I thought about in more detail at this stage:

  • What are the infographics going to be used for?
  • What format should they be in?
  • Is there a colour theme? What colours look good?
  • Can I create a custom palette
  • Can viewers distinguish between different parts of the chart?
  • Is it clear what question the visualisation is answering?
  • Is there enough information on the data visualisation?
  • Is there a heading, comment box, labels, annotation, legend etc.?
  • Is the result honest?
  • Document where all the visualisations are held

PASTEUR4OA are also keen to make the data we’d be using openly available so have uploaded versions to Zenodo, a service which allows EU Projects to share and showcase multidisciplinary research results. The data set urls are listed on the data set page on the main visualisation website. Over time we intend to add links from the main data visualisation website to other Open Access open data that believe could be of interest. As mentioned earlier, POLITO will be making some of this data available as linked data. The idea is that developers can use the work we’ve done as ‘proof of concept’ or inspiration and build more visualisations using the data available.

Conclusion

Through carrying out this piece of work for PASTEUR4OA I have learnt many significant lessons about the data visualisation process. Hopefully this blog post has provided a taster of the challenges and benefits such a process brings. As a newbie it wasn’t always easy, but it was certainly interesting and rewarding. If you are thinking about making your own visualisations you might find this complimentary slideset I have created useful.

I believe that the results collected on the PASTEUR4OA data visualisation website are an example of the kind of things those wishing to advocate for Open Access could do without any programming skills. They are there to inspire people, developers, researchers and those new to visualisation and interested in Open Access. It would be great to see some of the visual aids we’ve created in presentations, posters and articles – maybe they can make the (at times!) dry data we were given interesting and accessible.

References

The PASTEUR4OA Data Visualisations website

The PASTEUR4OA Data Visualisations website

Global Open Data Index 2015 is open for submissions

Mor Rubinstein - August 25, 2015 in Featured, Global Open Data Index, Open Knowledge

The Global Open Data Index measures and benchmarks the openness of government data around the world, and then presents this information in a way that is easy to understand and easy to use. Each year the open data community and Open Knowledge produces an annual ranking of countries, peer reviewed by our network of local open data experts. Launched in 2012 as tool to track the state of open data around the world. More and more governments were being to set up open data portals and make commitments to release open government data and we wanted to know whether those commitments were really translating into release of actual data.

The Index focuses on 15 key datasets that are essential for transparency and accountability (such as election results and government spending data), and those vital for providing critical services to citizens (such as maps and water quality). Today, we are pleased to announce that we are collecting submissions for the 2015 Index!

The Global Open Data Index tracks whether this data is actually released in a way that is accessible to citizens, media and civil society, and is unique in that it crowdsources its survey results from the global open data community. Crowdsourcing this data provides a tool for communities around the world to learn more about the open data available in their respective countries, and ensures that the results reflect the experience of civil society in finding open information, rather than accepting government claims of openness. Furthermore, the Global Open Data Index is not only a benchmarking tool, it also plays a foundational role in sustaining the open government data community around the world. If, for example, the government of a country does publish a dataset, but this is not clear to the public and it cannot be found through a simple search, then the data can easily be overlooked. Governments and open data practitioners can review the Index results to locate the data, see how accessible the data appears to citizens, and, in the case that improvements are necessary, advocate for making the data truly open.

Screen Shot 2015-08-25 at 13.35.24

 

Methodology and Dataset Updates

After four years of leading this global civil society assessment of the state of open data around the world, we have learned a few things and have updated both the datasets we are evaluating and the methodology of the Index itself to reflect these learnings! One of the major changes has been to run a massive consultation of the open data community to determine the datasets that we should be tracking. As a result of this consultation, we have added five datasets to the 2015 Index. This year, in addition to the ten datasets we evaluated last year, we will also be evaluating the release of water quality data, procurement data, health performance data, weather data and land ownership data. If you are interested in learning more about the consultation and its results, you can read more on our blog!

How can I contribute?

2015 Index contributions open today! We have done our best to make contributing to the Index as easy as possible. Check out the contribution tutorial in English and Spanish, ask questions in the discussion forum, reach out on twitter (#GODI15) or speak to one of our 10 regional community leads! There are countless ways to get help so please do not hesitate to ask! We would love for you to be involved. Follow #GODI15 on Twitter for more updates.

Important Dates

The Index team is hitting the road! We will be talking to people about the Index at the African Open Data Conference in Tanzania next week and will also be running Index sessions at both AbreLATAM and ConDatos in two weeks! Mor and Katelyn will be on the ground so please feel free to reach out!

Contributions will be open from August 25th, 2015 through September 20th, 2015. After the 20th of September we will begin the arduous peer review process! If you are interested in getting involved in the review, please do not hesitate to contact us. Finally, we will be launching the final version of the 2015 Global Open Data Index Ranking at the OGP Summit in Mexico in late October! This will be your opportunity to talk to us about the results and what that means in terms of the national action plans and commitments that governments are making! We are looking forward to a lively discussion!

Open Access Week 2015: Open for Collaboration

Alessandro Sarretta - August 25, 2015 in Open Access, openaccess

Come ogni autunno, ormai da 8 anni, anche questo Ottobre si svolgerà in tutto il mondo l'”International Open Access Week“. Quest’anno la settimana dedicata alla promozione dell’accesso libero alla conoscenza scientifica sarà dal 19 al 25 Ottobre. Il 4 Marzo 2015 SPARC (The Scholarly Publishing and Academic Resources Coalition) ha annunciato che il tema per […]

讀北市智慧城市願景有感

OKFN Taiwan - August 25, 2015 in 評論

昨日讀了臺北市林副市長對於未來在臺北市所規劃的6處智慧型社區,雖然大致上談了作業時程、籌組一個專案辦公室、和廠商溝通,細節上使用了流行的關鍵字,如智慧電錶、水錶、瓦斯表、大數據分析、雲端管理平臺,仍有著空中樓閣之感,像是把新加坡的 Smart Nation 拿來濃縮套用在臺北城上。

從家用能源管理系統先談,新加坡在2014年先在 Yunhua 的住宅進行測試「家庭能源管理系統」,提供數據資料讓住戶了解、監測自己的能源使用方式,有效節省20%的用電。其次是鼓勵民間運用數據資料開發各種家庭照護應用,整合家電間的數據傳輸。

社區營造方面則透過環境感測器,監測各種環境數據,如光照角度、日光時間、人流、車流,了解實際環境狀況後,再進行社區規劃,這個社區規劃還分析了垃圾桶的擺放位置、顏色、收取頻率、對房子中的管線如何安排、住戶對停車場的使用狀況,以因應都市在離尖峰時間對停車位的安排、分析住戶生活中產生的廢氣排放狀況以安排管線甚至可以回收再利用。

面對擁擠的交通尖峰,如何派車?如何讓乘客都能在最短的時間搭上車?如何讓計程車司機在最短時間內找到叫車的客人並避過塞車的路段?出門前如何透過路線規劃即時了解路況?民眾生活交通的脈動(絡)如何?

上面所提到的都是從各面向去取得資料,了解居民的生活脈絡,建立在各種感測器數據資料上,新加坡政府也藉由架設感測器智慧盒(AG box)、建立異構網(HetNet),在穩定的傳輸環境下輸送資料,並且讓民眾在任何環境裡都能透過網路享受多媒體娛樂,不會因為在地鐵站、電梯裡、網路使用者多的情況下因為頻寬不足、訊號不佳影響或互相干擾影響了使用感。

在臺灣島上的居民目前要了解自己的能源消耗情況還算處於困難的情況,也還沒想到進一步的節能方案;在臺北市的路上很難找到一個「給人使用」的垃圾桶,許多居民連倒垃圾都要追著垃圾車跑,甚至繞道、發生意外、造成塞車問題;號稱4G普及化,但常常只有3G的速度;交通問題更不用談,機車、自行車的行車空間不足,常與行人搶道,可能十分鐘裡來了五輛無載客的公車,下一班卻人滿為患。

房價是個敏感話題,「人人可租用」五字點出受薪階級想有「自己的家」是一件困難的事,在這樣大規模的社區建案裡,有無像當初規劃社會住宅一樣,依據不同家庭單元的大小提供不同坪數、房間數的規劃?第一批入住的住戶資格是如何判斷?

提供租用的社會住宅,如何保障歷任房客的使用資料在數位化環境中不會讓下一個房客輕易取得,資料如何穩定且在安全環境中傳輸?如果居住在這些社區的住戶都要提供資料,要如何保障資料提供者的資料安全?

DSC_1094用聰明的方式去解決現在居民所遇到的問題、分析可以取得的數據讓全民參與,找到問題、爬梳脈絡找到前因後果,並解決它或是設立一個明確的目標,例如讓這一區的住戶節省多少用電量、用水量、車禍事故發生率能減少多少、透過智慧電表的安裝或是這些工具的安裝能提供什麼樣的協助…等,有明確的目標更能永續經營下去,那才是夠智慧,不然只是炒地皮、講講關鍵字而已。

「智慧城市」不是建一個新的社區、一廂情願的把人塞進去住、提供更快的上網速度、公用自行車、電動車就能提供數據、再分析這些數據來解決問題才叫「智慧城市」。

作者:YZ

MyData working group meets Meeco

Antti "Jogi" Poikola - August 24, 2015 in Uncategorized

What: MyData working group meeting with special guest When: Tue 25.8. at 16:00 Where: Finnish Broadcasting Company YLE, Radiokatu 5, Pasila Sign-up: At the Facebook-event (join the MyData Facebook group) or by email to jogi [a] okf.fi

Australian based company Meeco develops a life management platform. Meeco’s product and ideas are well in line with MyData thinking. Meeco’s CEO and founder Katryna Dow is visiting in Finland and will join the MyData working group meeting .

 

Draft agenda for the working group meeting:

  • Meeco presentation
  • Other international contacts -> White paper 2.0
  • EU General Data Protection Regulation status
  • Datam.me status check
  • Helsinki innovation fund proposal
  • Followup from meeting with minister Vehviläinen’s assistant
  • VM Digitalisaatiohaaste
  • Autumn events and activities

The post MyData working group meets Meeco appeared first on Open Knowledge Finland.

Is there a link between OER and economic growth?

elenastojanovska - August 24, 2015 in Featured, oer

Should countries invest more in education to promote economic growth? Education is crucial towards the path of socio-economic development in one country. An educated population is one of the keys in enhancing the economic productivity and creating knowledge economy and the future of the countries is relying on the knowledge and skills of their people.

growth

While there has been progress towards meeting the Education For All (EFA) goals of UNESCO there are a lot of challenges remaining, especially in the developing economies regarding the access to education, improving quality and dealing with financial constraints. In terms of meeting the demands of the labor market and offering high quality relevant education, the developing countries need continuously to update their educational systems so they can equip students with needed skills. And this is quite difficult in a context of increasing student enrolments on one side and improving quality standards on other side. Bearing in mind the previously elaborated, the concept of Open Educational Resources (OER) is becoming more and more significant around the world, having the potential to contribute to improved delivery of education and tackle some of the key problems facing the education systems. (Commonwealth of Learning, 2012)

In this regard we come to the question: are OER one of the keys to global economic growth?

This article published in 2012 in The Guardian (by the Ambassador David T. Killion, U.S. Permanent Representative to UNESCO, and Sir John Daniel, President and CEO of the Commonwealth of Learning from 2004 to 2012) discusses one of the main conclusions of UNESCO World OER Congress in 2012: “the OER are key not only to solving the global education crisis but to unlocking sustainable global growth in the 21st century — that is, if governments are ready to seize on their potential.” This Congress concluded with a declaration that urged governments to play a more active role in supporting this movement, widening the circle of those able to contribute to renewed economic growth.

Therefore this article states that the economic potential of OER is big and that OERs can lower education costs substantially. The economic implications of the OER movement and its potential to expand the global knowledge economy can be achieved by making education more accessible and adaptable to the changing needs of the global economy. For example the companies who need experts in particular field will work in collaboration with educational institution to make sure the OER training and or education is aligned with exactly what that business needs.

Furthermore the authors discuss that platforms offering massive open online courses like Coursera, edX, Future Learn and many others providing open courses from leading world universities are important initiatives, but the governments are by far the biggest suppliers of education worldwide and they have the most to contribute to the OER movement and the most to gain in terms of cost savings and economic growth. (Daniel and Killion, 2012)

What is more, Andreas Meiszner and Lin Squires in their book: Openness and Education argue that when ICT and OER are appropriately developed they can significantly affect economic growth and provide alleviation from poverty. They state that many developed and particularly developing countries will gain from improved access to education and the development of open educational services that fits their needs. According to the authors what is required at this point of time, is a reinforced focus on research and development in open education and open educational services, and to put these issues at the top of the political agendas. Research and development in open education and open educational services must produce convincing evidence to show how open education and open educational services can have an impact on the development of national economies and society as whole, building policy support for open education and fostering its public adoption.

Moreover they state that understanding the social and economic impacts of open education requires unpacking many elements, resources and activities. They discuss also the iron triangle of education by Immerwhar, Johnson and Gasbarra and Daniela and Uvalic-Trumbic: quality, cost and access to education. If some of these vectors of the triangle increase it certainly leads to change in other vector. For example, if the quality of education is increasing, this will lead to increase of costs and reducing access to smaller proportion of population. And here comes on stage the OER contributing to quality education, accessible and at cost saving. (Meiszner and Squires 2013, pg.138-144)

In this regard, interesting angle offers the article The Economics of Open: “Making better use of what we already have generates economic benefit by increasing utilization. Given the worldwide demand for education shouldn’t we be doing a better job of using what we already have? Economic development is driven by skilled labour. The economics of open allows us to increase the skills and knowledge of all. Too many of our educational resources sit on a shelf unused or behind password protected systems.” (Stacey, 2012)

From my point of view, Open Education can contribute to skilled workforce in one country if it offers quality and if the students really got the right point of using OER. Brining open access to educational resources is not the only precondition for having a skilled labor. Students need quality education and mindset how to put in practice the knowledge and skills required. On that way, we can have the link between OER and economic growth. And in future we need more research and evidence on that. The open education will continue to expand and the open resources I think will come more from the nonprofit and business sector because they want workforce who will have knowledge to serve their purposes. And as it was previously stated the awareness should be raised among governments that they should play a more active role in supporting this movement, widening the circle of those able to contribute to economic growth.

References: