LUCA White Paper: Data as a Force for Good

AI of Things    6 April, 2017
LUCA’s second white paper sees Dr Richard Benjamins looking at data as a force for good. This piece focuses on giving readers the ability to analyse the possibilities of Big Data for reaching the UN Sustainable Development Goals.

We believe that Big Data can be extremely useful in the development of society- which is what we call Big Data for Social Good. For this reason we are contributing to the Sustainable Development Goals set by the UN for 2030. This white paper highlights research pilots including an example of a Columbian case study, whilst also taking into consideration potential barriers like privacy and data infringments.

You can download this Whitepaper in pdf format here.

Join us for the first Big Data for Social Good in Action event

AI of Things    4 April, 2017
Mark your calendars for Thursday, 18th May to join us for LUCA‘s first-ever Big Data for Social Good in Action meetup.

One of the hallmarks of LUCA is our Big Data for Social Good Initiative. This action-focused meetup unpacks our initiative by providing a forum for international changemakers to gather and discuss what it means to have a social impact through data
Whether you work in the public or private sector, in academia or at a startup, this event is for you if you’re passionate about using data to make a difference. Our intention is to foster a community of like-minded individuals and form partnerships to drive progress toward the U.N.’s Sustainable Development Goals
The event will take place from 19:00 – 21:00 at Wayra Madrid on Gran Via. We have several exciting talks planned, followed by a time of networking (and food!). 
Figure 2: The event’s agenda is geared towards facilitating conversation around using Big Data for good.
We have a great lineup of speakers, featuring Elena Gil, CEO of LUCA, Pedro de Alarcón, LUCA’s Senior Data Scientist, and David González, Founder and CTO of Vizzuality.

Visit the event page to register your interest in attending. Space is limited, so we will get back to you about whether your request was successful.

New developments in fitness and health data

AI of Things    3 April, 2017
The explosion of Big Data surrounding our daily lives is not a recent trend. Still it is continuously growing and developing new functions everyday, including the growing use of data for fitness and lifestyle management. We have seen how phones have been evolving towards a more concrete and detailed way to measure data, which now include our physical activity, hours spent sleeping, nutrition and vital signs. We’ve also seen the excess of apps developed in line with this new tech trend to help our self-measurement and target setting. Although this market is already saturated the possibilities for data in this sector are still very much in development stages. We aim to look closer at possible trends for the future of health and fitness combined with a more data-driven focus.

The current situation is not as glamorous and user friendly as you would expect with our physical and health data management, there is a high reliance upon user proactivity and device syncing. If you have tried to use Apple Health without any connected devices you know it can be a less than intuitive experience.

For that reason, Mywellness, an app (yes, another one) powered by Technogym that helps you reorganize and measure your physical activity data synchronized with your gym performance. How is that done? By gathering data directly from your activity using the gyms equipment such as connected bicycles, treadmills, and weightlifting machines? First, users take a medical examination with the trainer at the gym and baseline vital signs are uploaded to the cloud. Users are allowed to modify them with further measurements. Then, users follow a workout plan using gym equipment that tracks their progress towards personal goals. To provide a bigger picture of individual health, Mywellness  also includes other health data collected on a user’s phone.

Another development in the Big Data and health boom is the growing interest of companies in that data. A November 2016 study by Rand found out on November 2016 that workers who sleep less than six hours each night costs the american economy 411 billion $ anually. This cost is obviously detrimental to a companies performance, meaning that companies are increasingly interested in gathering health related data from their employees.

The Outside View, a Shoreditch based company recently acquired by Rightmove, carried out an experiment in 2014 under the name of the Health, Wealth and Happiness program. With the goal of finding a way to maximize their workforce performance by looking at employee happiness and health levels, The Outside View was the pioneer of the “quantified self” movement. Employees had to download a series of apps that reveal their day to day physical activity such as walking, running, sleeping hours, and even their level of hapiness. Processing all this data gave the employees access to specialized workout sessions perfectly complimenting their needs.

Rob Symes, founder of The Outside View commented on this initiative in an interview with The Guardian, stating that the biggest problem for the Health, Wealth and Happiness program was that the data couldn’t be unified in one place. This was the same problem that MyWellness was seeking to address, so there will likely be business-focused development along these lines in the near future.
Another company that is taking a new approach to health and Big Data is LifeBeam, creator of VI which launched a Kickstarter campaign last year to fund its futuristic personal trainer idea.What makes VI a very special trainer is that it doesn’t only measure your physical data from Apple Health or Google Fit, but it gives you real time insights about your performance. As LifeBeam founder Zvika Orron well says in the video above, “there are endless amounts of wearable tech today, most of them provide lots of numbers, but no one is really telling you what is good for you”.  Combining intuitive and real time data processing with natural speech as a your ultra-personalized personal trainer, this device is likely to become the future of fitness data.

Here at LUCA we are eager to see how this trend develops so we can make data application and analysis to health and fitness more efficient. As more companies become interested in analysing their employee health, equipment like VI are likely to increase in importance. This is definitely a trend that is worth watching because one day our salary may well be connected to how well we perform and take care of ourselves physically.

Could Big Data solve the music industry’s problems?

AI of Things    30 March, 2017
Musicians relations with their publishers and labels have always been tumultuous and it has been alleged with the amount of middle men in the industry that it can take between 2 and 3 years to receive royalties. The launch of Kobalt Music which was founded by Willard Ahdritz has led to much more transparent and data-driven approach to the music industry. We’ve investigated to try and understand some of the previous problems within the music industry and how Kobalt Music has given musicians an effective solution to accessing their profits.
One of the major problems in the current world of music Big Data has been that although the streaming services could provide accurate info to labels and publishers, it came in a format that was incompatible with their accounting systems. That meant that all those reams of data (more than ever, thanks to the services ability to granularly collect everything) was delivered in stacks of hard copy, which then had to be manually input into the label or publisher’s system. And of course, the problem was that the person doing the inputting was often someone who was not equipped to deal with some of the more complex decisions that would come up in the course of inputting, which lead to inaccurate statements for artists and songwriters. And let’s not forget the inevitable human error that goes along with manual data entry that didn’t help matters.

Kobalt Music collects royalty money directly from streaming services like Spotify, iTunes, YouTube and claims to earn clients on average around 30 per cent more than they would normally receive. As a result, attracting some of pop music’s reigning chart toppers, Zayn, Alt-J, Bjork, Calvin Harris and Grimes just to name a few. Kobalt uses an online music-detection technology (ProKlaim) that seeks out unclaimed songs amongst the plethora of user-generated videos on YouTube. This program currently identifies up to 1.5 billion video plays per month on the platform. If the algorithm detects any commercial matches (e.g. an advertising agency using a song without permission) Kobalt goes after those responsible. The software has been so efficient that it has tagged up to 1.5 million new videos per month.

Spotify can stream various songs up to 170 million times to around 3.2 million listeners and Kobalt collects micro-payments on each stream directly placing them on its portal line by line in real time. With the traditional model, the collection societies and territorial publishers gather these royalties and then pay it to the labels or artists’ publishers. This process can cause confusion and lead to money falling through the gaps. If they don’t possess the tools to track it and break the usage down the artist will never win.

Another level of transparency that has been brought to the music industry by Kobalt Music is through the acquisition of Artists Without a Label (AWAL). This service not only provides direct distribution through platforms like Spotify but also provides artists with advanced data analytics. This means that artists can find out where their songs are most popular, and therefore plan marketing and tours around the results. This is giving artists the chance to use Big Data to their advantage, it’s not only about using it, but it’s about the transparency and accessibility. This information can make artists aware of previously disconnected fan bases or of areas that need more attention.

Willard Ahdritz
Figure 3: Founder of Kobalt Music, Willard Ahdritz
The hype that has surrounded Kobalt Music led to an investment in 2016 by the Google Ventures’ London office of $60 million, showing that Kobalt’s work is absolutely worthy of investment and the deployment of Big Data within the music industry is something that will continue to grow. How do you think the creative industries will react and evolve with the use of Big Data?

4 companies using Big Data to address Water Scarcity

AI of Things    29 March, 2017
Water scarcity is an increasingly serious issue. It is estimated that by 2025, 1.8 billion people will face “absolute water scarcity”. The United Nations recognized the seriousness of this issue by including it as one of the Sustainable Development Goals, aiming to have global accessible clean water and sanitation by 2030. Although there is much to be done in order to accomplish that goal in the next thirteen years, there are many innovative efforts underway to make significant strides to that end. Here are four companies creatively employing Big Data to combat water scarcity:

1. GE

A widely recognized leader in sustainability innovation, GE announced in 2016 that its Water and Process Technologies Unit would be expanding its emphasis on digital water management. GE formed a partnership with several water technology companies that will allow them to enhance their customer experience while focusing on saving water. For example, GE and its partners are working on initiatives that will help customers monitor their water consumption habits with a goal of reducing those levels. Customers will also be able to use the data to identify leaks and quickly address them, thereby saving water. In the company press release announcing the partnership, Ralph Exton, CMO of GE’s Water & Process Technologies Unit stated, “Using data, remote control and analytics, we are working to transform the way water utilities operate, helping them run more efficiently while also improving the service and value they provide to consumers.”
big data for social good
Figure 2: GE is focusing on expanding its digital water management profile (image source: GE)
One of GE’s partners in their water reduction imitative, WaterSmart Software has been gaining international attention for their revolutionary technology. The San Francisco-based startup was a 2016 recipient of the World Economic Forum’s prestigious Technology Pioneers Award for creatively tackling water problems. WaterSmart focuses on equipping water utilities with the data analytics they need to manage their water supply. The company offers a variety of solutions, ranging from data-based drought response plans and leak monitoring to customer usage tracking that aggregates historical usage data and is paired with suggestions for reduction.
big data, big data for social good
Figure 3: WaterSmart Software uses Big Data to help utilities better manage their water systems.

3. TaKaDu

TaKaDu is also a recipient of the World Economic Forum’s Technology Pioneers Award. The company, based out of Israel, touts itself as a pioneer in the water network monitoring space with a goal of optimizing water use. Through a combination of Internet of Things (IoT) technology and Big Data analytics, TaKaDu provides whole-system monitoring for utility companies around the world. They combine current network data with real-time monitoring in order to quickly identify any leaks, disruptions or other inefficiencies in water systems. More than just providing great customer service, this technology also allows utilities to reduce water waste connected to these issues.
big data for social good
Figure 4: TaKaDu leaders at a World Economic Forum conference.
Rather than focusing on a specific product, Imagine H2O focuses on building companies that can address water issues. Their mission is to empower people “to deploy and develop innovation to solve water challenges globally.” To accomplish this, Imagine H2O provides educational resources, partners with industry leaders and runs a start-up accelerator for water-minded entrepreneurs. They recently announced the twelve companies who earned a place in their second annual Water Data Challenge cohort. These companies, selected from over 180 entries, offered innovative business models that leveraged data-driven solutions to address water resource issues. Imagine H2O then assists these companies with resources, scaling and planning to launch businesses with a strong social good element.
big data for social good
Figure 5: The latest cohort from Imagine H20’s Water Data Challenge. 
There are many more companies doing exciting things with Big Data and water. This is an important issue where Big Data can be deployed for Social Good and we at LUCA are excited to watch this space develop. If you would like to discover more companies working in this space, check out the Smart Water Networks Forum (SWAN), a non-profit that serves as a facilitator for companies attempting to make data-driven decisions about water in order to tackle serious issues like water scarcity.

Hackathons are not just for developers

Ana Zamora    28 March, 2017

By Glyn Povah, Head of Global Product Development at LUCA.

Hackathon is derived from the English words “hack” and “marathon”, and if you Google the word “hackathon” you’ll get something like the following: “Noun (informal). An event, typically lasting several days, in which a large number of people meet to engage in collaborative computer programming”.
This all conjures up images of lots of developers packed in a dark room, coding on MacBooks for 24 hours, eating lots of pizza and drinking lots of soda. My experience is somewhat different. In fact I’m not a developer at all. Well, not any more anyway. Despite that I still consider hackathons an indispensable part of the new product development process.

Actually this post isn’t about hackathons at all really. The central theme is about a collaborative approach to product development that puts customers heart and centre, not as an afterthought.
Smart Digits team in one of the product sessions
The Smart Digits team in one of the product sessions.


Customer-centric product development

“We innovate by starting with the customer and working backwards. That becomes the touchstone for how we invent”. Jeff Bezos, CEO Amazon

In the Smart Digits team in LUCA, we always take a customer led approach to new product development. It’s how we started out with our first product and it’s been part of our DNA ever since. It’s not a very Telco approach to product development but it’s worked very well for companies like Amazon and it works very well for us.
The hackathons we run are all about collaboration. Not just about collaboration between engineers but a cross-functional team of developers, design UX and UI specialists, product developers and commercial managers. Such a cross-functional team can produce a working prototype in just one day if everyone comes along with the right “hacker” mindset.
The goal of any hackathon is having something to touch and feel and something we can show to customers. This could be a command line tool, a GUI or a web app. Bringing to life concepts and ideas we imagine, drawing them on napkins or whiteboards and rapidly getting them into our own hands and then very quickly into customer hands is key to to learning quickly and iterating rapidly. Sometimes the “experience” feels different to us or customers from how we imagine it on paper. Sometimes we discover flaws, challenges or unexpected benefits by playing with our demos and MVPs. Our customers can give rapid feedback too to improve the next iteration.
Back to collaboration and customers. Why is our team so insanely customer-focused when developing new products?
  1.  Fostering the right culture. The goal is to foster a spirit of collaboration in the team where everyone is equal regardless of their role and everyone can have their say and input. All team members across all functions are encouraged to challenge and be challenged. Robust and challenging conversations deliver great outcomes and decisions. Most of all this approach fosters trust and buy-in from the whole team which results in a highly collaborative approach from the start of a new product development project. The same principle applies to involving customers right from start too. This approach engenders trust and high levels of engagement from the get-go.
  2. Execution excellence. We believe co-creating with customers gets fast, high quality results that meet customer needs from day one. We have plenty of examples of going from a working prototype as a result of a one day hackathon to a Minimum Viable Product (MVP) we can share with our customer in less than two weeks.
Collaborative teamwork at the offices of Wayra UK
Collaborative teamwork at the offices of Wayra UK.


Think big, start small

As a product team with global remit to roll out new products in 21 countries with access to 350 million customers, our addressable market and opportunity is massive. Further, many of our B2B Enterprise customers are themselves massive global businesses too. So it’s natural to want think big from day one.
In Smart Digits we like to think big but smart small. Our ambition is always to scale new products globally but we always insist on proving them first with one customer in one country. As a global product development team we need to use our knowledge, experience and judgement to determine which ideas have global potential.
Thinking big and starting small doesn’t mean being slow to market. Quite the opposite. We’ve learned to co-create with one customer in one country and quickly launch a MVP. Once proven we can quickly move to scale globally using Agile based development process.
So, if you’re a new customer (or an existing one with a new product idea), be careful what you wish for. If you come to us with a new idea or use-case, you might find yourself a few weeks later in a dark room, eating pizza, drawing on whiteboards, napkins or even coding. We’ll have a prototype ready by the end of the day that you can touch and feel and a MVP a few weeks later. So you’d better be ready to integrate on those timescales!

4 Data Enthusiasts changing the world as we know it

AI of Things    27 March, 2017

It is becoming increasingly apparent that NGOs and governments are creating more roles in the Data Science discipline. Using Big Data for social good is all about coming up with new data-driven ways to solve the most pressing problems in our society. Today, we decided to take a look at some of the trailblazers in this space, focusing on four Data Scientists and Enthusiasts who really are changing the world with their work.

1. Miguel Luengo-Oroz

Figure 1: Miguel Luengo-Oroz has a very impressive career background.
Chief Scientist at UN Global Pulse, an innovation initiative at the Executive Office of the United Nations Secretary-General, harnessing Big Data for global development. He leads the data science team across the network of Global Pulse Labs in New York, Jakarta and Kampala which provide “innovation as a service” – developing Big Data projects together with UN system partners. Miguel is the founding director of MalariaSpot.org– videogames and crowdsourcing for diagnosis of malaria and other global health diseases, based at the Universidad Politécnica de Madrid. As an antidisciplinary scientist, over the last 10 years, he has been working on innovative projects at the crossroads of international development, social innovation, global health and systems biology with data science.

2. Bruno Sanchez Andrade Nuño

Bruno Sanchez Andrade Nuño
Figure 2: Bruno Sánchez-Andrade Nuño has focused on using Big Data for social good.
Bruno Sánchez-Andrade Nuño, Ph.D. is a strategic scientific advisor, and VP for Social Impact at Satellite operator Satellogic. Bruno is committed to bringing the value of Science and Technology in society at large, especially shifting the value of science away from mostly a body of facts and knowledge, and into very transferable skill-sets and tools to understand complexity, and make better decisions (Impact Science project). Previously at the World Bank Innovations Lab (and the President’s Office), their team led the technical work of Big Data across operations globally. Before the World Bank Bruno was Chief Scientist for the mapping company Mapbox during 10-fold early growth; where he worked building and leading the scientific technical support across the company, and in particular on the Satellite team.

3. Jake Porway

Jake Porway
Figure 3: Jake presenting at the National Geographic series.
Jake used his passion for technology and
data to see the good values in data and harnessing this information. He remains
an active data scientist even after setting up DataKind which aims to give
every social organization access to data capacity to better serve societies
across the world. He graduated from Columbia University and shortly after
started working as a Data Scientist at The New York Times. He has been a TV
host for the National Geographic as they promoted a game show using data to
create interactive gameplay. DataKind’s global presence allows it to target a
range of global issues and have a more effective response from their various
offices. He wants to promote the use of data not only just inform trivial
decisions but also how to actually improve the world we live in.
 
 

4. Christopher Fabian

Christopher Fabian
Figure 4: Passionate from the beginning about the potential of data
Christopher Fabian has been an integral part of the UNICEF Innovation Unit in New York since 2007. Working on this global project he has used the research and development priorities to focus on short-term problems in parts of the world that have the most difficult operating environments. Christopher states that “technology is not the end-product of innovation, but a principal driver of new ways of thinking about development problems”. We at LUCA are now actively participating in the Innovation Unit with Magic Box project we have just launched in collaboration. This shows that Christopher Fabian is clearly aware of the importance of data and the potential data has when it is optimised and harnessed. We hope that this collaboration combined with Christopher’s stellar experience will lead to more results. He previously stated that local talent is critical in creating successful local solutions and it’s hard for us not agree.
How important do you think it is for NGOs to hire Data Scientists? Let us know in the comments section below.

ElevenPaths creates an addon to make Firefox compatible with Certificate Transparency

ElevenPaths    27 March, 2017
Certificate Transparency will be mandatory in Chrome for new certificates in late 2017. This means that the webpages will show an alert if protected by certificates not present in the logs that Chrome checks by that time. No other browser supports Certificate Transparency yet. Mozilla is in its way to make it work but there is no official date to release it. ElevenPaths creates an addon to cover this feature.

Checking the SCT embedded in our certificates

Certificate Transparency is a new layer of security on top of TLS ecosystem. Sponsored by Google, it basically makes all the issued certificates to be logged (in some special servers), so if an eventual attacker would want to create a rogue one, it would face a dilemma: If the rogue certificate is not logged, that would rise up some eyebrows… if logged, that would allow a faster detection.
A certificate is considered “logged” if it counts with a SCT (Signed Certificate Timestamp). This SCT is given to the owner of the certificate when logged, and the browser has to verify it is real and current. This is exactly what Chrome has been doing for a while now. Now Firefox, thanks to this plugin, is able to check the SCT for certificates. But there are some good news and bad news:

This is how Chrome checks the SCT

 The good news

Our addon, created in cooperation with our lab in Buenos Aires, works with most of known logs. It means that it does not matter from which log the SCT comes from, we will be able to check it because we have introduced the public key and address of basically all known logs so far:

Google ‘Pilot’, Google ‘Aviator’, DigiCert Log Server, Google ‘Rocketeer’, Certly.IO, Izenpe, Symantec, Venafi, WoSign, WoSign ctlog, Symantec VEGA, CNNIC CT, Wang Shengnan GDCA , Google ‘Submariner’, Izenpe 2nd, StartCom CT, Google ‘Skydiver’, Google ‘Icarus’ , GDCA, Google ‘Daedalus’, PuChuangSiDa, Venafi Gen2 CT, Symantec SIRIUS and DigiCert CT2.

This makes our solution quite complete but…

The bad news

SCT may be delivered by three different ways: 

  • Embedded in the certificate.
  • As a TLS extension.
  • In OCSP.

It is not easy from a plugin technical perspective to get to TLS or OCSP extensions layer and check the SCT. So our plugin so far checks for SCT embedded in the certificate itself. Although not ideal, this is the most common scenario so most of certificates distribute its SCT embedded.

Another bad news is that plugins have to be validated by Mozilla to be published in its addons store. Once uploaded the plugin gets in a queue. If it contains “complex code” it may be there for longer, so Mozilla can make a better work reviewing and checking its security and quality. After waiting for more than two months, we have decided not to wait anymore. The queue seems to be stuck for days and days and the is no hope to make it work faster. Mozilla reviewers are working as much as they can, but they can not deal with so many addons as fast as they would like to. We thank them anyway. That is why we have decided to distribute it outside addons store. Once it gets reviewed released, we will let you know.

The addon is available from here.

To install it, just drag and drop the file into a new tab.

Or, from the extensions menu, settings, install from a file.

Innovation and Lab
www.elevenpaths.com

When Big Data meets art

AI of Things    24 March, 2017

In this world of ours, it is normal to feel overwhelmed by the amount of information, facts and data we encounter everyday – and the quantity we are generating every second continues to grow on an exponential basis. However, perhaps the problem isn’t actually about the amount we have to digest, but rather the way in which we digest that information. Maybe we are expecting the world to follow this frenetic rhythm of data consumerism without taking into account what the human brain is actually capable of processing. 

Nevertheless, we shouldn’t lose hope. There are many people out there thinking about how this problem can be solved. And it is not necessarily what you might be thinking. What if we present data in a way that people really engage with it? Because, after all, let’s not forget that in this cold and empirical world, emotions make it easier to establish bonds with people – and art is a key way to forget these emotional connections.

Connecting Big Data with Art, cannot be done by merely Data Scientists or Artists flying solo – but rather a unique and potentially beautiful partnership between both parties.

This new trend came about with movements such as net.art, the first group of artists that used Internet for their creations. But who is leading new revolution of bringing Big Data Art right now? Here are four examples of artists and projects that use Big Data:

1. Nathalie Miebach

Nathalie Miebach is the woman who inspired this blogpost. She has had a unique career path in which she blends arts and science, studying art in great detail, but also she has taking courses in physics and astronomy. The result of this combination of studies is quite remarkable. For ten years she has been creating beautiful sculptures which represent weather elements data. How? She measures data by herself with basic instruments and checks that information with internet data on weather. She then combines this data and chooses two or three variables and starts to translate that data into creating the sculpture which has a basket base, and each element represents a bean and colour thread. An example of the result can be seen below:

Nathalie also converts this “data visualization-art work” into a musical score as well. She works with musicians in the translation of this data into musical notes and composes a musical piece.

Her artistic statement is a powerful one: Nathalie challenges how the visualization of science can be approached. At first, you can see this sculpture can be viewed as a data visualization. But if placed in a museum, it could be seen as a piece of art. If it were placed in a concert hall, then it would also be a musical piece. In summary, with her art work you can see, feel and hear science – which is extremely well explained in this Ted Talk and her portfolio.

2. Maotik

Mathieu Le Sourd (also known as Maotik) is using more technical tools to create innovative art that represent data visualizations. The Montreál based artist is well-known for his impressive art installations.

FLOW is the name of the beautiful installation that represents real-time nature data. Taking into account the variables of the moon, temperature, humidity and the position of spectators, this interactive installation offers a sensorial experience of weather, showed as waves reflected into the installation. Want to see it in action? Check out the video below:

3. Fernanda B. Viegas and Martin Wattenberg

Big Data and art are also being combined by technical profiles doing beautiful visualizations. This is the case of Fernanda Viegas, a computational designer whose career has focused on data visualization, specifically on its social, collaborative and artistic aspects. She has done many interesting things, such as the project with IBM in which she created a public visualization tool named Many Eyes. You can see more of her projects on her website. One highlight is this wind map, an amazing visualization of wind real time data in the US done in collaboration with Martin Wattenberg:

Figure 2: Wind map of March 24th 2017 by Fernanda Viegas and Martin Wattenberg.

Fernanda and Martin have done many other projects with data together, which you can check out here.

4. Dear Data Project

And last but not least, another example is this unusual visualization in form of daily mail. This idea was shaped by Georgia Lupi and Stefanie Posavec, two friends separated by the Atlantic Ocean that sent each other visualizations of their daily data drawn on a postcard sent through mail post. This mail chain has been composed on a book named Dear Data Project, which is full of their beautiful handmade visualizations and is available here. To see the whole idea behind the project, watch the video below:

So fear no more, Big Data friends. There are many ways to engage with non-specialized audiences to make science not only interesting but also fun to consume. Art is key in processing all of the information around us, and an opportunity to creatively represent our world. We hope that this discussion of art and Big Data will help you start to start to rethink the way we view data.

How we’re mapping Climate Change with Big Data

AI of Things    23 March, 2017
At the 2014 U.N. Climate Change Summit in New York City, U.S. President Barak Obama shared his thoughts on the seriousness of climate change. While acknowledging other pressing issues such as terrorism and inequality, he highlighted climate change as the most significant and declared, “There’s one issue that will define the contours of this century more dramatically than any other, and that is the urgent and growing threat of a changing climate.” 

Here at LUCA, we believe that Big Data can be a positive force for Social Good on key problems, including climate change. The urgency of the issue means that there is a need for clear and decisive action. But that action is only as good as the information we have about what exactly is happening, which is where Big Data can help.
Enrique Frías, one of lead researchers in Telefónica and LUCA‘s R&D department, is currently working with a team to determine how Big Data can specifically address the issues of climate change migration. The team is in the midst of conducting a preliminary study in Colombia to determine how call detail records (CDRs) can help map patterns and provide additional insight about people forced to migrate due to climate change-related issues.
big data, big data for social good
Figure 2: Environmental migration occurs worldwide after natural disasters, but the problem is worsening with climate change.

Environmental migration is nothing new, as it occurs almost every time there is a natural disaster, such as a hurricane or a tsunami. However, the volatility and severity of weather-related incidents are increasing due to climate change. This means that people are being compelled to move more frequently, and the duration of their displacement is usually longer, if they are able to return home at all.

Research by the International Displacement Monitoring Centre (IDMC), a Geneva-based monitoring organization that tracks displacement around the world, shows that the trend of climate migrants is going up. Their 2015 figures estimated that 19.3 million people were displaced due to weather-related events in that year alone, not including people who were still in a state of displacement from weather events in previous years. Based on yearly data collected since 2008, IDMC concludes that this averages out to one person being displaced from weather-related events every second.
Of course, it is difficult to pin down what extreme weather events are a direct result of climate change. But the link between climate change and extreme weather correlates with the increasing frequency and severity of these events, including droughts, floods, heatwaves, and hurricanes. This increase in extreme weather is expected to continue, which means that climate migration will only worsen.
Frías and his team are hoping that their use of CDRs can help address this issue. They are conducting the preliminary study on a local level, focusing on the municipality of La Guajira in northwestern Colombia. La Guajira declared a state of public calamity in 2014 due to a severe and prolonged drought. The lack of water and the subsequent impact on the food supply forced many residents to leave their homes. The researchers tracked this movement using aggregated and anonymized CDRs to determine different migration patterns. According to their preliminary results, 90% of residents stayed in the larger municipality of La Guajira but moved to areas where the drought was not as severe. The other 10% scattered around other municipalities, primarily concentrated in the nearby areas.

Movement patterns are important to track because these migrants are often in need of significant aid. Being able to predict how and where they will move will allow for aid to be more targeted and effective. While not alleviating the issue of climate migration altogether, better-deployed aid can lessen the extent of the suffering.
This research is still in the early phases, but we are excited at the initial results of the study. If it is successful on a local level in Colombia, it could be expanded and used on a larger geographic area, which would provide even more information about global climate change migration patterns. LUCA is already partnering with UNICEF’s Magic Box initiative to use Big Data to tackle problems like natural disaster response and the spread of viruses. The potential to use Big Data for Social Good in the area of climate change would be a significant opportunity to make a tangible impact with data.

Would you like to know more about this study? Check out the paper here.