What is Smart Retail?

Beatriz Sanz Baños    4 December, 2018

The implementation of IoT technology in the Retail sector allows offering personalized shopping experiences through smart stores. Thanks to information collected, processes can be optimized and more effective strategies can be applied to attract and retain customers.

5 in 5: Top 5 companies using AI (EP 3)

AI of Things    30 November, 2018

In the 3rd edition of 5 in 5, we will take a look at the top 5 companies who have already implemented functioning Artificial Intelligence (AI) into their practices. As we have seen in previous episodes; How to successfully implement AI into your business and How AI can benefit your business, AI needs to be applied with extreme care to ensure its successful use.

Briefly, AI can be defined as the fundamental application of the ideology of creating a computer that is capable of solving a complex problem in the same way a human would. It is comprised of a multitude of factors, including machine learning, deep learning, neural networks and Big Data.

We will now explore the top 5 companies who have already introduced AI into their systems.

1. Apple: Siri, Image Recognition

Over the course of two years, Apple has acquired 4 AI start-ups, clearly showing its dedication to entering the AI sphere. From one of these acquisitions, Emotient, Apple built the base for FaceID, the facial recognition security system that has come into effect in many of its popular smart phones.

Rumours have been circulating around Siri and its AI capabilities, but Apple has remained quiet on the subject since acquiring AI company Vocal IQ. The business has been successful in producing voice-control technology for General Motors, which allowed drivers to command certain functions within the car using vocal cues, clearly demonstrating its potential in AI application is great. 

2. Amazon: Alexa, Checkout-less Stores

Amazon has been at the forefront of AI innovations and has even introduced an intelligent voice server (Alexa) for the home via Amazon Echo. Alexa uses neural networks to process language and analyse the human voice, process it, and respond appropriately. The company also has software called Polly which turns text into speech and Rekognition which is an image recognition service. 

One of its most ambitious applications was the opening of a checkout-less store in Seattle. The technology relies on cameras and sensors that track what shoppers take off the shelves (and put back). Through the use of machine learning and intelligent algorithms, the items are automatically placed into the shoppers´ virtual basket, all of which is connected to the Amazon Go app. There are still many details that need fine tuning but its potential to become the ´new norm´ is promising.

3. Facebook: Facial Recognition, Descriptive Technology

Facebook has 4 labs around the world dedicated to AI research and has recently acquired an AI-based company, Ozlo, to help Messenger create a more complete virtual assistant. They are also among one of the first companies to use AI to help blind people ´see´ photos. In 2016, the firm used neural networks to generate a description of images that appeared on the app. For example, it would say ´three men laughing on a beach´.

Like many other big companies, it is using deep learning AI to find out what customers interact with the most. In this way they are able to customize content to each user, responding to individual preferences. However, unlike other big companies, Facebook has been in the AI game since 2010 when it introduced facial recognition technology that was able to identify people in photos.

4. Google: DeepMind, Machine Learning Algorithm 

It´s clear to see that Google is leading the AI race after embarking on a huge acquisition affair, acquiring 12 AI start-ups in just 4 years. Their ultimate mission is to provide the best results for every user, using machine learning algorithms to help their AI understand and learn, therefore streamlining results.

Perhaps the most exciting (and controversial) of their endeavours is the introduction of self-driving cars. They are designed to analyse the road ahead and make advanced decisions by learning from past experience. This means that in order to become an effective and reliable technology, the AI needs to practice and learn its reactions over a period of time.

In 2014, Google bought AI start-up DeepMind for $400 million, one of the largest AI acquisitions in history. Since then, it´s been used to analyse the quickest routes between underground stations and improve healthcare through a string of controversial exchanges with the NHS.

5. IBM: Watson, Teaching Assistants

Since the 1950s, IBM has been a pioneer in the world of AI. We have previously discussed its application in the world of music, including the collaboration with superstar producer Alex da Kid. In recent years they too have been proactive in the race, acquiring 3 start-ups.

Well-known for its Watson computer, IBM is able to use computers to extract meaning from photos, videos, text and speech. The machine has the ability to answer questions posed in a natural language, and was even recorded in 2011 having more success than its human counterparts in a US quiz show. At the same time, it is also developing a teaching assistant app that will be able to plan lessons based on previously approved material.

Another area in which IBM has used AI was in the recent Lexus advert for the LEXUS ES. Directed by Oscar winning director Kevin Macdonald, the advert worked from a script developed by IBM´s Watson AI system. IBM collaborated with its creative agency The&Partnership London to give the AI 15 years’ worth of footage, text and audio to analyse. The ad resonates on a very emotional human level, raising the question that even if a computer might not be able to understand or ´feel ´human emotions, it may have the capacity to replicate them to manipulate a reaction.

https://www.youtube.com/watch?v=BhKw71AeOg4

Overall, it is clear that the biggest companies in the world are leading the AI race, with every innovation being introduced to make our lives more efficient. Many of these practices still need to be continually updated to reduce errors, but the rate of growth and development is certainly promising.

Related articles:

Don’t miss out on a single post. Subscribe to LUCA Data Speaks.

You can also follow us on TwitterYouTube and LinkedIn

How to measure your data maturity?

Richard Benjamins    27 November, 2018
Big Data and Artificial Intelligence (AI) have become very popular these days, and many organizations have started their data journey to become more data-driven and take automated, intelligent decisions. But a data journey is a complex journey with several intermediate stages. While it is relatively clear what the stages are and what kind of activities they comprise (illustrated in Figure 1), it is less clear how to assess the overall data maturity of an organization with respect to its goal to fuel Analytics and AI.

Figure 1 The phases of a typical data journey towards becoming a data-driven organization
 
Indeed, measuring the data maturity of organizations is a multi-dimensional activity, covering a wide range of areas. In this article, we will provide an overview of those dimensions and how to measure progress on each of them. Figure 2 shows the dimensions, which we explain below using examples of what it means to be less or more mature.
 
 
Figure 2 The dimension of measuring organizational data maturity
 
 

IT, platform & tools

Anyone who wants to do something with data and AI needs a platform where data is stored and accessed. Early stage, immature organizations will likely have any platform to start with, either in the Cloud or on-premise, with no particular strategy. Mature organizations will have a clear strategy for how to support all facets needed for Analytics and AI. The strategy will encompass whether systems will run on-premise, in the Cloud or using a hybrid approach. It will describe the reference architecture for the big data software stack, APIs for accessing data in secure ways, etc. It will also cover the analytics, data visualization and data quality tools available for the users across the organization. Mature organizations will have automated most of the processes to run the platforms and tools on a daily basis, with minimum manual intervention. Finally, mature companies have a clear budget assigned to this along with a data roadmap of new functionalities and new data sources to include.  

Data protection

Data protection refers to the privacy and security of the organization’s data. Data protection can also be viewed as part of Data Governance, but due to its importance, it is often considered separately. With the new European GDPR regulation, it is clear for many organizations what it means to protect privacy of customer data. For most organizations, it is, however, still a major challenge to comply with all aspects of the GDPR. Because GDPR has set the bar high, we can say that organizations that are fully GDPR compliant, are mature on the data protection dimension. Data-mature organizations, in addition, use all kinds of privacy-enhancing technologies such as encryption, anonymization & pseudonymization, and differential privacy to reduce the risk of revealing personal information. With respect to security, apart from the technological solutions for secure data storage, transfer, access, and publishing, mature organizations also have a clear policy on who has access to what types of data, with special attention given to people with administrator rights who might be able to access all data and (encryption, hashing) keys.  

Data governance & management

This dimension measures how well data is managed as an asset. Almost all organizations that have started their data journey some time ago will recognize that one of the biggest problems is to have access to quality data and to understand what all data fields mean. Managing data as an asset includes aspects such as having an up-to-data inventory of all data sources, a data dictionary, and a master-data-management solution with data quality and lineage. But it is also about processes, ownership, and stewardship. Data sources typically have an owner that is responsible for the data generation, either as a consequence of an operation (e.g. payment data generated by POS devices) or through explicit data collection. A data steward takes care of the data on a daily basis in terms of availability, quality, updates, etc. Organizations that take data serious tend to set up a “data management office” that functions as a centre of excellence to advise the different stakeholders in the organization. More advanced organizations not only manage their data, but also their analytical models throughout their lifecycle. They will also consider external data, either procured or as Open Data to increase the value potential. And the most mature organizations have a clear policy on Open Data, stating how Open Data should be managed when used (license, liability, updates, etc), and when and under what circumstances private data can be published as Open Data, and under what licence.

Organization

The organization dimension refers to how the data professionals are organized in the company. Is there a separate organization like a Chief Data Officer ? How powerful is this position in terms of distance from the CEO (-1, -2, -3)? Or are the data professionals split between several organizations such as IT, Marketing and Finance?  What is the function of the data team? Is it a centre of excellence or is it operational, running all data operations of the company on a daily basis? And how well are the data professionals connected to the different businesses? Is there a company-wide “data board”, where data leaders and business leaders share, discuss and take decisions to align business and data priorities? Is there an initiative to “democratize” the data beyond the data professionals to the business people? How is the next layer of people involved in creating value from data?  

People

The people dimension is all about how organizations go about acquiring and retaining the skills and profiles required for the data journey towards AI and Analytics. Is it just treated as one of the many profiles, or is there a special focus reflecting the scarceness in the market? If hiring is hard, are there programs for training and upskilling the workforce? How refined are the profile definitions? It should recognize the different essential profiles including data scientist (analytics and Machine Learning), data engineer (data pre-processing and cleansing), data architect (architectural design of platforms) data “translators” (translate insights in business relevance), and AI engineers.  

Business

The final dimension, which is enabled by all the other dimensions, is the business dimension where the real value creation takes place. Mature organizations have a comprehensive data strategy where they lay out their plans and objectives for the six dimensions discussed in this article. There is also a clear vision on how much needs to be invested in each of the dimensions for achieving the goals. A data-mature organization also has a clear view on what use cases are possible and what the expected benefits are. Moreover, such organizations measure the economic impact of use cases  and report them in a consistent manner at the company level so that there is a clear understanding of the value generated by the data investments. This is essential for continuing to invest in data.   Finally, the most data-mature organizations are, apart from applying data and AI internally to optimize their business, looking at new opportunities with business. This could be based on insights generated from company data that are of value for other sectors and industries. For example, mobility data generated from mobile antennas, always in an anonymous and aggregated way, and combined with external data, has value for the traffic management , retail  and tourism  sector. But the new business opportunity could also be based on partnerships with companies from other sectors to combine data and generate differential insights. Data and AI can also be used for Social Good, that is, to pursue social objectives such as the Sustainable Development Goals of the UN.  

How to execute a data maturity assessment?

  A common way to perform a data maturity assessment is to translate each dimension into a set of questions with predefined answers ranging from 1 to 5, where 1 represents little maturity and 5 maximal maturity. This gives a questionnaire of less than 100 questions which still should be manageable. The questionnaire can be completed through interviews or as a self-assessment, possibly with a session afterwards where the self-assessed answers are challenged and the scores adapted. The resulting scores on each question are then aggregated per dimension, and finally in an overall data-maturity score. If done properly and avoiding tendencies to “look good”, this is a powerful tool to manage the data maturity of organizations: it embodies a data-driven way to manage the data journey. It allows to set objectives, track progress over time, prioritize data investments, and to compare or benchmark different units, especially in multi-national corporations.     Don’t miss out on a single post. Subscribe to LUCA Data Speaks.   You can also follow us on TwitterYouTube and LinkedIn

Leave a Comment on How to measure your data maturity?

Consumer Insights: Telling the stories behind the numbers

Almudena Bonet Medina    26 November, 2018

Written by Almudena Bonet Big Data, Advanced Analytics & Consumer Insights at LUCA consulting analytics

“A lot of times, people don’t know what they want until you show it to them.”

Steve Jobs had clear from the dawn of his career what it meant to find a good insight. On the other hand, on the side of the rest of the mortals, ordinary women and men have had to settle for finding clues that allow them to understand the perceptions and motivations that condition the behaviour of the rest of their fellow human beings, thus trying to extract the maximum performance of their economic activities.

Specifically to the suit of clients, consumers, visitors, patients, users, students … There are many nouns to define the relationship that is generated with people in different areas. What they all have in common is that they are based on relationships that aim to solve a series of needs that are more or less visible, constant or changing over time.

From the point of view of a company, it is not only about selling a product, a service, an experience and that’s it; and forgetting the user and never seeing him again. It is intended to generate a repetition, retain a user and turn it into a subscriber. But, for this, it is necessary to unveil the hidden truths that impel you to act and, even, anticipate their needs.

Achieving that goal is the key to continuously being in a position of knowing the client better, while offering creative and satisfactory answers to their needs. Analyzing, for example, the wow and pain points experienced by the user in his daily life and in relation to the brand, is postulated as a relevant source of information that helps get those insights.

How do we structure this process in a systematized way? Big Data technologies allow us to work precisely with these new sources of information, both internal and external. It allows us to go beyond a simple questionnaire (as Jobs already knew), and being able to process and compute data that until now had not been possible to even be captured. Data that, on the other hand, will be key to enter that world of what is hidden below the surface. With Big Data and advanced analytics we will be able to go beyond the execution as such of strategies, allowing us to focus on building more lasting relationships.

However, all this will not help if the people within the organization do not acquire an experimental attitude, fleeing from the conventional and actively counting on an integral vision of the data. The mentality must also change to stop showing raw data and start telling stories with them; to facilitate decision making and compete in optimal conditions, offering the best user experience at the right time.

A good narrative, supported by data through visualization and adapted to the interlocutor, transmits the information more effectively. In Amazon, for example, power point presentations are not allowed and it seems that bullet points are loaded by the devil. The alternative is to find a narrative structure that facilitates recall and extraction of conclusions.

As experts in advanced analytics, we are required to know the story behind the numbers to build an adequate script, based on numbers and away from “science fiction”, to fully enter the “hyperrealism” of what could to be put on the skin of the consumer. Like when one faces a good story, the most traditional perception of reality must be suspended momentarily to try to create new connections. Which, with the partial view of the data that was had so far, without having the full frame, it had not been possible to glimpse.

It is very easy to get lost in a sea of data without reaching relevant and actionable conclusions, which allow us to find quality insights such as those of the father of information technology as we know it now. It is not always necessary to be so ambitious and revolutionize the consumer industry, but let’s not lose the opportunity to maximize the value that can be extracted from a project. Let’s go beyond showing a very specific analytical output that may not be suitable for all audiences.

Know your audience, find the right context, choose the right data and follow the advice of one of the best storytellers, Jean Luc Godard: sometimes reality is too complex. Stories give it form.

Don’t miss out on a single post. Subscribe to LUCA Data Speaks.

You can also follow us on TwitterYouTube and LinkedIn

Intelligent construction and IoT, fated to work together

Cascajo Sastre María    23 November, 2018

Internet of Things (IoT) has come to revolutionize society. The disruptive nature of this technology optimizes all kinds of processes, and within the framework of digital innovation, construction is one of the sectors with the most potential to implement the solutions they offer.

IoT can be applied in the different stages of the construction process, be it a building or an engineering project. Already in the first phase of calculation, planning and design of structures, they offer great advantages and its possibilities are enormous for the subsequent work on site. It is also very useful during the phase of materials transportation and cementation, in which its use manages to improve the operation of the machinery used, the transfer of raw materials and the management of the available stock. The IoT connectivity does not end here, because once the works are finished, it can be used in the tasks of maintenance and energy management of the buildings.

1.CALCULATION, PLANNING AND DESIGN

From the first phase of planning, calculation and design, the IoT technology is of great help to carry out design tasks in an integrated manner. Thanks to transportable connected devices to the construction site, modifications are made more flexible for the architectural model, previously simulated in 3D.

2.WORK FOOT

The use of connected sensors in construction provides updated information about the work. These guarantee the safety of the buildings by identifying defective parts, thus avoiding disasters after the completion of the works. Installation in concrete is especially useful in monuments and structures such as bridges and tunnels. IoT technology also allows monitoring of environmental conditions at all times, which offers an enormous advantage for the protection of materials during the construction period and the prevention of damage, with the consequent savings in time and money.

3.TRANSPORT MATERIALS AND FOUNDATION

The transfer of the necessary materials to carry out the constructions is another component of the sector that can benefit from IoT connectivity. E l GPS tracking vehicle fleets that transport and monitoring their status. This is an improvement that helps to avoid delays due to breakdowns, thus avoiding unnecessary stops in the works. They are simultaneous, since both works during the project. 

4.MAINTENANCE AND ENERGY MANAGEMENT

As a consequence of the extension of IoT in the sector, the construction of intelligent buildings is becoming more common. These hyper connected buildings have the ability, through the use of different systems or technologies, to adapt their operation to the existing conditions at any given moment and allow the monitoring of the state of construction and the environment (lighting level, temperature, presence of people, etc.), which favors a much more efficient energy management and maintenance, as well as a more efficient rehabilitation of the buildings. The new challenge of architecture is to create sustainable buildings that are energy efficient and capable of self-sufficiency.

Intelligent building management systems have become a reality in most countries. The almost infinite benefits of intelligent building mark the way forward in this area and open the door to endless possibilities yet to be explored to achieve sustainable, efficient and respectful cities which will improve environment life for millions of people who live in them. The integration of the IoT and the existing construction technologies are presented, in this sense, as a great challenge for the cities of the 21st century.

The IoT technology applied to construction does not stop having the people and benefits as its ultimate goal, in fact, it also helps all those involved in the construction process. Architects and engineers see their work facilitated thanks to the connectivity between the offices and the construction site; the technicians and operators of the work benefit from a greater security and precision in their work and the final consumer enjoys, thanks to all this, of a higher quality and efficiency whether they are in a transport, in the case of roads or bridges, places of work or your own home.

The architecture of the future is already built today.

Leave a Comment on Intelligent construction and IoT, fated to work together

“Data is in the air” with the Data Science Awards Spain 2018

AI of Things    23 November, 2018

Winners of the Data Science Awards 2018.

On November 22nd, Telefónica celebrated the third edition of the Data Science Awards Spain 2018 prize giving, the only awards in Spain that give social recognition to professionals in the world of data and Data Science.

The main hall in Telefonica´s central building welcomed more than 80 people to the awards ceremony that consisted of three categories: Best Data Scientist, Best Data Journalism Work and Best Entrepreneurship.

The day began with Mercedes Estrada, Product Marketing Manager within LUCA, who welcomed and highlighted the great participation of this event, with more than 360 projects being presented.

Next, Carme Artigas, founder of Synergic Partners, a company acquired by Telefonica in 2015, whom the initiative came from. Artigas presented the dynamics of the Data Science Awards; which began last May in 2018 with the participants and went through various phases to get to know the winners of this years´ edition. Also, Carme Artigas highlighted the maturity of the projects presented this year compared to previous editions:

“In this third edition, due to the projects presented we have seen the maturity that data science has acquired. The ecosystem of disruptive technologies increases and evolves, it continuously develops in leaps and bounds,” he said.

David del Val, director of Core Innovation of Telefónica and CEO of Telefónica I + D, was in charge of the prelude to the awards with a keynote entitled “GANs and AI creativity” in which he showed the state of the art application of artificial intelligence algorithms called “Antagonic Generative Networks”, used to, among other things, generate content (whether images, photographs, videos, etc.) from scratch by the machines themselves.

David del Val, director of Core Innovation of Telefónica and CEO of Telefónica I + D.

Later began the delivery of prizes. First, the prize for Best Data Scientist was won by Pablo López Álvarez, with his project predicting early phases of Alzheimer’s. Its model of machine learning is able to estimate if a patient is in an early phase of this disease thanks to the study of the data processed by a MEG machine (manegmentencephalography), based on brain activity of patients through 102 sensors placed on their head. The judges highlighted the viability of this project that could be developed in a real medical environment.

In this category, Manuel López Martín received an honourable mention for his methodology used in the presented project.

The prize for the best journalistic data work was obtained by the Civio Foundation, for its complex journalistic research on the use and access to contraceptives worldwide around economic, social and religious variables, and for the visualizations presented to reflect results obtained.

Europa Press won an honourable mention for the best journalistic strategy with its EpData proposal, a platform with the objective of integrating all the opportunities offered by Big Data to improve the quality of information and fight against false news on the Internet in a collaborative way. 

Finally, the Prize for the best Big Data business initiative was achieved by Repsol, with its ambitious commitment to digital transformation that has led it to generate more than 400 initiatives based on analytics and data management. This initiative has given rise to a project that predicts, in real time, the quality of the product that it produces with an advance of 15 minutes, which allows modifications and improvements before its completion; a project in which 16 of Repsol’s production centres are already benefiting.

The Mention of Honour to the best business strategy of Big Data was obtained by Naturgy for its strategic plan to include Big Data technologies and Artificial Intelligence within the company, from the delivery of Big Data training at all levels within the company to the creation of a centre for analytical excellence where cases of advanced analytics and Big Data can be developed.

Elena Gil, CEO of LUCA and Global Director of Big Data B2B at Telefónica, closed the day by thanking the attendees and especially the participants and winners of this edition for the quality of the projects presented.

Elena Gil, CEO of LUCA and Global Director of Big Data B2B

Don’t miss out on a single post. Subscribe to LUCA Data Speaks.

You can also follow us on TwitterYouTube and LinkedIn

Cyberintelligence Report: Global Banking Cyber Report

ElevenPaths    22 November, 2018
As the world becomes more digital, new opportunities and threats arise and we tend to focus more on our daily business. As a result, when we are trying to develop a new product, website or application, we use to prioritize speed, convenience and ease of implementation over security.
ElevenPaths has conducted an analysis of 56 of the world’s leading banks. This analysis is based on public archives, web applications and mobile applications from these banks and addresses three key aspects of cybersecurity:
  • Integrated security in mobile applications.
  • Metadata available in public documents.
  • The information we can obtain about service communications and their quality (i.e. open ports on servers, their vulnerabilities, etc.).

To collect information we used four tools:
  • FOCA OpenSource, a self-developed tool (free and Open Source) obtaining documents through search engines, downloading them, extracting and analyzing the metadata.
  • Tacyt and mASAPP, another two self-developed tools allowing the visualization of the information from the mobile apps in official and unofficial markets, as well as finding vulnerabilities in the mentioned mobile applications. mASAPP also rates each application using a proprietary scoring system to rank apps from most to least secure. The higher the mASAPP score is, the worse the security of that application is considered.
  • Censys, a public OSINT search tool for servers and devices exposed to the Internet. It also allows to find specific hosts and services associated with each bank’s domains and see how the websites and their certificates are configured.
mASAPP- Puntuación de riesgo global por región img
mASAPP- Overall risk score per region

ANALYSIS RESULTS

Regarding mobile applications:
  • All the banks analyzed had vulnerabilities in their official applications, caused mainly by failures in the quality of the code. The most common vulnerability was potential SQL injection.
  • Banks in Asia, Africa and Latin America had the worst results.
  • We compared what permissions each banking application requested. Despite being in the same industry and providing the same type of service, only one permission was common to all of them: Internet Access.
  • The Middle East was the region with the lowest average number of requested permissions, while Asia was the one with the highest number of requested permissions per application.
  • Intrusive permissions such as access to phone contacts, making calls without user confirmation, reading and writing SMS or reading and writing system settings were present in several analyzed applications.
  • Some African banks have never had a mobile application.


Regarding metadata:
  • We detected hundreds of administrator accounts and several generic accounts with administrator characteristics.
  • Based on the metadata of the detected files, it is possible that many banks still use operating systems currently not supported by their manufacturers.
  • The analysis of public files has allowed us to obtain the physical location and names of various servers and printers. Companies should hide this kind of information because of the possible uses a malicious actor can make of it if it wants to harm the company.


Regarding servers, hosts and communications:
  • Although almost all hosts use HTTPS, there is still a large number of HTTP services, which is considered an unsafe protocol. .
  • Half of the banks use Akamai. Traffic mainly passes through North American servers.
  • Banks that do not use Akamai tend to host their services locally. The only exception is Asia, where banks that do not work with Akamai also have their servers in the United States.
  • None of the banks analyzed in Africa uses Akamai. This is one of the regions with most local hosts.
  • Africa is the region where most of its services are hosted locally, followed by the Middle East.
  • The most popular service when Akamai is not involved is FTP, followed by SMTP and different types of databases.
  • Services are hosted mostly in North America. Europe seems to be the second best option, but with a big difference from North America.





Pablo Moreno González
Sebastian García de Saint-Léger
Helene Aguirre Mindeguia
Pablo Bentanachs

Smart Agriculture: from the plow to the drone

Luis Simón Gómez Semeleder    21 November, 2018

The relentless growth of the world population that has been happening since the middle of last century (it is expected that in 2050 global population will reach 9.500 million), coupled with the need to limit the use of diminishing natural resources, has left the agricultural sector with no choice but to look for new advanced solutions that can offer an effective response to this new context. The use of new technological tools, combined with Internet development, is presented as indispensable tools in the agriculture of the 21st century.

We need to produce more and better, and we have to do it in rationally and efficiently, with an adequate use of resources and in a more sustainable way in the medium and long term. Society faces the great challenge of producing twice as much food in smaller cultivated zones, with less labor and with serious water shortage problems that indicators reveal will worsen in the coming years.

In this context of concern for sustainability, quality and quantity of agri-food productions, the objective of the sector is to find new solutions that optimize the use of resources and arable land. In this sense, agriculture increasingly relies on technology and the use of large volumes of data (Big Data). This is known as precision agriculture and consists in the analysis of the collected data, mainly by intelligent sensors, to optimize the management of a cultivated land from the agronomic, environmental and economic point of view. The objective is to carry out a more efficient and precise use of resources.

Agriculture increasingly relies on technology and the use of large volumes of data

The main method of precision agriculture is to apply the necessary resources, but at the right time and in the right place. For this, the new type of agriculture employs global positioning systems (GPS) and other electronic media such as drones that allow you to obtain accurate crop data and act in the best way. Through the collection and analysis of data, farmers can optimize resources such as irrigation water, fertilizers or pesticides, which translate into a significant reduction in costs and an improvement in product quality, as well as more responsible use of the environment.

On the other hand, agricultural automation and robotics also have a leading role in the agricultural sector and allow, thanks to artificial intelligence, to improve the yields of the various processes in the field of agriculture. Automation is a reality that continues to advance in various tasks. Actions such as controlling irrigation from a smartphone or driving a harvester from a virtual reality system already represent the day to day of many farmers throughout the world. Artificial Intelligence (AI) allows people to program complex tasks on computers or configure them to learn to refine their procedures through experience, just as humans do.

Agricultural automation and robotics also have a leading role in the agricultural sector

A technological revolution has started in the agricultural sector that has no intention of stopping : projects such as “Smart Agro “, promoted by Telefónica and the United Nations Food and Agriculture Organization (FAO) with the aim of transforming agriculture and increase food security, will soon be implemented in Bucayá , Colombia, benefiting 38 families in the area.

More than just one technology in particular, smart farming is the confluence of a diversity of technologies that is bringing a true revolution to the value chain and is growing rapidly. Through current technology, farmers will improve the quantity and quality of the products produced in a sustainable way, with the consequent benefits that this will bring to society. The data and technology have become part of the agriculture and will mark the way forward in the future.

Leave a Comment on Smart Agriculture: from the plow to the drone

The Deep Learning Hype

AI of Things    19 November, 2018

Written by Alfonso Ibañez, Data Science manager at LUCA consulting analytics

In the era of Big Data a day does not pass without us reading some news about Artificial Intelligence, Machine Learning or Deep Learning, never really knowing what they refer to. The “experts” in the sector mix and exchange the terms with all naturalness, which only contributes to their hype. The simple fact of mentioning them catches the attention of investors and convinces them that these techniques have an almost magical power.

Machine Learning is a scientific discipline coming from Artificial Intelligence, that studies how systems can be programmed to learn and improve with experience without human intervention. To address this problem, new paradigms emerge daily that allow the discovery of knowledge based on specific data deriving from solid statistical and computational principles.

One of the approaches that is receiving more interest from the scientific community refers to the Neural Networks. These networks are inspired by the animal nervous system constituted by a system of cells that collaborate with each other to produce a response to an external stimulus. 

As the topology of these systems becomes more complicated, we approach what is known as Deep Learning, a new marketing concept conceived to refer to complex neural networks. The idea under this new paradigm is that with a large number of neurons and many levels of interconnection between them, predictions can be improved in complex data sets.

The value of Deep Learning in companies

The use of Deep Learning in businesses is booming. More and more companies recognize the value of these techniques, since they allow to us work more efficiently and provide a real advantage over competition. Its emergence in the business world was favoured by the confluence of three key factors: algorithms, computing and data. 

On the one hand, the algorithms are constantly growing with the continuous improvement of existing techniques and the appearance of new ones. On the other hand, the evolution of computing capacity together with the cheapening of computer equipment, have allowed the analysis of gigabytes, terabytes or even petabytes for parallel information at high speed, thus allowing us to learn and make decisions in a much more efficient and agile manner than it was possible only a few years ago. 

The last factor is the access to large amounts of data with which to learn. Such data can come from multiple sources such as traditional business systems, social networks, mobile devices, the Internet of things or smart cities, among other things.

Thanks to the presence of Deep Learning in events, meetings and the press, a large part of society is fascinated by the potential of these techniques and believes that these statistical models can be the perfect solution for any complex situation. However, the reality is not as glamorous as a journalist can make it look, since it is the Data Scientists (the sexiest profession of the 21st century) who perform the “magic”. If the knowledge of the domain in question, the ability to deal with multiple data and intelligence when deciding which algorithms to use is limited, then the capacity of “learning” by the machines will also be limited.

The world as we know it is changing thanks to the potential of Deep Learning techniques, and surely with the passage of time it will be present in all aspects of our lives. According to Bill Gates we always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten. We will have to wait until then to really know if Deep Learning is a reality in our daily lives or a simple hype.

Don’t miss out on a single post. Subscribe to LUCA Data Speaks.

You can also follow us on TwitterYouTube and LinkedIn

5 in 5: How to successfully implement AI into your business (EP 2)

AI of Things    19 November, 2018

As many firms embark upon their digital transformation, confusion is growing around the use of Artificial Intelligence and what it will mean for the business world. As we have previously discussed in episode 1, Artificial Intelligence (AI) is the fundamental application of the ideology of creating a computer that is capable of solving a complex problem in the same way a human would.

The proper implementation, planning and care must be taken for its impact to be successful on a business. Last week, we considered 5 ways in which AI could help a business, from improving efficiency to increasing revenue, and we will now delve into 5 ways AI can be successfully implemented  into a business.

1) Make friends with AI

When introducing anything new into a business, it is crucial that one has done their thorough due diligence of the topic. Firms must know exactly what AI is, its uses, and the differences it will make to them before beginning its implementation. AI is comprised of many factors, so it is vital to also familiarise with all of the areas that form it, such as machine learning, predictive analytics and big data.

Firms will also need to clean their current data sets. Often data within a company is spread across many platforms and teams, with a lot of this data having little to no use and simply being an unproductive resource. It’s necessary for firms to decide which data holds value to them, and which is null, to help the system run best once online.

2) Identify areas of use and what it will do: Create a strategic and specialised project

The system must be built with balance. The role of AI and purpose of its creation is ultimately to aid us in our practices. However, too frequently, AI systems are designed around the vision the team has of creating its goals, when in reality it should be designed with the business goals at its core.

For the application of AI to be successful, firms must identify the areas where they want/need AI. Without a specific role or responsibilities (similar to a human job description), money and time may be lost, and the very benefit of AI lost also in the process. A detailed description of what role the AI will play and its functions, will greatly benefit the business as it will set boundaries and help with the adjustment to this modern technology. 

It is important to bring a team of experts to create and execute a specific pilot project. In this way, outside experts can be invaluable in their consultancy and knowledge. The initial ´pilot´ usually covers a relatively short period of time (2-3months), bringing with it a small team of external and internal employees together to set straightforward goals. It´s important to integrate external experts and internal personnel to ensure that the plans and strategies are set up with a deep understanding of the business as well as expertise of AI.  

3)  Teach employees

AI also offers the opportunity to become part of workers´ daily routine. Many employees are wary of this technology as they fear it will make them redundant. By introducing it as a way to augment their daily tasks and, ultimately, make their life easier, employees will feel more comfortable with the change. Companies must be transparent in their plans for AI so employees are satisfied and issues in workflow don´t arise as a result.

The biggest challenge will always be people, human behaviour. It will be vital to re-skill the workforce to shift the focus to decision-making effective management. This will create a culture that requires continuous learning and adaptation, which has the potential to demand a lot from employees, so firms need to ensure the correct training and development is provided to their personnel. Flexibility is also needed to ensure positive reactions to changes or issues that may arise.

Figure 2, Successful AI implemenation can greatly improve business practice 

 4) Start small

By starting small, businesses will be able to learn how to use the AI, collect feedback on its performance, and expand as necessary. At first, output will be of higher value to the company if the AI is able to focus on a specific area or dataset. It will also make the integration easier for employees as they will be able to give definitive comments on its performance which makes remedying the issues a much more streamlined process. The AI needs to be integrated as part of existing business processes, so there is no need for drastic cultural changes or new processes.

5) Remember the cognitive gap, respect your intelligence

The successful implementation of AI can take a long time. Too many firms believe that this process requires less time than it does, but the internal preparation beforehand needs a lot of careful planning. As we have seen many times in previous blogs, human intelligence remains the most fundamental asset today. It is important to respect the capabilities we hold,  to ensure that the AI remains under our control. 

Experts have calculate that around 40% of financial processes can be automated using AI and that it will save half a billion people two hours per day this year alone. Strong, executive leadership is needed to support the implementation process. Both business and technical leaders should work together to ensure business goals are kept in focus.

One should not that the role of AI will be different for every business. Its important to recall that there is no ´one-size-fits-all´ with this technology; the needs of a call centre will vary drastically from the needs of a bank´s fraud detection unit. Firms must be confident and bold in their approach; an offensive digital strategy has proved to be the most effective to avoid the dreaded digital disruption

Don’t miss out on a single post. Subscribe to LUCA Data Speaks.

You can also follow us on TwitterYouTube and LinkedIn