Paloma, Recuero de los Santos A digital twin to save the Mar Menor The Mar Menor, Europe’s largest saltwater lagoon, is suffering from severe degradation due to various socio-environmental factors. The Smartlagoon project, funded by the European Commission, kicks off the efforts...
AI of Things Telefónica Tech AI of Things made real In a previous post of our blog, we already told you how the combination of technologies based on Artificial Intelligence, Iot and Big Data, the “Artificial Intelligence of Things”,...
AI of Things Analysing tourist profiles in Spain with INE Tourism is one of the main sectors of economic activity in Spain. According to the National Institute of Statistics (INE), an autonomous body under the Ministry of Economy in...
AI of Things 5 degree programs for becoming a Data Scientist At LUCA, our Data Scientists are at the heart of what we do. Recruiting these valuable team members can be a challenge, though, because they are in such high...
AI of Things Deep Learning and satellite images to estimate the impact of COVID19 Motivated by the fact that the Coronavirus Disease (COVID-19) pandemic has caused worldwide turmoil in a short period of time since December 2019, we estimate the negative impact of...
AI of Things Success Story: LUCA Transit and Highways England The transport industry is very receptive to the application of Big Data and Artificial Intelligence strategies, as there are clear use cases that can maximize a companies’ efficiency and...
AI of Things LUCA Tourism at the Mexico Balloon Festival Each year, the Festival Internacional del Globo (FIG), brings together over 200 hot air balloons that fill the sky above the ‘Parque Metropolitano in León, Guanajuato, in what is an...
AI of Things How Germany moves Our colleagues in Telefonica NEXT have created, using mobility data, data analysis and big data, an interactive map that lets you visualize nationwide traffic flows. Cities, transport companies and...
Dare with Python: An experiment for all (intro)Paloma, Recuero de los Santos 16 January, 2019 As we did in our experiment on the Titanic dataset in Azure Machine Learning Studio, we will continue with the “Learning by doing” strategy because we believe that the best way to learn is to carry out small projects, from start to finish. A Machine Learning project may not be linear, but it has a series of well-defined stages: 1. Define the problem 2. Prepare the data 3. Evaluate different algorithms 4. Refine the results 5. Present them On the other hand, the best way to get to know a new platform or tool is to work with it. And that is precisely what we are going to do in this tutorial: get to know Python as a language, and as a platform. What is NOT necessary to follow this tutorial? The objective of this experiment is to show how a simple Machine Learning experiment in Python can be done. Different people with different profiles can work with ML models. For example, a Social Sciences researcher, or a financial expert, Insurance broker, Marketing agent etc. They all want to apply the model (and understand how it works). A developer who already knows other languages/ programming environments, may want to start learning Phyton. Or a Data Scientist that works developing new algorithms in R, for example, and wants to start working in Python. So, instead of making a list of the prerequisites to follow the tutorial, we will detail what is not needed: You do not have to understand everything at first. The goal is to follow the example from start to finish and get a real result. You can take note of the questions that arise and use the function help (“FunctionName”) of Python to learn about the functions that we are using.You do not need to know exactly how algorithms work. It is convenient to know their limitations, and how to configure them. But you can learn little by little. The objective of this experiment is to lose the fear of the platform and keep learning with other experiments!You do not have to be a programmer. The Python language has a quite intuitive syntax. As a clue to begin to understand it, it is convenient to look at the function’s calls (e.g. function ()) and in the assignment of variables (e.g. a = “b”). The important thing now is to “start”, little by little, you can learn all the details.You do not have to be an expert in Machine Learning. You can learn gradually about the advantages and limitations of different algorithms, how to improve in the different stages of the process, or the importance of evaluating accuracy through cross-validation. As it is our first project in Python, let’s focus on the basic steps. In other tutorials we can work on other tasks such as preparing data with Panda or improving the results with PyBrain. What is Python? Python is an interpreted programming language, oriented to high level objects and dynamic semantics. Its syntax emphasizes the readability of code, which facilitates its debugging and, therefore, promotes productivity. It offers the power and flexibility of compiled languages with a smooth learning curve. Although Python was created as a general-purpose programming language, it has a series of libraries and development environments for each of the phases of the Data Science process. This, added to its power open source characteristics and ease of learning, has led it to take the lead from other languages of data analytics through Machine Learning such as SAS (leading commercial software so far) and R (also open source, but more typical of academic or research environments). Python was created by Guido Van Rossum in 1991 and, curiously, owes its name to the great fondness of its creator for the Monty Python films. In addition to libraries of scientific, numerical tools, analysis tools and data structures, or Machine Learning algorithms such as NumPy, SciPy, Matplotlib, Pandas or PyBrain, which will be discussed in more detail in another posts of the tutorial, Python offers interactive programming environments oriented around Data Science. Among them we find: 1. The Shell or Python interpreter, which can be launched from the Windows menu, is interactive (executes the commands as you write), and is useful for simple tests and calculations, but not for development. 2. IPython: It is an extended version of the interpreter that allows highlighting of lines and errors by means of colours, an additional syntax for the shell, and autocompletion by means of a tabulator. 3. IDE or Integrated Development Environments such as Ninja IDE, Spyder, or the one we will work with, Jupyter. Jupyter is a web application that allows you to create and share documents with executable code, equations, visualization, and explanatory text. Besides Python, it is compatible with more than 40 programming languages, including: R, Julia, and Scala and integrates very well with Big Data tools, such as Apache Spark. What steps are we going to take in this tutorial? What steps are we going to take in this tutorial? So that they are not too long, we are going to divide the work into different posts. Introduction: An experiment for all Python for all (1): Installation of the Anaconda environment.Python for all (2): What are the Jupiter Notebook ? Create Notebook and practice easy commands.Python for all (3): ScyPy, NumPy, Pandas…. What libraries do we need?Python for all (4): We start the experiment properly. Data loading,exploratory analysis (dimensions of the dataset, statistics, visualization,etc.)Python for all (5) Final: Creation of the models and estimation of theiraccuracy Don’t miss out on a single post. Subscribe to LUCA Data Speaks. You can also follow us on Twitter, YouTube and LinkedIn The best Data Science certificationsThe acceleration of the insurance sector in Big Data
Paloma, Recuero de los Santos A digital twin to save the Mar Menor The Mar Menor, Europe’s largest saltwater lagoon, is suffering from severe degradation due to various socio-environmental factors. The Smartlagoon project, funded by the European Commission, kicks off the efforts...
AI of Things Telefónica Tech AI of Things made real In a previous post of our blog, we already told you how the combination of technologies based on Artificial Intelligence, Iot and Big Data, the “Artificial Intelligence of Things”,...
AI of Things 5 AI uses in Photography On our blog, we enjoy sharing the applications of Big Data and Artificial Intelligence in sectors that may surprise you. Previously, in “Dining with Data” we saw how customers...
AI of Things Coca-Cola’s use of AI to stay at the top of the drinks market Coca-Cola is the largest beverage company in the world serving over 1.9 billion drinks daily across its 500 brands. Being such a large conglomerate active in so many countries...
AI of Things Artificial Intelligence of Things, how things plan to make our lives simpler Just as in the Grimm Brothers fairy tale where two little elves teamed up to help the cobbler have a better life, Artificial Intelligence and IoT, Big Data technologies...
Patrick Buckley AI in Policing, how technology is helping to keep us safe Artificial intelligence and IoT technologies continue to revolutionise the way in which we live around the world. In today’s post we take a brief look at how technologies enhance...