Rock appround the clock, our research in DEFCON

ElevenPaths    28 August, 2018

In the world of Threat Intelligence, determining the attacker’s geographical location of is one of the most valuable data for attribution techniques, even if not perceived like that, this information may lead a research one way or another. One of the most wanted information is where the author comes from, where he lives in or where the computer was located at the time of an attack.

We focused our research in taking advantage of this kind of “time zone” bugs for tracking Android malware developers. We will describe two very effective ways to find out the developer’s time zone. We have also calculated if these circumstances has some real relation with malware, diving in our 10 million APKs database.

AAPT time zone disclosure bug 

The Android app development kit (SDK for Android) comes with a tool called “aapt”. This program packs the files that are going to compose the application and generates an .apk file which basically corresponds to a zip format.

If the aapt tool is used directly by command line or maybe via a developing plugin outside Android Studio, the files composing the APK will be generated with a date following this format: 1980-01-01 {offset_GMT]:00:00. Where [offset_GMT] represents the time zone corresponding to the operative system configuration time zone where the app is being packed.

The figure represents a simple .apk generated by command line with aapt in a computer with time zone configured to GMT +3.

Offset GMT in the modified time field image
Offset
GMT in the modified time field

Normal
0

21

false
false
false

ES
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Tabla normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:10.0pt;
mso-para-margin-left:0cm;
line-height:115%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:”Calibri”,”sans-serif”;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;
mso-fareast-language:EN-US;}

As observed, the modification time of the files is 01-01-80 and “03”, which corresponds to GMT +3. We observed this issue with different real apps and time zones. Why?
During the process where aapt adds a new file to an .APK, (ZipFile.cpp – line 358), you may observe in line 500 a call to “setModWhen”, using variable “modWhen” as an argument.

Calling setModWhen in aapt source code image
Calling
setModWhen in aapt source code

Normal
0

21

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Tabla normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:8.0pt;
mso-para-margin-left:0cm;
line-height:107%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:”Calibri”,sans-serif;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;
mso-ansi-language:EN-US;
mso-fareast-language:EN-US;}

But when going back in the code, there is no part in the code where “modWhen” gets an useful value. It just keeps its “0” value, initially set in line 367 in the same file (ZipFile.cpp – line 367):

Setting modWhen in aapt code image
Setting
modWhen in aapt code

Normal
0

21

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Tabla normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:8.0pt;
mso-para-margin-left:0cm;
line-height:107%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:”Calibri”,sans-serif;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;
mso-ansi-language:EN-US;
mso-fareast-language:EN-US;}

Function setModWhen will then be called (always) like:

pEntry->setModWhen(0);

Inside this function (ZipEntry.cpp – line 340), the modWhen variable (from now on referenced as “when”) is used in line 351 as part of this operation:

even = (time_t)(((unsigned long)(when) + 1) & (~1));

Which will be called like this, taking into account “modWhen” value:

even = (time_t)(((unsigned long)(0) + 1) & (~1));

The result is (obviously) “0”. This value will be stored in the variable “even” that will be later on used as an argument for “localtime” function. This function allows to create the structure for “tm * ptm” date and will be used to set date an hour for the modified field in the files added to the .APK itself.

setModWhen function inside aapt source code image
setModWhen
function inside aapt source code.

Normal
0

21

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Tabla normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:8.0pt;
mso-para-margin-left:0cm;
line-height:107%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:”Calibri”,sans-serif;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;
mso-ansi-language:EN-US;
mso-fareast-language:EN-US;}

Because of the timestamp (“even” variable) used as an argument for localtime is not valid, the date generated for the files is not the real one, but 0. There is a correction for the years (it is set to 80 if lower) and it finally gets the format already described: “01-01-80 [offset_gmt]:00:00”.

Next figure shows how “even” is set to 0, just before localtime function receives it as an argument.

Variable “even” in runtime image
Variable
“even” in runtime

Normal
0

21

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Tabla normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:8.0pt;
mso-para-margin-left:0cm;
line-height:107%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:”Calibri”,sans-serif;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;
mso-ansi-language:EN-US;
mso-fareast-language:EN-US;}

The code goes on, and now it splits the data (day, month, year, hours, minutes and seconds) so they can be used separately (in this case printed separately in the screen). The order in which localtime returns the result is: seconds, minutes, hours, day, month, and year. That is, for example, in the first position (0x006A0E10) you may find 4 bytes for the seconds, and in the last one, (0x006A0E24) we can find another 4 bytes for the year.

Result from localtime function as in memory image
Result
from localtime function as in memory

Normal
0

21

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Tabla normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:8.0pt;
mso-para-margin-left:0cm;
line-height:107%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:”Calibri”,sans-serif;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;
mso-ansi-language:EN-US;
mso-fareast-language:EN-US;}

returned information example imagenFollowing the colours in the figure, the returned information goes like this:

So definitely, localtime function is the one returning this offset (in this case: +3), taking it from the operative system. Aapt will later round up the numbers to 01-01-80 because this is the “Epoch” for PKZip standard. The reason may be that localtime times to adapt every date to your own time zone where the computer is supposed to be located. 


Honoring the documentation of the localtime function, this should not happen because it is specified that if this function gets a null or “0” value as an argument, return value should be null. So, when is localtime getting the GMT offset and returning it? For Windows System, if TZ variable (time zone) is not set in the application itself, localtime function will try to extract time zone information from the system itself and the function will go for this data when receiving a (real or not) argument value. An invalid timestamp like “null” or “0”, will just be taken as a “0” hour and the returned value will contain the GMT offset, that ends up cleanly added to the place where the hours should be.
In UNIX/Linux this particularity exists as well. If a developer uses aapt by command line, the GMT offset for his/her time zone will be “added” to the modifying time for the files inside the APK. Focusing in aapt source code, setModWhen function uses localtime_r instead of localtime (code is the same, but depends on the system where it is run), but the argument passed to it is still “even” variable (with a value of 0). This function is basically the same as in Windows, but there is no TZ variable to decide: it will always add the time zone set in the operative system.

What to conclude then? Localtime is not handling errors as it should. When receiving a 0 or null argument, it should return null, not 0 plus whatever your GMT (TZ for Windows) is, added to this value. On the other hand, aapt makes a mistake using 0 as a “constant” argument for feeding this function.


GMT zone certificate calculation

As said, .APK (and jar, for this particular technique) follow the PKZIP standard. That is, they are .zip files for what is worth and share most of the PKZIP specifications. In the case the APK is built not directly using aapt, there will not be a chance to know the creator time zone and all the “modification time” fields for the files inside the zip should be the “right ones”. However (a few years ago), we have found another factor that will allow us to know the time zone where the developer compiled the application, just as interesting as the one mentioned and as a complementary method. The method is about calculating the difference between the right timestamp of the files and the timestamp of the certificate inside the APK to sign it (this date is stored in UTC, so we have references enough to calculate the time zone). 

UTC Time - ZIPs file gets the offset and thus, time zone (map from timeanddate.com) image
UTC
Time – ZIPs file gets the offset and thus, time zone (map from
timeanddate.com).

Normal
0

21

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Tabla normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:8.0pt;
mso-para-margin-left:0cm;
line-height:107%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:”Calibri”,sans-serif;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;
mso-ansi-language:EN-US;
mso-fareast-language:EN-US;}

Relation with malware

We tried to stablish a relation between: 

  • Malware/adware creators and the way APKs are compiled (using aapt in command line). 
  • Malware/adware creators and the way ad-hoc and disposable certificates are created. 

For this experiment, we took 1000 files (unless stated otherwise) from the ones with the leakage in every flavor (1000 files leaking GMT+1, 1000 leaking GMT+2… etc) and checked for malware.
AAPT disclosure bug:

Hemos intentado establecer una relación entre:

With the AAPT bug:

samples with AAPT Timezone disclosure image

Green cloumns are not representative because of using too few samples.

Leaking because of a disposable certificate:

APKS with file/certificate datetimes (do not forget DST!) image

So we can conclude that, basically, GMT+4, GMT+5, GMT+8, GMT-6 and GMT-7 are the time zones producing more malware. Why this little difference between techniques? For example, with the first aapt bug, predominant time zones producing malware are: GMT+4, GMT+8 and GMT-7. With the certificate technique, GMT+5, GMT+8 and GMT-6 are the ones producing more malware. These GMTs correspond to some parts of Russia, China, and United States West Coast. We think that this difference is because of the Daylight Saving Time. These techniques are tied to DST so some countries may use +-1 hour difference depending of the season. China does not use DST (and Russia either since a few years ago).

Aside, we know our database contains about a 6% of malware in any set without these characteristics we may find. So, we will use this as a “correction factor” to compare, we finally get these numbers:

comparison table image

Metadata
As one of the techniques related with metadata, we show how all the strings automatically generated by Android Studio are in specific components created by the IDE itself, while the text strings written by the developer are found in other files, not associated to a specific component. For example, when executing:


./aapt dump –values resources app.APK | grep ‘^ *resource.*:string/’ –after-context=1 > output.txt

Extracting all the resources of an Android application, filtering by text strings imagen
Extracting
all the resources of an Android application, filtering by text strings.

Normal
0

21

false
false
false

EN-US
X-NONE
X-NONE

/* Style Definitions */
table.MsoNormalTable
{mso-style-name:”Tabla normal”;
mso-tstyle-rowband-size:0;
mso-tstyle-colband-size:0;
mso-style-noshow:yes;
mso-style-priority:99;
mso-style-parent:””;
mso-padding-alt:0cm 5.4pt 0cm 5.4pt;
mso-para-margin-top:0cm;
mso-para-margin-right:0cm;
mso-para-margin-bottom:8.0pt;
mso-para-margin-left:0cm;
line-height:107%;
mso-pagination:widow-orphan;
font-size:11.0pt;
font-family:”Calibri”,sans-serif;
mso-ascii-font-family:Calibri;
mso-ascii-theme-font:minor-latin;
mso-hansi-font-family:Calibri;
mso-hansi-theme-font:minor-latin;
mso-bidi-font-family:”Times New Roman”;
mso-bidi-theme-font:minor-bidi;
mso-ansi-language:EN-US;
mso-fareast-language:EN-US;}

We get the strings written by the developer directly that, very likely, will use his/her native language.



Conclusions and future work

We have presented two techniques to leakage time zone from an app. One of them, related to an aapt bug, does not only shows a bug in the way dates are handled, but a possible problem of a system function (localtime) not honoring the specifications. This may affect other programs in some other ways.
By studying these techniques, we have a new way of possibly detecting automated malware creation by analyzing when and how certificates are created to sign these apps. Aside of the statistics about where the malware comes from analyzing its time zones, this may be used as an important feature in machine learning systems to early detect Android malware.

Aside, we have shown some tools and tricks for a quick view of all this useful information around APKs metadata.
Future work should be more accurate about DST, taking the season into account to classify malware, and maybe using more samples to get better conclusions.
This is just a briefing of the complete paper, that you may find here:

This is just a briefing, the whole paper is here:

Innovation and Labs

Safer and controlled waters with smart buoys

Beatriz Sanz Baños    23 August, 2018

Internet of Things is not limited to dry land. We already saw how they it inserted itself in the sea to improve the experience of bathers and ensure more sustainable tourism in the post 10 sensors that make the beaches smart.

However, the technology has gone even further into open waters with the help of smart buoys that measure the level of sea pollution or where to find larger schools of fish. They can also be used in pools to measure the level of deterioration of the waters, regardless if it’s salty or chlorinated.

These floating elements are already common for everyone. Until now their function has been to signal the safe bathing area, channels for entry and exit of boats, or specific points of reference at sea. However, with the application of IoT its functions can bring many more benefits to bathers, as well as coastal authorities, fishermen and others.

The function of smart buoys is determined by the programming of the sensor they carry. We can find different types of uses for these new connected devices both at the beach and in swimming pools:

  • Beaches: In addition to facilitating monitoring the water temperature and its quality, they can help make the beaches much safer since they can be used to monitor the boats and / or control restricted access. It’s a very common problem this summer; bathers have denounced boats to get too close to the shore, crossing the security distance of 200 meters established by law. In this way, the relevant authorities can receive the signal with enough time to act and be alert.Likewise, the safety of swimmers will also be reinforced with these buoys, since they allow jellyfish banks to be monitored. Once detected they send a warning and can even reorient seaward to get a better reading, thus preventing the jellyfish from reaching the shore of the beaches where they eventually die, but not before causing problems among bathers.
  • Fishing: This sector is also going through a digital transformation and adapting new technology to its daily tasks. For example, in fishing, sailors nowadays have satellite-connected buoys that can tell them where the schools of fish are located. These buoys were created by the company Marine Instruments, which places great emphasis on control standards to ensure that fishing resources are maintained for the next generations. However, they also make fishing more efficient, since they also manage to reduce the consumption expenditure of fuel to capture the same amount of fish, or have an estimate of the fish population in an area, which indicate how much fishing can be done and how they affect the environment.
  • Detecting polluting substances: German researchers from the Karlsruhe Technological Institute (KIT) developed a smart monitoring project two years ago. They used a multisensory buoy that allowed them to make high-precision measurements as well as monitor bodies of water. In this way, they could measure the quality of water at different depths and configure parameters to measure things like oxygen concentration, temperature or the presence of greenhouse gases, then they analyze the captured data, either in the buoy itself or remotely, adding the information of many buoys. Its power supply is produced by wind energy and solar cells and the measurement system combines methane and CO2 sensors, flow direction, sample analysis systems and meteorological measurement station. This system was born mainly to detect the so-called blue-green algae, which grows uncontrollably in rivers, maritime areas and lakes; they release toxic substances and can kill the fauna of certain aquatic environments. This process has been evolving and, along the way, similar projects have emerged such as BRAAVOO (Biosensors, Reporters and Algal Autonomous Vessels for Ocean Operation), the project of buoys with biosensors that will monitor marine pollutants. It was born with the aim of trying to stop the degradation of marine water quality in a biological and chemical way thanks to buoys with chips that will monitor marine pollutants in real time thanks to three biosensors: bacterial, immunosensors and algae sensors.
  • Swimming pools: the maintenance of swimming pools is a never-ending task. It’s stagnant water in which many people bathe during the day, chlorine and pH levels have to be controlled and regulated constantly. To do this, the sensors on the buoys are the most appropriate solution, since they can carry out exhaustive control in real time and send a warning to the responsible of the pool when the quality of the water is compromised. The message quickly reaches the wearable or the device that is connected to the buoy. Then the person in charge only has to see what level fails or if the temperature has risen or fallen above the indicated temperature and correct it. To this end, mobile apps have also been created so that private users who have a private pool in their homes can also be aware of the quality of their water at any time.

All these applications of connected sensors, with the help of the processing of collected information that allow to know the state of the water in detail, reflect the natural way in which we adapt our day to day to new technologies to any context, and how these improve our quality of life, as well as cities and natural areas that surround us. Together with them, it highlights the importance of Big Data and the importance of knowing how to analyze the data collected by these sensors in order to solve the indicated problems.

The next big battle for energy efficiency … air conditioners

Beatriz Sanz Baños    21 August, 2018

“The growing demand for energy from air conditioners (ACs) is one of the critical points in the debate about energy and sustainability. It is necessary to reduce their energy consumption, generate savings and avoid emissions. “These words were delivered by the Executive Director of the International Energy Agency (IEA), Fatih Birol in the presentation of the study “The Future of Cooling” this May.

Currently, the electrical consumption of air conditioners and air conditioning systems represents 20% of the total consumption of buildings and represents 10% of the total world electricity consumption.

In the coming decades, the huge demand for AC is going to become one of the main sources of electricity consumption worldwide due to the growth of the global population and the increase of income level in warm countries.

For example, in Mexico and Brazil the percentage of households with CA is only 16% while in Japan and the US it exceeds 90%. The predictions suggest that in 2050 two thirds of the world’s households could have air conditioning, whose electric consumption would represent more than 35% of the total.

the huge demand for AC is going to become one of the main sources of electricity consumption worldwide

For this reason, the IEA highlights the need to invest in efficient CAs, which would reduce the AC power consumption by 45% compared to the base case.

At Telefónica we have been working in the field of Energy Efficiency for several years, deploying Smart Energy IoTsolutions so that companies reduce energy consumption and therefore increase their competitiveness.

smart_energy

Specifically, the Smart Energy IoT solution allows intelligent energy management based on the measurement of consumption, remote control and big data through a cloud platform that facilitates decision-making by managers. Thus, the efficient management of lighting and air conditioners can save between 10% and 30%, contributing to the reduction in consumption that is so necessary in the medium and long term.

The evolution of TV – Addressable Advertising: Creating a Better, More Personal TV & Video Experience

AI of Things    21 August, 2018

Written by Natalia Barrera Palomero – Digital Consultant in LUCA Advertising

As a powerful channel with broad reach, it is no surprise that TV currently accounts for $70 billion in U.S. marketing spend. For nearly 70 years now, TV advertising practices have remained practically unchanged — that is until now.

A perfect storm of data proliferation, technology advancements and shifting consumer behaviour is upon us, which means marketers are entering the new age for how TV advertising is planned, targeted, purchased and measured. Brands can maximize the opportunity TV provides to connect with consumers.

Figure 2. US Ad Spending: The eMarketer Forecast for 2017

On average, we each have at least three connected devices, and throughout the day, we are consuming content wherever and whenever we like. With a Hulu and Netflix mentality, it’s no surprise that we are entering the era of “TV everywhere.” We want our content on demand so much so that even channels such as HBO, CBS and ESPN are creating TV anywhere experiences. Ten years ago, phrases such as “cord cutter” and “binge watching” were not part of a marketers’ vocabulary. Today, the reality is that consumers are seamlessly switching between screens and channels around the clock.

 Figure 3. The ESPN Lab found that combining TV commercials with digital video ads can increase awareness. 
Source: WSJ

It’s an exciting time for marketers, with more ways than ever to reach and engage with their best customers and prospects. However, this rapid channel growth also means there are more ways than ever for marketers to drop the ball. They could waste dollars by targeting less valuable prospects, overexposing others, delivering a poor brand experience or failing to coordinate multitouch, “surround sound” campaigns effectively across channels for maximum impact. For these reasons, it’s important for marketers to leverage the addressability of both traditional and emerging channels, including addressable TV. 

Top trends in shifting consumer behavior that marketers must take note of when thinking about how to devise their best TV strategy:

1. TV is still the king of video. TV is still the dominant channel when it comes to reaching largest audience, which means addressable TV is a valuable opportunity to maximize the power of TV.

2. Mobile is the first screen for online video. The number one device for consuming streaming or downloaded video content is a mobile. Consider a surround-sound approach where addressable TV strategy is paired with an addressable mobile video campaign targeted to the same audience. Being able to reach people, regardless of the screen they are viewing.

3. Cord cutting is on the rise. People are moving away from traditional pay TV. Technological and demographic trends virtually guarantee that cord cutting will continue to rise as smartphone and tablet ownership increases and millennials become a greater share of the adult population. As OTT options such as Roku, Apple TV, Chromecast and Amazon Fire are upgraded, consumers will have more and more options to consume online video without sacrificing quality or screen size. As a result, more consumers will be reconsidering whether they need to continue paying for traditional TV. By understanding this shift in consumer behaviour, marketers can devise a strong addressable media strategy that leverages the opportunity addressable TV provides, both on its own and also when it’s coordinated with additional channels such as mobile and online.

4. You better add video or OTT options if you want to reach millennials. Millennials are the most device agnostic, with more than one-third saying they don’t mind watching video on a portable device even if it means a smaller screen. That’s more than double the rate of those age 35 and older. This decentralized viewing can create headaches for marketers who need to start a relationship with millennials during this stage of their lives, when they are most open to trying out new brands and have yet to settle down. On the plus side, marketers who do manage to reach this audience will find them much more open to advertising than the average individual. In fact, millennials are more than four times more likely to say video ads they view on their cell phone are useful. While the challenge is great, so is the potential reward. Including online and mobile video into your strategy is key.

5. Online video has its perks. The growth in online video viewing creates many opportunities for marketers. Online audiences can be targeted more easily and served advertising that is relevant, responsive and measureable. While CPM for online video ads generally are lower than those for TV, marketers can use that savings to negotiate costs based on clicks, completed views or transactions. From a creative perspective, online video also gives advertisers the opportunity to develop interactive creative and even make some elements of the creative dynamic.

TV is still the medium that provides brands with massive reach. TV commercials are effective at grabbing attention, telling a story and connecting with consumers unlike other advertising mediums. Now, with advancements in TV technology, marketers have addressable targeting options that have typically only been possible through other channels such as direct mail, email and digital.

What is Addressable TV?

Figure 4. How addressable TV works Source: Big Data made simple 

The concept of addressable TV is simple: deliver TV advertising like one targets email or direct mail, using household-level audience data such as income, lifestyle interests, shopping behaviour and family composition.

Like a piece of mail that arrives in your mailbox, the TV ad can be tailored to the household, with different ads delivered to different households simultaneously across the same ad unit. This means the art and science of traditional addressable advertising channels now is possible through TV.

To target at the household level via addressable TV, one must buy from TV operator (some examples in US: Cablevision, Comcast, Dish Network, DIRECTV) who has set-top box technology to target and deliver the one-to-one ads.

Figure 5. Custom Ad delivery Source: Experian.com

Addressable TV is about the person and not the program. You and your next-door neighbor may be watching the same show, but through the power of Addressable TV, end up viewing different ads.

How Addressable TV Works

Process for making an addressable TV campaign come to life:

1.Build an audience: The advertiser determines the target audience based on first-, second- or third party data.

Figure 6. An audience is determined based on data  Source: Experian.com

2. Match audience. A neutral third party matching anonymizes and matches the target audience to the TV operator’s subscriber universe in a privacy-compliant manner. 

Figure 7. Audience matching based on the advertisers target audience Source:Experian.com

3.Launch campaign.  

Figure 8. Campaign Launch based on zone and segment Source:Experian.com

4. Measure the results. At the end of the campaign, the TV operator sends ad exposure data and the advertiser provides cross-channel sales data being able to create campaign closed-loop analytics that show how the campaign drove sales both in-store and online.

By focusing on targeting individual households rather than specific programs, addressable advertising is opening new doors.  Outside of the top 20 prime-time spots, there is a lot of viewing going on. There are many audiences on the long-term networks – some networks that have not been measured by traditional TV measurement. Addressable TV will revolutionize TV advertising and change how companies approach advertising. By targeting specific households/customers, addressable TV makes the advertising about quality not quantity. Industry leaders predict that one-quarter of TV ad budgets will be spent on addressable TV within three years.

Don’t miss out on a single post. Subscribe to LUCA Data Speaks.

You can also follow us on TwitterYouTube and LinkedIn

The horizon of IoT

Beatriz Sanz Baños    16 August, 2018

Energy savings and production costs, the large amount of connected devices, investment in this technology or the reduction of traffic accidents is just some of the data that shows the benefits of IoT for the society.

10 sensors to create Smart Beaches

Beatriz Sanz Baños    14 August, 2018

From the showers that clean our feet to sensors that measure the quality of the sand or sea water. The smart beaches are the next horizon, but they’re already a reality in the Valencian Community. A pioneering projectbetween the Turism Agency from Valencia and the Polytechnic University has made this possible in the sandy areas of Gandía, Benidorm and Benicassim.

The project consists in obtaining and analyzing data that allows us to improve our experience as bathers and achieve a competitive advantage in the tourism sector, a key industry in the Spanish economy. Likewise, it will offer information in real time not only to citizens, but also to the authorities.

But how do you do it? The new technologies of the Internet of Things will be present in this innovative management of the beaches, and thanks to them you will be able to obtain a great variety of indicators through different measurement sensors, among which are:

  1. Substrate sensor. They are quantitative sensors that allow us to know the quality and temperature of the sand, as well as its cleanliness and the presence of unwanted remains.
  2. Surveillance drones. They will watch over bathers or the presence of dangerous species such as jellyfish, significantly increasing the safety of these beaches.
  3. Monitoring bathymetries. They help monitor the seabed in real time and control the species that cross it.
  4. Smart buoys. They are In charge of controlling the limits of the bathing areas and the proximity of boats to prevent accidents. They also monitor the quality of the water, its temperature, and help detect the presence of jellyfish.
  5. UVA sensors. They are the perfect aid for sunscreens. It measures radiation levels, their variation, and transfer information so that the screens advice how much we must protect our skin and eyes from UVA rays.
  6. Shower. It helps control water consumption, which also has a timer and self-diagnosis sensor, thus allowing for better maintenance by reducing the cost.
  7. Smart Parking. Parking control that provides information in real time of the places available to citizens and which in turn helps reduce CO2 emissions and fuel consumption.
  8. Water supply. Sensors that allow, among other things, the detection of leaks in the supply network and carry out an appropriate maintenance of the same. They also help control water consumption, which helps the environment.
  9. WI-FI. Besides being one of the favorite features of holidaymakers, who do not want to consume their data on the beach, the installation of Wi-Fi allows, among other things, to measure occupancy density.
  10. Smart traffic lights. It is a critical tool to help with the arrival and departure of cars to the beaches. It facilitates traffic control and pedestrian prioritization, in addition to offering safety information.

This are ten IoT tools that are part of these beaches, but there are many more. They provide citizens and visitors with a safer, more efficient and more comfortable vacation, with new and better advanced services at our fingertips. The authorities allow them to ensure a sustainable tourism, without problems or accidents, from which they can obtain critical information that helps making better decisions in the future and improves the tourist experience. Smart cities are already a reality, and projects and initiatives like this are the most obvious proof.

Data Transparency Lab looks for new projects on privacy and data transparency

AI of Things    9 August, 2018

Data Transparency Lab (DTL) call for tools opens on August 10th and will end up on September 30th to attract new apps. The selected projects will get prizes ranging from €10 000 to €25 000 to complete their applications, and will announce the awarded projects at its annual event that will take place in Barcelona on November 15th. 

The Data Transparency Lab (DTL) will launch a Call for Tools on August 10th that will allow entrepreneurs, startups, universities and research centres to submit new projects on privacy and data transparency. 

The aim of this call is to give economic and technical support to complete apps, tools, libraries and other forms of software that explore topics aligned with Data Transparency Lab’s mission to help users understand how their data is being used when they connect to online services: which data is collected, who is collecting that data and how it is used.

The current call will offer up to €50 000 and the number of selected projects will depend on the amounts requested by the applicants that make it to the final. Participants will be able to select the required amount to finalize the project, from €10 000 to €25 000, and will have to describe how they plan to spend the requested funds. In addition to the economical prize, the selected projects will get technical support from an advisor from Telefónica with whom the will sign a Schedule of Work to supervise the development of the tool and guarantee the success of the project.

Submission process

In order to apply, applicants have to submit their projects through the online form that will be available on August 10th at DTL Call For Tools page. The deadline to apply ends up on September 30th. From this date the Committee will judge all the proposals and will announce the winners at DTL’s annual event on November 15th in Barcelona, when they will place ideas in common on the advancements in this field with renowned researchers, technicians, regulators and industry representatives. 


Figure 1: Call for tools key dates

Areas of interest: privacy leaks detection, advertisement transparency, personal data valuation, analysis of tracking techniques and variation of pricing depending on personal information, detection of algorithmic bias and discrimination, identification and analysis of anti-competitive practices in online platforms, detection and analysis of online behavioural targeting on advertising, search, recommendation, transparency challenges around new crypto-currencies and blockchain technologies, explainable Artificial Intelligence, among others.

About DTL

The Data Transparency Lab was founded four years ago by Telefónica at the R&D Centre in Barcelona as a clear case of the effort that Telefónica Barcelona puts into innovation. The initiative came up as the need to study transparency related to data use in the digital environment and DTL has been consolidated as a model organization in its sector. DTL gathers the best technologists, regulators, industry representatives and researchers worldwide to foster transparency and privacy on personal data.

The goal of DTL is to connect the best talent worldwide to work on technological challenges that data transparency requires in order to create a new trustworthy data economy. With the aim of supporting more projects DTL launches this call for tools to create a community that allows the development of open source applications designated to improve data management by individuals and companies. 

DTL Grants Program was created on 2015 to support research and development of tools to empower users to be in control of their online personal data. Since its foundation, DTL has invested in grants €900 000 designated to 18 projects from international universities such as Princeton, Berkeley, Max Planck Institute for Software Systems or Technische Universität Berlin, among others. With the aim of supporting more projects DTL has launched this call for tools and will soon announce the Hackathon planned to take place on November.

Don’t miss out on a single post. Subscribe to LUCA Data Speaks.

Tracking of goods for the tranquility of the consumer

Beatriz Sanz Baños    7 August, 2018

Electronic commerce plays an increasingly important role in the business of companies. In fact, the number of people who buy online at least once a week has increased more than 10% in the last two years, and the number that does at least once a month is close to 60%. Companies that sell remotely must pay attention not only to the quality of their establishments (physical or digital) but also to other areas of business such as transportation or customer service.

When a purchase is made online, three objectives are usually pursued: efficiency, security and speed. The extent to which these three factors are combined will be decisive for customers to follow an online commerce page or seek better quality in the competition. The Internet of Things is the revolution of online commercebecause it offers many new possibilities to make your online purchase a success.

The IoT, linked to blockchain (simplifying a lot, it is a data structure in which the information is grouped in sets), offers a solution for tracking packages and merchandise that guarantees the traceability of the route and the estimated time of arrival, the registration of the conditions of shipmen allowing detection of deviations like temperature, vibration, etc., and the ability to assign responsibilities to each event.

These solutions, encompassed in what is known as Cargo Tracking or Asset Tracking, work through wireless sensors located in the cargo or transport element (container, pallet, etc.), cloud technology and predictive analysis tools. Not only can we know the exact location of the order, but we can also know the climate of the route in which is travelling, something that could allow to change the temperature of the package container, if necessary, to ensure that everything arrives in optimal conditions. Any minimum variation of the temperature or movement of the package due to a bump or sudden braking may alert the carrier and allow them to act immediately.

These solutions can be coordinated with the connected fleet services, known as Fleet Management, and increase the operational and service benefits for the final customer. A fleet of connected vehicles allows you to optimize the delivery of packages by being able to know, in real time, where they are at each moment, trace the most efficient delivery routes and solve problems and incidents such as a delay in the delivery.

In summary, thanks to the IoT, the transport company knows the conditions and variations of their journeys in real time, which allows you to increase the security of the merchandise, be more accurate in the estimated delivery times and minimize transport times thanks to the optimization of the routes used by the carriers.

In addition, the benefits are not limited to the road. Also on the rails and maritime transport the IoT is improving the tracking of goods. With this technology, the communication and coordination of the entire logistics chain will be improved, which includes cars, trains and ships, all sharing information between them and the different infrastructures to predict possible incidents. You can thus exchange information about your position, power and even composition, which will help assess and even deliver orders if necessary at the time of loading or unloading in stations, ports, etc.

These systems of management and optimization of shipments have a direct impact on the client, since the procedures are streamlined and allow a more personalized attention. Normally, the thing that makes consumers the most nervous is the fact that they do not know the state of the product they have just purchased, but thanks to the IoT the security, comfort and higher quality information go hand in hand.

Safer, but not immune

Beatriz Sanz Baños    30 July, 2018

Technological advances, and specifically IoT, are designed to give us more time, increase our comfort, and improve our quality of life. The digital transformation has managed improved our lives and to brought us closer to a society that we thought existed only in the future not too long ago. We also imagined that in this type of society crime would be greatly reduced because the general improvement of things would also extend to security. Although this is true as increasingly stronger systems against criminals are developed, the thing is human beings are very cunning and they always find ways to break those systems.

We have an example in a video created by Wired two years ago. In it we can see how two hackers take full control of an autonomous vehicle doing with it what they wanted. They reduced the role of the driver to a mere spectator of a possible tragedy. Luckily, this video was recorded only to show the dangers of certain IoT advances and there was no need to regret any incident. It shows us is that every digitalization is susceptible to a cyber-attack.

However, not all the examples are negative. Months ago the motorcycle brand Honda had a problem in Peru. The problem was that it was too good, incredible as it may seem. Their motorcycles, having a reputation for good quality, were the most stolen in the country, so customers began buying from other brands to avoid losing their motorcycles. To solve this problem, Telefónica and IoT came up with a system that, with the push of a button, located the lost motorcycle and returned it to its owner. It’s called Fleet Management. This way nobody had to face the thieves, which would give us all a bit of trouble, and the motorcycles would return as if by magic to their owner. It worked and Honda’s motorcycles were bought again.

Staying in the automotive sector, the incorporation of telemetry (measure magnitudes and send them to distant receivers) not only allows knowing the location of the vehicle, but also increases the speed of assistance in case of accident, it performs preventive maintenance, and shows the fastest route for every moment of the day. The implications go beyond the safety of citizens. These advances also allow significant improvements in trade. The fact that both the employer and the consumer are able to know the situation and location of the order positively affects the well-known customer experience (level of satisfaction of a user / client after testing the services of a company).

For reasons like this, Internet users of things should not be afraid to keep betting on innovation. Luckily for them, connected objects will increase almost three times its number in the future and so the investment in security will also grow.

So far, it has not been possible to develop a technology that eliminates the evil of the people and thus ends the crime. That said, it is reducing it or at least it is making it more difficult for the criminals. The future is here, do you want to be part of it?

A history of Lisp and its use in neural networks – Part II

AI of Things    30 July, 2018

Written by Sergio Sancho Azcoitia, Security Researcher for ElevenPaths

Last month, we started a two-article series talking about Lisp. Part I covered the history, its beginnings and uses when it comes to creating neural networks. Today however, we will show you how Lisp works, and how you can create a simple neural network. 

Before we begin, let’s refresh our memory on how neural networks work. Neural networks base themselves on a series of nodes or “neurons” that have connections between themselves and form a series of layers. The way they work is compared to the behavior of the human brain when it solves a problem, as the signal progresses, it will take one path or another according to a series of pre-established parameters.

Our neural network in question will be simple, meaning it will only consist of three layers, and its function will be to find whom we are thinking about, out of all our team members. If we see the image below, the nodes or neurons correspond to the names of our teammates and the layers correspond to the conditions that lead us to one answer or another.

Figure 2. Even though this neural network consists of 3 layers, you can always increase the complexity of a neural network by adding more layers and sublayers (Source: Gengiskanhg, Wikipedia Spain)

 
To create the different layers in the network, we will only use questions with “yes” or “no” answers. The questions will be carefully chosen so that when answering them, either a response is offered or you move onto the next layer. This process will be repeated as many times as necessary until a final response is obtained. As the user answers questions, the number of possible candidates for a final answer will decrease. In this brief example, for each question answered, an answer is obtained and, if not, a possible candidate is discarded.

  Figure 3. Our example of a neural network that tells us who are thinking about

As we have said before, this example is simple, but starting with this as a base, you care create neural networks that are much more complex, adding layers and sublayers. A clear example of this is expert system Akinator, that we have mentioned before on the blog. In the case of Akinator, it counts with an immense neural network that increases as new characters (nodes) are added to the game.

The Lisp language is one the best examples to understand some of the concepts surrounding Artificial Intelligence, and functional programming (such as recursion). This is one of the reasons it has become the favorite of many MIT researchers to develop his projects during the years after its appearance.

Don’t miss out on a single post. Subscribe to LUCA Data Speaks.

You can also follow us on TwitterYouTube and LinkedIn