The attack against OpenPGP infrastructure: consequences of a SOB’s actions

Sergio de los Santos    2 July, 2019

What is happening with the attack against OpenPGP infrastructure constitutes a disaster, according to the affected people who maintain the protocol. Robert J. Hansen, who communicated the incident, has literally described the attacker as a ‘son of a bitch’, since public certificates are being vandalized by taking advantage of essentially two functionalities that have become serious problems.

A little of background knowledge

On peer-to-peer public certificate networks, where anyone may find the PGP public key of someone else, nothing is ever deleted. This was decided in the 90s by design in order to withstand potential attacks from those governments wishing to censure. Let us remember that the whole free encryption movement was born as an expression of ‘rebelliousness’, precisely due to the highest’s circles attempt to dominate cryptography.

Since there is no centralized certification authority, PGP is based on something much more horizontal. They are the users themselves who sign a certificate attesting that it belongs to the user in question. Anyone can a sign a public certificate, attesting that it belongs to who it states to belong. By the 90s, people met to swap floppy disks with their signatures so that others may sign their public keys, whether they knew each other or not. The spread of the Internet brought along a server network hosting public keys. There, you may find the keys and, if appropriate, also sign one attesting that it belongs to the individual in question. Anyone can sign them an unlimited number of times, attesting with its own signature that the certificate belongs to who it states to belong. This attaches a signature of a given number of bytes to the certificate. Forever.

The attack

The attack that is being performed consists in signing thousands of times (up to 150,000 signatures per hour) public certificates and uploading them to the peer-to-peer certificate networks, from where they will never be deleted. In practice, valid certificates start to have a size of several tens of megabytes soon after. They are signed without data, as you can see in the following imagen. So far they have been focused on attacking two relevant persons from the OpenPGP movement guild: Robert J. Hansen and Daniel Kahn Gillmor.

The problem here is that under these circumstances, Enigmail and any OpenPGP implementation simply stop working or take very long time to process such oversized certificates (several tens of megabytes slow down the process up to tens of minutes). For example, 55,000 signatures for a certificate of 17 megabytes. In practice, this disables the keyring. Anyone wishing to verify Daniel or Robert’s signatures will break their installation while importing them.

He considers himself an “occasional programmer” of this language, although we hope that it is just a modesty exercise

Consequently, the attack takes advantage of two circumstances difficult to be addressed (they are features by design), so it’s pure vandalism:

  • The fact that there is no limitation on the number of signatures. If there were any, it will pose a problem as well, since the attacker may reach the limit of trusting signatures of a certificate and this way prevent anyone from trusting it again.
  • SKS servers replicate the content, and achieving it is part of the design in case an agency may intervene. This way, what is made cannot be deleted.

Why has not it been fixed?

The synchronizing network system called Synchronizing Key Server is open source, but in practice it is unmaintained. It was created as the keystone of Yaron Minsky’s Ph.D thesis, and it is written in an programming language called OCaml. As strange as it may sound, no one knows how it works so it would be necessary not only to address it, but to question the design itself as well.

Solutions are being reported: don’t refresh the affected keys and other mitigations, or refresh them from the server keys.openpgp.org, that implements a number of constraints on the problem in exchange for losing other functionalities. According to Hansen himself: the current network is unlikely to be saved.

But the worst may be yet to come. Software packages in distribution repositories are usually signed with OpenPGP. What if they start to attack these certificates? Software updates from distributions may become really slow and useless. It endangers the updating of systems that may be critical. This may suppose a call effect for other attackers, since exploiting the flaw is relatively simple.

Conclusions?

It was already known that the network could be misused, but no one expected such a wanton vandalized action. According to the affected people, they cannot understand its purpose if it’s not destroying the altruist work of people that attempt to make encryption an unlimited right. Defeatism may be perceived from affected people’s messages, where they show their frustration, anger and somewhat pessimism, with sentences such as “this is a disaster that could be foreseen”, “there is no solution”, etc. This final sentence is devastating:

But if you get hit by a bus while crossing the street, I’ll tell the driver everyone deserves a mulligan once in a while.
You fool. You absolute, unmitigated, unadulterated, complete and utter, fool.
Peace to everyone — including you, you son of a bitch. (Mulligan refers to a “second opportunity” in the golf jargon).

A number of points have caught our attention:

  • The fact that the core code of the network is written in a such an unknown language, and that it has been hardly ever maintained since then, precisely because of its perfect functioning.
  • The fact that gnuPG (the OpenPGP implementation) or Enigmail (that after all uses genuPG) cannot work with certificates of several megabytes is at least surprising. Their capacity to handle databases is quite poor. It reminds us what happened with OpenSSL after HeartBleed. Tens of defects started to be detected within the code, so the programmers admitted in some way that they could not spend more time on auditing the code. Solutions such as LibreSSL were born to attempt to program a more secure TLS implementation. Daniel Kahn himself admits it: “As an engineering community, we failed”.
  • This damages the image of PGP in general, that is not in very good health. And this strong blow only endangers its whole image, not only its servers (these ones being the excuse for the attack).

It is an interesting issue due to several reasons that lead to a number of unknowns: What is going to happen with OpenPGP in general; with protocol itself; with its more common implementations; with the servers… And above all, what the attacker’s plan is (or the call effect that may cause more attacks): if it will go from personal certificates to those that sign packages, or how distributions will react.

Have you ever wondered where the energy comes from?

Beatriz Sanz Baños    2 July, 2019

Thousands of companies and households have already use solar energy, with the dual purpose of reducing the carbon footprint of the planet and actively control their consumption.

Have you already joined the clean energies?

IoTtechnology works to ensure that systems located in distant places communicate with each other and interact as a unit, so it is very useful for photovoltaic plants. The panels that collect the energy from the sun use this technology, for example, to control the accumulated energy, manage the loading and unloading of batteries and offer the user precise consumption data. This is applicable to the industrial sectorand also to the consumption in smart homes-integrated in home automation-, to allocate the collected energy to both appliances and to the charging of electric vehicles.

Spanish company POWEN uses IoT to create customized solar energy installationsfor users. Under the claim “feel your energy, control your energy”, POWEN puts the user in the energy center and offers the possibility of controlling their energy consumption, always having accurate information about the energy generated, consumed or reverted to the network (thanks to the law approved by the Government in April that regulate residential self-consumption). With this model of distributed energy, the energy consumed by people in their homes or in their jobs is generated in the same place by an autonomous installation. This is an increasingly widespread habit that we know as “self-consumption”.

Another example of commitment to sustainable development is the Colombian company Sun Supply, which provides photovoltaic electricity with a focus on environmental, economic and social responsibility. Both the urban and rural population benefits from its solar panels, which are used to power the sensors that control the river levels of the Antioquia region or for the monitoring of real-time energy consumption and the supply of electrical power in apartments, farms, hospitals or mobile phone chargers located in public spaces.

In Brazil we have Origo, which is also one of the leaders in photovoltaic generation for SMEs and companies with difficulties in accessing electricity. This company builds solar farms and rents its panels, installs solar roofs and provides self-supply kits for professional and domestic use. In addition, they bring electricity for the first time to communities located in the interior of the Amazon.

How is connectivity applied?

In general, all these new technologies are based on the sensorization of many elements of technology such as the environment (sun orientation, room temperature, etc). The new solar generation plants have multiple sensors that must be connected wirelessly so that they can be flexible, adjustable and with acceptable costs.

Technologies such as LTE or 5G appear to be a key part of this revolution, transporting thousands of data to powerful SCADA – a specific software that allows to control and supervise processes from distance– to store data on the performance of photovoltaic plants.

All the elements of the installation – inverters, meters, string-boxes, weather stations, security systems, solar trackers – communicate with each other and upload the data collected to the cloud through these Industrial IoT Networks (LTE / 5G). There, they are evaluated by algorithmsthat alert in case of deviations from the expected results, using artificial intelligence or machine learning to optimize the capture, generation and distribution of energy. Through an app, the user can interactwith the installation remotely.

Solutions such as those offered by POWEN, Origo or Sun Supply show us the potential of IoT to achieve more sustainable energy management. The application of this technology is, and will be even more so in the future, fundamental for the development and consolidation of photovoltaic self-supply and, therefore, for the fight against pollution and climate change.

The truth behind Libra, Facebook’s new cryptocurrency

Alfonso de la Rocha Gómez-Arevalillo    1 July, 2019

It’s been almost a week since Facebook’s public release of Libra Blockchain, a decentralized programmable database promoted by Facebook to support a low-volatility cryptocurrency. Opiniated rivers of ink have been already poured in favor and against this project. Once the initial storm of opinions has passed, and after processing Libra’s whitepaper while trying not to read too many reviews in order not to pollute my own opinion, in this article I would like to share a technical overview about Libra and what, in my humble opinion, means for the blockchain ecosystem.

Disclaimer: The purpose of this document is to share a technical analysis of Facebook’s open source reference implementation of Libra.

Rationale behind Facebook Libra

Before getting lost into the technicalities, it is worth understanding why Facebook have bothered to design yet another blockchain network. After a decade, Bitcoin and cryptocurrencies have shown to be the first success story of blockchain technology. Cryptocurrencies have served as an efficient medium of exchange for billions of people around the world, but their high-volatility, and the fact that they are not backed by big corporations or governments but by developers, engineers and decentralization maximalists have prevented their mass-adoption. Libra is Facebook’s attempt to fix this.

The Libra coin will be the first asset to be issued over the Libra blockchain, and it will be fully backed by a diversified basket of bank deposits and treasuries from high-quality central banks to ensure a low-inflation and low-volatility of the currency. The governance of the network, and therefore the Libra coin and its backed assets, will be managed by the Libra Association, initially formed by a diverse set of founding members such as Mastercard, Paypal, Visa, Facebook, Ebay, etc. Thus, Libra’s aim is to become a corporate-backed global financial network supporting a global currency managed by a brand new corporate-formed non-profit central bank (The Libra Association). It may seem silly, but this is huge, and an indicative of it is the fact that there are no banks as founder members of the Libra Association.

The Libra Association is still accepting new members, and to join this select group an initial investment of $10 million in Libra investment tokens have to be made. Presumably, this initial investment will be used to purchase the basket of financial asset used to back the currency. As already advanced, the members of the Libra Association will be the ones in charge of setting the governance of the network and its underlaying coin. Initially, Libra will be a permissioned network, but its designed is thought out in such a way that in the future it can become a permisionless network based on Proof-of-stake. Every member of the Libra Association will have a validator node (in charge of executing and verifying transactions), and clients (i.e. users) will be able to send transactions to the network through these nodes. The aim is for Libra to become a global permisionless network.

Libra is Facebook’s attempt to create a global financial network where startups and incumbents can compete with new types of business models and financial applications lowering the barriers set by banks in the financial sector. A new generation of fintechs and financial disruptions is coming.

Technical walkthrough

Libra seems like a great disruption from the business side, but what about the technical part? Does it offer something new to existing technologies? Let’s go through Libra Core’s, i.e. Facebook’s reference open source implementation of the technology, to clearly understand what it offers.

For the analysis, we evaluate the main concepts and schemes backing the operation of blockchain technologies in order to compare it with existing proposals. Mainly we discuss Libra’s logical data model, its transaction execution, its scripting programming language (Move), its consensus, its networking layer, and performance.

Logical Data Model

Facebook libra blockchain
Photo: Arif Riyanto

The Libra Blockchain is implemented as a decentralized database visible to validators and clients in the network. All data in Libra is stored in a single versioned database, or ledger. The version of the ledger equals the number of transactions executed in the network, hence, every new successful transaction increase by one the version of the ledger. At each version, the database is made up of a tuple representing the transaction that triggered that version, the output of that transaction and the state of the ledger after executing the transaction. Thus, if we are at version 3 and we send a transaction to the network, after executing the transaction the status of the database will be in version 4 with the following data: (transaction4, output of transaction4, state after transaction4). What does this mean? That transactions are not aggregated in blocks like in other technologies such as Bitcoin and Ethereum. This lack of blocks more closely resembles technologies such as Corda.

Changes in the database are triggered through transactions, and the specific changes to be performed over an asset are determined by Move scripts. Move is the underlaying programming language in Libra, and like Solidity smart contracts in Ethereum, or chaincodes in Hyperledger Fabric, it enables transactions to perform changes over the data stored in the ledger. In our previous example, our transaction4 operates over the state after transaction3 giving an output of transaction4. Simple, right?

What kind of operations can be performed in Libra? Through a validator, you can execute a transaction against the ledger state at the latest version (such as “transfer 10 Libras to my friend Bob”), or you can query about the ledger history at both, current and previous versions (so we can ask something like “what was the balance in Libras for Alice at version 3 of the ledger?” or “You know the current version of Charlie’s balance?”).

Finally, Libra uses an account-based data model to represent the ledger. This means that the state of the ledger is structured through a key-value store which maps account address keys to account values (i.e. users to their balance in Libras). Account address keys identify users in the system (they work like Ethereum and Bitcoin addresses, what is more, there is no enforced-KYC in Libra, so like in the aforementioned technologies, users may have more than account in the network). Account values, on the other hand, are the collection of published Move resources and modules. Libra and other assets in the network are represented through Move resources, while Move modules are the code to be run over resources by a transaction (like smart contracts or chaincodes). This account-based model makes every user, asset and module accessible in the network through its unique address.

Figure 1 depicts an example of the account-based data model of Libra with four accounts. Ovals represent resources and rectangles modules. A directed edge from a resource to a module means that the type of the resource was declared by that module. Thus, account 0x12 contains a balance of Currency.T resource declared by the Currency module.

Figure 1
Figure 1: Account-based data model.

Nerd alert, skip this paragraph if you are not highly technical: Unlike Bitcoin or Corda that uses an UTXO transaction model, where assets histories are modelled through the consumption of inputs and the generation of outputs; Libra uses an account-based model, more similar to the one in Ethereum, where the history of assets are modelled as updates over certain addresses in the ledger.

Key concepts:

  • Data in Libra is stored in a versioned database.
  • Changes in the ledger are triggered by transactions.
  • Logic and resources in Libra are implemented as Move scripts.
  • Users in Libra can have more than one account, and KYC is not enforced.
  • Libra’s data model is not UTXO (like Bitcoin or Corda) but account-based (like in Ethereum)

Transaction Execution

The only way to change the ledger in Libra is through transactions. Transactions are triggered by users in the network. Like in many other blockchain technologies, transactions in Libra must be deterministic (the output and ledger changes of the transaction is completely predictable) and hermitic (all the information required to execute the transaction is contained in the transaction or in the current state of the ledger).

Like in Ethereum and other blockchain networks, in order to manage the demand for compute capacity in the Libra network, the concept of gas is used in transactions. Every transaction incurs in a gas expense according to the congestion of the network. This mechanism aims to reduce the demand when the system is under a higher load than it was provisioned for. The system is designed to have low fees during normal operation, when sufficient capacity is available, and really high fees, for instance, in presence of a potential denial-of-service attack. All this is regulated, like in Ethereum, using a gas price and gas cost, where the gas cost determines the specific effort to run a transaction, and the gas price the specific price of a unit of gas according to the congestion of the network. Users may set a high gas price to ensure that the validation of their transactions are prioritized.

As advanced above, transactions use Move scripts to determine the changes to be performed in the ledger, and the specific executions to be run. Initially, only the Move modules for the Libra coin are available for use, but in the future any asset and related logic would be implementable using Move. Presumably anyone in the network with enough gas will be able to deploy their own modules and resources to Libra (indeed, you would be able to issue Cryptokitties over Libra). Even more, every logic in the Libra network, even the management of the gas price and gas cost of transactions, and their execution are implemented using Move scripts.

In Libra the execution of transaction and the update of the ledger are performed independently (see Figure 2). When a transaction is sent to the network, (i) the first thing validator nodes do is to check its signatures are valid (i.e. they match the sender’s public key and the data transaction data). (ii) Then the “prologue” code is run. This Move script authenticates the transaction sender, ensures that the sender has sufficient Libra coin to pay the transactions fees, and verifies that the transaction is not a replay of a previous transaction (to prevent replay-attacks). (iii) Once the transaction prologue has completed successfully, Move’s virtual machine checks that the transaction and the code run by the transaction are well-formed (the VM verifies this by validating the corresponding Move modules and resources bytecode); (iv) if this verification is successful the transaction modules are published under the transaction sender’s account (Libra uses an address-based data model, remember?); (v) with everything related to the transaction verified and the modules published,  published in the transaction is executed, the corresponding Move scripts are executed, the write operations performed by the scripts in the ledger are performed, and the transaction events (if they exists) are emitted; (vi) finally, the VM runs the transaction epilogue (another Move script similar to the prologue) that charges the user the gas used and increment the sender’s sequence number (this sequence number accounts the number of transactions performed by a user and prevents replay-attacks).

Figure 2: Transaction execution flow.

Key concepts:

  • The execution of transactions in Libra must be deterministic and hermitic.
  • Libra uses gas (through gas price and gas cost of transactions) to handle congestion and computing demands.
  • Fees in Libra should be low with enough capacity available in the network.
  • Every logic in Libra is implemented as a Move script.
  • Initially the only available resource is the Libra coin. In the future users will be able to deploy new modules and resources to the network using Move scripts.

The Move Programming Language

Throughout the document we have been talking about the Move Programming Language without presenting it formally. Move is a new programming language created for the execution of code over the Libra protocol. Move was designed (i) to enable flexible transactions via transactions scripts; (ii) to allow user-defined code and datatypes (such as Cryptokitties), including smart contracts via modules (you could kind of guess this from aforementioned descriptions); (iii) and support configuration and extensibility of the Libra protocol (remember that everything in Libra is a Move script, even the computation of gas costs and the gas price of transactions).

In Move a resource can never be copied, only moved (got it?). In addition, a resource type can only be created and destroyed by the module that declared the type. All these guarantees are enforced by the Move Virtual Machine, over which every Move script is run. Related to this is interesting the fact that the Libra coin is implemented as a resource type in the Move language (in contrast to Ether and Bitcoin, which have a special status in their respective languages).

There is a specific whitepaper dedicated in depth to explain the Move Programming Language, so we can leave this matter for a special issue about Move.

Key concepts:

  • Again, everything in Libra is a Move script.
  • Move allow flexible transaction, configuration extensibility and user-defined resources and modules.
  • In Move resources can never be copied, only moved.
  • Safety guarantees of Move are enforced by its Virtual Machine.

Byzantine Fault Tolerant Consensus

And what about the consensus, how do validator nodes reach a consensus in the state of the shared ledger with every new transaction? The answer is LibraBFT. LibraBFT is a Byzantine Fault Tolerant (BFT) consensus algorithm based on HotStuff. BFT consensus algorithms are designed so that an agreement between nodes is achieved even if 1/3 of them are malicious or compromised.

So how do LibraBFT works? Whenever validators receive new transactions from clients, they share them with the rest of validators through a shared mempool protocol. This mempool protocol aggregates every transaction pending validation in the network. LibraBFT then proceeds in a sequence of rounds. In each round, a validator takes the role of leader and proposes a block of transactions to extend a certified sequence of blocks that contain the full previous transaction history (wait! But you told us that there were no blocks in Libra? And you are right, blocks here are just an abstraction used by the consensus algorithm to aggregate pending transactions, but data in the ledger is not stored using blocks).

Every validator receives the proposed block from the leader and check their voting rules to determine if it should vote the certifying block or not (according to the changes proposed in the transaction). Once verified the execution of the transaction, each validator’s vote is shared with the leader, and if enough votes are collected by the leader (>= 2f+1 to ensure a BFT consensus) then the block is committed and spread through the network so that every validator can update its database.

Again, get ready for a nerdy note. LibraBFT provides safety and liveness in the partial synchrony model, as in the case of well-known consensus proposals such as Casper (Ethereum) and Tendermint (Cosmos). Libra does not pose anything disruptive in the consensus field, but it uses concepts that have shown to work in the cryptoworld.

Key concepts:

  • LibraBFT supports 1/3 of compromised or malicious nodes in the network.
  • Transactions pending validation are shared between validator nodes through a mempool protocol.
  • A set of voting rounds are required between validators to reach consensus and commit changes to the ledger.
Photo: Jordan Harrison

Networking

Like any other blockchain technology or decentralized system, Libra requires an underlying networking protocol to enable the communication between (at least) validator nodes in order to reach consensus. Libra’s networking protocol is inspired by the libp2p project, which is the distributed networking library promoted by the creators of IPFS and used in projects such as Parity Substrate.

Libra’s networking layer implements well-known schemes in peer-to-peer systems such as Multiaddr, for peer addressing, messages over TCP for reliable transport, Noise for authentication and full-end encryption, Yamux for multiplexing substreams over a single connection, and push-style gossip for peer discovery. I understand that you may have no clue what all these random concepts mean, but if you like distributed system I highly recommend you read and understand all these schemes, as they are the basis of almost every distributed system and blockchain protocol. They are the substrate of blockchain technology.

In order to authenticate nodes in the network, this networking service uses the same validator smart contract as the consensus algorithm to verify the public key of the connected node. Any change in a validators node status will be reflected in this smart contract, and to join the inter-validator network, a validator must authenticate using a network public key in the most recent validator set defined by the contract (this is the scheme used by Libra to permission nodes to the network).

And by the way, the first reference implementation of Libra core is written in Rust (like Parity, Substrate and many other projects), in case you want to start learning this programming language to be prepared for the future to come.

Key concepts:

  • Libra is built over a P2P networking layer inspired in the well-known libP2P.
  • Permissioning in Libra is achieved through a smart contract that holds the public keys of every allowed validator in the network.

Performance

One of the main concerns when we talk about decentralized technologies and blockchain is its performance. There are no performance evaluations about Libra yet, but according to the whitepaper, Libra has been designed to support in its initial launch at least 1000 transactions per second with a 10 seconds finality time (way more than what we are used to achieve with technologies such as Bitcoin, Ethereum or even Hyperledger Fabric).

According to their numbers, to achieve this, validators will require a bandwidth of at least 40Mbps, a CPU able to perform 1000 verifications per second (something achieved by any modern commodity CPU), and an SSD storage system preferably of around 16TB to support the expected load of the network. This HW requirements should be affordable by anyone. Unfortunately, until the network is not live, we won’t be able to verify this claim.

Key concepts:

  • Libra’s first release is expected to support up to 1000 transactions per second with a 10 second finality time.

Hands on Libra Core

The open source reference infrastructure is already available in Github, and you can start running your own node and making your first Libra transactions. So, what kind of can of things can we start doing with Libra?

  • After compiling and installing the node (Figure 3), we will have Libra’s command line available to start testing (Figure 4).
  • The first thing we can do is create a new user account. This command creates a new asymmetric key pair and returns its corresponding address (Figure 5).
  • With our user account ready, we need to get (mint) some coins to start transferring them. In this test environment coins may be easily obtained using the “mint” command (Figure 6).
  • Finally, we can send our newly minted coins to another user account (Figure 7). After performing the transaction, we can see all its details such as the signatures involved, the payload or the gas used.

In Libra’s official website you will find easy tutorials to start putting your hands on its technology.

Figure 3: Libra Rust Compilation.
Figure 4: Available commands.
Figure 5: Create new account
Figure 6: Mint new coins
Figure 7: Make a transaction

Open questions

Even after reading the whitepaper and our aforementioned analysis, a set of open question not covered by the technical whitepaper arise. I’m sure there are many more, but the ones that come to my mind are the following:

  • Right now, the only available Move resource in the network is the Libra coin, what are the rules, and what kind of things will user be able to do with Move? Will anyone be able to deploy their custom Move resources or modules, or only validators will be entitled to do this?
  • The Libra coin is backed by a basket of financial assets, but who and how decides the monetary policy backing the Libra coin? It is a technical scheme, or is the Libra Association responsible for this? Can new Libra coins be minted or issued at any moment, or what are the rules?
  • Taking a step back, what would be the specific rules for the governance of the network and the Libra coin?
  • Libra wants to transition from a permissioned network to a permissionless one based on Proof-of-stake (PoS). How is this transition planned, and which specific PoS mechanism will be chosen for the network? Will the performance be maintained under this scenario?
  • What will be the requirements, after the investment stage, for new members to join the permissioned network, and how is the onboarding and permissioning process performed?
  • Has Libra any privacy scheme in its roadmap to enable private transactions? Are there any layer-2 or sharding mechanisms thought to scale Libra? And what are the regulation requirements of Libra and the Libra coin? Does Libra coin and users using it have to enforce any regulations (KYC and AML)? Is it a bank, a currency, an exchange fund, a security or all of the above, or none? What do central banks think? Cross-border sanctions? Can the controlling group ban applications? Users? Countries? How and why?
  • How do you get people to use it? How do you cash in/out – what real-world distribution do you have (mobile operator payment systems like Mpesa have worked in emerging markets because the operators have already built this)? What happens if I lose my password?

Conclusions

This was tough ride; I hope you enjoyed it as much as I did. My conclusion is that, as shown in our technical analysis, Libra doesn’t offer great disruption or innovation to current blockchain technologies and protocols. Every technical concept presented in Libra core have been studied and tested to a greater or lesser extent in other blockchain proposals. The real disruption is in the business side of the proposal. Libra is a corporate-backed blockchain initiative with a clear goal: issuing a new financially-backed global currency over a shared global exchange network. Its aim is to lower the barrier to financial innovation shifting the sovereignty of money from central banks to corporations. Let’s see how it ends up and the new type of services and business models it enables.

First photo: Stanislaw Zarychta,

The data-driven transformation of businesses, a matter of survival

Antonio Pita Lozano    1 July, 2019

Written by Antonio Pita

During the Digital Enterprise Show (#TelefónicaEmpresasDES) celebrated in Madrid at the end of last month we had the opportunity to look at the state-of-the-art disciplines which surround the digitalisation of companies. The data-driven transformation was raised as a key issue.

Thus, big data and advanced analytics were dedicated to a series of seven presentations in which nine experts from the world of data shared their experiences, the challenges they faced and the ways in which they overcame them.

The day started with the presentation from Elena Gil, CEO of LUCA, the data unit of Telefónica, “Connecting big data & AI with business”, in which she perfectly captured the moment in which organisations are currently living and the need for the data-driven transformation. “The digital world impacts on every company, they are changing the rules of the game and new competitors are arising in every sector such as Spotify, Uber…”, she said. Competing in this environment requires, amongst other things, “presenting the perfect offer in that precise moment through the appropriate channel”. In this sense, many organisations are using big data and artificial intelligence in some initiatives but find barriers to integrating the extracted understanding of the data into the functional processes.

Elena Gil made it clear that the use of big data and artificial intelligence “is not a question of differentiation, but of survival”. And what happens now? The answer is that the explosion of data and the great evolution of technology have permitted the fast development of analytics, which includes machine learning and artificial intelligence.

Business Vision:

Ke Zhang, COO of Graphext, with his evocative, “Data science for business”, informs us of the insights which focus on resolving business problems through practical cases of customer analytics and graph visualisation techniques. I would complete it and say “Data science for business, beyond Kaggle” in order to emphasise the importance of business and to avoid the companies which centre the part of the model, which although is criticised, eclipses the WOW effect of business, that is still not being adequately addressed by most companies. Yes, Telefónica has already implemented big data into many of its initiatives, especially in the most relevant and critical ones, as Elena Gil recalled.

To complement this, with a more technological perspective, Luis Reina, global market-cloud sales analytics architect at IBM, emphasised the value of what is known as the enterprise insight platform, a platform which comes to facilitate the extraction of the understanding from data to ensure that a company is data-centric, as Chema Alonso explains when he talks about the fourth platform.

These platforms face this objective through three steps:

  • The capture of information, where special relevance is given to the combination of related databases with NoSQL databases united through the virtualisation of data, which allows you to speak a unique data language throughout the whole company.
  • The organisation of information through the transformation and analysis of profiles with a special emphasis on quality, the government and regulation.
  • And, lastly, inclusion of the visualisation, operational and of course advanced analytics capabilities through machine learning techniques and cognitive artificial intelligence models.

In this area, Elena Gil says that Telefónica has been recognised by Forrester as a leader in “The Forrester Wave” of specialist insight providers, as the only telco ahead of companies such as SAP, IBM, Teradata and Alibaba, among others in this field. Forrester recognises Telefónica as the company with the most complete and strongest offer in the global market to accompany businesses in the data-driven transformation.

The convergence of Big Data

The vision of the CEO of LUCA (Telefónica Empresas) about the coming future is clear: “Big Data finds itself at the centre of the convergence of various technologies: IoT, real time, cloud and artificial intelligence, which allows the expansion of its field of action and that it can be used as a service, which simplifies the technological complexity of organisations.

Elena stated that “binomial big data and artificial intelligence take on special relevance since they are fed back into an endless loop, which opens up a new world of opportunities: big data facilitates artificial intelligence thanks to the generation of large quantities of data and the increase in the capacity to store and process it, whilst artificial intelligence expands the capacity of big data in order to extract business insights, automating processes and improving decision making, something that given the amount of existing data would be impossible.

Cognitive intelligence:

Other main themes included the use of cognitive intelligence in the interactions with clients and what improves the user experience.

As in February 2017, Telefónica was the first telco in the world to present a virtual assistant, AURA, this time it was the turn of Luca Pronzati, chief business innovation officer of MSC Cruises, with Zoe, the first virtual assistant in the cruise ship sector.

Future challenges:

To fully benefit from the capabilities of big data and artificial intelligence it is necessary to carry out the data driven transformation and to become a company lead by data.

Elena Gil stated that “the technology is already available to all and it is never going to be limited again”. Victor Agramunt, Director of Seidor Analytics, shared this vision in his statement: “The great challenge is the transformation of people and their mindset”. According to Elena, “the challenges at the moment are having access to talent, looking out for privacy and security and to act ethically”.

About ethics also, Rafael Fernandez, CDO of Bankia and President of Club de CDO de España, said that “ in order to be able to establish a strategy of ‘data ethics’ in companies you need sponsorship and money because you are not only trying to put it in place but also to maintain it sustainably over time. Also, in order to have positive results it is necessary to have a data culture developed within the business through literacy data plans”. “Without a mature data culture in organisations, an ethical data strategy is going to fail”, he stated.

Rafael put forward an interesting point: “A few years I spoke about privacy and anticipated that in the coming years we will experience discrimination”. “It is a critical theme on which they are already working at Telefónica, which ha established artificial intelligence principles and is developing methodologies which allow the training for fairer algorithms”.

José Ignacio Goicoechea, from Serikat, examined the compliance with the regulation on privacy and, in particular, the use of anonymisation technologies that ensure the “complex equilibrium between organisations which have data and others who would use them well”.

How do you capture clients on multiple channels?

Anshul Kudal    28 June, 2019

Have you heard of the word omnichannel? It is a word which is used a lot in the world of marketing. Like its name indicates, it refers to multiple sales channels available to the consumer, which can include physical shops, pop-ups, the web and apps.

Omnichannel, which originally focused on the physical world, has expanded into the digital world. The digital strategy has become the most important part for the majority of businesses, but experts in marketing however don’t know how to connect the online and the offline world.

Clients use different channels when they want to buy a product. The first time that they come into contact with a product is possibly on social media or on television. Once the product has caught the attention of the public, they can find it online to look at prices or models; then, the potential customer goes to a physical store to test the product (especially with fashion and electronics) and finally they decide whether to buy it, either in the physical store or afterwards online.

The numerous points of contact between the product and the potential clients are what hinders the work of marketing professionals in understanding and interpreting the needs of the public. It is complicated to establish percentages about which clients prefer the online service and which clients opt for a more traditional method in-store.

One of the marketing objectives is to capture the attention of the consumer in order to offer them the products which most relate to their needs. The multiple channels make it difficult to make a personalised experience for the clients. On the online platform it is relatively straightforward to offer a personalised environment which selects the products most appropriate for each consumer, however with the offline medium, the customer may feel lost if they do not find the product they desire in the expected section of the physical store. For example, a consumer can assess products differently online and can go to a local shop to try them out, once they arrive at that shop that can become drawn other products with better offers or it could be that the shop says the product which they are interested in online is not available.

These factors mean that in the transition from online to offline, potential clients are lost, without knowing for sure what has led the consumer to change their decision. It causes a loss of fundamental data.

How do you resolve this loss of data?

One solution could be to encapsulate all of the business data in a large database which will compare and analyse it, something which would take a lot of effort, with capturing information about offline shopping patterns already being quite tedious.

Audience and Attribution allows business and marketing professionals to find buying and interest patterns and to create profiles of the visitors of an online business based on data processed by Telefónica. Also, it identifies the interests of the users which go to a physical store, including the users who are near physical advertisements, such as billboard.

Audience and Attribution thus allows the gathering of information about online and offline consumers and also values the profitability of advertising investments, as it can register users that have bought a product due to a physical advert, or that have visited a store, whether it be online or offline.

In order not to lose information when consumers move to offline platforms, you can also use LUCA Store which provides information about the potential of an area of influence, zones of interest and visitors at a point of sale.

Finally, business and marketing professionals will be able to have a 360º view of their business!

Make your holidays run smoothly with IoT

Beatriz Sanz Baños    28 June, 2019

The arrival of summer holidays allows us to visit our favorite destinations, but who isn´t worried about losing their suitcase when travelling? Luckily, IoT help us to have everything under control.

The number of brands launching connected suitcasesto the market is increasing to take care of our belongings. An example of this is Travelmate, which has a GPS chip so that, in case of loss, we have its exact location in an instant through the mobile.

Thanks to IoT technology, Smart suitcases can be controlled at any time through a mobile app. With just one click we can know where they are, make them go from one place to another and close or open them without even touching them.

Nowadays, with Travelmate you will no longer have to carry your suitcase when you are tired. This is possible because this suitcase connects to your smartphone via Bluetooth and follows you wherever you go, always a few centimeters away with a system of anti-collision sensors in order to avoid any obstacle, accelerometer and gyroscope to move horizontally, vertically and make 360turns.

Travelmate also has a secure TSA locking system, a long-lasting wireless battery that allows charging other devices and a LED lighting system with various colors that indicates the battery level or the direction of movement.

Other brands such asBlueSmarthave also brought to the market smart suitcases that connect to the smartphone via Bluetooth and allow them to manage their location through and app, thanks to the GPS they have incorporated. In addition, they have proximity sensors, digital padlock and a digital scale, so by using the app you can also open and close your suitcase, as well as to know beforehand if the weight falls within the limit allowed by airlines.

The fingerprint lock has also reached these devices. Pluggage,Delsey´s smart suitcase, offers a lock of this type so that your objects are secure in case of loss. We will no longer have to worry about where we keep the padlock´s key or to remember the code for opening it. It also sends us a notification to our smartphone through the app when the suitcase approaches us on the conveyor belt at the airport and informs us of the weight of it. 

Whether it is by train, plane or in the highway, with smart suitcases, trips are more comfortable. An example of the ability that connected devices have to make our lives easier. Let IoT take care of your things and enjoy your vacations without any worries.

Hacker Women are driving Aura, Telefónica’s Artificial Intelligence

Cristina de la Cruz    26 June, 2019

Telefónica is the pioneer telco in introducing Artificial Intelligence (AI), through Aura, to facilitate communication with its customers. Behind this initiative, one of the Company’s most ambitious, the profiles of Irene Gómez, Ana Molina, Marta Pérez and Sarita Saffon stand out, as some of the hackers who gave the project a feminine voice and face. All of them participate regularly in technology-related events where they highlight Telefónica’s advances in AI.

Designing a personality for Artificial Intelligence

When designing Aura, one of the points that has been studied most carefully has been the type of relationship humans have with AI, taking into account the design challenges posed by new patterns of interaction such as the use of voice. Ana Molina, Service and Experience Designer at Telefónica Aura, during her speech at the sixth edition of Experience Fighters, summed up these challenges with the sentence:

“Designing an intelligent interface requires mastering conversational design, exploiting context and data to the fullest, but above all, you have to be prepared for the unexpected.”

Ana Molina
Ana Molina during her lecture at Experience Fighters

Always considering the user experience

In terms of the overall user experience with Aura, Marta Pérez and Sarita Saffon, UX Researchers at Telefónica Aura, focus their work on user research and idea management, and their experience is based on the design of new processes and tools to help identify user behaviours and needs. That’s how they explained it at Redbility Damn! I´m researcher

“The importance of research with users in innovation projects, especially in challenging technologies such as Artificial Intelligence, is essential in bringing the user experience to the entire design process and decision making within the company”

Sarita Saffon
Marta Pérez and Sarita Saffon in Redbility Damn! I´m researcher

Betting on multidisciplinary teams

The best way to develop an AI is to have multidisciplinary teams, so within the Aura project -with Irene Gómez as director- they try to break down barriers and eliminate biases, “with a team that develops in different ways and taking into account that AI training data is representative of a society in a balanced way”, as stated by the director at Telefonica Aura in the cycle of Tecnobienestar conferences.Another key point to highlight regarding the use of AI at Telefónica is the implementationof a series of ethical principles to regulate customer’s data usage through Artificial Intelligence.

Irene Gómez speaking about Aura in one of her public appearances

Telefónica has understood the importance of parity for development in the technological world, with women hackers as Aura standard-bearers who, thanks to their work, have managed to position the company at the forefront as regards the use of AI within telco.

If you want to know more about Aura, visit volumes 1 and 2, which tell you its history from birth to the present day.

How do brands speak to you?

Beatriz Sanz Baños    Salmerón Uribes Marina    25 June, 2019

Internet of Things gives companies a world of possibilities to improve their productivity, management and service they offer to their clients. This last section is especially important in Retail, where providing users with a satisfactory shopping experience is essential to build loyalty and differentiate from the competition.

The way in which consumers make their purchase, look for products, follow trends, etc. is changing, and therefore, also changes the way in which brands interact with consumers. Technological advances and digitalization have become the IoT tools that provide the Retail sector with the most benefits, impacting its business model, procedures and of course; directly in their users.

Have you ever felt that the screens of a store tell you exactly what you wanted to know?

Dynamic marketing allows the retailer to create its own cannel and communicate with its customersat the point of sale. That´s why, when we enter a store, go to a restaurant o we are at a clinic that has screens, we see that the content shown in them is addressed to us as consumers and is related to what we are looking for: products, services, schedules… Even before entering in the shop window itself!

The retailer being able to manage all of the sale points in a centralized manner means that the content being viewed on a store screen can be the same or different from another store of the same brand. How do store screens know “what they have to tell us”? The retailers adapt the content along the customer journey based on the knowledge of our behavior, obtained from anonymized data.

Gone is the time in which only one content could be distributed on all screens or that the changes had to be made individually on each of the screens where the content was broadcast. Currently, there are softwares that allow the distribution of content on different screens, adapting the content. Have you ever noticed that in many showcases there is the same creativity, but depending on the screen is displayed differently? That the content changes according to age or sex? It adapts for a better viewing experience.

Transforming the customer experience into a point of sale in a multichannel and multiservice environment is crucial to be able to give customers what they´re looking for. Therefore, dynamic marketing not only allows the management, programming, distribution, segmentation and synchronization of content; it also allows proactive monitoring, with customized dashboards over the different performance indicators, and the creation of business reports that always seek to personalize the content that the client visualizes so that it consumes what it really want to see.

One of the connected technological solutions that allow stores to adapt the contents of their screens is spotsign, a service that, thanks to the recompilation of statistical information of the influx of customers and their behavior, generates a deeper knowledge of the business that helps improve sales and operational processes.

Examples as spotsign show us that IoT is increasing their presence in our daily lives and in the case of dynamic marketing with a double objective; improve the customer experience by activating the point of sale though technological solutions and services and the possibility of transforming the experience with the brand and obtaining customer knowledge from the retailer.

Leave a Comment on How do brands speak to you?

Business Message main players and roles

AI of Things    24 June, 2019

Telecommunications companies have a natural right to play in enabling communication channels for their customers. However, in the case of P2P (Peer to Peer) messaging we see that Over the Top (OTT) players, that have enabled conversational messaging platforms, are dominating the market and telecommunications companies have little, if any, role to play.

However, in the case of A2P (Application to Peer) messaging or Business Messaging we see that the ubiquitous and reliable SMS is still growing. SMS reach is unmatched. It has the highest read rate within minutes and boasts the highest engagement rate when compared to email and OTT mobile messaging apps.

The global A2P SMS market is estimated to grow at a CAGR of 4.4% during the forecast period 2016 – 2025, and accounts for US$ 62.10 Bn in the year 2025. The ubiquitous nature of SMS services has favoured the growth of A2P SMS market significantly. A2P messages are used for reminders, mobile event ticketing, One Time Passwords (OTP), flight & train updates, promotional activities, polling contests and the list goes on…

However, with entry of new OTT players into messaging customers rightly demand improved user features and experiences. The results of which has created OTT Business Messaging competition and RCS (Rich Communication Suite).

Although most messaging platforms such as iMessage, LINE and WeChat were created as P2P messaging platforms, they have all now enabled businesses to use their messaging channel, as this provides a powerful monetisation case. Now we will review the key OTT players offering a Business Messaging proposition.

OTT analysis

WhatsApp for Business

WhatsApp is a messaging app for smartphones created in 2009. It has 1.5 billion users in 180 countries making it the most-popular messaging app in the world. It has one billion daily active users. WhatsApp began to verify business accounts in August 2017. A few months later, in January 2018, WhatsApp Business was launched. As of May 2018, 3 million businesses had signed up for WhatsApp Business via the specific app, which the business is required to download. Businesses can only contact customers who have provided their number and agreed to be contacted. The app is aimed at small and medium-sized business owners, and is aimed to facilitate better client-customer communication. As with standard WhatsApp, messages from businesses will be encrypted. Additionally, in August 2018, WhatsApp launched the WhatsApp Business API, which allows businesses to respond to messages sent by customers, who must message first. Businesses have 24 hours to reply for free (from the last message), after which they will be charged at a fixed per-message rate. This is WhatsApp’s first revenue-generating enterprise product. Facebook released on April 2019 a new SDK that allows mobile app developers to integrate WhatsApp verification into Account Kit for iOS and Android. This will allow developers to build apps where users can opt to receive their verification codes from companies through the WhatsApp app installed on their phone instead of through SMS.

Facebook Messenger

In 2011, Facebook decided to single out its messaging feature into a standalone app – Facebook Messenger, and three years later the company announced that messaging featured will be removed from Facebook app completely and so all users will have to download the Facebook Messenger app to be able to send instant messages on Facebook. In March of 2015, Facebook added a feature to the app, which allowed users to send money to their friends. In spring of 2016, Facebook launched chatbots solution for Facebook Messenger that allowed businesses to build a communication channel with their customers via Facebook Messenger. As of April 2017 the total number of user of the app has reached 1.3 billion. Businesses on Messenger lets consumers and businesses interact straight from the Messenger app. It can be used as a customer support option to help retain customers and answer their questions. The platform allows businesses to accept payment right from the messenger. The Business/customer relationship is all managed between Facebook and messenger, in a virtuous circle. 

Apple Business Chat (ABC)

On June 9th, 2017, Apple released Apple Business Chat (Business Chat). Business Chat is a new tool that allows businesses to offer real-time customer support. Customers can search for a business in iOS via Safari, Maps, Spotlight, or Siri, and from any of those services they can open Messages and chat with the business. Many businesses also let their customers start a conversation directly from their own app or website. Only clients can start a conversation, and once they delete a thread, the business can’t contact them again until they start another conversation. Business cannot see your personal information, like name or phone number, unless clients choose to share it for appointments or deliveries. ABC is provided to platform partners for free, therefore offer strong competition to players that monetise in the Business Messaging space.

This increased competition from OTTs into Business Messaging has incentivised Telecommunications Company’s advancement to RCS. RCS offers a standards upgrade to SMS.

Rich Communication Services (RCS)

RCS brings conversational and group messaging, picture sharing, audio and video, rich cards, delivery and read receipts to life for carriers. At the same time, it provides a seamless evolution—with SMS/MMS fallback. RCS is already live in 65 operators and 45 countries, with 167 million MAU and is forecasted to reach 1.01 billion MAU by the end of 2019. Market research shows 74% of consumers say they are more likely to communicate with a brand over RCS, and early brand results show a 10x uplift in click through rates, with 60% of consumers preferring RCS over SMS.  RCS represents an opportunity for MNOs to monetize mobile messaging by providing brands with a secure, clean marketing channel that delivers the privacy and reliability that enterprises have come to expect from mobile operators.

But the window of opportunity to scale RCS is limited. OTT messaging applications are already entering the B2C messaging space in an attempt to generate revenue from a customer base grown on top of a free P2P messaging service. WhatsApp for Business (leveraging the WhatsApp consumer base) and Apple Business Chat (leveraging the iMessage installed base) are some clear examples.

Business Messaging is crucial for Businesses to communicate with their customers. And now messaging platforms have enabled the capability of customers to communicate with Businesses too. SMS Business Messaging will continue to be relevant however, there is a land grab in IP based business messaging. As a telecommunications company Telefonica is proactively supporting the enablement and deployment of RCS across its footprint, while supporting the integration of other channels that our customers want to interact with.

RCS Business messaging vs OTT positioning


Fig 1. RCS Business messaging vs OTT positioning

Don’t miss out on a single post. Subscribe to LUCA Data Speaks.

You can also follow us on TwitterYouTube and LinkedIn

Innovating with IoT

Beatriz Sanz Baños    21 June, 2019

Startups create products and technological solutions that anticipate the needs of the future, helping us in the transition to the digital age. However, they need resources so that time does not play against them, since from the beginning of a project to its final release it may become obsolete.

Telefónica´s IoT Activation Programme helps entrepreneurs minimize the time of release of their IoT solutions to the market. Below, we can see three success stories of startups that have participated in IoT Activation:

AElnnova

This startup was born in 2014 with the purpose of providing solutions to address the problems arising from climate change and improve the environment. With this objective, they propose to eliminate batteries from electronic devices, which would become self-powered from the conversion into electrical energy of the heat generated by their operation.

To start the project, AElnnova had 3 basic needs: financing, knowledge and entry to the market. To solve them, they have help from Telefónica, which brings the experience of its qualified technicians, the infrastructure of The Thinx and the possibilities of its commercial network.

Eccocar

This shared mobility platform helps fleet managers accelerate the transition to a sustainable mobility. Thanks to Kite platform, they have a hardware for connecting the vehicles to the cloud and APIs with which they receive and monitor the data in real time.  This allows receiving alerts to any technical problem, thus guaranteeing the safety of the fleet and the drivers.

Eccocar is linked to Wayra Germany and maintains a double relationship with Telefónica: on one hand, it offers corporate car sharing services to its operating fleet; on the other one, it complements Telefónica´s connected car services by offering automatic rental APPs to rent-a-cars and other mobility solutions.

Plantae

In Plantaethey design and develop sensors with wireless technology to optimize risk in agriculture and professional gardening. These devices use radiofrequency and GPRS technology to measure and send data to the cloud in real time about humidity, temperature and soil conductivity. The information is accessible from any mobile device, which optimizes watering, saving water and energy.

The sensors of Plantae are already being used in gardens, agricultural plantations and football or golf courses throughout Spain. Even so, the startup continues to optimize its solution in The Thinx laboratories, as a preliminary step to market it in the Latin American markets with the support of Telefónica.

These digital entrepreneurship projects show us the great potential of IoT to successfully develop new business models that have a positive impact on the whole society. The support of Telefónica, though IoT Activation Programme, has been fundamental to achieve this.