D3FEND, the other side of the ATT&CK coin

David García    14 September, 2021

We are already familiar with the ATT&CK project of the MITRE corporation. It is a de facto standard that helps us to characterise threats based on the techniques and tools used by cybercrime, which is essential when planning or modelling threats.  It is based on the military-rooted concept of TTP or tactics, techniques and procedures. In the end, it aims to model the various actors: how they do it, who does it and why they do it. With this information, it is possible to realistically simulate attacks, improve the detection of malicious activity on the network, etc. But today we will talk about D3FEND.

Although ATT&CK has a chapter on mitigations, the arrangement of mitigations did not allow for a common naming to facilitate the creation of relationships. Basically, they were defensive strokes of a couple of lines to a paragraph in length with no concordance between them.

Seeing this need, MITRE itself embarked on a project (in beta phase) to categorise and build a common language regarding possible defensive capabilities and countermeasures. D3FEND was born with this premise in mind: to create a knowledge base in the form of an ontology that includes all countermeasures and capabilities. Moreover, it does not detach itself from ATT&CK, but also maps on top of it by adding or extending the chapter on mitigations.

The framework is presented in the form of a matrix (like the ATT&CK matrices). Here we can see the general classification: Harden, Detect, Isolate, Deceive and Evict. Different defence fundamentals that are developed in several columns.

For example, extremely common is the detection of malware based on the analysis of DNS traffic, either in the surrounding network or during the detonation of a sample or executable. In such a case, this is an item in the detection area and network traffic analysis sub-area. As we can see, within this we find the item “DNS Traffic Analysis” which contains a complete definition and relations with other artefacts of the framework, plus (and this is especially useful) the corresponding mapping to ATT&CK.

Once we obtain the definition and relationships with ATT&CK, we can expand the relationships with the rest of the artefacts by clicking on the graph in the Digital Artifact Relationships section:

Here we can see in a practical way the relationships both with other defensive techniques and with the mapping to the ATT&CK framework. In this aspect it is particularly useful. For example, if we zoom in on the image, we can see that the offensive techniques described in ATT&CK that produce outgoing DNS traffic are grouped and identified:

It is not now a matter of adopting countermeasures for all the techniques identified, but rather that all these techniques derive from the concept of Outbound Internet DNS Lookup Traffic or, in other words, performing a DNS lookup.

This is where we should place our emphasis and focus on the countermeasures that will facilitate the adoption of possible controls and defences, or by correspondence, the defensive side of the tree:

What do we do to detect and prevent the resolution of malicious domains? That question would be answered by D3FEND here, at this point.

For example, one of the countermeasures is the classic list of blocked domains. Something that is usually fed through a feed of domains classified as malicious or similar categories: spam, porn, gambling, etc.

As we can see, defending is an art and here is the science, which comes to classify, document and interrelate every concept so that we can have a tactical vision and help us in the arduous task of building walls and digging trenches.

D3FEND is in beta, as we have already mentioned, and much content has yet to be detailed and expanded. Nevertheless, we can see that it has potential and above all that it complements its twin, ATT&CK, very well.

Victory is on your ideas

ElevenPaths    13 September, 2021

Victory is not always a question of numbers. In order to win, you have to believe in what you do, rely on your team and always be willing to improve. Overcoming the odds is not an easy task and our #LadyHacker know this. Only through the analysis of the situation, the ability to convince and, above all, the value of teamwork, we win games.


Because victory is on your ideas, we present the fourth video of the #LadyHacker 2021 campaign, Telefónica’s global initiative that aims to make the role of women in the technology sector more visible and raise awareness among our girls about their potential to study STEM careers. PLAY IT!

Join the #LadyHacker initiative and WE ARE WAITING FOR YOU!

Cyber Security Weekly Briefing 4-10 September

Telefónica Tech    10 September, 2021

Critical vulnerability in Zoho ADSelfService Plus

The company Zoho has issued a security advisory warning of a critical vulnerability in ADSelfService Plus, an enterprise password and login management software. The vulnerability involves an authentication bypass affecting REST API URLs in ADSelfService Plus, which would allow a threat actor to perform remote code execution (RCE). Although it has been identified as CVE-2021-40539, the vulnerability is currently unqualified according to CVSSv3. However, several sources define it as critical. In addition, both the Zoho organisation itself and the US Cybersecurity and Infrastructure Security Agency (CISA) have confirmed that the vulnerability exists. CISA have confirmed that there is evidence of active exploitation of this flaw on the network, so it is recommended to apply as soon as possible the updates already released by Zoho that fix the problem in all versions of ADSelfService Plus prior to 6114.

More: https://www.manageengine.com/products/self-service-password/kb/how-to-fix-authentication-bypass-vulnerability-in-REST-API.html

New Windows 0-day actively exploited

Microsoft has published a security advisory revealing details of a new remote code execution vulnerability in Microsoft MSHTML, the functionality that handles the “rendering” or construction of web documents in the now obsolete Internet Explorer (IE) browser but also in Office products. This flaw, listed as CVE-2021-40444 with a criticality level of CVSSv3 8.8, is being exploited by threat actors in targeted attacks by sending a specially crafted document that requires user interaction.

For the time being, there are no patches to solve it, but there are mitigating measures by disabling new ActiveX controls in IE. It should also be noted that the attack according to Microsoft, is dismissed if the default Office configuration of “protected view” is maintained. Security researchers claim to have located malicious Word documents used in attacks, obtaining more information about its exploitation and confirming that its criticality is greater than initially thought. It has also been found that the default protected view that Office applies to files downloaded from the internet (MotW) is not enabled if, for example, the malicious document is contained in a zip or iso file or is an RTF document. At this point, it is unclear whether there will be an official patch from Microsoft on Tuesday 14 September to address this vulnerability, so it is strongly recommended to apply the described mitigation measures and not to open attachments that are not from a trusted source.

All details: https://msrc.microsoft.com/update-guide/vulnerability/CVE-2021-40444

0-day in Ghostscript allows servers to be compromised

Vietnamese security researcher Nguyen The Duc published last Sunday a proof-of-concept (PoC) for an unpatched 0-day vulnerability in Ghostscript. This exploit, published on Github, and confirmed to work by several researchers, poses a risk to all servers using this component. Ghostscript is a small library that allows applications to process PDF documents and PostScript-based files. Although it is most commonly used in desktop software, it is also commonly used on servers, where it is bundled with image conversion and file upload processing tools such as ImageMagick. The published proof-of-concept would exploit this second scenario, allowing potential attackers to load an altered SVG file that bypasses image processing and executes malicious code on the system. It is worth noting that this 0-day was discovered last year by researcher Emil Lerner, however, it was not made public until last month, when it was presented at a security conference.

More information: https://therecord.media/ghostscript-zero-day-allows-full-server-compromises/

ProxyShell exploit to deploy Conti ransomware

New research by Sophos has revealed that the operators of the Conti ransomware have added to their arsenal the exploitation of recent vulnerabilities in Microsoft Exchange that form the exploit chain known as ProxyShell (CVE-2021-34473, CVE-2021-34523, CVE-2021-31207). While this technique was already used as an access vector by the LockFile ransomware just a few weeks ago, the Conti raids show an improvement in techniques that allow the network to be completely compromised in just five days. Within one minute of successfully exploiting ProxyShell, the attackers have a remote web shell, within four hours they have obtained domain administrator credentials and within 48 hours they have exfiltrated 1TB of sensitive data. In total, over the course of one raid, up to seven backdoors (web shells, Cobalt Strike and commercial tools such as Altera or Splashtop) were observed to maintain access to the compromised environment.

More: https://news.sophos.com/en-us/2021/09/03/conti-affiliates-use-proxyshell-exchange-exploit-in-ransomware-attacks/

‘Choosing people to whom you can entrust is one of the biggest challenges for entrepreneurs’, Dani Aldea – Altostratus

Telefónica Tech    8 September, 2021

Altostratus is a specialised company in software development and digital transformation processes. Tell us about its beginnings, how did the idea come about?

The idea came about in 2009, when I came across an article in the newspaper where Google Cloud was looking for local partners in Spain. Without thinking too much about it, I decided to create Altostratus with the idea that it would be a company focused on marketing Google Cloud products, mainly what was then known as Google Apps for work and which later became Google Workspace, Google Cloud’s collaboration and communication suite.

Over the years we have expanded into consulting based on Google Cloud Platform and the development of innovative cloud solutions for businesses. Altostratus is currently one of the main Google Cloud Partners in the country and has been part of Telefónica Tech since June this year.

As a digital transformation company, what would be the differentiating elements of Altostratus, what is its value proposition?

Our value as a company is that we use cloud technology to deliver innovative solutions to operational business needs. In other words, we act as a link between business and technology, we understand both worlds and we connect them.

And although we are familiar with other technologies because of our background, where we have specialised most is in the Google Cloud environment.

As values that define us, I would highlight our agility when it comes to tackling projects and our close relationship with the client, whether it is an SME or a large company. And, above all, the personalisation of the service, we always analyse the company’s environment and its internal processes to be able to select and offer the best technological solutions in each case.

Undoubtedly, one of the major milestones so far is the entry into the Telefónica Tech group. What does Telefónica Tech bring to Altostratus and, on the other hand, what is the value of Altostratus for Telefónica at a strategic level?

Being part of Telefónica Tech is the culmination of a journey that began some time ago for Altostratus to become the leader in Google Cloud services in Spain. For us it is a natural step in our relationship with Telefónica, after several years of jointly accompanying customers in their process of digital transformation or outsourcing to the cloud. Thanks to the size of Telefónica Tech, we will be able to count on the technical and commercial resources of a multinational and take on larger projects.

With respect to Telefónica Tech, this acquisition reinforces the Telefónica Group’s MultiCloud strategy, internalising highly specialised technical teams that provide in-depth knowledge of one of the three public clouds they cover, in this case Google Cloud.

To simplify it… We provide know-how and Telefónica Tech provides a strong structure, with the possibility of reaching all parts of the market.

What customer segment is your offer aimed at and what kind of projects do you develop?

Our services are aimed at large companies, where heavy investments in IT infrastructure or development make more sense, because they also have a great impact on cost reduction or return at a business level.

In terms of sectors, our most relevant projects are in companies that need a high level of automation in their operations and services, or those that handle a large volume of data and make decisions based on it. Some common sectors are retail, banking, insurance, logistics or outsourcing.

As outstanding cases I can tell you about a project developed for the banking sector, in which a multi-platform and multi-language virtual assistant with voice was developed using ML and AI technology. The aim was to complement the bank’s call centre with a comprehensive and personalised service 24 hours a day, capable of dealing with queries, but also carrying out banking operations.

Another flagship case, in sensorisation and ML, was the development of a highly innovative predictive maintenance solution that we carried out for a company in the energy sector. It consisted of calculating the status of the entire network in real time, anticipating losses in the network and forecasting consumption increases, with all the economic savings and improvement in service that this entailed.

How is the Cloud sector evolving in Spain and how do you see it developing in the medium term?

The sector is growing rapidly, and in Spain alone it is estimated that it will grow by 20% annually in the coming years.

On the one hand, companies increasingly see the move to the cloud for their IT infrastructure, both for the scalability of their applications and for cost reasons. The cloud offers a great financial advantage, as it allows us to pass our IT expenses on to operating costs instead of having them as CAPEX, or large investments in computer equipment that are amortised in the future.

On the other hand, the context we have been living in has meant a strong acceleration in the demand for digitalisation. Teleworking has accelerated the adoption of new SaaS work tools, as they allow access to company data from any place and device, and also to communicate globally in an efficient way. In the end, all of this has given a strong boost to our sector.

What personal learning would you highlight?

One of the great lessons for me has been the importance of honesty in decision making, honesty with yourself and with others. When you make a decision thinking about personal ethics and the impact it will have on others, only then will you be able to rest easy, even though some people may not like that decision.

And my second learning is the importance of choosing a good team. When you start a project, you feel capable of everything, but as it progresses you need to delegate more and more, otherwise you can’t move forward. Choosing people to whom you can entrust important responsibilities is one of the biggest challenges for entrepreneurs.

I totally identify with the phrase that says that alone you get there faster, but in a team you get further.

What Little Red Riding Hood teaches us about cyber security

José Vicente Catalán    7 September, 2021

They say that a good fairy tale will be considered as such if it has many different readings and is able to convey hundreds of different messages. Little Red Riding Hood can be read from a cyber security point of view and is therefore one of the best stories ever written. It will certainly make it easier for you to make the little ones at home aware of the risks on the net and, while you’re at it, the grown-ups can brush up on basic security notions.

Perhaps Little Red Riding Hood is already a while ago and so you don’t remember it well, so let’s refresh your memory, but without being exhaustive: Little Red Riding Hood’s mother asks the girl to go to her grandmother’s house to carry a basket of food and reminds her not to leave the road or talk to strangers; on the way a hunter warns her of the presence of a dangerous wolf; Little Red Riding Hood leaves the road to go through the woods and talks happily with the wolf, giving too many details of her errand; the wolf convinces the girl to pick flowers for the grandmother; the wolf arrives before Little Red Riding Hood, eats the grandmother and dresses up as her; Little Red Riding Hood then, although suspicious of the strange grandmother, stays with her and ends up being eaten by the wolf; finally, the hunter arrives at the house and rescues both grandmother and granddaughter, causing the wolf to run off.

What does this have to do with cyber security?

A lot! Let’s go step by step:

  • Little Red Riding Hood’s mother asks the girl to go to her grandmother’s house to take a basket of food and reminds her not to leave the road or talk to strangers: when we move around the internet it is important that, like Little Red Riding Hood, we follow safe roads, meaning both the connection, always using secure and trusted networks (public WiFi can be traps to take your data); and the “path” we follow on the internet, avoiding websites and dubious sites where it is more likely to suffer an attack.
  • Along the way, a hunter warns her of the presence of a dangerous wolf: it is important to keep security updates up to date. The hunter could be understood as an antivirus that is aware of the nearby presence of a malicious agent and, if it is not up to date, it cannot fulfil its preventive function.
  • Little Red Riding Hood leaves the road to go through the forest and talks happily with the wolf, giving too many details of her errand: perhaps the most important part of cyber security is exposed here: it is very important to protect your information because with it they can design a thousand different attacks. Little Red Riding Hood tells the wolf where she is going, where granny’s house is, what she has in her basket… and the wolf uses it for his criminal purpose. So we should not give more information than is strictly necessary and even less if we do not know the person who is asking for it.
  • The wolf convinces the girl to pick flowers for her grandmother: it is important that on the internet we do not do what a stranger asks us to do: do not click on that link or open the file sent to you by a strange sender because it is very likely that you are infecting your computer.
  • The wolf arrives before Little Red Riding Hood, eats Grandma and dresses up as Grandma: as a result of the previous mistakes, the wolf perpetrates what in cyber security is called a “Man-in-the-Middle attack“: an attacker (wolf) knows that a message (in this case Little Red Riding Hood) must get from a sender (mother) to a receiver (grandma) and uses information he has collected (my name is Little Red Riding Hood and I’m bringing you a basket of food) to get there first and deceive the receiver, thus achieving a benefit (eating grandma).
  • Even though Little Red Riding Hood is suspicious of this strange grandmother, she stays with her and ends up being eaten by the wolf: this point is a perfect description of a phishing attack. On the Internet it is common for a website to cross our path that, at first sight, seems authentic and trustworthy (like the wolf dressed up as granny seen from the door of the house by Little Red Riding Hood), but when we get closer and look closely we see that it has spelling mistakes, that the logo is not exactly the original and that the URL has a hyphen between words that the original website does not have (as when Little Red Riding Hood gets closer and sees those big ears, those paws and those fangs so unlike her granny). Caution on networks is essential and if something, whether it is a website or an email, looks suspicious, it is more than likely that it hides some malicious intent behind it.
  • Finally, the hunter arrives at the house and rescues both grandmother and granddaughter, causing the wolf to run away: Although Little Red Riding Hood is not an example of caution and her carelessness makes her an easy prey for the wolf, the truth is that sometimes on the networks we end up being victims of some kind of attack even if we have taken all the preventive measures. In these cases, we can rely on cyber security experts, who, like the hunter who rescues grandma and Little Red Riding Hood, can act quickly to minimise the damage caused by a wolf.

Cyber Security Weekly Briefing 28 August – 3 September

Telefónica Tech    3 September, 2021

PoC available and scans detected for RCE in Confluence

On Wednesday 25 August, Confluence published a security advisory to warn of a vulnerability in Confluence Server and Data Center in versions prior to 6.13.23, 7.4.11, 7.11.6, 7.12.5 and 7.13.0. In the advisory, the firm clarified that the flaw did not affect Confluence Cloud customers. The vulnerability, which has been given the identifier CVE-2021-26084 and a CVSS of 9.8, is specifically an OGNL (Object-Graph Navigation Language) injection vulnerability that would allow an authenticated user, and in some cases even an unauthenticated user, to execute arbitrary code on a Confluence Server or Data Center instance. Just a few days later, on Sunday 29 August, some security researchers announced that they had managed to execute code remotely without authentication in a relatively simple way, but they had not yet made the details of the PoC public, which they delayed for a few days until yesterday, September 1st. Although the PoC was not initially made public, on August 31st, the detection of mass scans of vulnerable Confluence servers was already beginning to be reported.

More: https://therecord.media/confluence-enterprise-servers-targeted-with-recent-vulnerability/

ChaosDB – Critical vulnerability in Microsoft Azure Cosmos DB

Security researchers from Wiz have discovered a critical vulnerability in Azure, Microsoft’s Cloud platform, that allows the complete remote takeover of Cosmos DB accounts with admin privileges. Due to the severity of this flaw, the researchers have not published all its technical details and the means to exploit it. However, they have confirmed that #ChaosDB is triggered by the chained exploitation of a series of vulnerabilities in the Jupyter Notebook function of Cosmos DB. By exploiting these flaws, a threat agent could obtain credentials from the targeted Cosmos DB, Jupyter Notebook and Jupyter Notebook Storage accounts. With said credentials, the attacker will be able to see, modify and erase data from the Cosmos DB accounts. In the article, Wiz has posted a video showing the exploitation chain. Microsoft patched its flaw on August 12th, less than 48 hours after being warned by Wiz, but it took some days until they sent a warning on August 26th to 30% of Cosmos DB users. In this warning, Microsoft informed that there was no evidence that the vulnerability was being exploited, but urged users to reset primary keys as security measure. Meanwhile, Wiz has indicated that the number of potentially affected clients could be bigger that the one assessed by Microsoft and has recommended all users to undertake all security measures necessary.

All the details: https://chaosdb.wiz.io/

ProxyToken – New Microsoft Exchange vulnerability

Security researchers at Zero Day Initiative have published technical details about a severe vulnerability in Microsoft Exchange Server called ProxyToken. The flaw, listed with the identifier CVE-2021-33766 and which has received a CVSSv3 of 7.3, is specifically an information disclosure vulnerability that could reveal victims’ personal information or sensitive company data, among other things. Microsoft Exchange uses two websites: the front-end, which users connect to access email, and which largely functions as a proxy for the back end, to which it passes authentication requests. The currently identified problem lies in a function called DelegatedAuthModule, where the front-end bypasses authentication requests, which contain a SecurityToken cookie that identifies them directly to the back end. When the front-end receives an authentication request with the SecurityToken cookie, it knows that the back end is solely responsible for authenticating this request. However, the back end is completely unaware that it needs to authenticate some incoming requests based on the SecurityToken cookie, since DelegatedAuthModule is not loaded on installations that have not been configured to use the special delegated authentication feature. The result is that requests can pass through, without being subjected to authentication on the front-end or back-end. Microsoft addressed the issue as part of its July updates and recommends that all Exchange server administrators who have not installed the appropriate patches prioritise this task.

Learn more: https://www.zerodayinitiative.com/blog/2021/8/30/proxytoken-an-authentication-bypass-in-microsoft-exchange-server

BrakTooth: vulnerabilities affecting Bluetooth devices

The ASSET research team has published a total of 16 security advisories, addressing 20 vulnerabilities affecting the Bluetooth software stack on System-on-Chip (SoC) boards from eleven different suppliers. It is estimated that billions of devices are affected, including mobile devices, computers, tablets, etc. According to the researchers, exploiting these security flaws could allow denial-of-service attacks or the execution of malicious code, although the impact would differ depending on the SoC board model and Bluetooth software stack used. The vulnerabilities identified include CVE-2021-28139, which allows remote code execution on devices with ESP32 SoC boards from Espressif Systems via Bluetooth LMP packets. So far, only three of the affected suppliers have released patches: Espressif Systems, Infineon and Bluetrum. Others, such as Intel, continue to work on this issue, and some, such as Texas Instruments, have indicated that they will not address the issue, while Qualcomm will only work on a part of the issue.

Info: https://asset-group.github.io/disclosures/braktooth/

Blockchain Expectations and Realities From 2017 To 2021 And Beyond

Jorge Ordovás    30 August, 2021

It has been several years since 2017 when Gartner first included Blockchain at the top of its “Hype Cycle” of emerging technologies. Back then he estimated that it would take 5-10 years to reach the mass market. It was also beginning to descend into the “abyss of disillusionment” alongside companions such as Machine Learning, autonomous vehicles and drones, among others.

Today we are going to review the experience of companies like Telefónica in making Blockchain’s expectations a reality. What has happened during this time? Are there already successful applications of this technology, four years later? What areas is it being applied in? What does the future hold for us??

The Ancient Age: The Beginning of Cryptocurrencies

Strictly speaking, 2017 was not the first time that Gartner introduced something related to Blockchain in its “Hype Cycle”. Two years earlier, in 2015, we could find cryptocurrencies on this curve.

Probably this year, 2015, was a turning point for the first wave of bitcoin adoption. Companies such as WordPress, Microsoft, DELL or Destinia (to mention some Spanish companies) saw in this cryptocurrency an alternative means of payment for their customers. But the use of cryptocurrencies for the purchase of goods and services, integrated into the online shops of these companies, quickly fell into the abyss of disillusionment. The reason was simple: it did not solve any problems, for either party.

There was no clear return for the companies. The number of users was low and the cost savings in the form of lower fees compared to alternatives were unattractive. In addition, there was the risk (technological, legal, economic, reputational, etc.) involved in adopting bitcoin. Nor did users have any incentive to replace credit cards or services such as Paypal, which were easy to use and already accepted as the usual and secure means of making online purchases.

Gartner Hype Cycle for Emerging Technologies – 2015

The Middle Age: From Cryptocurrencies to Smart Contracts

Between 2015 and 2017, expectations of the massive use of cryptocurrencies deflated, weighed down by their use as a means of payment. Companies began to look at the underlying technology that made cryptocurrencies possible. They thought of a “decentralised database” where information was stored, without intermediaries, secure and unalterable, and which could be used for more than just payments. Thus, in 2017, cryptocurrencies disappeared from Gartner’s “Hype Cycle” to make way for Blockchain.

Gartner Hype Cycle for Emerging Technologies – 2017

During this period, new technological solutions began to emerge. They were based on the basic concepts of cryptocurrencies applied to the deployment of general-purpose applications. They did not focus solely on payments and used public and private Blockchain networks. These applications, known as “Smart Contracts”, make it possible to define business logic. Once deployed on these networks, they allow different use cases to be implemented securely (without any actor being able to affect their operation).  They take advantage of the characteristics of immutability, traceability and transparency of the information in Blockchain networks, in those scenarios where these aspects represent a differential value

The Modern Age: Technology in Search Of A Problem

Well, that’s great, but… ‘what’s the point of it’, you may ask. Where does it make sense for a company to integrate some Blockchain technology? Does it really add value to the business? Is it just one more of these purely technological trends that do not go beyond proofs of concept and pilots?

At that time, the objective of those of us who worked with this technology was to answer these questions. To identify use cases where its application was differential. We had to solve a problem that had not been solved until then, or we had to do it in a different way. It seemed difficult to generate revenue, but it did seem possible to obtain efficiencies (cost/time savings). Especially when we were looking at complex processes with multiple actors.

During those two years we probably discarded 90% of the scenarios where we proposed to apply Blockchain. After the initial analysis it did not bring clear benefits. Sometimes it implied more complexity than what it solved, both technologically, operationally or in terms of governance between the actors involved. In other cases, it required a technological maturity that did not yet exist. Or it didn’t make sense to apply Blockchain at all. But we tackled other projects where it made sense, especially in complex processes with multiple actors. In cases where there are intermediaries because we don’t trust the information generated by third parties or there are conflicting interests (as in the reconciliation of international call billing between operators, for example).

The French Revolution: Supply Chains

Among the recurring use cases that were implemented in companies, one stood out above the rest. This was the application of Blockchain in supply chains. It had all the characteristics to become the ideal candidate: different actors involved, a linear process to monitor, multiple entities to control, potential impact on the business, etc.

Telefónica was a pioneer in the application of Blockchain (along with other technologies such as IoT, Big Data or AI) in the transformation of one of our supply chains. We managed to implement a project with real returns: ROI in less than a year and substantial savings. It was not just a matter of applying Blockchain, but also of highlighting the operational improvements that the incorporation of these technologies allowed. This is explained in detail by the Global Supply Chain business area managers who made it possible, in this webinar:

The Contemporary Age: Blockchain for All

And so, we arrive at 2019, the year in which Gartner would no longer include Blockchain in its “Hype Cycle” of emerging technologies. It was already a firmly enough established technology to have a specific “Hype Cycle” in which to predict which application areas would have the greatest development expectations in the coming years. But also which ones were already at the bottom of the “abyss of disillusionment” struggling to find their “product-market fit” in order to reach the market definitively.

Gartner Hype Cycle for Blockchain Business – 2019

If we look first at the lower end of the ” abyss of disillusionment “, we find the generic concepts of ” Blockchain ” and ” Distributed ledgers ” (the superset of distributed ledger solutions, where Blockchain is also included). However, in 2019 we are starting to see the “return” of cryptocurrencies to the curve, as well as several related trends (ICOs and Digital Asset Exchanges).

Taking this graph as a reference, in the last two years we can identify two very distinct trends:

  • The increase in the use of private and consortium Blockchain technologies in the business environment, typically for process optimisation.
  • The development of decentralised financial services based on public Blockchain technologies and cryptoassets.

Blockchain In the Company

In the business sphere, different realities have become evident (and which we have broken down in more detail in the following article):

  • The maturity of Blockchain technologies specifically designed for the business environment
  • The move from proof-of-concept to production environments
  • The importance of return on investment
  • The need to interoperate between the different networks (private, public and consortium) to obtain the greatest benefit.
  • The importance of having reusable components (allowing us not to “start from scratch” each project).
  • The value of effective decentralisation
  • The importance of combining Blockchain technology with other technologies, such as IoT (to give greater veracity to the information that is recorded).

Blockchain In the Financial Sector

Since 2019, the development of decentralised financial services has increased significantly. There is growing interest from both customers who make use of them and companies in the financial sector. It is worth noting:

  • The use of cryptocurrencies as an investment asset (not as a means of payment), both among individuals and by companies such as Tesla and Microstrategy (which have been publicly acquiring bitcoin since 2020)
  • The growth of “stable coins”, cryptocurrencies designed to remain stable in value (avoiding volatility, and thus reducing the risk of traditional cryptocurrencies)
  • The development of DeFi (Decentralised Finance), a set of “decentralised” protocols and services (loans, deposits, derivatives, etc.), supported by the ecosystem of cryptocurrencies and stable coins, whose volume of business has increased sharply since 2020
  • The emergence of NFTs (Non-fungible tokens), which make it possible to develop business models based on unique assets (works of art, collectibles, etc.)
  • The interest of central banks in the development of “digital money”, the so-called CBDCs (Central Bank Digital Currency)

Blockchain Today

Garner recently published his latest Blockchain Hype Cycle (so far), where we can see some of these trends at different stages of the curve.

Gartner Hype Cycle for Blockchain – 2021

Some of these trends will be key to the ultimate development of this technology, especially in the business environment.:

  • The combination of technologies to bring the greatest value to the business.
  • Consolidating PaaS solutions that make it easy to use Blockchain for quick and easy integration into enterprises, such as TrustOS.
  • Consolidation of Layer 2 solutions to increase the performance of existing networks.
  • The development of interoperability solutions between networks, to connect and extend services and avoid “silos”.
  • The promotion of digital identity supported by distributed registration technologies, where Spain is one of the leading nations.
  • The development of new business models based on tokenisation

It is clear that this has only just begun. At Telefónica we have an exciting challenge ahead of us over the next few years to continue to lead the adoption of Blockchain in the enterprise, to successfully address all these lines of development and to realise the expectations of Blockchain in Gartner’s Hype Cycle, as we will discuss in a future article.

Leave a Comment on Blockchain Expectations and Realities From 2017 To 2021 And Beyond

Cyber Security Weekly Briefing 14-27 August

Telefónica Tech    27 August, 2021

Exploitation of vulnerabilities in Exchange ProxyShell​

Security researcher Kevin Beaumont has analyzed the recent massive exploitation of Microsoft Exchange Server vulnerabilities known as ProxyShell. These are a set of flaws revealed by Orange Tsai during the BlackHat  conferences that comprise the following vulnerabilities: CVE-2021-34473CVE-2021-34523 and CVE-2021-31207. In his publication, Beaumont explains how to identify potentially affected systems as well as urges security teams to patch the flaws as soon as possible. This is because, as revealed by Symantec, the LockFile ransomware team has been taking advantage of these vulnerabilities to access networks from victims and to use the PetitPotam vulnerability, yet to be fully patched, to access the domain controller and then, to spread through the networks. So far, at least 10 companies affected by this campaign have been identified, mainly located in the US and Asia. Given the circumstances, CISA has published guidelines to identify affected systems and possible mitigations. ​​ The Microsoft Exchange team has published a new warning updating last week’s information on the set of vulnerabilities known as ProxyShell. The reason behind this new publication is to confirm that Exchange servers are protected if the Microsoft Monthly patches for May and July are installed. Plus, the team recommends to keep this type of software constantly updated. Within the article a series of guidelines are included that allow teams to identify vulnerable Exchange Servers. Moreover, researchers from Huntress have issued several updates on the post where they have been analyzing these vulnerabilities to inform about the detection of over 140 webshells that have been already installed in vulnerable servers belonging to companies from various sectors. According to the researchers, some of the dates in which configuration was tampered date back to March, April, June and July, which means that there could be a connection with ProxyLogon.

All the details: https://doublepulsar.com/multiple-threat-actors-including-a-ransomware-gang-exploiting-exchange-proxyshell-vulnerabilities-c457b1655e9c

Realtek vulnerabilities exploited to distribute malware

In mid-August, four vulnerabilities were disclosed by IoT Inspector Research Lab in a software SDK distributed as part of Realtek chipsets used in hundreds of thousands of smart devices from at least 65 vendors. Among the four issues discovered, the critical vulnerability classified as CVE-2021-35395 received the highest severity rating of 9.8 CVSSv3. Effective exploitation of these bugs could allow unauthenticated attackers to fully compromise the target device and execute arbitrary code with the highest level of privilege. Although Realtek released patches a day before IoT Inspector published its findings, researchers at Seamless Network have detected attempts to exploit these vulnerabilities to propagate a variant of the Mirai malware. Furthermore, and according to Seamless Network’s scans, the most common device models currently running the vulnerable Realtek SDK are: Netis E1+ extender, Edimax N150 and N300 Wi-Fi router, Repotec RP-WR5444 router, recommending owners of such devices to look or inquire their sellers for new firmware patches.

Learn more: https://securingsam.com/realtek-vulnerabilities-weaponized/

38 million records exposed due to Microsoft Power Apps misconfiguration

The UpGuard team has published a report about a misconfiguration in Microsoft Power Apps, which would have resulted in the exposure of more than 38 million personal data records. Microsoft Power Apps allows companies and institutions to create custom applications and can enable the OData (open data protocol) API to retrieve user data from Power Apps lists. On May 24, UpGuard detected that lists with Power Apps data could be anonymously accessed via the OData API, due to the fact that accesses are not limited by default. The investigation discovered thousands of lists accessible on hundreds of portals, including private companies and public administrations, with a variety of data ranging from emails, vaccination appointments, first and last names, phone numbers, or social security numbers. Microsoft has changed the default settings to address the problem and has contacted affected customers, as has UpGuard, which has alerted 47 affected entities.

Full info: https://www.upguard.com/breaches/power-apps

New iPhone exploit used to deploy Pegasus spyware

Researchers at Citizen Lab have detected a new zero-click iMessage exploit, called FORCEDENTRY, that was used to deploy NSO Group’s Pegasus spyware. FORCEDENTRY was used to target the devices of at least nine Bahraini activists, including members of the Bahrain Center for Human Rights, Waad, Al Wefaq, between June 2020 and February 2021. At least four of the activists are believed to have been compromised by LULU, a Pegasus operator attributed with high confidence to the government of Bahrain. Furthermore, it points out that one of the hacked activists, was living in London at the time of the compromise, making this the first documented compromise made by the Bahraini government of a device that was used by an activist in Europe. The Citizen Lab report also states that some of the activists’ phones suffered zero-click iMessage attacks that, besides FORCEDENTRY, also included the 2020 KISMET exploit. Experts recommend disabling iMessage and FaceTime to prevent attacks mentioned in the report, anyway, powerful spyware like the one developed by NSO group has many other exploits in their arsenal.

More: https://citizenlab.ca/2021/08/bahrain-hacks-activists-with-nso-group-zero-click-iphone-exploits/

Vulnerability in Kalay protocol affects millions of IoT devices

Researchers at Mandiant have discovered, in coordination with the Cybersecurity and Infrastructure Security Agency (CISA), a vulnerability in IoT devices using the Kalay network protocol from the manufacturer ThroughTek. The vulnerability, classified as CVE-2021-28372, allows unauthorised remote connection to the devices by an attacker, thus compromising their integrity and allowing audio eavesdropping, real-time video viewing and even the compromise of device credentials. The manufacturer has so far been unable to determine the number of affected devices due to the way the protocol is integrated into the products’ software, although it is estimated that there are at least 85 million active devices using this protocol. Versions prior to 3.1.10 and 3.4.2.0 are affected by this vulnerability.

All the details: https://www.fireeye.com/blog/threat-research/2021/08/mandiant-discloses-critical-vulnerability-affecting-iot-devices.html

‘A Data-Driven Company’ the bedside reading for leading this transformation

Antonio Pita Lozano    25 August, 2021

A Data-Driven Company is the name given to companies that use data as a fundamental asset in their decision-making in all their processes within the company. They do this optimally through the use of advanced analytical techniques such as Artificial Intelligence. As has been said before, it is vital for companies, becoming a matter of survival.

A data-driven Company is also the latest book recently published by Dr. Richard Benjamins, Chief AI & Data Strategist at Telefónica, with the aim of helping companies accelerate their data-driven transformation. The book also includes mini articles by industry experts that provide a transversal view of the major issues companies face in this process.

What I liked most is that it is real. It deals, in a pragmatic way, with the most important questions that companies face in their data-driven transformation process, avoiding the euphoria and overselling that surrounds the world of Artificial Intelligence.

This book, full of experience and advice, helps to identify and recognise the challenges they face and the decisions to be made. Different alternatives are presented and analysed for each of them, facilitating the selection of the alternative that best suits the company according to its particular characteristics. Structured in 21 lessons aggregated in 5 thematic parts that I will summarise at a high level in this post.

Transforming the company

The first part focuses on organisational transformation, focusing on the company’s organisational chart and the relationships between areas that ensure the accomplishment of goals. Starting with the key profile, the Chief Data Officer (CDO), to the relationships between the areas of data, IT, Artificial Intelligence and the CDO itself. These relationships will be established depending on their maturity and for this it is convenient to have maturity measurement methodologies as indicated in the book.

This first part is aimed at the management committees of companies or those responsible for their organisation, so that they can successfully plan the transformation. The remaining parts are aimed at the people who must lead this data-driven transformation within their companies, with the CDO being the main but not the only target of these lessons.

Use cases accelerate transformation

The second part deals with the business and financing of transformation plans. The first step is to establish the selection of the use cases, an arduous task due to the lack of knowledge and uncertainty of the new technologies to be used, and the measurement of the economic impact of the use case to be developed.

At this point, I liked the proposed measurement approach: comparing the economic impact of the use case, both in terms of cost reduction and increased revenue, with the impact if the analytical use case were not available. This is not easy to do, so it is necessary to establish the measurements prior to the use case in order to be able to measure the economic impact. But it is not only economic impact that we get from use cases. Each use case has an impact on the cultural change of the company, although this is difficult to measure, it is undeniable that successful use cases accelerate the transformation of the company.

Technology, a key pillar in this transformation

The third part is focused on technology, one of the fundamental pillars of the data-driven transformation. Apart from the well-known debate between Cloud and On-premises (with permission of hybrid systems), also known as Capex to Opex transformation strategy, other key aspects in the development of the technological strategy are discussed.

Deciding to establish a local or global strategy in the administration and management of data and in the development of advanced analytics may seem easy, but there are many aspects that must be taken into account, such as technological maturity or the impact on data governance, among others. Even contemplating the use of MLaaS, Machine Learning as a Service for more mature organisations.

People at the heart of transformation

The fourth part is all about the people, which in my opinion is the most important pillar to address in the data-driven transformation of companies. The book does not focus on hiring architects, engineers or data scientists to execute the use cases, but on managing the members of the company who must participate in the transformation through the democratisation of data thanks to Data Literacy programmes and Self-services tools. These tools should enable the entire company to extract the maximum value from data, each at his or her level of knowledge and skills.

A lesson is also reserved for managing employee scepticism, which is very common in transformation plans and is often the main challenge in transformation. We must be able to identify and reverse it as early as possible.

In this fourth part we find what for me is the most important lesson: how to create “momentum with data”, in other words, how to approach the execution of use cases, not from the planning of tasks and activities, but from the management of expectations and the management of those involved or affected within the company. A good strategy of interactions and communications that creates positive momentum is key to the success of the use cases and, therefore, of the transformation. 

Without ignoring the responsibility

Part five focuses on the responsibility of data-driven companies in the face of the societal challenges presented by Artificial Intelligence. Issues such as discrimination and bias in algorithms, the appropriateness of the use of black box algorithms, data privacy and security, and the use of autonomous decisions are presented in the book to ensure that those in charge of managing teams using Artificial Intelligence are aware of the impact of their actions. To address these challenges, companies are developing their AI ethics principles to mitigate the aforementioned risks.

In short, it is a must-have book for anyone who wants to lead or participate in the data-driven transformation process of their company or public administration and wants to anticipate the challenges that will arise thanks to the experience of all the people who have collaborated in the book.

The Malware Created in Go Is A Trend And Is Here To Stay

David García    17 August, 2021

Even though it cannot be said that Go is a new programming language (it is already more than ten years old), it does belong to that new batch of languages that, together with Rust, Typescript, Zig, etc., are becoming more and more popular in current times. Go was born, in a way, from Google’s need to find a substitute for the efficient but immensely complex C++. This complexity produced two undesirable effects in a company: very long compilation times and a shortage of engineers with a high level of knowledge, very necessary if you want to take advantage of the language (and not generate security flaws).

Thus, well-known names such as Rob Pike, Ken Thompson and others designed a programming language that would cover the needs described above. A language that would be secure, without memory errors that would result in security vulnerabilities, with reduced compilation times, multiplatform, Internet-oriented and concurrent programming. A priori, the task seemed like an insurmountable challenge, the squaring of a perfect circle.

But it worked, Go was (and is) simple, secure, powerful and modern. Its success is beyond doubt. It is already an ecosystem that has produced projects such as Docker, Kubernetes or CoreOS. Its syntax, which inherits the simplicity of the C language, has opened the doors to a huge number of programmers with its gentle learning curve and automatic memory management.

Is it this ease of use that has caused malware to adopt it as an “official” language?

One of the first opportunities to see Go used as part of a malware infrastructure was with the Mirai botnet (whose source code was released and can be read here). If you recall, Mirai focused on finding and detecting devices with easy or default passwords. It was a tremendous success and within days of going live it had millions of nodes under its control.

Although Mirai was written primarily in the venerable C language, its control centre (C2) was written in Go:

Those who design and program malware are not much different from a developer except for the obviously malicious intentionality. In the end, it is a well-defined evolutionary process that includes updates, improvements and bug fixes with the continuous release of new versions. We also cannot forget that the life of malware is short and ends when the anti-virus industry deploys the signatures necessary for detection. New versions of the same malware can increase that window of infection by slipping under the antivirus radar, if only momentarily.

Since the creation of malware is not very different from that of ordinary software, tools remain common and are no stranger to changing trends. Moreover, in Go they have found a tool with highly desirable features: cross-platform, automatic memory management, unit tests integrated into the language, etc.

Not only did they find a good tool, but in Go the binaries produced statically carry the libraries necessary for their execution, i.e., they do not need the system to have a specific library, they already include it embedded in the executable. This, however, causes the binaries to reach “abnormal” sizes, of several megabytes, a circumstance that can be ambivalent: it allows them to be characterised and, on the other hand, a large size hinders the automatic inspection of the binaries.

Is this enough for the relative success Go is having in its adoption of malware?

When an analyst is interested in a particular sample, he or she uses a set of tools to start to get an idea of what is in front of him or her. Between the static and dynamic analysis, a picture emerges that eventually paints a picture of a malware’s intentions.

Analysts have decades of experience dealing with executables produced by the usual compilers and overwhelmingly made in C or C++. Go binaries are structured differently and when opening them for inspection, the analyst encounters unfamiliar territory. The tools used therefore need to adapt to this new canon in order to find an agile analysis process.

While this is happening, malware writers see an added benefit in the fact that producing a Go binary allows them to slow down analysis and increase the window of malware activity.

A small sample, two basic programs, two “Hello World” in C and Go. Let’s look at their sizes:

Now, we will look at the number of functions (symbols) carried by each of the binaries produced:

As we can see, the Go binary not only carries the program, but also a group of functions from its standard library that allow us to print the classic “Hello World” sentence on the terminal. This behaviour can be reproduced by C if we had statically compiled everything necessary to perform the same function, but this is not common.

However, the number of functions should not daunt an experienced analyst. It is only chaff that needs to be removed until the true root of the binary, the behaviour coded by its authors, is reached. Thus, it is just a matter of adapting the already available reverse engineering and analysis tools to the new landscape.

By the way, if we stripe (delete debug information and symbols (function names, etc.)) from a binary in Go, the symbols (just like their C counterpart) disappear and make analysis even more difficult:

This is only the tip of the iceberg. There are many more problems for analysts to unravel: string detection, memory reservations, and so on. It all “looks” familiar to the binaries produced by other compilers, but it is different in the way that normal operations are performed. It’s like speaking assembler, but with a noticeably different accent.

Is this a strong trend to follow?

The number of malware samples grows daily in enormous proportions. Obviously, many are variants of the same family and there is a melting pot of programming languages on which they are designed and released.

As early as 2019, a study by PaloAlto Networks’ UNIT42 reported 13,000 unique malware samples created on Go. A couple of years later, the trend was being confirmed by other labs, with Intezer’s report, published last March, representing a 2000% growth since a few years ago

No only Go

Another modern programming language is Rust, born in the heart of Mozilla, now independent of the Firefox browser organisation. Nor is it free from being used by cybercriminals to include it in their arsenal of tools.

Interestingly enough, Rust, which is a language destined to be the replacement for the complex and enormous C++, has a meme called “Rewrite it in Rust”, which consists of asking projects that are not written in Rust to rewrite them in this language. Evidently, in a playful and mocking tone.

Well, it has happened for real in the malware world with “Buer”. This malware became known written in C language… until a few months, when they found a complete rewrite in Rust.

Malware is an industry, no wonder, and the processes that are inherent to industry eventually permeate it. The design, assembly, deployment, and quality chains are all modelled on those used by organisations. As we have seen, anything that gives a competitive advantage will be used using the latest trends in development.