by Maria Perez | Nov 15, 2019 | Compliance News
Microsoft released one more warning concerning the need to patch the BlueKeep vulnerability (CVE-2019-0708). The vulnerability required immediate patching considering the October 23 mass attack that took advantage of this vulnerability.
The attack was first identified on November 2, but the attacker was unable to totally exploit the vulnerability. It seems that the threat actor has a low skill level and launched the campaign to exploit the flaw to deploy cryptocurrency mining malware. Microsoft gave another warning that things could go worse.
The first try of mass exploitation acquired a great deal of attention from mass media, but it seems that it did not have a great effect on the seriousness of patching. SANS Institute performed a scan and observed that the speed of patching didn’t quite change after the mass attack. Microsoft released the patch in May and the number of unpatched devices diminished, yet there are still a lot of devices that can be exploited by BllueKeep.
Even though the attack was extensive, it had minimal success. In the majority of cases, the exploit employed failed to work properly and the devices merely crashed. In the event that an expert threat actor exploited the vulnerability with success, it’s possible to connect a vulnerable device via RDP services without the need for user interaction. Codes may be implemented on unsecured computer systems, in order that the attacker can access, modify, and steal data, install malware, and begin attacks on other unpatched devices connected to the network system, which include those that are not exposed on the web.
In 2017, security specialist Marcus Hutchins discovered and initialized a ‘kill switch’ to take care of the WannaCry ransomware damages. At this point, he is cautioning that a ransomware attack is capable of causing a big disruption even with no worm, considering that the vulnerable devices were servers.
Microsoft said that although it is unlikely to prevent the BlueKeep attacks, there are other more threatening exploits that could be made and used in massive attacks on vulnerable devices. Microsoft customers need to identify and update all vulnerable devices straight away.
by Maria Perez | Nov 6, 2019 | Compliance News
The National Institute of Standards and Technology (NIST) launched its final Big Data Interoperability Framework (NBDIF) to assist with the design of data analysis software applications which could operate on just about any computing platform and be conveniently transferred from one computing platform to a different one.
NBDIF is the end result of many years of work and joint venture of over 800 authorities from the government, academe, and private community. The final document consists of nine volumes talking about big data definitions and taxonomies, use circumstance & prerequisites, reference architecture, roadmap standards, privacy and security, a reference architecture interface, and adoption and modernization.
The primary intent behind NBDIF is to advise developers on the design and deployment of greatly helpful tools for big data examination that could be used on diverse computing platforms; from one laptop computer to multi-node cloud-based settings. Developers must make their big data analysis tools to enable them to immediately be migrated from platform to platform and enable data analysts to be changed to more complex algorithms without being forced to retool their computer settings.
Developers can use the framework to make an agnostic setting for big data analysis tool production to ensure their tools could help data analysts’ findings to run continuously, even when their targets change and technology improves.
The amount of files that require analysis has increased significantly recently. Data is presently obtained from a huge range of devices, such as an assortment of sensors hooked up to the internet of things. A few years ago, close to 2.5 exabytes which equal billion billion bytes, of information are generated daily around the world. By 2025, international information generation has been estimated to have 463 exabytes each day.
Data scientists may use large datasets to acquire precious observations and big data analysis tools will permit them to level up their analyses from just one laptop unit to distributed cloud-based settings that work through various nodes and analyze big amounts of information.
So as to do that, data analysts might be required to recreate their tools from the start and employ varied computer languages and algorithms to permit them to be employed on varied platforms. The usage of the framework will boost interoperability and substantially minimize the problem on data analysts.
The final copy of the framework comprises consensus definitions and taxonomies to be sure developers understand each other when talking over options for new analysis tools, besides data privacy and security prerequisites, and a reference architecture interface spec to direct deployment of their tools.
The reference architecture interface specification is beneficial to vendors when developing flexible settings where any tool could function in. In the past, no standard for developing interoperable options are available. At this time there is.
The big data analysis tools could be utilized in different ways, for instance in drug discovery where experts have to assess the behavior of some candidate drug proteins in one set of assessments, then utilize that information into the succeeding round. The flexibility to make changes immediately will help to hasten the analyses and minimize drug development expenditures. NIST, in addition, proposes that the tools can help analysts distinguish health scams with less effort.
The reference architecture will permit the user to pick whether to do analytics using the most recent machine learning and AI tactics or the conventional statistical methods.