Oops! It appears that you have disabled your Javascript. In order for you to see this page as it is meant to appear, we ask that you please re-enable your Javascript!

CIS Hosting

CALL NOW!(ID:182259)

Blockchain technology could help usher in a whole new era for UK businesses, new research has claimed.

A report from Capgemini has found the blockchain could become ubiquitous across many businesses by 2025, primarily through playing a key role in supporting supply chains across the world.

The UK currently leads the way in supporting blockchain initiatives and implementation, ahead of the USA, and businesses here are set to boost their investment by 30 percent over the next three years, the company found.

Blockchain UK boost

Overall, only three percent of organisations are currently deploying blockchain at scale (with 10 percent saying they have have some kind of pilot scheme underway) however 87 percent of respondents said they were in some form of experimentation with the technology.

The study also found that cost saving (89 percent), enhanced traceability (81 percent) and enhanced transparency (79 percent) are the top three drivers behind current investments in blockchain. 

“There are some really exciting use cases in the marketplace that are showing the benefits of blockchain for improving the supply chain, but blockchain is not a silver bullet solution for an organization’s supply chain challenges,” said Sudhir Pai, chief technology officer for financial services at Capgemini. 

“Blockchain’s ROI has not yet been quantified, and business models and processes will need to be redesigned for its adoption. Effective partnerships are needed across the supply chain to build an ecosystem-based blockchain strategy, integrated with broader technology deployments, to ensure that it can realize its potential.”

TechRadar: latest computing news

As enterprises strive for ways to better interpret data, they traditionally have relied on predictive analytics. However, as business becomes more competitive, organizations have realized they need to tap into even deeper insights and are beginning to look at the next level. By going beyond predictive analytics to leverage more advanced tools, such as prescriptive analytics, enterprises are able to leverage recommended actions that address uncovered insights.

What is traditional analytics and what is advanced analytics? Is there a natural progression toward advanced or are they different deployments?

The promise of analytics has always been about freedom of information and knowledge. The challenge has been how to glean meaningful knowledge from data and present it to an end user in a way they can understand and use to make critical business decisions in a timely manner. 

Traditional analytics has focused on providing humans with tools to assist us with understanding data so we can become the interpreters of data. Human minds are very bad at detecting trends and patterns from raw data, so we create tools that give us the ability to manipulate data into visual constructs that the brain can better interpret. The next level, advanced analytics, gives us new solutions with advanced statistical techniques to uncover patterns in the data and serve up results to the end, rather than relying on humans to build visualizations (which can be subject to bias). 

There is a natural progression towards advanced analytics – it is a journey that does not have to be on separate deployments. In fact, it is enhanced by having it on the same deployment, and embedding it in a platform that brings together data visualization, planning, insight, and steering/oversight functions. 

Business analysts who start this journey will need advanced analytics features to be easy to use and accessible in their data visualization or planning tools at the click of a button. This allows enterprises to take the first step towards advanced analytics without needing significant retraining or upskilling. Users just need to know what kind of advanced analytics problem they are trying to solve and then the software should help solve it and provide the working model.

What is the main differentiator of predictive analytics, machine learning and prescriptive analytics?

This can be viewed as part of a continuum – predictive analytics is about the application of mathematics to data to uncover patterns and relationships. The question becomes “who is building these mathematical models?” 

In the case of predictive analytics, there is a human or team of humans that are explicitly building these models. Machine learning is about removing the human element from the model creation so that software can build the models on behalf of the end user, and wherever possible, automate all other steps in the advanced analytics process. 

Finally, prescriptive analytics is about taking the results of those models and making a recommended action for addressing the uncovered insights. Prescriptive analytics further removes the human element from the decision-making process and instead allows software to identify and suggest the next course of action. 

Are they used in different business processes or is one more advanced than the other?

They are used in the same business process, and enable users to automate at significant scale. However, prescriptive analytics makes possible the pursuit of new business processes and those are definitely more advanced than what a user can do with traditional analytics or even predictive analytics on its own. 

For example, if someone can automate the end to end decision making process with prescriptive analytics based on live data coming from sensors in the manufacturing plant, why not take that one step further? Use the software to automatically take action and record the results of those actions without human assistance.  The human effort is then focused on a higher level, ensuring process execution is effective in the aggregate.

Are businesses currently utilizing both or just one?

Business are using a mix of both to varying degrees of maturity. Some applications have embedded predictive models developed with machine learning techniques, which expose the results directly in the application and deliver suggested recommendations for the next action to take. These sit side by side with the workflow driven approach that would be more common in an organization. 

What are the biggest barriers to successfully leveraging predictive analytics across an organization? 

The biggest barrier has been a lack of human resources capable of building predictive analytics in a usable timeframe. In many ways, this is similar to what the traditional analytics market experienced.  Data was locked away in databases and only those database administrators who could code structured query language were able to access it. Because of that shortage, and the importance of that information, tools became available to make it possible for a non-technical person to access that data and create meaningful visualizations.

What are the challenges of prescriptive analytics and how are some businesses preparing to offset those?

The most obvious challenge with regards to prescriptive analytics is governance. If you are removing humans from the decision-making process and your model is flawed (either because the situation has changed, or because you have outlier situations), then you will not only make mistakes, but you will make them automatically and at scale. In areas of sensitivity or human risk, it is becoming common to enable a human with the insights and benefits of prescriptive analytics, but ultimately the human makes the final decision.

How can companies prepare to go from predictive analytics to prescriptive?

Companies can prepare for this by changing the way they frame traditional analytics questions. Instead of thinking only about how to get a result to answer a business question, companies need to think about the new types of business questions they can ask. In other words, if you could look at historical patterns over time and project what is likely to occur in future periods, where would you apply this? Next, empower business analysts to actually solve these problems with tools that give them the ability to easily use machine learning techniques to develop the models. 

David Judge, Vice President of Leonardo | Analytics at SAP 

TechRadar: latest computing news

The fact that Samsung is working on an in-screen fingerprint reader for the Galaxy S10 has been well documented, but it’s clear now that Samsung has a gamut of in-screen features in the works.

A photo posted to Twitter by user Ice Universe during Samsung Display’s ‘2018 Samsung OLED Forum’ event in China reveals the display division of Samsung’s entire 2019 product roadmap. Of course, the long-rumored in-screen fingerprint reader is shown clearly on the presentation slide, but so are three other exciting developments.

The slide briefly describes an Under Panel Sensor, or UPS, that will likely house a camera (i.e. image sensor) in addition to other sensors. There are also plans to implement haptics beneath the screen, or HoD, which we imagine would work similar to Apple’s 3D Touch on iOS devices. Finally, the slide reveals plans for sound on display technology, or SoD, as in front-facing speakers and earphones beneath the display.

This collection of features could be the key to truly bezel-less, 100% screen phones and tablets (perhaps even laptops). Up until now, these nearly all-screen smartphone designs have required ‘notches’ of various sizes.

Now, it’s well known that the in-screen fingerprint reader is already available in smartphones, but those phones are currently exclusive to China and surrounding regions. This expansion of in-screen technologies, from a global technology firm, could allow for all sorts of new device form factors, especially if Samsung Display licenses out the technology to prospective buyers.

As for when we can expect to see these developments in products on the shelves, it could be as early as 2019, especially the well-established fingerprint reader, but more likely to be 2020.

  • Everything you need to know about OLED displays

Via MSPowerUser

TechRadar: latest computing news

After decades of an eager and unbridled embrace of the online ecosystem, companies and consumers are becoming painfully aware of the vulnerabilities associated with the digital age. A superfluous series of high-profile data breaches, cases of corporate mismanagement, and tragic insider thefts are causing many people to question the efficacy of our relationship with the internet and its abundance of immersive platforms. 

As a result, positive brand awareness is quickly becoming intertwined with competent data protection, and, as expected, many companies are struggling in this regard. 

Big data has big problems

For instance, in March, Facebook launched an ambitious public relations campaign intended to help the company restore its damaged reputation after the often-reported Cambridge Analytica scandal revealed that an unprecedented level of mismanagement and ignorance allowed a third-party to siphon the personal data of 87 million users.

To support the initiative, Facebook CEO, Mark Zuckerberg, released a cogent statement explaining, “We have a responsibility to protect your data, and if we can't then we don't deserve to serve you.” 

Nevertheless, the company closed out September with news of yet another data breach, this time compromising 50 million accounts. Publicly, Facebook says that it’s prioritizing data security, but protecting their users’ data continually proves challenging. 

To be sure, Facebook is not alone. Yahoo, Equifax, Under Armour, and even the U.S. Department of Defense and the U.S. electrical grid have endured network breaches.

For all their problematic peculiarities, each of these incidences share a common theme. They were perpetrated by malicious but external bad actors. Unfortunately, these and other organizations are also frequently plagued by numerous internal risks that threaten to undermine data security. Internal threats like accidental sharing or intentional data theft actually lead to more data loss than exploitation of software vulnerabilities and external threats combined.

What’s more, since social media conglomerates aren’t the only companies with significant amounts of data to protect, this shifting and increasingly perilous digital landscape can seem daunting for companies consider. 

A problem with a solution

Fortunately, the current data loss prevention crisis is not a problem without a solution. In fact, there are tangible steps that any company can take to secure their data, protect their brand, and promote their bottom line. 

Data visibility

According to Forbes, 2.5 quintillion bytes of data are created every day, a swelling number that is only growing as the internet becomes more ingrained in every platform, service, and activity. Companies contribute a significant share of that data, and they are responsible for managing and securing their portion. 

Everything from the plethora of emails sent each day to the sensitive customer information that underpins modern platforms collectively creates this incredible amount of data, and companies need visibility into this information to protect its integrity. 

It’s common for many employees to transfer data across multiple platforms through several different access points. Data visibility initiatives can provide companies with a real-time snapshot of their data landscape and security vulnerabilities, allowing them to make informed and immediate decisions to protect their data. 

Nobody can afford to be ignorant about their data, and a software empowered snapshot can help IT administrators best understand and respond to their active data environment.

Activity monitoring

User activity monitoring software can provide companies with a more detailed view of user activity with sensitive data, with the ability to identify usage patterns, unintentional misuse, and intentional data exfiltration attempts. 

As a result, creating, implementing, and monitoring company policies regarding technology use and data appropriation leveraging employee monitoring software is a highly effective way for organizations to protect their information against insider threats. 

In a data environment where compromise can be catastrophic, all avenues for data protection should be available to IT administrators. 

User behavior analytics

Advancements in machine learning are making it possible to combine data visibility and activity monitoring activities into a comprehensive user behavior assessment. 

When employees and team members become insider threats, user behavior analytics can indicate behavioral changes and other markers that can help identify and address potential threats. Real-time identification allows companies to pursue quick investigations to mitigate the risk of intentional or accidental data theft. 

A differentiating factor

Increasingly, companies can differentiate their business model and their brand by providing a secure and intelligent data monitoring apparatus. It can be difficult or impossible to repair a brand’s image after a data breach – look no further than Facebook’s public relations nightmare and their subsequent $ 120 billion devaluation – but companies that secure their information on the front end can benefit from the inverses effect. They can achieve a brand boost predicated on integrity and intentionality. 

The current technological landscape may be perilous, but there are clear steps that organizations can take to better protect their data and support their customers, employees, and bottom lines.

Isaac Kohen, Founder and Chief Technology Officer of Teramind 

TechRadar: latest computing news

The successor to Samsung’s take on the Windows tablet is here: the Samsung Galaxy Book 2, and it’s got a new friend under the hood – a Qualcomm processor. The two companies made the announcement during an in-depth panel discussion in New York City.

Of course, the Galaxy Book 2’s marquee feature is LTE connectivity through the modem within Qualcomm’s Snapdragon 850 system on a chip, or SoC. However, another major point for this 2-in-1 tablet is that the essential stylus and keyboard cover accessories come in the box.

That box filled with the complete product, you might be wondering, will cost $ 999 (about £757, AU$ 1,399) in the US when it launches on November 2 on AT&T, Microsoft and Samsung’s online stores. The tablet is also going straight to AT&T, Sprint and Verizon stores later this month. Naturally, you'll need a data plan from one of these carriers if you want that connectivity anywhere.

This is an awfully impressive price given what appears to be on offer on paper, if only for including those arguably crucial accessories.

Galaxy Book 2 swims with the tide

Previous iterations of the Galaxy Book, or Samsung Windows tablets like it, went against the trend of kickstand designs in lieu of ones that rely more on the keyboard for its various modes of use. Now, the Galaxy Book 2 has gotten in line with the overwhelming majority in its design.

This may make the Galaxy Book 2 a bit less unique in its look and feel, but it may well have been a worthy decision given feedback on previous models provided by many, including yours truly.

As for the tablet’s speeds and feeds, you’re of course looking at a Qualcomm Snapdragon 850 SoC backed up by 4GB of memory and 128GB of solid-state storage. This all sits behind a 12-inch Super AMOLED display with a 2,160 x 1,440-pixel resolution.

Samsung and Qualcomm promise a battery life of up to 20 hours from the tablet. However, this number was achieved using Windows 10 in S Mode, which the tablet ships with.

On the outside, the tablet sports two USB-C 3.1 ports, a microSD card reader and a headphone jack.

Of course, Qualcomm’s struggles with launching its computing platform are well documented, so only a full review will tell whether this is the product to turn things around for it.

TechRadar: latest computing news