Digital Strategies:  Are Analytics Disrupting the World?

Keith Manthey

CTO of Analytics at EMC Emerging Technologies Division

Close up of woman hand pointing at business document during discussion at meeting“Software is eating the world”.  It is a phrase that we often see written, but sometimes do not fully understand.  More recently I read derivations of that phrase that posits that “analytics are disrupting the world”.  Both phrases have a lot of truth.  But why? Some of the major disruptions in the last 5 years can be attributed to analytics.  Most companies that serve as an intermediary, such as Uber or AirBNB, with a business model of making consumer and supplier “connections” are driven by analytics.  Pricing surges, routing optimizations, available rentals, available drivers, etc. are all algorithms to these “connection” businesses that are disrupting the world.  It could be argued that analytics is their secret weapon.

It is normal for startups to try new and sometimes crazy & risky investments into new technologies like Hadoop and analytics.  The trend is carrying over into traditional industries and established businesses as well.  What are the analytics uses cases in industries like Financial Services (aka FSI)?

Established Analytics Plays in FSI

Two use cases naturally come to my mind when I think of “Analytics” and “Financial Services”; High Frequency Trading and Fraud are two traditional use cases that have long utilized analytics.  Both are fairly well respected and written about with regard to their heavy use of analytics.  I myself blogged recently (From Kinetic to Synthetic) on behalf of Equifax regarding the market trends in Synthetic Fraud.  Beyond these obvious trends though, where are analytics impacting the Financial Services industry?  What use cases are relevant and impacting the industry in 2016 and why?

Telematics

The insurance industry has been experimenting with opt-in programs that monitor driving behavior for several years.  Insurance companies have varying opinions of its usefulness, but it’s clear that driving behavior is (1) a heavy use of unstructured data and (2) a dramatic leap from the statistical based approach using financial data, actuarial tables, and statistics.  Telematics is the name given to a set of opt-in programs around usage-based insurance / driver monitoring programs. Telematics use in insurance companies has fostered a belief that has long been used in other verticals like fraud that pins behavior down to an individual pattern instead of trying to predict broad swaths of patterns.  To be more precise, Telematics is looking to derive a “behavior of one” vs a “generalized driving pattern for 1K individuals”.  As to the change of why this is different from past insurance practices, we will draw a specific comparison between the two. Method One – historical actuarial tables of life expectancy along with demographic and financial data to denote risk vs. Method Two – how does ONE individual drive based upon real driving data as received from their car.  Which might be more predictive about the expected rate of accidents is the question for analytics.  While this is a gross over-simplification of the entire process, it is a radical shift of the types of data and the analytical methods of deriving value from the data available to the industry.  Truly transformational.

Labor Arbitrage

The insurance industry has been experimenting with analytics based on past performance data.  The industry has years of predictive information (i.e., claim reviews along with actual outcomes) based on past claims.  By exploring this past performance data, Insurance companies are able to apply logistical regression algorithms to derive weighted scores.  The derived scores are then being analyzed to determine a path forward.  For example, if scores greater then 50 amounted to claims that are evaluated and then almost always paid by the insurer, then all scores above 50 should be immediately approved and paid.  The inverse is also true that treatments can be quickly rejected as they are often not appealed or regularly turned down under review if appealed. The analytics of the actual present case was compared against previous outcomes of the corpus of past performance data to derive the most likely outcome of the case.  The resulting business effect would be that the workforce that reviewed medical claims would only be given those files that needed to be worked.  The result would be a better work force productivity.  Labor Arbitrage with data and analytics being the disruptor of workforce trends.

Know Your Customer

Retail Banking has turned to analytics as they have focused on attracting and retaining their customers.   After a large trend of acquisitions in the last decade, retail banks are working to integrate their various portfolios.  In Business people shaking hands, finishing up a meetingsome cases, resolving down the identity of all their clients on all their accounts isn’t always as straight forward as it sounds.  This is especially hard with dormant accounts that might have maiden names, mangled data attributes, or old addresses.  The ultimate goal of co-locating all their customer data into an analytics environment is a customer 360.  Customer 360 is mainly focused on gaining full insights around a customer.  This can lead to upsell opportunities by understanding what a customer’s peer set and what products a similar demographic has a strong interest in. For example, if individuals of a given demographic typically subscribe to 3 of a company’s 5 products, an individual matching that demographic should be targeted for upsell on those additional products when they only subscribe to 1 product.  This is using large swathes of data and companies own product adoptions to build upsell and marketing strategies for their own customers.  If someone was a small business owner and personal consumer of the retail bank, the company may not have previously tied those accounts together.  It gives the bank a whole new perspective on who its customer base really is.

Wrap Up

Why are these trends interesting?  In most of these cases above, people are familiar with certain portions of the story.  The underlying why or what, might often get missed.  It is important to not only understand the technology and capabilities involved with transformation, but also the underlying shift that is being caused. EMC has a long history of helping customers through those journeys and we look forward to helping even more clients face them.

 

 

 

 

MLBAM Goes Over the Top: The Case for a DIY Approach to OTT

James Corrigan

Advisory Solutions Architect at EMC

Latest posts by James Corrigan (see all)

smart tvWhen looking at the current media landscape, the definition of what constitutes a “broadcaster” is undergoing a serious overhaul. Traditional linear TV might not be dead just yet, but it’s clearly having to reinvent itself in order to stay competitive amid rapidly evolving media business models and increasingly diverse content distribution platforms.

The concept of “binge watching” a TV show, for example, was non-existent only a few years ago. Media consumption towards digital and online viewership on a myriad of devices such as smartphones, tablets and PCs is on the rise. Subscription on-demand services are becoming the consumption method of choice, while broadcast-yourself platforms like Twitch and YouTube are fast becoming a popular corner stone of millennial’s viewership habits. Horowitz Research found that over 70 percent of millennials have access to an OTT SVOD service, and they are three times as likely to have an OTT SVOD service without a pay TV subscription. PricewaterhouseCoopers (PwC) estimates that OTT video streaming will grow to be a $10.1 billion business by 2018, up from $3.3 billion in 2013.

As a result, broadcast operators are evolving into media aggregators, and content providers are transforming into “entertainment service providers,” expanding into platforms ranging from mobile to digital to even virtual theme parks.

Building Versus Buying:

This change in media consumption requires media organizations to consider a more efficient storage compute and network infrastructure. Media organizations need flexible and agile platforms to not only expand their content libraries but also to meet the dynamic growth in the number of subscribers and how they consume and experience media and entertainment.

To successfully compete in OTT market is dependent upon the “uniqueness” of your service to the consumer , This uniqueness comes from either having unique or exclusive content, or by having a platform which is able to adapt and offer the customer more than just watching content. For the latter how you deploy your solution whether it be (1) build your own (“DIY”), (2) buy a turn-key solution or (3) take a hybrid approach, is key to success.

MLBAM Hits a Home Run with a DIY Approach

A key advantage of the “DIY” approach is that it increases business agility, allowing media organizations to adapt and change, as consumers demand more from their services. For some media organizations this  allows them to leverage existing content assets, infrastructure and technology teams and keep deployment costs low. Further, layering OTT video delivery on top of regular playout enables organizations to incrementally add the new workflow to the existing content delivery ecosystem. For new entrants,  the DIY approach enables  new development methodologies, allowing these “new kids on the block” to develop micro-services unencumbered by legacy services.

One example of an organization taking the DIY approach is Isilon customer Major League Baseball Advanced Media (MLBAM), which has created a streaming media empire. MLBAM’s success underscores the voracious and rapid growth in consumer demand for streaming video; it streams sporting events, and also supports the streaming service HBO GO, as well as mobile, web and TV offerings for the NHL.

“The reality is that now we’re in a situation where digital distribution isn’t just a ‘nice to have’ strategy, it’s an essential strategy for any content company,” said Joe Inzerillo, CTO for MLBAM. “When I think about…how we’re going to be able to innovate, I often tell people ‘I don’t manage technology, I actually manage velocity.’ The ability to adapt and innovate and move forward is absolutely essential.”

Alternatively, the turn-key approach, which either outsources your media platform or gives you a pre-built video delivery infrastructure, can offer benefits such as increased speed-to-market. However, selecting the right outsource partner for this approach is critical; you choose incorrectly and it can create vendor lock-in, loss of control and flexibility and larger operational costs.

Making it Personal: Analytics’ Role

3D smart tv with hand holding remote control isolatedBeing able to access content when and where consumer’s want – on the device they want – is one part of the challenge with the rise of digital and online content. Another key component is personalization of that content to viewers. Making content more relevant and tailored for subscribers is critical to the success of alternate broadcast business models – EMC and Pivotal are helping media companies extract insights on customers through the development and use of analytics should be key to any OTT strategy. Analyzing data on what consumers are watching should be used to help drive content acquisition and personalized recommendation engines. The added benefits of personalized advertisement of content through targeted ad insertion will help increase revenue through tailored advertisements.

Scaling for the future

Infrastructure platforms that scale is the final consideration for the new age media platforms. Being able to scale “apps” based on containers or virtual instances is key. To do that you need a platform that scales compute, network and storage independently or together, just like EMC’s scale out NAS with Isilon or scale out compute with VCE or VXRail/Rack. MLBAM’s Inzerillo explains. “The ability to have a technology like Isilon that’s flexible, so that the size of the data lake can grow as we on board clients, is increasingly important to us. That kind of flexibility allows you to really focus on total cost of ownership of the custodianship of the data.”

Inzerillo continues, “If you’re always worried about the sand that you’re standing on, because it’s shifting, you’re never going to be able to jump, and  what we need to be able to do is sprint.”

It’s an exciting time to be in the ever-evolving media and entertainment space – the breadth of offerings that broadcasters and media companies are developing today, and the range of devices and distribution models to reach subscribers will only continue to grow.

Infrastructure Convergence Takes Off at Melbourne Airport

Yasir Yousuff

Sr. Director, Global Geo Marketing at EMC Emerging Technologies Division

By air, by land, or by sea? Which do you reckon is the most demanding means of travel these days? In asking so, I’d like to steer your thoughts to the institutions and businesses that provide transportation in these myriad segments.

Melbourne Airport_resizedHands down, my pick would be aviation; out of which the heaviest burden falls on any international airport operating 24/7. Let’s take Melbourne Airport in Australia for example. In a typical year, some 32 million passengers transit through its doors – almost a third more than Australia’s entire population. If you think that’s a lot; that figure looks set to double to 64 million by 2033.

As the threat of terrorism grows, so will the criteria for stringent checks. And as travelers get more affluent, so will their expectations. Put the two together, you get somewhat of a paradoxical dilemma that needs to be addressed.

So how does Australia’s only major 24/7 airport cope with these present and future demands?

First Class Security Challenges

Beginning with security, airports have come to terms with the fact that sole passport checks in the immigration process isn’t sufficient. Thanks to Hollywood movies and their depictions of how easy it is to get hold of “fake” passports – think Jason Bourne but in the context of a “bad” guy out to harm innocents, a large majority of the public within the age of reasoning would have to agree that more detailed levels of screening are a necessity.

“Some of the things we need to look at are new technologies associated with biometrics, new methods of running through our security and our protocols. Biometrics will require significant compute power and significant storage ability,” says Paul Bunker, Melbourne Airport’s Business Systems & ICT Executive.

With biometrics, Bunker is referring to breakthroughs such as fingerprint and facial recognition. While these data dense technologies are typically developed in silos, airports like the Melbourne Airport need them to function coherently as part of its integrated security ecosystem and processed in near real-time to ensure authorities have ample time to respond to threats.

First Class Service Challenges

Then there are the all-important passengers who travel in and out for a plethora of reasons: some for business, some for leisure, and some on transit to other destinations.

Whichever the case, most, if not all of them, expect a seamless experience. In this regard, it means free from the hassles of waiting for long periods to clear immigration, picking up luggage at belts almost immediately after, and the list goes on.

With the airport’s IT systems increasingly strained in managing these operational outcomes, a more sustainable way forward is inevitable.

First Class Transformative Strategy

Melbourne Airport has historically been more reactive and focused heavily on maintenance but that has changed in recent times. Terminal 4, which opened in August 2015, became the airport’s first terminal to embrace digital innovation, boasting Asia-Pacific’s first end-to-end self-service model from check-in kiosks to automated bag drop facilities.

This comes against the backdrop of a new charter that aims to enable IT to take on a more strategic role and drive greater business value through technology platforms.

“We wanted to create a new terminal that was effectively as much as possible a fully automated terminal where each passenger had more control over the environment,” Bunker explained. “Technical challenges associated with storing massive amounts of data generated not only by our core systems but particularly by our CCTV and access control solutions is a major problem we had.”

First Class Solution

In response, Melbourne Airport implemented two VCE Vblock System 340 with a VNX5600 converged infrastructure solution featuring 250 virtual servers and 2.5 petabytes of storage capacity. Two EMC Isilon NL series clusters were further deployed at two sites for production and disaster recovery.

Business People Rushing Walking Plane Travel Concept

The new converged infrastructure has allowed Melbourne Airport to simplify its IT operations by great leaps, creating a comfortable buffer that is able to support future growth as the business matures. It has also guaranteed high availability on key applications like baggage handling and check-in, crucial in the development of Terminal 4 as a fully automated self-service terminal.

While key decision-makers may have a rational gauge on where technological trends are headed, it is far from 100%. These sweeping reforms have effectively laid the foundations to enable flexibility in adopting new technologies across the board – biometrics for security and analytics for customer experience enhancement – whenever the need calls for it. Furthermore, the airport can now do away with separate IT vendors to reduce management complexity.

Yet all these come pale in comparison to the long-term collaborative working relationship Melbourne Airport has forged with EMC to support its bid to become an industry-leading innovation driver of the future.

Read the Melbourne Airport Case Study to learn more.

 

Digital Health Strategies – An introduction to Elastic Cloud Storage (ECS)

Nathan Bott

Healthcare Solutions Architect at EMC

This past April, my father reached two important milestones – he turned 70 and retired from a 40-plus year career in food science.  He is now planning to head back to Spain to complete the Camino de Santiago – or the Way of St. James – a journey he started in 2014.  Unfortunately he had to stop 150 miles into the 500 mile trek because of severe back and hip pain due to the emergence of degenerative disc disease.  After working with his physician to manage this new condition, he started to prepare for the upcoming trip by walking between 5 and 10 miles three times a week.  Along with this training came other ailments that would be expected with anybody his age:  pulled muscles, strained knees, and “light-headedness.”  This last ailment can be attributed to another condition he happens to have – Type 2 Diabetes.  And so it goes, as he gets older and tries to maintain a high level of activity, he will suffer more ailments, and spend more time and money (via Medicare benefits) managing these chronic conditions.

And he will not be alone.  My father was born in 1946 and is thus a first year baby-boomer, the first wave of new Medicare beneficiaries in which about 10,000 enroll every day.  The Congressional Budget Office expects over 80 million Americans will be Medicare eligible by 2035, an almost 50% increase in enrollment from 2015.  The cost per beneficiary is expected to increase even more as each patient will have multiple chronic conditions to manage; per the National Council on Aging:

  • About 68% of Medicare beneficiaries have two or more chronic diseases and 36% have four or more.
  • More than two-thirds of all health care costs are for treating chronic diseases.

The US government and the healthcare industry are well aware of the current “silver tsunami” and planning has been underway.

For the past 7 years, since the passage of the Hi-Tech provision in the American Recovery and Reinvestment Act (ARRA) in 2009 and the Medicare Shared Saving Program (MSSP) in 2011 the ground work has been laid to implement various programs and incentives to distribute the efforts to manage the cost of delivering healthcare to an ever expanding beneficiary population.  The prolific adoption of electronic health records technology by healthcare providers and the reorganization of reimbursements to these providers – from a fee-for-service to an outcomes based model – have combined to become a catalyst for a digital revolution in healthcare.

Government led healthcare reform programs like Accountable Care Organizations (ACO), the Patient Centered Medical Home, and the Precision Medicine initiative are predicated with having a digital technology platform that can use the demographic, financial, clinical and genetic data acquired from the vast population of patients to develop evidence-based plans of care that are specifically tailored based on the genetic disposition and the disease(s) of a given patient.

Medicine doctor hand working with modern medical iconsRegardless of the industry, product or service, a disruptive technology that drives innovation through digitization requires a re-assessment of the infrastructure that supports it; the healthcare industry is no different.  As healthcare providers have implemented electronic medical records systems, deployed enterprise imaging solutions, piloted next generation sequencing programs, and developed clinical informatics capabilities, new infrastructure requirements and operating modes have emerged.  Furthermore, in response to the evolving markets and reimbursement models explained above, many healthcare entities – providers, payers, and pharmaceuticals alike – have consolidated through mergers and acquisitions which also necessitate re-evaluating infrastructure architectures in order to rationalize operational capabilities, drive utilization efficiency and decrease both operational and capital costs.

Working directly with healthcare customers, collaborating with healthcare software vendors, and partnering with IT service providers, EMC has been on the front line to provide architectural guidance and infrastructure solutions to support this digital revolution and its emerging infrastructure requirements. A key infrastructure solution to support the digitization revolution in healthcare is a highly durable, geo-distributed, performant storage platform that will work with legacy monolithic systems using file system interfaces as well as cloud-native distributed applications using standard storage APIs like AWS S3 or OpenStack Swift.

ECSEMC’s Elastic Cloud Storage system (ECS) is a modern object storage platform that does just that…and more.  Just as important, the ECS object platform can be used for a myriad of use cases specifically for the healthcare industry to support:

  • Innovative technology platforms which enable coordinated and accessible medical services such as outlined by the Patient-Centered Medical Home program
  • Collaboration and data sharing as needed for programs such as the Accountable Care Organization initiative
  • An increase in IT operational agility using a storage platform that can be provisioned with cloud-based API’s
  • A decrease in costs through storage utilization efficiency at scale using modern data protection and replication methods

In my follow-up blog entries here, I will provide more details on the functional capabilities of ECS as well as map these capabilities to specific use cases that are driving the digital revolution to take on the challenges of delivering collaborative and personalized healthcare services to an aging population with multiple complex chronic conditions while driving down IT operational costs as well as the overall cost of the healthcare system.

Examples of the use cases I mentioned above include various new technology trends like the emerging Internet of Things (IoT) solutions that support remote patient monitoring, telehealth, and behavior modification tools to help manage chronic diseases; data lake functionality with the Hadoop ecosystem for population and precision health based analytics programs; and cloud-native development efforts to launch distributed mobile applications that can capture and access data from any location.

I look forward to exploring these use cases and examining how ECS’s unique capabilities will help our healthcare customers move towards meeting their technical, operational, and “digitized-mission” goals.

A Co-Design of Hardware and Software: Part 3 – How Do You Connect to 18,000 NAND Die in Parallel‬?

Donald Wake

Consultant Technical Marketing Engineer at EMC | DSSD

DSSD D5 and the worlds largest and first NVMe over PCIe Mesh Fabric

In this third & final installment of our three part blog series titled “A Co-Design of Hardware and Software” we will literally open up the box and show you all the hardware that makes the D5 work.  In Part 1 of the seriesA Flood of Data Needs Flood Software” we discussed and linked to Mike Shapiro’s overview video that explains how the EMC DSSD D5 Rack-Scale Flash Solution delivers both extreme performance and the lowest possible latency using a radical new approach to Hardware and Software Co-Design.

emc_dssd_Paint_v2In Part 2 of the series, we discussed and linked to Jeff Bonwick’s video showcasing EMC DSSD’s patented Cubic RAID algorithm.  We learned how the Cubic RAID algorithm makes the DSSD D5 “The Safest Place In Flash For Your Data” while also being the fastest storage appliance on Earth!

In this final installment Chris Frank, Senior Director of Hardware at DSSD peels back the metal and shows you the inside of the world’s largest, fastest NVMe over PCIe Mesh Fabric.
As Chris shows, the EMC DSSD D5 is a quantum leap in Shared Flash Storage technology that took some incredible new hardware invention, including:Chris_DSSD_Paint_v2

  • The world’s densest flash modules with 512 distinct NAND Die per FM
  • Dual-ported Gen 3 PCIe x4 lane client cards that can be installed in up to 48 servers.
  • I/O modules that provide 48 PCIe connections each.
  • An ultra-dense PCIe connector and cable to connect each client port to a DSSD D5 IOM port.
  • Separation of control and data planes to prevent I/O bottlenecks

 

Along the way, Chris shows how these elements are sharable across servers—without traditional, high-latency networking technology.

 

Delivering the Best Patient Care Possible

Barry Morris

Sales DVP at EMC Federal Division

Latest posts by Barry Morris (see all)

Military Health – From Volume to Value

The digital world is driving us to act as both patients and consumers of healthcare information. For example – wearable devices, such as Fitbits, provide users with personalized data and the insight required to make more informed lifestyle decisions.

IoT_HealthcareAt the same time, healthcare providers are shifting to value-based care – from “pay per pill” to “pay for performance” and “pay for outcomes” while working to meet meaningful use goals. Comprehensive patient and population health data, and data collection opportunities enabled by the Internet of Things (IoT) can provide opportunities for healthcare providers – including military – to attain new insights and deliver the best care possible.

In a recent article in Health IT Analytics, EMC’s Roberta Katz writes, “In the current accountable care environment, where electronic health record documentation is being prioritized, this new realm of patient generated data can build on a caregiver’s clinical expertise and augment hospital protocols…With the use of IoT tools and sensors, we can review our own data in real-time, from the number of steps we are taking, cardio output, sleep cycles, blood pressure, and even mood, to become an ‘empowered’ patient.”

Modern tablet showing medical diagnosisWorking toward a transformative goal to achieve, among others, the results stated by Katz show that the Military Health System (MHS) http://www.health.mil/ is undergoing a disruptive application migration from the current system (AHLTA) to the new Cerner-based  MHS GENESIS – targeted for launch at the end of this year. Their goal: the ability to share health records electronically and document the complete continuum of care between MHS locations, private providers, and possibly Veteran Affairs.

Centralized data collection and analysis as part of an extended Electronic Health Record (EHR) system can provide a picture otherwise impossible to obtain – bringing together disparate information that alone does not raise an alert, but pulled together, can signal a need for intervention.

EMC is proud to participate in the Defense Health Information Technology Symposium (DHITS) 2016 on August 2-4 – and will focus on supporting the MHS Transformation. Please visit us at booth 401 to learn more about how Big Data analytics and data lakes are transforming military health.

For additional information, check out:

Follow EMC

Categories

Archives

Connect with us on Twitter