Posts Tagged ‘Scale-out’

Converged Infrastructure + Isilon: Better Together

David Noy

VP Product Management, Emerging Technologies Division at EMC

You can’t beat Isilon for simplicity, scalability, performance and savings. We’re talking  world-class scale-out NAS that stores, manages, protects and analyzes your unstructured data with a powerful platform that stays simple, no matter how large your data environment. And Dell EMC already has the #1 converged infrastructure with blocks and racks. So bringing these two superstars together into one converged system is truly a case of one plus one equals three.

This convergence—pairing Vblock/VxBlock/VxRack systems and the Technology Extension for Isilon— creates an unmatched combination that flexibly supports a wide range of workloads with ultra-high performance, multi-protocol NAS storage. And the benefits really add up, too:

As impressive as these numbers are, it all boils down to value and versatility. These converged solutions give you more value for your investment because, quite simply, they store more data for less. And their versatility allows you to optimally run both traditional and nontraditional workloads These include video surveillance, SAP/Oracle/Microsoft applications, mixed workloads that generate structured and unstructured data, Electronic Medical Records and Medical Imaging and more – on infrastructure built and supported as one product.

With a Dell EMC Converged System, you’ll see better, faster business outcomes through simpler IT across a wide range of application workloads. For more information on modernizing your data center with the industry’s broadest converged portfolio, visit emc.com/ci or call your Dell EMC representative today.

 

Learn more about Converged Infrastructure and IsilonAlso, check out the full infographic

When It Comes To Data, Isolation Is The Enemy Of Insights

Brandon Whitelaw

Senior Director of Global Sales Strategy for Emerging Technologies Division at Dell EMC

Latest posts by Brandon Whitelaw (see all)

Within IT, data storage, servers and virtualization, there have always been ebbs and flows of consolidation and deconsolidation. You had the transition from terminals to PCs and now we’re going back to virtual desktops – it flows back and forth from centralized to decentralized. It’s also common to see IT trends repeat themselves.

dataIn the mid to late 90s, the major trend was to consolidate structured data sources into a single platform; to go from direct detached storage with dedicated servers per application to a consolidated central storage piece, called a storage array network (SAN). SANs allowed organizations to go from a shared nothing architecture (SN) to a shared everything architecture (SE), where you have a single point of control, allowing users to share available resources and not have data trapped or siloed within the independent direct detached storage systems.

The benefit of consolidation has been an ongoing IT trend that continues to repeat itself on a regular basis, whether it’s storage, servers or networking. What’s interesting is once you consolidate all the data sources, IT is able to finally look at doing more with them. The consolidation onto a SAN enables cross analysis of data sources that were otherwise previously isolated from each other. This was simply practically infeasible to do before. Now that these sources are in one place, this enables the emergence of systems such as an enterprise data warehouse, which is the concept of ingesting and transforming all the data on a common scheme to allow for reporting and analysis. Companies embracing this process led to growth in IT consumption because of the value gained from that data. It also led to new insights, resulting in most of the world’s finance, strategy, accounting, operations and sales groups all relying on the data they get from these enterprise data warehouses.

Next, companies started giving employees PCs, and what do you do on PCs? Create files. Naturally, the next step is to ask, “How do I share these files?” and “How do I collaborate on these files?” The end result is home directories and file shares. From an infrastructure perspective, there needed to be a shared common platform for this data to come together. Regular PCs can’t talk to a SAN without direct block level access, a fiber channel, or being connected in the data center to a server, so unless you want everyone to physically sit in the data center, you run Ethernet.

Businesses ended up building Windows file servers to be the middleman brokering the data between the users on Ethernet and the backend SAN. This method worked until companies reached the point where the Windows file servers steadily grew to dozens. Yet again, this led to IT teams being left with complexity, inefficiency and facing the original problem of having several isolated silos of data and multiple different points of management.

So what’s the solution? Let’s take the middleman out of this. Let’s take the file system that was sitting on top of the file servers and move it directly onto the storage system and allow Ethernet to go directly to it. Thus the network-attached storage (NAS) was born.

However, continuing the cycle, what started as a single NAS eventually became dozens for organizations. Each NAS device contained specific applications with different performance characteristics and protocol access. Also, each system could only store so much data before it didn’t have enough performance to keep up, so systems would continue expanding and replicating to accommodate.

This escalates until an administrator is startled to realize 80 percent of his/her company’s data being created is unstructured. The biggest challenge of unstructured data is that it’s not confined to the four walls of a data center. Once again, we find ourselves with silos that aren’t being shared (notice the trend repeating itself?). Ultimately, this creates the need for scale-out architecture with multiprotocol data access that can combine and consolidate unstructured data sources to optimize collaboration.

Doubling every two years, unstructured data is the vast majority of all data being created. Traditionally, the approach to gaining insights from this data has involved building yet another silo, which prevents having a single source of istock_000048860836_largetruth and having your data in one place. Due to the associated cost and the complexity, not all of the data goes into a data lake, for instance, but only sub-samples of the data that are relevant to that individual query. An option to ending this particular cycle is investing in a storage system that not only has the protocol access and tiering capabilities to consolidate all your unstructured data sources, but can also serve as your analytics platform. Therefore your primary storage, the single source of truth that comes with it and that ease of management will lend itself to become that next phase, which is unlocking its insights.

Storing data is typically viewed as a red-ink line item, but it can actually be to your benefit. Not because of regulation or policies dictating it, but as a deeper, wider set of data that can provide better answers. Often, you may not know what questions to ask until you’re able to see data sets together. Consider the painting technique, pointillism. If you look too closely, it’s just a bunch of dots of paint. However, if you stand back, a landscape emerges, ladies with umbrellas materialize and suddenly you realize you’re staring at Georges Seurat’s famous panting, A Sunday Afternoon on the Island of La Grande Jatte. Similar to pointillism, with data analytics, you never think of connecting the dots if you don’t even realize they’re next to one another.

Dell & EMC Showcase Their Synergies with the All-Flash ScaleIO Ready Node

Jason Brown

Consultant Product Marketing Manager at Dell EMC

scaleioimage1In case you missed it, on September 15th we announced the Dell EMC ScaleIO Ready Node. This announcement highlights months of collaboration between Dell and EMC to combine the best of both worlds – Dell PowerEdge servers and Dell EMC ScaleIO. The ScaleIO Ready Node brings All-Flash capabilities to Software-Defined Storage to enable customers to transform their data centers, making the path to the modern data center easier with Dell EMC:

There are tons of specs and details about the ScaleIO Ready Node which I won’t rehash here. You can check out the data sheet, spec sheet, and FAQ to get all the details you need. What I’d like to highlight are two key points regarding this announcement:

  1. Bringing best of breed EMC software-defined storage and Dell server hardware together
  2. Optimizing All-Flash with SDS

The first point is really important. There’s a reason why Dell spent 500 gazillion dollars on EMC… oh, it was only $67 billion you say? Peanuts then! But seriously, there are obviously a lot of synergies and opportunities between Dell and EMC, and the ScaleIO Ready Node is one of the first examples. Dell builds best-of-breed servers with its PowerEdge line. EMC is a leader in software-defined storage (SDS), with ScaleIO as its block storage offering. And guess what? ScaleIO runs on x86-based industry standard servers! Bringing PowerEdge servers and ScaleIO together was a no-brainer. It’s like peanut butter and jelly. Or mac and cheese. Or even “peas and carrots”*.

But, it’s not as simple or straightforward as you think. A ton of thought and work went into the planning and R&D processes associated with the ScaleIO Ready Node. Yeah, we’ve loaded ScaleIO onto a variety of Dell PowerEdge servers. But it doesn’t stop there. We’ve introduced a system that is pre-configured, pre-validated, and optimized for running ScaleIO. Plus, it comes with “one throat to choke” for procurement and support: Dell EMC.

I can’t emphasize how important that is. When I talk to customers, they get SDS, they understand there can be significant 3-5-year TCO savings, and they absolutely love the performance, scalability, and flexibility of ScaleIO. But, when the rubber meets the road, a majority of customers are not going to buy ScaleIO software and then procure <insert your favorite brand> servers from another vendor (but if they do, I hear Dell has good stuff). So, we’ve simplified the process and enabled faster time-to market by using Dell EMC’s supply chain and services so customers can hit the ground running – while preserving the flexibility which is a huge differentiator for ScaleIO. See what two ScaleIO veterans have to say about this:

The second point is the formal introduction of ScaleIO into the All-Flash (AF) arena. Yeah I know, every product out there has AF capabilities, and yeah, flash is becoming commoditized, and yeah, you could run AF ScaleIO clusters before the ScaleIO Ready Node. Regardless, AF is the way of the future and one of the foundations of the modern data center. So, we’re combining two key foundations to transform your data center – All-Flash and Software-Defined Storage – into a single platform to make it much easier for customers to start their journey to the modern data center.

What’s important about the AF ScaleIO Ready Node is how we optimize flash with SDS. ScaleIO’s architecture is unique and is the key behind unlocking the power of AF. All of the SSD drives within the All-Flash ScaleIO Ready Node work in parallel, eliminating any and all bottlenecks. Each node scaleioimage2participates in IOPS and there’s no cache to get in the way – for reads or writes. The ability to take full advantage of the aggregate performance of all SSDs makes it possible for performance to scale linearly as you add more servers.

Customers have the ability to migrate Tier 1 and Tier 2 application workloads with high-performance requirements to All-Flash ScaleIO Ready Nodes without missing a beat! Check out Dell EMC’s All-Flash page for more details, and if you want to see some guy talking about the All-Flash ScaleIO Ready Node, click here.

We’re extremely excited about the release of the ScaleIO Ready Node. It’s awesome to be one of the first products to be released by Dell Technologies that highlights the synergies between Dell and EMC. With this collaboration, we’re able to bring peace of mind to customers and provide unique product capabilities now and in the future. Please visit the Dell EMC ScaleIO Ready Node page to learn more! #GoBigWinBig

*If you’ve never seen Forrest Gump, go watch it. Now!

Bullseye: NCC Media Helps Advertisers Score a Direct Hit with Their Audiences

Keith Manthey

CTO of Analytics at EMC Emerging Technologies Division

Can you recall the last commercial you watched? If you’re having a hard time remembering, don’t beat yourself up. Nowadays, it’s all too easy for us to fast forward, block or avoid ads entirely. In order to watch content on one of our many devices, advertising is no longer a necessary preccartoon-tv-convertedursor to enjoying our favorite shows. While people may be consuming more media in more places than we’ve ever seen, it’s more challenging than ever for advertisers to reach their target audiences. The rise of media fragmentation and ad-blocking technology has exponentially increased the cost of reaching the same number of people it did years ago.

As a result, advertisers are under pressure to accurately allocate spending across TV and digital media and to ensure those precious ad dollars being spent are reaching the right people. Gone are the days when a media buyer’s gut instinct determined which block of airtime to purchase on what channel, replaced instead with advanced analytics.

By aggregating hundreds of terabytes of data from sources like Nielsen ratings, Scarborough local market data and even voting and census data, companies such as NCC Media are able to provide targeted strategies for major retailers, nonprofits and political campaigns. In one of the most contentious election years to date, AdAge reported data and analytics are heavily influencing how political advertisers are purchasing cable TV spots. According to Tim Kay, director of political strategy at NCC, “Media buyers…are basing TV decisions on information such as whether a program’s audience is more likely to vote or be registered gun owners.”

In order for NCC to identify its customers targets more quickly (for example, matching data about cable audiences to voter information and zip codes targets more quickly), NCC built an enterprise data lake with EMC Isilon’s scale-out storage with Hortonworks Data Platform, allowing it to streamline data aggregation and analytics.

To learn more about how Isilon has helped NCC eliminate time-consuming overnight data importation processes and tackle growing data aggregation requirements, check out the video below or read the NCC Media Case Study.

 

Telemedicine Part 1: TeleRadiology as the growth medium of Precision Medicine

Sanjay Joshi

CTO, Healthcare & Life-Sciences at EMC
Sanjay Joshi is the Isilon CTO of Healthcare and Life Sciences at the EMC Emerging Technologies Division. Based in Seattle, Sanjay's 28+ year career has spanned the entire gamut of life-sciences and healthcare from clinical and biotechnology research to healthcare informatics to medical devices. His current focus is a systems view of Healthcare, Genomics and Proteomics for infrastructures and informatics. Recent experience has included information and instrument systems in Electronic Medical Records; Proteomics and Flow Cytometry; FDA and HIPAA validations; Lab Information Management Systems (LIMS); Translational Genomics research and Imaging. Sanjay holds a patent in multi-dimensional flow cytometry analytics. He began his career developing and building X-Ray machines. Sanjay was the recipient of a National Institutes of Health (NIH) Small Business Innovation Research (SBIR) grant and has been a consultant or co-Principal-Investigator on several NIH grants. He is actively involved in non-profit biotech networking and educational organizations in the Seattle area and beyond. Sanjay holds a Master of Biomedical Engineering from the University of New South Wales, Sydney and a Bachelor of Instrumentation Technology from Bangalore University. He has completed several medical school and PhD level courses.

Real “health care” happens when telemedicine is closely joined to a connected-care delivery model that has prevention and continuity-of-care at its core. This model has been defined well, but only sparsely adopted. As John Hockenberry, host of the morning show “The Takeaway” on National Public Radio, eloquently puts it: “health is not episodic.” We need a continuous care system.

Telemedicine makes it possible for you to see a specialist like me without driving hundreds of miles

Image source: Chest. 2013,143 (2):295-295. doi:10.1378/chest.143.2.295

How do we get the “right care to the right patient at the right time”? Schleidgen et al define Precision Medicine also known as Personalized Medicine (1) as seeking “to improve stratification and timing of health care by utilizing biological information and biomarkers on the level of molecular disease pathways, genetics, proteomics as well as metabolomics.” Precision Medicine (2) is an orthogonal, multimodal view of the patient from her/his cells to pathways to organs to health and disease. There are several devices and transducers that would catalyze telemedicine: Radiology, Pathology, and Wearables. I will focus on Radiology for this part of my three-part series, since all of these modalities use multi-spectral imaging.

Where first?
The world is still mostly rural. According to World Bank statistics, 19% of the USA is rural, but the worldwide average is about 30% which is a spectrum from 0% rural (Hong Kong) to 74% rural (Afghanistan). With the recent consolidations (since 2010 in the US) of hospitals into larger organizations (3), it is this 30% to 70% of the world with sparse network connectivity that needs telemedicine sooner than the well-off “worried well” folks who live in dense urban areas with close access to healthcare. China has the world’s largest number of hospitals at around 60,000 followed by India at around 15,000. The US tally is approximately 5,700 hospitals. The counter-argument to the rural needs in the US is the risk of reduction of physician numbers (4), the growing numbers of the urban poor and the elderly. Then there is the plight of poor health amongst the world’s millions of refugees who are usually stuck in no-mans-lands, fleeing conflicts that never seem to wane. All these use-cases are valid, but need prioritization.

Connected Health and the “Saver App”
Many a fortune has been made by devising and selling “killer apps” on mobile platforms. In healthcare what we need is a “saver app.” Using the pyscho-social keys to the success of these “sticky” technologies, Dr. Joseph C. Kvedar succinctly builds the case for connected health in his recent book “The Internet of Healthy Things” with three strategies and three tactics:

Strategies: (1) Make It about Life; (2) Make It Personal; and (3) Reinforce Social Connections.

Tactics: (1) Employ Messaging; (2) Use Unpredictable Rewards; and (3) Use the Sentinel Effect.

Dr. Kvedar calls this “digital therapies.”

The Vendor Neutral Archive (VNA) and Virtual Radiology
The Western Roentgen Society, a predecessor of the Radiological Society of North America (RSNA), was founded in 1915 in St. Louis, Missouri (soon after the invention of the X-Ray tube in Bavaria in 1895). An interactive timeline of Radiology events can be seen here. Innovations in Radiology have always accelerated the innovations in healthcare.

The Radiology value chain is in its images and clinical reporting, as summarized in the diagram below (5):

Radiology value chain

To scale this value-chain for telemedicine, we need much larger adoption of VNA, which is an “Enterprise Class” data management system. A VNA consolidates multiple Imaging Departments into:

  • a master directory,
  • associated storage and
  • lifecycle management of data

The difference between PACS (Picture Archiving and Communications System) (6) and VNA is the Image Display and the Image Manager layers respectively.

The Image Display layer is a PACS Vendor or a Cloud based “image program”. All Admit, Discharge and Transfer (ADT) information must reside with the image. This means DICOM standards and HL7 X.12N interoperability (using service protocols like FHIR) are critical. The Image Manager for VNA is the “storage layer of images”, either local or cloud based. For telemedicine to be successful, VNA must “scale-out” exponentially and in a distributed manner within a privacy and security context.

VNA’s largest players (alphabetically) are: Agfa, CareStream, FujiFilm (TeraMedica), IBM (Merge), Perceptive Software (Acuo), Philips and Siemens. The merger of NightHawk Radiology with vRad which was then acquired by MedNax and IBM’s acquisition of Merge Healthcare (in Aug 2015) are important landmarks in this trend.

One of the most interesting journal articles in 2015 was on “Imaging Genomics” (or Radiomics) of glioblastoma, a brain cancer. By bidirectionally linking imaging features to the underlying molecular features, the authors (7) have created a new field of non-invasive genomic biomarkers.

Imagine this “virtual connected hive” of patients on one side and physicians, radiologists and pathologists on the other, constantly monitoring and improving the care of a population in health and disease at the individual and personal level. Telemedicine needs to be the anchor architecture for Precision Medicine. Without Telemedicine (and VNA), there is no Precision Medicine.

Postscript: Telepresence in mythology
Let me end this tale of distance and care with a little echo from my namesake, Sanjaya, who is mentioned in the first chapter of the first verse of the Bhagvad Gita (literally translated as the “Song of the Lord”) – an existential dialog between the warrior Arjuna and his charioteer, Krishna. The Gita, as it is commonly known, is set within the longest Big Data poem with over 100,000 verses (and 1.8 million words), the Mahabharata, estimated to be first written around 400 BCE.

Dhritarashtra, the blind king, starts this great book-within-book by enquiring: “O Sanjaya, what did my sons and the sons of Pandu decide about battle after assembling at the holy land of righteousness Kurukshetra?”

Sanjaya starts the Gita by peering into the great yonder. He is bestowed with the divine gift of seeing events afar (divya-drishti); he is the king’s tele-vision – and Dhritarashtra’s advisor and charioteer (just like Krishna in the Gita). The other great religions and mythologies also mention telepresence in their seminal books.

My tagline for the “trickle down” in technology innovation flow is “from Defense to Life Sciences to Pornography to Finance to Commerce to Healthcare.” One interpretation of the Mahabharata is that it did not have any gods – all miracles were added later. Perhaps we have now reached the pivot point for telepresence which has happened in war to “trickle down” into population scale healthcare without divine intervention or miracles!

References:

  1. Schleidgen et al, “What is personalized medicine: sharpening a vague term based on a systematic literature review”, BMC Medical Ethics, Dec 2013, 14:55
  2. “Toward Precision Medicine”, Natl. Acad. Press, June 2012
  3. McCue MJ, et al, “Hospital Acquisitions Before Healthcare Reform”, Journal of Healthcare Management, 2015 May-Jun; 60(3):186-203.
  4. Petterson SM, et al, “Estimating the residency expansion required to avoid projected primary care physician shortages by 2035”, Annals of Family Medicine 2015 Mar; 13(2):107-14. doi: 10.1370/afm.1760
  5. Enzmann DR, “Radiology’s Value Chain”, Radiology: Volume 263: Number 1, April 2012, pp 243-252
  6. Huang HK, “PACS and Imaging Informatics: Basic Principles and Applications”, Wiley-Blackwell; 2 edition (January 12, 2010)
  7. Moton S, et al, “Imaging genomics of glioblastoma: biology, biomarkers, and breakthroughs”, Topics in Magnetic Resonance Imaging. 2015

 

Get first access to our LifeScience Solutions

A new application model requires a new toolbox

The rise of the internet and the Toolboxprevalence of mobile devices, including smartphones and tablets, have driven a revolution in application design towards what is commonly known as a Platform 3 application.  Platform 3 applications are characterized by a scope of use that spans potentially millions of users, a breadth of access that includes worldwide access 24 hours a day, a volume and variety of data storage and access needs that includes both traditional applications as well as big data analytic platforms, and need to share data sets across multiple instances of a single application as well as across multiple independent applications. As businesses increasingly move towards such platform 3 applications, they can take advantage of a number of tools available in the industry to help them create and deploy their applications. However, it is not sufficient to have a great set of tools in your box; knowing how to use them effectively is just as important. (more…)

Categories

Archives

Connect with us on Twitter