Posts Tagged ‘cloud’

Why Healthcare IT Should Abandon Data Storage Islands and Take the Plunge into Data Lakes

One of the most significant technology-related challenges in the modern era is managing data growth. As healthcare organizations leverage new data-generating technology, and as medical record retention requirements evolve, the exponential rise in data (already growing at 48 percent each year according to the Dell EMC Digital Universe Study) could span decades.

Let’s start by first examining the factors contributing to the healthcare data deluge:

  • Longer legal retention times for medical records – in some cases up to the lifetime of the patient.
  • Digitization of healthcare and new digitized diagnostics workflows such as digital pathology, clinical next-generation sequencing, digital breast tomosynthesis, surgical documentation and sleep study videos.
  • With more digital images to store and manage, there is also an increased need for bigger picture archive and communication system (PACS) or vendor-neutral archive (VNA) deployments.
  • Finally, more people are having these digitized medical tests, (especially given the large aging population) resulting in a higher number of yearly studies with increased data sizes.

Healthcare organizations also face frequent and complex storage migrations, rising operational costs, storage inefficiencies, limited scalability, increasing management complexity and storage tiering issues caused by storage silo sprawl.

Another challenge is the growing demand to understand and utilize unstructured clinical data. To mine this data, a storage infrastructure is necessary that supports the in-place analytics required for better patient insights and the evolution of healthcare that enables precision medicine.

Isolated Islands Aren’t Always Idyllic When It Comes to Data

The way that healthcare IT has approached data storage infrastructure historically hasn’t been ideal to begin with, and it certainly doesn’t set up healthcare organizations for success in the future.

Traditionally, when adding new digital diagnostic tools, healthcare organizations provided a dedicated storage infrastructure for each application or diagnostic discipline. For example, to deal with the growing storage requirements of digitized X-rays, an organization will create a new storage system solely for the radiology department. As a result, isolated storage siloes, or data islands, must be individually managed, making processes and infrastructure complicated and expensive to operate and scale.

Isolated siloes further undermine IT goals by increasing the cost of data management and compounding the complexity of performing analytics, which may require multiple copies of large amounts of data copied into another dedicated storage infrastructure that can’t be shared with other workflows. Even the process of creating these silos is involved and expensive because tech refreshes require migrating medical data to new storage. Each migration, typically performed every three to five years, is labor-intensive and complicated. Frequent migrations not only strain resources, but take IT staff away from projects aimed at modernizing the organization, improving patient care and increasing revenue.

Further, silos make it difficult for healthcare providers to search data and analyze information, preventing them from gaining the insights they need for better patient care. Healthcare providers are also looking to tap potentially important medical data from Internet-connected medical devices or personal technologies such as wireless activity trackers. If healthcare organizations are to remain successful in a highly regulated and increasingly competitive, consolidated and patient-centered market, they need a simplified, scalable data management strategy.

Simplify and Consolidate Healthcare Data Management with Data Lakes

The key to modern healthcare data management is to employ a strategy that simplifies storage infrastructure and storage management and supports multiple current and future workflows simultaneously. A Dell EMC healthcare data lake, for example, leverages scale-out storage to house data for clinical and non-clinical workloads across departmental boundaries. Such healthcare data lakes reduce the number of storage silos a hospital uses and eliminate the need for data migrations. This type of storage scales on the fly without downtime, addressing IT scalability and performance issues and providing native file and next-generation access methods.

Healthcare data lake storage can also:

  • Eliminate storage inefficiencies and reduce costs by automatically moving data that can be archived to denser, more cost-effective storage tiers.
  • Allow healthcare IT to expand into private, hybrid or public clouds, enabling IT to leverage cloud economies by creating storage pools for object storage.
  • Offer long-term data retention without the security risks and giving up data sovereignty of the public cloud; the same cloud expansion can be utilized for next-generation use cases such as healthcare IoT.
  • Enable precision medicine and better patient insights by fostering advanced analytics across all unstructured data, such as digitized pathology, radiology, cardiology and genomics data.
  • Reduce data management costs and complexities through automation, and scale capacity and performance on demand without downtime.
  • Eliminate storage migration projects.

 

The greatest technical challenge facing today’s healthcare organizations is the ability to effectively leverage and manage data. However, by employing a healthcare data management strategy that replaces siloed storage with a Dell EMC healthcare data lake, healthcare organizations will be better prepared to meet the requirements of today’s and tomorrow’s next-generation infrastructure and usher in advanced analytics and new storage access methods.

 

Get your fill of news, resources and videos on the Dell EMC Emerging Technologies Healthcare Resource Page

 

 

Using a World Wide Herd (WWH) to Advance Disease Discovery and Treatment

Patricia Florissi

Vice President & Global Chief Technology Officer, Sales at Dell EMC
Patricia Florissi is Vice President and Global Chief Technology Officer (CTO) for Sales. As Global CTO for Sales, Patricia helps define mid and long term technology strategy, representing the needs of the broader EMC ecosystem in EMC strategic initiatives. Patricia is an EMC Distinguished Engineer, holds a Ph. D. in Computer Science from Columbia University in New York, graduated valedictorian with an MBA at the Stern Business School in New York University, and has a Master's and a Bachelor's Degree in Computer Science from the Universidade Federal de Pernambuco, in Brazil. Patricia holds multiple patents, and has published extensively in periodicals including Computer Networks and IEEE Proceedings.

Latest posts by Patricia Florissi (see all)

Analysis of very large genomic datasets has the potential to radically alter the way we keep people healthy. Whether it is quickly identifying the cause of a new infectious outbreak to prevent its spread or personalizing a treatment based on a patient’s genetic variants to knock out a stubborn disease, modern Big Data analytics has a major role to play.

By leveraging cloud, Apache™ Hadoop®, next-generation sequencers, and other technologies, life scientists potentially have a new, very powerful way to conduct innovative global-scale collaborative genomic analysis research that has not been possible before. With the right approach, there are great benefits that can be realized.

image1_

To illustrate the possibilities and benefits of using coordinated worldwide genomic analysis, Dell EMC partnered with researchers at Ben-Gurion University of the Negev (BGU) to develop a global data analytics environment that spans across multiple clouds. This environment lets life sciences organizations analyze data from multiple heterogeneous sources while preserving privacy and security. The work conducted by this collaboration simulated a scenario that might be used by researchers and public health organizations to identify the early onset of outbreaks of infectious diseases. The approach could also help uncover new combinations of virulence factors that may characterize new diseases. Additionally, the methods used have applicability to new drug discovery and translational and personalized medicine.

 

Expanding on past accomplishments

In 2003, SARS (severe acute respiratory syndrome) was the first infectious outbreak where fast global collaborative genomic analysis was used to identify the cause of a disease. The effort was carried out by researchers in the U.S. and Canada who decoded the genome of the coronavirus to prove it was the cause of SARS.

The Dell EMC and BGU simulated disease detection and identification scenario makes use of technological developments (the much lower cost of sequencing, the availability of greater computing power, the use of cloud for data sharing, etc.) to address some of the shortcomings of past efforts and enhance the outcome.

Specifically, some diseases are caused by the combination of virulence factors. They may all be present in one pathogen or across several pathogens in the same biome. There can also be geographical variations. This makes it very hard to identified root causes of a disease when pathogens are analyzed in isolation as has been the case in the past.

Addressing these issues requires sequencing entire micro-biomes from many samples gathered worldwide. The computational requirements for such an approach are enormous. A single facility would need a compute and storage infrastructure on a par with major government research labs or national supercomputing centers.

Dell EMC and BGU simulated a scenario of distributed sequencing centers scattered worldwide, where each center sequences entire micro-biome samples. Each center analyzes the sequence reads generated against a set of known virulence factors. This is done to detect the combination of these factors causing diseases, allowing for near-real time diagnostic analysis and targeted treatment.

To carry out these operations in the different centers, Dell EMC extended the Hadoop framework to orchestrate distributed and parallel computation across clusters scattered worldwide. This pushed computation as close as possible to the source of data, leveraging the principle of data locality at world-wide scale, while preserving data privacy.

Since one Hadoop instance is represented by a single elephant, Dell EMC concluded that a set of Hadoop instances, scattered across the world, but working in tandem formed a World Wide Herd or WWH. This is the name Dell EMC has given to its Hadoop extensions.

image2_

Using WWH, Dell EMC wrote a distributed application where each one of a set of collaborating sequence centers calculates a profile of the virulence factors present in each of the micro-biome it sequenced and sends just these profiles to a center selected to do the global computation.

That center would then use bi-clustering to uncover common patterns of virulence factors among subsets of micro-biomes that could have been originally sampled in any part of the world.

This approach could allow researchers and public health organizations to potentially identify the early onset of outbreaks and also uncover new combinations of virulence factors that may characterize new diseases.

There are several biological advantages to this approach. The approach eliminates the time required to isolate a specific pathogen for analysis and for re-assembling the genomes of the individual microorganisms. Sequencing the entire biome lets researchers identify known and unknown combinations of virulence factors. And collecting samples independently world-wide helps ensure the detection of variants.

On the compute side, the approach uses local processing power to perform the biome sequence analysis. This reduces the need for a large centralized HPC environment. Additionally, the method overcomes the matter of data diversity. It can support all data sources and any data formats.

This investigative approach could be used as a next-generation outbreak surveillance system. It allows collaboration where different geographically dispersed groups simultaneously investigate different variants of a new disease. In addition, the WWH architecture has great applicability to pharmaceutical industry R&D efforts, which increasingly relies on a multi-disciplinary approach where geographically dispersed groups investigate different aspects of a disease or drug target using a wide variety of analysis algorithms on share data.

 

Learn more about modern genomic Big Data analytics

 

 

Solving the Video Vortex at the Secured Cities Conference

Gary Buonacorsi

CTO of State and Local Government at Dell EMC

Latest posts by Gary Buonacorsi (see all)

I’m in Houston today at the Secured Cities conference, the leading government security and public safety event, to participate on the “Video Vortex Drives Public Safety to the Datacenter” panel. I’ll be joined by Kenneth Baker, director of Infrastructure Support at the Metropolitan Transit Authority of Harris County (METRO), who recently helped implement a citywide video surveillance system for the bus and trolley service. I’m looking forward to hearing more about METRO’s specific architecture, the pain points and challenges the department faced and what problems it hopes to solve with the new system.

For those of you unable to join us in the “Space City” of Houston, here’s a glimpse of what I’ll be covering in the session:

 

What is driving the increase in data for state and local government? 

drroneOne key factor is the emergence of new surveillance technology, such as drones, body cameras, license plate trackers and audio/video recognizance. In particular, drone usage in the public safety arena has seen significant growth for providing situational awareness in tactical events such as bank robberies or hostage situations. In addition to tactical operations, drones are also being used around the country for policing activities. Pilot programs are popping up in cities like Modesto, California, where law enforcement is using drones to assist with search warrants and surveying crime scenes. The sky’s the limit for drone usage in law enforcement, as evidenced by Amazon patenting a voice-activated shoulder-mounted drone earlier this month that officers can use to help assess dangerous situations.

Secondly, resolution requirements are increasing. Grainy pictures are ineffectual when it comes to facial recognition, analytics and post-evaluation, forcing the transition from standard definition to 4K. As new tools and analytics are posed, resolution requirements are much higher.

Perhaps the most common reason for the increase in data for public safety organizations is the growing number of camera counts and longer video retention times. With the rise of citywide surveillance, cities such as London and New York City are moving towards having cameras on practically every street corner. Discovery activities in legal proceedings are extending the retention period and the chain of evidence storage requirements.

 

Given this exponential data growth, how is it impacting organizations and what do they need to focus on?

IT departments at these organizations should look for architectures that are open source, scalable and enterprise-ready to integrate with the system they currently have, in addition to any changes they may make in the future. Simply put, department heads should avoid spot solutions and instead adopt an integrated, strategic approach to help plan for the years ahead. I would counsel them to look for a solution that allows them to start small but grow big, and easily add more cameras and scale without disrupting the current environment.

The next major area to consider is life cycle management. Previously, video footage was kept for a week before it was written over or deleted. Now long term archiving is critical with the potential for courts to mandate digital assets such as video evidence in a capital case to be maintained indefinitely.

Organizations must embrace the shift to an enterprise model. For police departments, having body cameras isn’t enough. They must consider how to integrate them into dashboard cameras, 911 call centers, etc., taking each of these point solutions to form an enterprise approach.

 

Which platform will support retention policies and what are the three different storage architectures? How can organizations escape the video vortex?
cloud2Early video surveillance solutions presented a host of challenges, including restricting departments to certain file and storage protocols, and communication channels. Combine those factors with non IP-based cameras, and modernizing existing systems became extremely difficult. The first step for organizations to solve the video vortex is to select an open platform that not only allows them to migrate and move data from system to system, but that enables them to shift providers easily. Open platforms also present more options in terms of analytics and security, enabling departments to apply more traditional security tools on top of their data storage and data transportation needs.

Compute and data storage is the key element to eliminating the video vortex. Storage is the foundation layer of a sound architecture and must address the needs of an organization, including scaling, enterprise approach and open platform to avoid a lock-in. Currently, three storage architectures exist today: distributed, centralized and cloud. Police forces that are relatively small typically still rely on a distributed architecture, capturing the data from their cars and body cameras and physically transporting it back from a mobile storage device to a centralized repository where it can then be analyzed and managed. Distributed architectures can be folded into centralized architectures, allowing them to be part of the enterprise approach with a centralized location like police headquarters, schools, airports or the METRO. A centralized architecture makes it possible to gather all of these remote data feeds from their video surveillance solutions and bring them back to a centralized repository. In a case like this, the architecture must be efficient, storing only essential data to minimize utilization rates and costs. It must also be capable of supporting thousands of surveillance devices in order to scale to multiple distributed architectures that are coming back to one location.

The third architecture to consider is cloud. Cloud presents a useful solution in that it is elastic, scalable, expands very easily and can ramp up very quickly. However, cloud storage can be very costly in light of the potential retention policy changes, data sets and cloud size – all of a sudden, the portability of those cloud data sets become much more complex. From an architecture perspective, organizations must consider how to bridge that gap and determine the amount of data that can be returned to a more cost-effective on-premise solution without compromising the capabilities that cloud offers.

Finally, distributed, centralized and cloud platforms all underlie the data lake architecture, which is really the foundation for evidence management and helps solve the video vortex public safety organizations are facing.

Embrace Digital Transformation with Elastic Cloud Storage (ECS) 3.0

Sam Grocott

Senior Vice President, Marketing & Product Management at EMC ETD

Digital Transformation is drastically changing the business landscape, and the effects are being felt in every industry, and every region of the world. For some, the goal of this transformation is to use technology to leapfrog the competition by offering innovative products and services. For others, the focus is on avoiding disruption from new market entrants. Whatever your situation might be, it’s clear that you can’t ignore the change. In a recent study by Dell Technologies, 78% of global businesses surveyed believe that digital start-ups will pose a threat to their organization, while almost half (45%) fear they may become obsolete in the next three to five years due to competition from digital-born start-ups. These numbers are a stark indication of the pressure that business leaders are feeling to adapt or fall by the wayside.

But for IT leaders, this raises an uncomfortable question: Where will you find the money to make this transformation? You’re already under constant pressure to lower IT costs. How can you invest in new technologies while still doing this?

Elastic Cloud Storage (ECS), Dell EMC’s object storage platform, was built to help organizations with precisely this challenge. After being in market for just under two years, the latest release, ECS 3.0 is being announced at Dell EMC World today. ECS is a next-generation storage platform that simplifies storage and management of your unstructured data, increases your agility, and most importantly, lowers your costs. Let’s take a look at some of the ways ECS can help modernize your datacenter, clearing the way for you to embrace Digital Transformation.

Simplify and Accelerate Cloud-Native Development

The success of companies like Uber and AirBnB has highlighted the transformative power of “cloud native” mobile and web apps. Enterprises everywhere are taking note – in the previously mentioned Dell Technologies survey, 72% of companies indicated that they are expanding their software development capabilities. Often, these software development efforts are directed towards “cloud-native” applications designed for the web and mobile devices.

ECS is designed for cloud-native applications that utilize the S3 protocol (or other REST-based APIs like OpenStack Swift). ECS natively performs many functions like geo-distribution, ensuring strong data consistency and data protection, freeing up application developers to focus on what moves their business forward. This greatly increases developer productivity, and reduces the time to market for new applications that can unlock greater customer satisfaction, as well as new sources of revenue.

Reduce storage TCO and complexity

Legacy storage systems that sit in most enterprise datacenters are struggling to keep up with the explosion in unstructured data. Primary storage platforms are constantly running out of capacity, and it is expensive to store infrequently accessed data on these platforms. Additionally, as many businesses operate on a global scale, data coming in from different corners of the world ends up forming silos, which increase management complexity and lower agility in responding to business needs.
ECS is compatible with a wide range of cloud-enabled tiering solutions for Dell EMC primary storage resources like VMAX, VNX, Isilon and Data Domain.  Additionally, ECS is certified on many 3rd party tiering solutions, which enable it to act as a low cost, global cloud-tier for 3rd party storage platforms. These solutions drive up primary storage efficiency and drive down cost by accessing a lower cost tier with ECS. Tiering to ECS is friction-free, which means that apps or users accessing primary storage don’t have to change any behavior at all.

image1

Tape Replacement

The new ECS dense compute rack D-series increases storage density by more than 60%, making it an ideal replacement for tape archives. The D-Series comes as an eight node system that provides the highest density configurations for ECS at 4.5PB (D-4500) and 6.2PB (D-6200) in a single rack.

These new configurations provide the low cost and scalability benefits of traditional tape solutions, but without the lack of agility, poor reliability, and operational difficulties associated with storing data on tape.  Additionally, ECS makes business data available to BUs in an on-demand fashion. This allows organizations to fully embrace Digital Transformation, which relies on insights mined from business data to create more compelling experiences for customers.

Legacy application modernization

ECS can serve as an ideal storage platform for organizations looking to modernize legacy LoB applications that utilize or generate a large amount of unstructured data. Modifying legacy apps to point to ECS using the S3 (or other REST-based APIs like OpenStack Swift) protocol can help reduce costs, simplify maintenance of the application, and allow them to scale to handle massive amounts of data.

Take the Next Step

Learn more about how ECS can enable your transformation , follow @DellEMCECS on Twitter, or try it out – for free!

 

 

IACP: Body Cam Storage Success

Ken Mills

CTO Surveillance & Security

Latest posts by Ken Mills (see all)

Marking the 123rd IACP with Tips to Make Selecting On-Premise Body Cam Storage & Management as Easy as 1, 2, 3

We’re excited to attend the IACP Annual Conference and Exposition in San Diego this week on Oct. 15-18. Each year, thousands of dedicated professionals from federal, state, county, local and tribal agencies attend IACP to learn about the newest intelligence, strategies and tech solutions available to blog1law enforcement.

Among the topics likely to attract attention and spark discussions are body cams and the importance of gathering electronic evidence. With an overwhelming 99 percent of public safety experts agreeing that video surveillance technology will play a significant role in their ability to prevent crime, theft and terrorism over the next five years, it’s more critical than ever to ensure we’re utilizing video data to its potential.

The increase in video data means there is a massive potential for enhanced situational awareness and better intelligence – but only if the data is analyzed.

In honor of the IACP’s 123rd year, we’re sharing tips to help make selecting on-premise body cam storage and management as easy as 1, 2, 3.

1. Beyond Body Cams

While body cams are certainly getting their share of coverage lately, it’s important to remember body cams are just one component of the video data that public safety departments are tasked with managing. Today’s public safety environments also consist of video, surveillance cameras, drones, in-car video, mobile devices and more. Progressive public safety departments must build a data platform that can collect, store and manage these individual pools of data. A common infrastructure provides a more cost-effective storage environment, more control of the data and better security.

blog2

2. Costly Clouds

Last month, the Associated Press reported police departments in Indiana and Kentucky have halted the use of body cams, citing new laws that would require the video to be stored longer and thereby significantly increasing the cost. On average, each body cam requires a minimum of 1TB of storage per year. Competing cloud solutions charge over $1,400/year – per camera. For a police department that has 500 body cameras, that can quickly add up, with the cost of storage for body cams totaling approximately $700,000 annually in perpetuity. Department heads trying to maintain budgets and plan for additional personnel to monitor the data should consider alternative storage solutions that cost considerably less to deploy and provide an overall better total cost of ownership.

3. Open to New Solutions

Open platform enables departments to integrate body cam data with the best available industry applications. To avoid the risk of limiting video to a single company’s platform, departments should bypass a closed solution as it may prevent other key applications gaining access to that data. Because the video world is constantly changing, an open platform will enable departments to implement the best solutions today and tomorrow.

Read more about our storage solutions here or visit us at Booth 820 and Booth 5307 at IACP. We look forward to seeing you there!

 

 

Survey findings show close alignment with Dell EMC strategy

Charles Sevior

Chief Technology Officer at EMC Emerging Technologies Division

Media Workflow Trends Survey-  Industry Transformation is Underway

Earlier in 2016, Dell EMC commissioned Gatepoint Research to conduct an extensive survey with Media Industry executives.  The survey, entitled Media Workflow Trends yielded some interesting results that point to a good understanding of the pace of change, and the need to stay agile for competitive advantage.

The results of that survey are summarised in a new Infographic, which apart from being much more interesting than a series of pie charts brings to the surface the key themes that align with the technology development strategy of Dell EMC.

Content Storage Demands Are Exploding

I have worked in the media industry for decades, and so this is hardly a surprising finding.  Early in my career, it was commonplace to find production offices full of shelves and compactus storage units.  These were crammed with videotapes. Then there were boxes stacked everywhere – also full of tapes with titles scrawled on the back.  There were colour-coded stickers – “Master”, “Protection Master”, “Edit Copy”, “HOLD”… There was a warehouse full of tapes of various types, even old films.  One thing you learned, is that nothing was ever thrown away (but plenty of things went missing).

Fast-forward to 2016, and most media companies involved in production and distribution of content have shifted to file-based Media Asset Management systems – or at least a media content archive repository.  This has helped to contain the data sprawl into a central location, but it has done nothing to reduce the total storage capacity requirement.  Think about the increasing resolution of content, the increasing number of channels, multiple versions for different delivery platforms and of course the increasing “shoot to use” ratio.  Sports events have increasing number of cameras with retained ISO recordings for highlights and post-match inquiries, Reality TV formats are based on multi-cam techniques to get every reaction from different angles.  Whilst these programs are in production, the storage capacity demands can skyrocket.

Only 3% of our survey respondents replied that storage needs are flat or negative – and 50% responded that the demand for storage capacity is growing rapidly and a major concern.

Multiplatform Content Delivery

Pretty much every major media company is either doing this already, or has a plan to extend their audience reach beyond simple linear broadcast channels in the next few years.  But what is interesting is the increasingly careful way in which media companies are deploying their solutions.

Recognising that the simple approach of outsourcing multiplatform content delivery to a third-party OVP (Online Video Platform) is not very revenue accretive, Media companies are now starting to embrace DIY in order to pull-back some profit margin in what is otherwise a very difficult to monetise delivery strategy.  As we learn more from some of the leaders in this industry – such as MLBAM – we can see the benefits in taking control and managing as much of the content delivery process end to end.  Just like we always did with linear content delivery over terrestrial RF transmitters, satellite transponders and cable TV networks.

One of the key tips is being ready to scale.  As streaming demand spikes and grows with popular content, how can every incremental viewer bring incremental profit – not just rising CDN costs?  Taking a tip from Netflix, you can build a distributed origin and control the CDN deeper into the delivery network.  Dell EMC has repeatedly partnered with some of the leading solution vendors in this space, who make it easier to deploy a well-managed and profitable multiplatform content delivery system.

IP-Based Workflows are here

Most industry commentators seem to get pretty excited about “the death of SDI”, and how soon IP networking can completely replace the dedicated video & audio circuits of the past.  But really, that is just a side show for which we will soon lose interest.  There is no “right or wrong” way to build a media facility.  The engineers and technical architects will select the appropriate technology on a case by case basis as they always have, based on reliability, quality, cost, ease of management etc.  And over time, there will simply be more connections made using IP network technology and fewer using dedicated single-purpose technology.

But what is the end-game?  I see it as moving our media equipment technology stacks (also known as the “rack room” or “central technical facility”) away from dedicated single-purpose vendor solutions built and managed carefully by Broadcast Engineers into a flexible virtualised technology stack that looks identical to a cloud-scale data centre – built and managed by IT and Media Technologists.  It will be open architecture, built on software-defined principles and capable of easy repurposing as the application technology needs of the business shift more frequently than they did in the past.

It is important to select your partners carefully as you make this transition into IP and software-defined.  Dell EMC has deliberately remained vendor neutral and standards-based.  We have aligned with SMPTE and AIMS who we believe are two organisations that have the broad interests of the industry (both end-users and vendors) at heart, and will result in practical, cost-effective and widely-adopted solutions.

As a pioneer and leader in scale-out storage, virtualisation and converged infrastructure, Dell EMC is in a great position to help you avoid costly mistakes during your transition to IP-based workflows.

EMC-Media and Entertainment-Infographic

Click to see the full M&E trends infographic

Ultra-HD Is Coming

Well, it’s already here.  Of course most people shopping for a new flat screen TV today will see that their options include 4K resolution, and are increasingly affordable when compared to the default HD TV resolution.  Some in the industry will say that 4K is unnecessary and is being pushed by the consumer electronics manufacturers – but when has that ever been a different story in the past?  There is no doubt that consumers appreciate improved quality of content, and story-tellers love the creative opportunities afforded by the latest technology.  When we can finally deliver ALL of the aspects of Ultra-HD, such as HDR (high dynamic range), HFR (high frame rates) and multi-channel surround sound that will be one step closer to reality.

At the SMPTE Future of Cinema Keynote during NAB 2016, pioneering movie Director Ang Lee said;

Technology must work for us to help tell the human story.  Whether it is from 2K to 4K, or 24 to 60fps, it improves the sensory experience and as a viewer, you become more relaxed and less judgmental.  We will always be chasing god’s work – which is the natural vision and sensory experience. We are getting closer and learning more about how we communicate with each other.”

In the world of content creation and media distribution, we will increasingly adopt 4K cameras, render graphics and animations at increased resolution and ensure the product we make has an increased shelf life.  This is natural, even if it is happening before we have an ability to deliver this content to our viewers.  And while it is difficult to “rip and replace” cable, satellite and terrestrial networks that are still only shifting from SD to HD with new 4K solutions, OTT content delivery using internet broadband and mobile networks will probably be the way most consumers first access Ultra-HD.

Dell EMC Isilon is a scale-out storage solution that grows in capacity and bandwidth as more nodes combine into a single-volume multi-tier cluster.  We already have numerous customers using Isilon for 4K editing and broadcast today.  As we constantly innovate and bring new technology to market, we continue to deliver to our customers the benefits of Moore’s Law.  The real key to Isilon technology is the way that we deliver platform innovation in an incremental and backward-compatible way – supporting the ability to scale and grow non-disruptively.

Beyond LTO Archiving

I mentioned earlier in this blog how my early career was defined by shelves and boxes of tapes – videotapes everywhere.  I spent time in my day handling tape, winding tape into cartridges, even editing audio and videotape using a razor blade!  The most important machine in the building (a commercial TV station) was the cart machine.  That was because it held all of the commercial 30 second spots, and if those did not play, the TV station did not make money and we would not get paid.

Finally we replaced cart machines and replay videotape machines with hard disk servers that were highly reliable, fast to respond to late changes and very flexible.  So I wonder when we will say it is time to replace the data tape archive library with a cloud store?  Certainly we are all familiar with and probably daily users of one of the biggest media archives in the world (I refer to Google’s YouTube).  Wouldn’t it be great if your company had its own YouTube?  A content repository that was always online, instantly searchable, growing with fresh material and just as easy to use?

So then we get down to cost.  It turns out, that even though they seem cheap, the cost of actually using a public cloud store for long term retention is a lot more expensive than existing data tape technology – especially as the LTO industry brings innovation beyond LTO-6 into the latest LTO-7 data tape format with 6TB native capacity.

But that migration process to move all of your media from one standard to the next is painful and time-consuming – introducing cost, wear & tear and impacting on end-user search & retrieval times from the library.

From our survey respondents, the top features for consideration of a storage solution are performance, scalable capacity and efficient use of resources (floor space, power, personnel).  So if we took those criteria into account, cloud storage should win hands-down – if only the price was right.

Well finally now it is.  Dell EMC has been developing an innovative product called ECS (Elastic Cloud Storage) which meets all of the requirements of a Modern Archive – scalable, multi-site geo-replication, open architecture, software-defined.  And now it is available in a range of hardware platforms that offer the high packing density of large capacity and very efficient hard drives – today 8TB is supported and clearly that native capacity will grow.

Increasingly customers are asking us whether this technology is price competitive with LTO libraries, and whether it is reliable and ready for mission-critical high-value archives.  The answer to both of these questions is yes, and the benefits of moving to your own cloud store are significant (whether you choose to deploy it within your own premises or have it hosted for you).

Cloud Solutions are gathering converts

When you boil it all down, our industry is in transformation from a legacy & bespoke architecture to that of a cloud. The great thing about a cloud, is that it is flexible and can easily change shape, scale and take on new processes and workloads.  And it doesn’t have to be the public cloud.  It can be “your cloud”.  Or it can be a mix of both – which really gives you the best of both worlds.  Public cloud for burst, private cloud for base load and deterministic performance.

Building clouds and bringing technology innovation to industry is what Dell EMC is really good at.  Speak with us to learn more about how to embark on this journey and the choices available to you.

SUMMARY

So we find that across the media industry the evolution is underway.  This is a multi-faceted transformation.  We are not just switching from “SD to HD”, we are actually evolving at the business, operations, culture and technology level.

Dell EMC is positioned as an open architecture vendor neutral infrastructure provider offering best in class storage, servers, networking, workstations, virtualisation and cloud management solutions.  Engage with us to secure your infrastructure foundation, to be future-ready, and to simplify your technology environment so that you can focus on what really matters to your business – what makes your offering attractive to viewers (on any platform)

 

trends-picture-1

Follow Dell EMC

Categories

Archives

Connect with us on Twitter