Archive for the ‘Industry Trends’ Category

The Next Element for IT Service Providers in the Digital Age

Diana Gao

Senior Product Marketing Manager at EMC² ECS

Digital technology has disrupted large swaths of the economy and is generating huge amount of data, where the average backup hovers at around a petabyte. Not all organizations can cope up with this data deluge and look to service providers for storage and protection. Many service providers provide tape-based backup and archiving services. Despite their best efforts to innovate, data volumes always seem to grow faster, pushing the boundaries of tape capacity.

Today, companies of all sizes still use tape to store business information, but now it is more for cold storage than for data that needs to be accessed frequently. While tape as a low cost and reliable storage option is ideal for data not being accessed often, maintaining multiple versions of software and legacy infrastructure can put a burden on already taxed resources. These challenges come at a cost including software licenses, maintenance, and a waste of technical resources that could be spent on other more important initiatives to help drive business innovation. As a service provider, you need a secure and compliant data storage option that will enable you to sell more value added services.

As reported in Tech Target, a Storage magazine Purchasing Intention survey showed that the trend away from tape continues – 76% of IT professionals see their use of tape as a backup format either declining or staying the same.

Some service providers are considering offering cloud-based backup-as-a-service without causing any security concerns for their customers. Others are looking for a solution that combines the benefits of faster data access along with the cost advantages of tape.

More than a few service providers have discovered an ideal solution that covers all of these benefits: Elastic Cloud Storage (ECS) object storage platform. As a highly scalable, multi-tenant, multi-protocol object storage system, ECS is the perfect platform that helps service providers to better meet their service-level-agreement (SLA) commitments to customers by offering highly resilient, reliable and low-cost storage services with enterprise-class security.

Iron Mountain® Incorporated (NYSE: IRM), a leading provider of storage and information management services, is one of those who have discovered this solution. In additional to its traditional tape-based storage-as-a-service, it partnered with Dell EMC to provide a cost-effective, scalable and modern Cloud Archive as a part of their services portfolio. Designed to scale as the volume of data grows with ECS as the backend storage platform, the Cloud Archive solution is ideal for organizations needing offsite, pay-as-you-use archival storage with near-infinite scalability.

“Our customers trust that we know where the data is by having those cloud-based solutions in our datacenters. It gives them a peace of mind where they know where their data is at rest.” said Eileen Sweeney, SVP Data Management at Iron Mountain.

Watch the video below to hear more about how Iron Mountain uses ECS to modernize its storage management services for 95% of Fortune 1000 companies. 

You’ll find the full rundown of Iron Mountain Cloud Archive solution with ECS here.

Planning on getting away to Barcelona for Mobile World Congress (MWC) 2017? Stop by at VMWare Stand at Hall 3, Stand 3K10 to meet with Dell EMC experts!

What’s Next for Hadoop? Examining its Evolution, and its Potential

John Mallory

CTO of Analytics at EMC Emerging Technologies Division

In my last blog post, I talked about one of the most popular buzzwords in the IT space today – the Internet of Things – and offered some perspective in terms of what’s real and what’s hype, as well as which use cases make the most sense for IoT in the short-term.

Today I’d like to address the evolution of Apache’s Hadoop, and factors to consider that will drive Hadoop adoption to a wider audience beyond early use-cases.

First, consider that data informs nearly every decision an organization makes today. Customers across virtually every industry expect to interact with businesses wherever they go, in real-time, across a myriad pf devices and applications. This results in piles and mounds of information that need to be culled, sorted and organized to find actionable data to drive businesses forward.

This evolution mirrors much of what’s taking place in the Apache-Hadoop ecosystem as it continues to mature and find its place among a broader business audience.

The Origins & Evolution of Hadoop

HadoopLet’s look at the origins of Hadoop as a start. Hadoop originally started out as a framework for big batch processing, which is exactly what early adopters like Yahoo! needed – an algorithm that could crawl all of the content on the Internet to help build big search engines and then take the outputs and monetize them with targeted advertising. That type of a use case is entirely predicated on batch processing on a very large scale.

The next phase centered on how Hadoop would reach a broader customer base. The challenge there was to make Hadoop easier to use by a wider audience. Sure, it’s possible to do very rich processing with Hadoop, but it also has to be programmed very specifically, which can make it difficult to use by enterprise users for business intelligence or reporting. This drove the trend around SQL on Hadoop, which was the big thing about two years ago with companies like Cloudera, IBM, Pivotal and others entering the space. (more…)

Examining the Internet of Things: What’s hype? What’s real?

John Mallory

CTO of Analytics at EMC Emerging Technologies Division

The Internet of Things is one of the biggest buzzwords in technology today, and indeed, it does have the potential to be a truly transformational force in the way that we live and work today.Internet of Things

However, if you peel back the “potential” and excitable future-speak surrounding IoT, and look at the actual reality of where it is today, the story is much, much different.  Yes, Internet-enabled “things” ranging from phones to watches to cars are getting smarter by being able to access, share and interpret data in new ways. But in our enthusiasm to embrace a Jetsons-like future powered by IoT, we’re losing sight of the infrastructure required (both at the literal hardware and organizational/institutional levels) to actually elevate this technology beyond buzzword status.

Consider, for example, the hype cycle over “big data” about three years ago when it became the industry’s hot topic without much, well, data to back it up. Hadoop is another example – it too had early adopters, but even now is only being rolled out into Fortune 1000/5000 companies. Organizations are still struggling with how to monetize it.

(more…)

A new application model requires a new toolbox

The rise of the internet and the Toolboxprevalence of mobile devices, including smartphones and tablets, have driven a revolution in application design towards what is commonly known as a Platform 3 application.  Platform 3 applications are characterized by a scope of use that spans potentially millions of users, a breadth of access that includes worldwide access 24 hours a day, a volume and variety of data storage and access needs that includes both traditional applications as well as big data analytic platforms, and need to share data sets across multiple instances of a single application as well as across multiple independent applications. As businesses increasingly move towards such platform 3 applications, they can take advantage of a number of tools available in the industry to help them create and deploy their applications. However, it is not sufficient to have a great set of tools in your box; knowing how to use them effectively is just as important. (more…)

What do Analytics and the Suez Canal have in common?

Suresh Sathyamurthy

Sr. Director, Product Marketing & Communications at EMC

Suez Canal

1859: Egyptian workers under French engineers begin construction of the Suez Canal. A canal across the Isthmus of Suez would cut the ocean distance from Europe to Asia by up to 6,000 miles, and it could be built at sea level, without any locks. Circumventing the additional travel would reduce risk, overhead of additional supplies, and fewer sailors.  Completed ten years later, the effect on the world trade was immediate. This wonder shrunk the globe rapidly at a topography level but also in the time, it traditionally took to gain business and economic benefits.

The economic benefits of the Suez Canal and In-Place Hadoop analytics

My metaphor here is that like sailing the old, 12,000+ mile route around the coast of Africa, the traditional method of storing and moving data for analysis is a long and arduous journey that affects your business and economic benefits. Just as the pre-Suez Canal journey from Europe to Asia required significantly more time, larger ships, more crew and more provisions, the traditional route to analytics requires more time (copying and moving data) bigger ships (3x storage capacity), more crew (IT resources) and more provisions (overhead). Now, imagine taking the EMC data lake route that reduces overhead, takes much less time, and offers increased flexibility. The EMC Isilon data lake with its native Hadoop Distributed File System (HDFS) support is the modern route to actionable results. It effectively brings Hadoop to where your data exists today, as opposed to having to ship and replicate your data to a separate Hadoop stack for analysis.

The Open Data Platform Initiative (ODPi), IBM and EMC Isilon

The Isilon data lake’s shared storage architecture natively supports HDFS, and the ODPi common platform. IBM, EMC, Pivotal and Hortonworks established the ODPi to create a standardized, common platform for Hadoop analytics that enables organizations to realize business results more quickly.  Which brings us to the EMC and IBM analytics collaboration. IBM BigInsights, being a part of the ODPi, means now there’s another choice for in-place analytics with the EMC data lake. And, it quickly became evident to both EMC and IBM that there was a strong customer demand for IBM BigInsights and EMC Isilon to align on a data lake approach to analytics. The EMC and IBM collaboration enables analytics on your data right where it is, within the EMC Isilon data lake, while IBM BigInsights provides the separate compute resources that analyze the data. Now you’re on the expedited route to business analytics with EMC Isilon and IBM.

Whether you are looking to gain a 360-degree view of your customers, attempting to prevent fraud in the financial markets, or making smarter infrastructure investments, the increased efficiencies of the partnership allows you to be nimble in understanding and reacting to what your data is telling you.

About 15,000 ships make the 11-hour journey through the canal each year. It’s estimated that the canal bears roughly 8 percent of the world’s shipping and is recognized as one of the most important waterways in the world. Forrester Research1 predicts big data analytics as the number 2 priority of corporations, and states Hadoop has already disrupted the economics of data. Just as the Suez Canal offers key business benefits for trade between Europe and Asia, so does in-place analytics. Here’s how: Compass

  • No moving and copying of data
  • No 3X replication of data
  • Increased storage utilization efficiency (to an average of 80%)
  • Enterprise data resiliency and availability
  • Enterprise grade security features
  • Quicker time to business insight
  • Smarter infrastructure investments
  • Reduction of CAPEX and OPEX
  • Increased choice and flexibility

In summary, back to the metaphor, the modern route to analytics saves on time to benefit, and can be achieved with smaller ships, less crew, and with fewer provisions required.

Where can I get more details?

The EMC Hadoop Starter Kit for IBM BigInsights is available and has instructions on how to build and deploy IBM BigInsights Open Platform with EMC Isilon. You can also learn more about the Hadoop enabled EMC Data Lake here.

1 Source: Forrester Predictions 2015: Hadoop Will Become a Cornerstone of Your Business Technology Agenda

Thoughts on the future of M&E: A wrap-up of interesting observations from SMPTE

Tom Burns & Charles Sevior

Tom Burns & Charles Sevior

Chief Technology Officers at EMC Emerging Technologies Division
Tom Burns & Charles Sevior

Latest posts by Tom Burns & Charles Sevior (see all)

The EMC Media and Entertainment team recently returned from the 2015 SMPTE (Society of Motion Picture & Television Engineers) conference in Los Angeles. It was four jam-packed days of technical presentations, and we all came back with a greater understanding of the future of media workflowsMedia and thoughts on how we can help our customers take those next steps.

From the agenda we expected a lot of conversations around the transition from Virtual/Augmented Reality, SDI to IP, Cloud and Hybrid Cloud, and future delivery methods. We weren’t disappointed; we had some great conversations with people throughout the industry. Here’s an overview:

Tom Burns – CTO M&E

At this year’s SMPTE Annual Technical Conference, I attended the “Broadcast Infrastructure” track, even though I really wanted to see what was happening with High Dynamic Range (Psst – I found out that Cinematic HDR is a reality at a few AMC Prime venues! Wait for a full subjective review in a following post…)

The most exciting infrastructure trend I encountered (detailed via a number of papers and presentations at the SMPTE ATC) is the ability to replicate the real-time capability of Serial Digital Interface (SDI) video over coax within an all-IP plant. This feat is accomplished via IP encapsulation using an unmodified group of switches in an Ethernet fabric (often in a leaf & spine topology).

Sending high-quality video via a point-to-point IP connection has been around for a while, at a range of prices, quality settings and codecs. However, the last technical hurdle is to provide frame-accurate switching of an SD, HD, or UHD video signal, with embedded audio, timecode, genlock and other ancillary data.

The IEEE 1588 Precision Time Protocol (PTP) was presented for our enjoyment, with the addition of SMPTE extensions to become ST-2059, a protocol for genlock over IP networks. As well, a new slate of SMPTE standards for video transport over IP networks were detailed, the ST2022 family (parts 1 – 7).

Charles Sevior – CTO APJ

Cloud
The “Cloud” is really starting to change the way Broadcast IT technology teams think about storage and compute infrastructure and the distribution of rich media content for both B2B and B2C requirements.  I spent a full day in the SMPTE TC Cloud track, consuming and questioning speakers from AWS, Sundog Media, ETC (Entertainment Technology Center) of USC, Telestream, and Levels Beyond.

There are plenty of advocates from the cloud industry pointing to global and cost-effective solutions, consuming resources on demand, and so on.  It certainly seems no doubt that in a few years the dedicated racks of carefully constructed equipment and software stacks powering most media companies will be replaced with general-purpose technology, operating systems, and application stacks.

I personally think these application stacks will tend towards a hybrid between on-premise and off-premise infrastructure – with the off-premise perhaps also a hybrid between private specialist hosting and public service providers.  That decision is primarily guided by cost and expertise – which is in turn guided by the service provider’s “value-add” in terms of high-speed connectivity to the content provider or recipient (I am thinking here of the difference between multiple uncompressed HD/UHD feeds from sports venues and delivery to a viewer’s device – broadcast or unicast).

Ultimately as we get universally adopted cloud stack frameworks, it will not be difficult to spin these up and down on different platforms in different environments.  This work is progressing well and I am pleased to see that EMC is well-positioned to deliver this technology – whether it be cloud storage, cloud computing or open-source frameworks to provide resources to third-party vendor application solutions.  Stay tuned for more announcements on this in 2016!

Anyone who brings a digital file-based workflow solution into a dynamic media organization – such as a live Newsroom – knows that the hardest problem to solve is the file naming convention.  Most facilities have their own bespoke solutions.  It doesn’t really matter as long as it is documented and everybody follows the rules! However as our file counts grow into the billions, and we have automatic conform and transcode processes constantly creating new files, we know that file-naming remains a big problem – one that SMPTE has been working to solve and standardize.

One very cool concept that was presented (and still has my head spinning) was from Joshua Kolden, ETC@USC. He presented a solution more advanced than the UMID or MD5 hash, which produces a unique 90-character human and machine readable code for every single file on the planet.  The short summary is below, and the link to the paper – recommended reading!

The C4 ID system provides an unambiguous, universally unique ID for any file or block of data. However, not only is the C4 ID universally unique, it is also universally consistent. This means that given identical files in two different organizations, both organizations would independently agree on the C4 ID, without the need for a central registry or any other shared information.” – ETC@USC

OTT
Delivery of content directly to consumers “Over-The-Top” of the Internet is what we all know these days when we watch media on YouTube, Facebook, Netflix or any of the myriad of platforms.  It is of course one of the biggest consumption growth patterns that our industry is tracking, and every traditional broadcaster is actively making content available via OTT platforms.  It is both a threat and an opportunity, and is a major disruption to what has been a pretty stable advertising- and subscription-funded business model that has endured over the past decades.

There were three thought-provoking sessions covering what SMPTE described as “the wild west.” Prime Focus Technologies – the India-based media platform solution provider – presented a dynamic metadata tagging solution for live sports content creation that dramatically increased the “speed to screen” from live event to mobile catch-up consumption.  Comcast spent some time delving into the real-time packaging and repurposing of linear content for OTT distribution and consumption, including Just-In-Time packaging and dynamic Ad-Insertion (Server-side vs. Client-side).  Everything has to be just right in order to get a good viewer experience with no buffering and pauses. The final session was a student paper from USC.  Actually this student Arnav Mendiratta was also honored by SMPTE as the 2015 recipient of the Louis F. Wolf Jr. Memorial Scholarship. He explored the application and benefits of Big Data Analytics (such as the Hadoop ecosystem) to improve viewer satisfaction and increase monetization.

VR
The pre-symposium conference track was dedicated to the emerging technology and consumer category of Virtual Reality / Augmented Reality.  You will be familiar with this as typified when somebody straps on a viewing headset and enjoys a role-playing game.  Whilst currently in the realm of gaming, this may extend into movies and television as the logical progression beyond stereoscopic (3D) viewing technology.  It is extraordinary to contemplate the storage, computing, and bandwidth issues when you consider each “camera” is now a 360 degree dome with between 12 and 30 HD/UHD cameras running at a high frame rate (> 50 fps). 3D computer stitching of all cameras for every frame creates a high-resolution 360 degree “canvas.” Unicast delivery of this to every viewer – each free to choose their angle of view and direction in real time—means extremely high data rates and storage requirements.  These are problems that will take some time to become commercially viable.

However as my thoughts turned to what the experience would be like enjoying my favorite sports event from a virtual seat hovering close to the on-field umpire, my immediate concern was: how do I reach my beer and drink it whilst wearing a headset and not spill a drop?  Such are the really serious nature of these practical concerns. (I discovered that those on the inside are actually working on this problem by designing a “beer caddy” with the electronic visibility of a game controller).  Maybe VR technology does have a future!

Media and Entertainment is such an important aspect of our lives, and the technology to create, deliver and archive media continues to drive towards ever more efficient workflow. It’s clear that the media industry continues to evolve, as these are just a few of the technologies that are transforming our industry today.

 

Follow Dell EMC

Categories

Archives

Connect with us on Twitter