Posts Tagged ‘source:etb’

Need 32 Gb/s Fibre Channel? Got Connectrix MDS 9700s? You’re All Set…Almost!

Deirdre Wassell

Deirdre Wassell

On April 11, 2017 Cisco announced the availability of a 48-port 32Gb/s switching module for their MDS 9700 enterprise director series.

Why is this important to Dell EMC Connectrix customers?  Dell EMC resells and services Cisco MDS products under the Dell EMC Connectirx brand.  Many of our largest storage customers choose Connectrix MDS for their storage networking needs.  If you are a Connectrix MDS 9700 customer, you will be able to upgrade to 32Gb/s Fibre Channel while preserving your current investment.

Since Cisco entered the storage networking business in 2003, their philosophy has been to design for the long term to sustain their customer’s investments as Fibre Channel (FC) speed bumps occur.  The MDS 9500 series for example, supported 4- and 8Gb/s FC.  With this philosophy, there’s no need for full chassis or “fork-lift” upgrades.

Dell EMC expects to release the new Connectrix MDS switch module in calendar year quarter three, 2017.  Stay tuned for more details about new features and functionality coming soon.

Cisco MDS 48-port 32Gb/s Switching Module for the MDS 9700 Series

Galaxy: A Workflow Management System for Modern Life Sciences Research

Nathan Bott

Healthcare Solutions Architect at EMC

Am I a life scientist or an IT data manager? That’s the question many researchers are asking themselves in today’s data-driven life sciences organizations.

Whether it is a bench scientist analyzing a genomic sequence or an M.D. exploring biomarkers and a patient’s genomic variants to develop a personalized treatment, researchers are spending a great amount of time searching for, accessing, manipulating, analyzing, and visualizing data.

Organizations supporting such research efforts are trying to make it easier to perform these tasks without the user needing extensive IT expertise and skills. This mission is not easy.

Focus on the data

Modern life sciences data analysis requirements are vastly different than they were just a handful of years ago.

In the past, once data was created, it was stored, analyzed soon after, and then archived to tape or another long-term medium. Today, not only is more data is being generated, but also the need to re-analyze that data means that it must be retained where it can be easily accessed for longer periods.

Additionally, today’s research is much more collaborative and multi-disciplinary. As a result, organizations must provide an easy way for researchers to access data, ensure that results are reproducible, and provide transparency to ensure best practices are used and that procedures adhere to regulatory mandates.

More analytics and collaboration represent areas where The Galaxy Project (also known as just Galaxy) can help. Galaxy is a scientific workflow, data integration, and data and analysis persistence and publishing platform designed to help make computational biology accessible to research scientists that do not have computer programming experience.

Galaxy is generally used as a general bioinformatics workflow management system that automatically tracks and manages data while providing support for capturing the context and intent of computational methods.

Organizations have several ways to make use of Galaxy. They include:

Free public instance: The Galaxy Main instance is available as a free public service at UseGalaxy.org. This is the Galaxy Project’s primary production Galaxy instance and is useful for sharing or publishing data and methods with colleagues for routine analysis or with the larger scientific community for publications.

Anyone can use the public servers, with or without an account. (With an account, data quotas are increased and full functionality across sessions opens up, such as naming, saving, sharing, and publishing Galaxy-defined objects).

Publicly available instances: Many other Galaxy servers besides Main have been made publicly available by the Galaxy community. Specifically, a number of institutions have installed Galaxy and have made those installations either accessible to individual researchers or open to certain organizations or communities.

For example, the Centre de Bioinformatique de Bordeaux offers a general purpose Galaxy instance that includes EMBOSS (a software analysis package for molecular biology) and fibronectin (diversity analysis of synthetic libraries of a Fibronectin domain). Biomina offers a general purpose Galaxy instance that includes most standard tools for DNA/RNA sequencing, plus extra tools for panel resequencing, variant annotation, and some tools for Illumina SNP array analysis.

A list of the publically available installations of Galaxy can be found here.

Do-it-yourself: Organizations also have the choice of deploying their own Galaxy installations. There are two options: an organization can install a local instance of Galaxy (more information on setting up a local instance of Galaxy can be found here), or Galaxy can be deployed to the cloud. The Galaxy Project supports CloudMan, a software package that provides a common interface to different cloud infrastructures.

How it works

Architecturally, Galaxy is a modular python-based web application that provides a data abstracting layer to integrate with various storage platforms. This allows researchers to access data on a variety of storage back-ends like standard direct attached storage, S3 object-based cloud storage, storage management systems like iRODs (the Integrated Rule-Oriented Data System), or a distributed file system.

For example, a Galaxy implementation might use object-based storage such as that provided by Dell EMC Elastic Cloud Storage (ECS). ECS is a software-defined, cloud-scale, object storage platform that combines that cost advantages of commodity infrastructure with the reliability, availability, and serviceability of traditional storage arrays.

With ECS, any organization can deliver scalable and simple public cloud services with the reliability and control of a private-cloud infrastructure.

ECS provides comprehensive protocol support, like S3 or Swift, for unstructured workloads on a single, cloud-scale storage platform. This would allow the user of a Galaxy implementation to easily access data stored on such cloud storage platforms.

With ECS, organizations can easily manage a globally distributed storage infrastructure under a single global namespace with anywhere access to content. ECS features a flexible software-defined architecture that is layered to promote limitless scalability. Each layer is completely abstracted and independently scalable with high availability and no single points of failure.

Get first access to our Life Sciences Solutions

You can test drive Dell EMC ECS by registering for an account and getting access to our APIs by visiting https://portal.ecstestdrive.com/

Or you can download the Dell EMC ECS Community Edition here and try it for FREE in your own environment with no time limit for non-production use

At the Speed of Light

Keith Manthey

CTO of Analytics at EMC Emerging Technologies Division

For the last year, an obvious trend in analytics has begun to emerge. Batch analytics are getting bigger and real-time analytics are getting faster.  This divergence has never been more apparent then as of late.

Batch Analytics

Batch analytics primarily compose the arena of descriptive analytics, massive scale analytics, and online model development. Descriptive analytics are still the main purview of data warehouses, but Hadoop has expanded the capabilities to ask “What If” questions with far more data types and analytics capabilities. The size of some Hadoop descriptive analytics installations have reached rather massive scale.

The documented successes of massive scale analytics is well trod. Cross data analytics (like disease detection with multiple data sets), time-series modeling, and anomaly detection rank are particularly impressive due to their depth of adoption in several verticals. The instances in health care analytics with Hadoop alone in the past year are numerous and show the potential of this use case to provide amazing insights into caring for our aging population as well as healing rather bespoke diseases.

Model development is an application that effectively highlights the groundbreaking potential that can be unlocked through Hadoop’s newest capabilities and analytics. Creating real-time models based upon trillions of transactions for a hybrid architecture is a good example of this category. Due to the small percentage of records that actually occur as real fraud on a daily basis, trillions of transactions are required for a fraud model to be certified as effective. The model is then deployed into production, which is often a real-time system.

One data point for the basis of my belief that batch is getting “bigger” is that I have been engaged in no less than 10 Hadoop clusters that have crossed the 50 PB threshold this year alone. In each case, the cluster has hit a logical pause point, causing customers  to re-evaluate the architecture and operations. This may be due to cost, scale, limitations, or other catalysts.  These are often the times when I am engaged with these customers. Not every client reaches these catalysts at consistent sizes or times, so it’s interesting that 10 clusters greater than 50 PB have hit this in 2017 alone. Nonetheless, Hadoop continues to capture new records of customers setting all-time size limits on their Hadoop cluster size.

Real-Time Analytics

While hybrid analytics were certainly in vogue last year, real-time or streaming analytics appear to be the hottest trend as of late. Real-time analytics, such as efforts to combat fraud authorizations, are not new endeavors. Why is the latest big push for streaming analytics now the “new hot thing”? There are several factors at play.

Data is growing at an ever increasing rate. One contributing factor can effectively be categorized as “whether to store or not to store.” While this step takes place usually in conjunction with more complex processes, one aspect that is clearly apparent is some form of analytics to decide if the data is useful.  Not every piece of data is valuable and an enormous amount of data is being generated. Determining if there is value in using batch storage for a particular artifact of data is one use for real-time analytics.

Moving up the value chain, the more significant factor at play is that the value proposition of real-time far outweighs the value proposition in batch. However, this doesn’t mean that batch and real-time are de-coupled or not symbiotic in some ways. In high-frequency trading, fraud authorization detection, cyber security, and other streaming use cases, the value of gaining insights in real time versus several days can be especially critical. Real-time systems have historically not relied upon Hadoop for their architectures, which has not gone unnoticed by some traditional Hadoop ecosystem tools like Spark.  The University of California Berkeley recently shifted the focus of its AMP Labs to create RISELabs, greenlighting projects such as Drizzle that aim to bring low-latency streaming capabilities to Spark. The ultimate goal of Drizzle and RISELabs is to increase the viability of Spark for real-time, non-Hadoop workloads. The emphasis on creating lower latency tools will certainly escalate the usage of streaming analytics, as real time continues to get “faster.”

The last factor is the “Internet of Everything,” often referred to as “IoT” or “M2M.” While sensors are top of mind, most companies are still finding their way in this new world of streaming sensor data. Highly technologically advanced use cases and designs are already in place, but the installs are still very bespoke and limited in nature. The mass adoption is still a work in progress. The theoretical value of this data for use in governance analytics or the analytics of improving business operations is massive. Given the dearth of data, storage in batch is not a feasible alternative at scale. As such, most of the analytics of IoT are streaming-based capabilities. The value proposition is still truly outstanding and IoT analytics remain in the hype phase. The furor and spending is in full-scale deployment regardless.

In closing, the divergence of analytics is growing between batch and online analytics. The symbiotic relationship remains strong, but the architectures are quickly separating. Most predictions from IDC, Gartner, and Forrester indicate streaming analytics will grow at a far greater rate than batch analytics due to most of the factors above. It will be interesting to see how this trend continues to manifest itself.  Dell EMC is always interested in learning more about specific use cases, and we welcome your stories on how these trends are impacting your business.

Overcoming the Exabyte-Sized Obstacles to Precision Medicine

Wolfgang Mertz

CTO of Healthcare, Life Sciences and High performance Computing

As we make strides towards a future that includes autonomous cars and grocery stores sans checkout lines, concepts that once seemed reserved only for utopian fiction, it seems there’s no limit to what science and technology can accomplish. It’s an especially exciting time for those in the life sciences and healthcare fields, with 2016 seeing breakthroughs such as a potential “universal” flu vaccine and CRISPR, a promising gene editing technology that may help treat cancer.

Several of Dell EMC’s customers are also making significant advances in precision medicine, the medical model that focuses on using an individual’s specific genetic makeup to customize and prescribe treatments.

Currently, physicians and scientists are in the research phase of a myriad of applications for precision medicine, including oncology, diabetes and cardiology. Before we are able to realize the vision President Obama shared of “the right treatments at the right time, every time, to the right person” from his 2015 Precision Medicine Initiative, there are significant challenges to overcome.

Accessibility

In order for precision medicine to become available to the masses, this will require researchers and doctors to not only have the technical infrastructure to support genomic sequencing, but the storage capacity and resources to access, view and share additional relevant data as well. They will need to have visibility into patients’ electronic health records (EHR), along with information on environmental conditions and lifestyle behaviors and biological samples. While increased data sharing may sound simple enough, the reality is there is still much work to be done on the storage infrastructure side to make this possible. Much of this data is typically siloed, which impedes healthcare providers’ ability to collaborate and review critical information that could impact a patient’s diagnosis and treatment. To fully take advantage of the potential life-saving insights available from precision medicine, organizations must implement a storage solution that enables high-speed access anytime, anywhere.

Volume

Another issue to confront is the storage capacity needed to house and preserve the petabytes of genomic data, medical imaging, EHR and other data. Thanks to decreased costs of genomic sequencing and more genomes being analyzed, the sheer volume of genomic data alone being generated is quickly eclipsing the storage available in most legacy systems. According to a scientific report by Stephens et. al published in PLOS Biology, between 100 million and two billion human genomes may be sequenced by 2025. This may lead to storage demands of up to 2-40 exabytes since storage requirements must take into consideration the accuracy of the data collected. The paper states that, “For every 3 billion bases of human genome sequence, 30-fold more data (~100 gigabases) must be collected because of errors in sequencing, base calling and genome alignment.” With this exponential projected growth, scale-out storage that can simultaneously manage multiple current and future workflows is necessary now more than ever.

Early Stages 

Finally, while it’s easy to get caught up in the excitement of the advances made thus far in precision medicine, we have to remember this remains a young discipline. At the IT level, there’s still much to be done around network and storage infrastructure and workflows in order to develop the solutions that will make this ground-breaking research readily available to the public, the physician community and healthcare professionals. Third-generation platform applications need to be built to make this more mainstream. Fortunately, major healthcare technology players such as GE and Philips have undertaken initiatives to attract independent software vendor (ISV) applications. With high-profile companies willing to devote time and resources to supporting ISV applications, the more likely it is scientists will have access to more sophisticated tools sooner.

More cohort analysis such as Genomic England’s 100,000 Genomic Project must be put in place to ensure researchers have sufficient data to develop new forms of screening and treatment and these efforts will also necessitate additional storage capabilities.

Conclusion

Despite these barriers, the future remains promising for precision medicine. With the proper infrastructure in place to provide reliable shared access and massive scalability, clinicians and researchers will have the freedom to focus on discovering the breakthroughs of tomorrow.

Get first access to our Life Sciences Solutions

Examining TCO for Object Storage in the Media and Entertainment Industry

The cloud has changed everything for the media and entertainment industry when it comes to storage. The economies of scale that cloud-based storage can support has transformed the way that media organizations archive multi-petabyte amounts of media.

Tape-based multi-petabyte archives present a number of challenges, including a host of implementation of maintenance issues. Data stored on tape is not accessible until the specific tape is located, loaded onto a tape drive, and then positioned to the proper location on the tape. Then there is the factor of the physical footprint of the library frame, and real estate required for frame expansions – tape libraries are huge. This becomes all the more problematic in densely populated, major media hubs such as Hollywood, Vancouver and New York.

At first, the public cloud seemed like a good alternative to tape, providing lower storage costs. But while it’s cheaper to store content in the public cloud, you must also factor in the high costs associated with data retrieval, which can be prohibitive given data egress fees. The public cloud also requires moving your entire media archive library to the cloud and giving up the freedom to use the applications of your choice. Suddenly the lower initial costs of the public cloud can be wrapped up in a significantly larger price to pay.

Object storage is emerging as a viable option that offers media companies a number of benefits and efficiencies that the public cloud and tape-based archives simply cannot provide. In fact, object storage is rapidly becoming mandatory for applications that must manage large, constantly growing repositories of media for long-term retention.

Dell EMC Elastic Cloud Storage (ECS) blends next-generation object storage with traditional storage features that offer the media and entertainment world an on-premises cloud storage platform that is cost-competitive with multi-petabyte type libraries. ECS not only simplifies the archive infrastructure, it enables critical new cloud-enabled workflows not possible with a legacy tape library.

Instant Availability of Content

The greatest benefit of object storage for media and entertainment companies is the instant availability of their media content – you can’t access media on tape without a planned and scheduled retrieval from a robotic tape library. For a broadcast company, the delay in data availability could result in a missed air date, advertiser revenue loss, and legal fees.

With instant access to their entire archives, a whole new world of possibilities opens up for content creators. Archives aren’t often considered when it comes to content creation – the process of accessing media content has historically been difficult and the process of obtaining data often takes far too long. However, with instant access to archived media, archives can effectively become monetized, rather than just sitting around on tape in a dark closet gathering dust and being wasted. Being able to access all of your media content at any time allows rapid deployment of new workflows and new revenue opportunities. Further, with object storage, engineering resources that were focused on tape library maintenance can be re-focused on new projects.

Operational Efficiencies

Object storage can also offer increased operational efficiencies – eliminating annual maintenance costs, as one example. One of the biggest – and least predictable – expenses with operating a tape library is maintenance. Errors on a tape library are commonplace, drive failures and downtime to fix issues can impact deadlines and cause data availability issues that can require valuable engineering time and result in lost revenue.

Going Hot and Cold: Consolidation and Prioritization

Public cloud storage services can enable users to move cold or inactive content off of tier 1 storage for archiving, but concerns around security, compliance, vendor-lock and unpredictable costs still remain a concern.  Cold content can still deliver value and ESC allows organizations to monetize this data and provide an active-archive with the same scalability and low costs benefits, but without the lack of IT agility and reliability concerns.

ECS allows organizations to consolidate their backup and archive storage requirements into a single platform. It can replace tape archives for long-term retention and near-line purposes, and surpass public cloud service for backup.

In the video below, Dell EMC’s Tom Burns and Manuvir Das offer some additional perspective on how the media and entertainment industry can benefit from object storage: 

Stay current with Media & Entertainment industry trends

Dispelling Common Misperceptions About Cloud-Based Storage Architectures

As the media and entertainment industry moves to 4K resolution and virtual/augmented content formats, the storage and archive requirements for media content has grown exponentially. But while storage requirements continue to skyrocket, industry revenue has not grown accordingly – and M&E organizations are finding themselves challenged to “do more with less.” More organizations are looking to leverage the cost efficiencies, scalability and flexibility that cloud storage can offer, but many remain apprehensive about taking the plunge.

To be clear, in this post when we talk about “the cloud,” we’re talking cloud architectures, versus the public cloud provided by vendors such as Microsoft, AWS and Google, among others. Unlike public clouds, cloud architectures can be used completely within your facility if desired and they are designed with infinite scalability and ease of access in mind.

There are a number of misperceptions about moving data to cloud architectures that are (wait for it) clouding people’s judgment. It’s time we busted some of the bigger myths and misperceptions out there about cloud storage.

Myth #1: I’ll have to learn a whole new interface – false! Dell EMC’s Elastic Cloud Storage (ECS) employs a tiered system, where it sits under a file system – in our case, Isilon. For organizations already deploying Isilon SAN or NAS storage platforms, the workflows stay exactly as they were, as does users’ interface to the file system.

This tiered approach helps companies to “do more with less” by allowing them to free up primary storage and consolidate resources. By tiering down “cold,” inactive data to ECS, you can better optimize your tier-one higher performance storage and drive down costs.

Myth #2: My data won’t be safe in the cloud – false! ECS features a geo-efficient architecture that stores, distributes and protects data both locally and geographically, eliminating any single point of failure and providing a seamless failover from site to site with no impact to business. Further, even though the data within ECS is distributed, it’s still a secure, private environment so users won’t run into scenarios where anyone can access information without the right credentials.

Myth #3: Collaboration and access is going to be negatively impacted – false! If you look at the VFX industry, for example, teams are frequently spread across the world and working across time zones on a 24/7 basis. ECS enables global teams to work on the same piece of data at the same time from one system – it’s true collaboration. ECS’s multi-site, active-active architecture and universal accessibility enables anywhere access to content from any application or device.

Myth #4: Moving to the cloud is an all-or-nothing approach – false! ECS can be deployed when your organization is ready for it – whether that’s in a month, or six months, or a year. We realize a lot of operations personnel like to “see” their data and know first-hand that it’s there. We get that. But as things evolve, it’s likely that organizations will face pressure to take at least some of the data offsite. With ECS, you can still keep your data in the data center and, when the time is right to take your data off-site, Dell EMC can work with your organization to move your infrastructure to a hosted facility or a co-lo where you can continue to access your data just as you did when it was on-premise. ECS is available in a variety of form factors that can be deployed and expanded incrementally, so you can choose the right size for your immediate needs and project growth.

Because it is designed with “limitless scale” in mind, ECS eliminates concerns and worries of running out of storage, it can meet the needs for today’s M&E organizations, as well as those in the future simply by adding additional storage, just as you used to do with tapes.

Hopefully we’ve been able to bust a few of the myths around adopting a cloud-based storage architecture. This video featuring Dell EMC’s Tom Burns and Manuvir Das can offer additional insight into ECS’s tiered approach and how media organizations can begin seeing benefits from day one.

Stay current with Media & Entertainment industry trends here or listen to Broadcast Workflows webcast recording.

Categories

Archives

Connect with us on Twitter