Find Communities by: Category | Product

Clivegold

EMC Forum 2013 Announced

Posted by Clivegold May 6, 2013

EMC Australia & New Zealand is excited to announce EMC Forum 2013 which is having its 10th year in Australia!

 

We have come a long way in ten years. In 2004, when the first EMC Forum was held in Australia, most PC manufacturers still provided floppy disks as standard, with each disc having a capacity of 1.44MB. If you were to use this technology to save the information the world generates in just one day in 2013, you would need 763 billion floppy disks!

 

EMC Forum is an industry leading event for technology professionals to learn about the trends that are transforming industries, markets and organisations. It features dozens of exhibitors, keynotes from widely respected industry leaders and the opportunity to network with thousands of industry peers!

 

Last year’s event welcomed more than 2,000 delegates from Sydney and Melbourne, and built a discussion around the transformational nature of cloud and big data. This year at EMC Forum the discussion will focus on the ten years of technology which have brought us to this exciting point in time and how to lead your transformation into the next decade. 

 

Attendees will learn how to harness the latest IT developments such as Cloud computing, exploding data volumes and increasing data security to create more efficient organisations.

 

At the event, you will be given the opportunity to choose from over 20 technical sessions, product demonstrations, leadership presentations from local and international experts and numerous networking opportunities.

 

Register for free today!

Sydney: Tuesday, 13 August

Melbourne: Friday,16 August

 

Connect with us to stay up to date on all announcements, session details and news from EMC Forum 2013! Here’s how:

The convergence of Big Data with mobility, cloud, and a social media-driven society has created unparalleled opportunities alongside an equally evolving threat landscape.

Art Coviello, RSA Executive Vice President and Executive Chairman announced that Big Data Analytics will help security practitioners regain the advantages of vigilance and time over sophisticated attackers. More here

EMC’s updated, more powerful Data Protection Advisor 6 brings new capabilities to the table that streamline the monitoring and management of backup.

 

Learn about how EMC solutions have helped companies strengthen their businesses with greater efficiency and lower costs. Through EMC solutions, Monde Nissin has increased efficiency in its operation by reducing its backup storage by 40 per cent. EMC has also helped to strengthen Panduit's global presence by saving approximately $40,000 per remote office in hardware savings.

 

It has been a busy month for EMC product launches in Australia and New Zealand. We announced a new Flash software strategy and the introduction of XtremeSF, a new family of Flash cards, which will dramatically increase application performance.

 

EMC introduced Apache Hadoop: Pivotal HD, the most flexible, scalable, inexpensive, fault-tolerant distribution system ever developed.

 

We also launched VMAX Cloud Edition, optimised for the private cloud, complete with automatic provisioning and tiered support offerings, for self-service, enterprise-class storage-as-a-service, cloud delivery platform.



The days of securing your corporate data via perimeter protection are over! The nature of attacks has evolved to the extent that you have to work under the premise that your perimeter has already been breached, and the Security Compliance Officer has to structure their defence on the assumption that the bad guys are already inside.

 

Traditional defences do not adequately address this problem, and intruders can be left to wander free in your network for weeks, or even months, before they are detected.

 

To face this challenge, RSA, the security division of EMC, has launched RSA Security Analytics. The revolutionary security monitoring system uses big data analytics to detect and aid the investigation of threats. We reveal how RSA Security Analytics can help organisations protect their most valuable asset - the company’s data.

 

Groundbreaking legislations have been introduced in Asia Pacific which require companies in certain industries to ensure that their IT systems function under stress. You can read more on how security leaders are urging organisations to prepare for the big data revolution in information security in this blog post by Par Botes, CTO, EMC APJ.

An Australian Journalist who is well known and one of the doyens of IT News coined the term “Storage is Snorage” about 10 years ago! For the most part he has been right! EMC and the industry have enhanced the hardware by ‘riding the price/performance’ curves of the underlying components… and disks have grown 1000x and processors 10 000x faster and RAM cheaper, However until recently, the basic architecture has not really changed!!

But that has all changed and two new architectures reach prime-time this year!

The first is scale-out, yes I know Isilon has been in the market for over 5 years, but in more niche application areas. Now two trends are merging, mainstream computing environments are experiencing massive growth in unstructured data and the traditional architectures are creaking, and secondly Isilon has ‘Enterprise’ features.

A quick word of warning when you look around, the value is in the architecture not the fact that there is a single file system! Why I say this is that whenever there is a major advance in technology, you get the ‘horseless carriages’! People who take the old technology and substitute some-part of it and think it’s all new.. (Or less kindly; you can put lipstick on a pig, but it’s still a pig!) The reason I put this point in is that ‘traditional’ storage with its RAID groups and LUN size limits, (or aggregates), is the source of the management nightmare when you scale to the PByte level. So putting a wrapper or layer above this does not detract from the management, etc, overheads. So to do Scale-Out, you need to design from the ground up.

Talking about ground up design, it brings me to the second exciting architecture this year, the All-Flash array. Once again all storage designed before this had one design consideration, ‘locality matters!” Because of the mechanical drives the position of the head is a major determinant of performance and through put. (Throw random requests at a disk drive and it will perform like a stuck pig, order them and it screams!)

Now if you start from scratch and design around a storage medium that has no locality, (no penalty for writing or reading form anywhere). Major change in design. Secondly, RAID, well that is an absurd notion, there is no ‘DISK’ involved, just an address space!

Preparing for a talk this week I went searching for the latest estimates of how large the Australian cloud market is. Two companies are often quoted in the IT press as the market analysts, Gartner and IDC. True to form I found two recent estimates; IDC says A$2.33 billion by 2016, (CAGR of 24.8%), (IDC here), but Gartner says it’s already there at $2.4 billion in 2012, (Gartner here).

See the issue? They would argue they count different things, so you can’t really compare the numbers. Which I kind of understand, but what is the size of the market? Maybe that doesn’t really matter, what is important is it is now substantial and it’s growing fast!

What I’m seeing is that ‘cloud’ is now mainstream, just about everyone I speak to has deployed at least one cloud service. To generalise what I would say is that SMB’s are almost totally cloud, why would you purchase and look after infrastructure if you were a small business or a start-up? In the mid-tier it seems like the cloud is the second datacentre used for backup and DR. And at the high end it seems as if the cloud has become the third datacentre. (Perhaps active-active between the two primary datacentres with the cloud providing the third; test, dev, triangulated DR.)

Confidence is building as well, perhaps not in the Traditional public cloud providers as fast as its growing with the home-grown enterprise centric providers like Telstra, Optus, etc. (In-fact the IDC article above says 70% of their survey group agree with this.)

Anyone who is still riding the data-sovereignty and “patriot act” rationale for not adopting cloud, is breading shadow IT in their organisation. Which is very dangerous! There is now nothing stopping any user pulling out the corporate credit card and setting up that Dropbox or AWS service to get something done quickly. Now IT has no control and data security is totally breached! However providing a trusted, secure and enterprise facility which is readily available in Australia by some of Australia’s most trusted brands… does seem more logical to me.

Now if I convinced you to take the leap into the Australian cloud be aware of another prediction, 30% of cloud suppliers will be out of business by 2015, (IDC). So when choosing the vendor you still need the due-diligence as no matter how good your contract is, once an organisation goes under, your data and systems and processes go with it!

What’s your experience with the Australian Cloud?

Occasionally, we get very clear signs that significant change is in the air.  When large sums of money unexpectedly change hands, people take notice that something interesting and perhaps unexpected is happening.


This morning, GE announced that they were investing $105m to take a significant stake in Pivotal, the new initiative jointly owned by EMC and VMware.

 

On one hand, we have GE: perhaps one of the best examples of an exceedingly well-run global corporation with a market cap of over $220B.  On the other hand, we have Pivotal: a nascent analytics platform company formed from EMC and VMware assets, with an exceptional leader at the helm: Paul Maritz.

 

Why would a well-resourced and exceptional global corporation take a significant stake in what might appear to be a technology startup venture? 

 

And what might this signal going forward?


Context Matters

I need to start with a disclaimer: I have no "inside knowledge" whatsoever in regards to this.  Consider what follows only informed speculation on my part.

 

That being said, let's dive in ...

 

The first big idea in play is the "internet of things", or what GE calls the "machine internet".

 

6a00d83451be8f69e2017eea88ad87970d-320wi.jpg

The first wave of the internet was driven by people interacting with computers in new and different ways.  We saw the seismic changes that occurred, and are still happening to this day.

The second wave is now upon us: machines talking to machines in new and different ways.  Not to overstate the obvious, but there are many more machine sensors than people, and they don't get tired, enter into flame wars, embarrass themselves on Facebook, etc.

 

The result is data -- and lots of it.  Incredibly rich, valuable and relatively new data sources that can be harnessed in ways that we can only begin to imagine.

Enter the second big idea -- big data analytics, or more specifically -- an opportunity to harvest and monetize all that data in exciting, novel and occasionally revolutionizing ways.

 

Early pioneers in this space had to basically roll their own platforms to do this work.  It wasn't easy, but they persevered and are now starting to clearly reap the rewards.  But not every potential beneficiary of big data analytics applications has the required resources to follow their path.

 

Which brings us to our third big idea: the compelling goal of Pivotal is to create an extensible and industrialized software platform that brings these capabilities to businesses and organizations everywhere.

It's one of those "we can change the world" missions.  I, for one, am completely bought in.

 

Three big ideas: the internet of things, monetizing data through a new generation of big data analytics applications, and an industrial-strength platform to do it on.

 

Consider GE


If you follow Jeff Immelt (and you should), it is obvious that he is on a mission to completely reposition GE's business model for this new world.

It is not an experiment or a side show -- it's the core strategy of the company going forward.  While only an exceptional leader can take on a challenge of this magnitude at GE scale -- he is clearly making progress.

6a00d83451be8f69e2017eea8897c8970d-320wi.jpg

Digging deeper, if you consider the core vertical industries where GE plays, it's not hard to be dazzled by the amazing transformational potential of big data analytics in each and every one of them.  It is a tide that raises each and every boat in the GE harbor.

 

The GE leadership team clearly has the big data analytics bug.  They're not the first, and they won't be the last.


Build, Buy or Partner?


While GE certainly has the resources to build their own software platforms, it's not one of their core competencies at present.  I believe that GE would want to focus their energies on using and exploiting the technology, and not hand-crafting several million lines of code.

 

If today you went looking for products to buy to create the required platform capabilities, it'd be more of the same: you'd find bits and pieces here and there -- and you'd have to invest massively to create the required industrialized capabilities.


6a00d83451be8f69e2017d431445b7970c-320wi.jpg


They're just not in the marketplace today -- that's one of the reasons Pivotal was formed.

 

That leaves you with a partnering option.  Here again, your potential choices are limited to a handful of familiar names: perhaps IBM, Oracle, maybe Microsoft or an SI partner.  And, of course, the new Pivotal venture.

 

Once you've selected your partner, you'll want to partner deep.  Remember, GE is betting their future business model on having a platform that's capable of serving all their potential needs -- now, and into the future.  Nothing says "partnership" quite like investing more than $100m to take a meaningful stake.

 

Laid out this way, it's not quite so surprising GE did what they did.   It meets a strategic business requirement, and will probably turn out to be a smart balance sheet investment in its own right.


The Pivotal Impact


The analyst community was duly impressed by the announcement.  John Furrier declared Pivotal the "new superpower".  Steve Duplessie was unusually at a loss for words, except "Wow, wow, wow".  And I'm sure there will be more responses along this line.


Not that Pivotal needed any more attention or focus -- they've got plenty of that already -- but this announcement took the buzz to a whole new level.


6a00d83451be8f69e2017eea88a4ff970d-320wi.png

 

It's largely unprecedented model -- a large, strategic user of technology taking a sizable investment stake in the technology's provider vs. simply buying the product, or perhaps buying the provider.


Pivotal also gets a unique opportunity to closely partner with a very motivated and challenging customer without the usual inefficiencies associated with the traditional vendor/customer relationship.  Both parties own a big stake in the successful outcome -- at the CEO level.  Speaking solely as a technology vendor, that's a very valuable opportunity indeed.

 

But these are interesting times indeed.

 

And interesting times call for innovative approaches.


By Peter Smalls – Senior Director, Product Marketing, Backup Recovery System Division at EMC

Today, EMC announced updates to EMC Data Domain deduplication storage systems and EMC SourceOne archiving software, changing the data protection game by delivering enhanced backup and archiving performance and efficiency, an expanded partner ecosystem, and reduced cost and complexity.

There is no more important job for an IT department than protecting its company’s data.  But IT faces the Herculean task of ensuring that data is both RECOVERABLE and ACCESSIBLE for as long as needed, all while dealing with limited resources and flat budgets.  That’s why EMC delivers data protection solutions for backup AND archiving, helping organizations address their recoverability and accessibility requirements while reducing cost and complexity through consolidation.

backup-archive-consolidation-screen-shot-4-8-13.png

Today’s announcement expands Data Domain’s market leadership in the purpose-built-backup-appliance market, delivering:

  • Oracle optimized deduplicationenabling customers to easily integrate Data Domain into multiplexed Oracle RMAN deployments while maximizing deduplication rates.
  • Data Domain Boost over Fibre Channelenabling faster, more efficient backup and simplified management in Fibre Channel environments.
  • Enhanced Data Domain Boost for Symantec NetBackupsimplifying cross-domain disaster recovery and file system backup in NetBackup environments.
  • New Data Domain Boost Support for Dell NetVault Backupdelivering 50% faster and more efficient backup and disaster recovery in Dell NetVault Backup environments.

 

Check out Data Domain’s market leading backup application ecosystem:

boost-ecosystem-screen-shot-2013-04-09-at-10-36-00-am.png

Data Domain systems are also making big waves in archiving.  As the first and only inline deduplication solution optimized for consolidating backup and archive data on a single system, Data Domain systems deliver unbeatable TCO:

  • 3x Faster Ingest Performanceensuring quick and efficient archiving and accelerated archive storage migrations from legacy archive platforms to Data Domain systems.
  • Expanded Archive Application Supportincluding integration with EMC Documentum and IBM InfoSphere Optim.

 

Data Domain systems can now be deployed with over 16 archiving applications, a few of which are highlighted below:

archive-ecosystem-new-screen-shot-2013-04-05-at-9-02-31-am1.png

Today’s announcement also reflects EMC’s commitment to archiving software.  EMC SourceOne 7 is a milestone achievement including enhancements for file, email, and SharePoint archiving, including:

  • Faster Performancenew indexing engine, support for Office 2010, and support for over 40 additional file types.
  • New File System Archivingin-place indexing and tiered storage options reduce primary storage costs and allow for discovery of content regardless of location.
  • Enhanced Reporting, Auditing, and Monitoringoffers detailed auditing and reporting for all of the content types stored in the archive, and provides tighter integration with system management tools.

 

We’re also announcing major enhancements to EMC SourceOne Discovery Manager including faster performance, a new GUI, expanded content support, enhanced auditing and reporting!

Now customers have one solution for all their archiving and discovery needs:

s1-process-screen-shot-2013-04-09-at-10-40-00-am.png

We’re not the only ones excited about this news, check out what our customers and partners are saying by clicking on the quotes below:

quotesmed_artboard-2-1024x315.png

Visit  the Data Domain and SourceOne pages on emc.com and let us know what you think below.

By Peter C. Conway – Vice President, Enterprise Storage Division at EMC

Today at VMWare’s Partner Exchange, EMC launched VMAX Cloud Edition. I’ve been a part of some pretty big transformations leading product management and technical marketing teams at companies like EMC and Microsoft. I can testify that it’s truly a fundamental change to the whole idea of what “enterprise-class storage” is, and how “as-a-service” delivery works for cloud. For more detail and to participate in the conversation join our launch webcast today at 11 am ET.

This is a first for the industry. It’s a self-service, enterprise-class as-a-service, cloud delivery platform that provides easy access to consume storage for public, private and hybrid cloud.

The breakthrough is self-service access. It’s easy, flexible and fast—with enterprise storage attributes. It’s all made possible by the pre-engineered and pre-configured service levels that form a service catalog for tenants to choose from. Gone are engines, drives and nerd-knobs with VMAX Cloud Edition. It’s about right-sizing your storage for your application needs. Also, all of the service levels spring from the most powerful, Tier-1 storage in the industry: VMAX.

What’s the impact?

VMAX Cloud Edition accelerates the design and implementation of “as-a-service delivery” up-to 4.5X faster than any other multi-tenant storage in the industry today. How’s that possible? VMAX Cloud Edition basically eliminates the steps to design/model/test service levels, create array configurations, and implement the configuration. VMAX Cloud Edition automates all of those day to day storage tasks.

Here’s how it works.

Need higher performance? Pick a higher performance service level like Gold, Platinum or Diamond. Activity is low? Change on the fly to a service level like Silver or Bronze. A DBA, a business application owner or virtualization manager
– with no specialized storage skills – can easily provision storage 6X faster than a storage expert provisioning storage traditionally.

Cool, huh?

Brian Garrett, Vice President, ESG Lab at Enterprise Strategy Group offers his take on the experience here:

 

 

Jason Currill is the CEO of Ospero, a UK based worldwide cloud utility company shares his reaction:


 

Like I said – a fundamental change.

VMAX Cloud Edition includes everything needed to run IT as a business. It’s got tenant-level metering and chargeback or show-back reporting, as well as REST APIs to integrate reporting, operation and self-service into your existing management and orchestration layer.

There’s also a linear cost model for predictable management of the business of cloud. Each service level is priced consistently per TB regardless of the quantity purchased, so private IT and cloud service providers can buy-in and scale-up without a cost penalty. That removes a lot of risk in establishing chargeback pricing.

One question remains. If users can easily make the choices they need based on service level outcome, and enterprise-class service level delivery is automated, will anyone ever look at storage the same way again?

Clivegold

The Big Data Storymap

Posted by Clivegold May 5, 2013

By Bill Schmarzo

I wanted to share some recent work that we have been doing inside EMC Global Services, to create a “Big Data Storymap” that would help clients understand the big data journey in a pictorial format.

big-data-storymap.png

The goal of a storymap is to provide a graphical visualization that uses metaphors and themes to educate our clients about the key components of a successful big data strategy[1].  And like any good map, there are important “landmarks” that I want to make sure you visit.

Landmark #1:  Explosive Market Dynamics

landmark1.jpg

Market dynamics are changing due to big data.  Data, like water, is powerful.  Massive volumes of structured and unstructured data, wide variety of internal and external data, and high-velocity data can either power organizational change and business innovation, or it can swamp the unprepared.  Organization that don’t adapt to big data risk:

  • Profit and margin declines
  • Market share losses
  • Competitors innovating faster
  • Missed business opportunities

 

On the other hand, organizations that aggressively integrate big data thinking and capabilities will be able to:

 

  • Mine social and mobile data to uncover customers’ interests, passions, associations, and affiliations
  • Exploit machine data for predictive maintenance and operational optimization
  • Leverage behavioral insights to create more a compelling user experience
  • Integrate new big data innovations to modernize data warehouse and business intelligence environments (real-time, predictive)
  • Become a data-driven culture
  • Nurture and invest in data assets
  • Cultivate analytic models and insights as intellectual property

 

Landmark #2:  Business And IT Challenges



landmark2.jpg

Big Data enables business transformation, moving from a “rearview mirror” view of the business using a subset of the data in batch to monitor business performance, to the predictive enterprise that leverages all available data in real-time to optimize business performance.  However, organizations face significant challenges in leveraging big data to transform their businesses, including:

  • Rigid architectures that impede exploiting immediate business opportunities
  • Retrospective reporting that doesn’t guide business decisions
  • Social, mobile, or machine insights that are not available in an actionable manner

 

Traditional business intelligence and data warehouses struggle to manage and analyze new data sources.  Their architectures are:


  • Batch-oriented which delays access to the data for analysis
  • Brittle and labor intensive to add new data sources, reports, and analytics
  • Performance and scalability challenged as data scales to petabytes
  • Limited to aggregated and sampled data views
  • Unable to handle the tsunami of new, external unstructured data sources

 

Landmark #3: Big Data Business Transformation


landmark3.jpg

Where are an organization’s aspirations with respect to leveraging big data analytics to power value creation processes?  Some organizations struggle understanding the business potential of big data.  They are unclear as to the different stages of business maturity.  Our Big Data Maturity model benchmarks an organization’s big data business aspirations, and provides a way to identify the level of sophistication desired for data monetization opportunities:

  • Business Monitoring – deploys business intelligence to monitor on-going business performance
  • Business Insights – leverages predictive analytics to uncover actionable insights that can be integrated into existing reports and dashboards
  • Business Optimization – embeds predictive analytics into existing business processes to optimize select business operations
  • Data Monetization – creates new revenue opportunities by reselling data and analytics, creating “intelligent” products, or over-hauling the customer engagement experience
  • Business Metamorphosis – leverages customers’ usage patterns, product performance behaviors, and market trends to create entirely new business models

 

Landmark #4: Big Data Journey


landmark4.jpg

The big data journey requires collaboration between business and IT stakeholders along a path to identify the right business opportunities and necessary big data architectures.  The big data journey needs to 1) focus on powering an organization’s key business initiative while 2) ensuring that the big data business opportunities can be implemented by IT.  The big data journey following this path:

  • Identify the targeted business initiative where big data can provide competitive advantage or business differentiation
  • Determine – and envision – how big data can deliver the required analytic insights
  • Define over-arching data strategy (acquisition, transformation, enrichment)
  • Build analytic models and insights
  • Implement big data infrastructure, technologies, and architectures
  • Integrate analytic insights into applications and business processes

 

Landmark #5: Operationalize Big Data


landmark5.jpg

Successful organizations define a process to continuously uncover and publish new insights about the business.  Organizations need a well-defined process to tease out and integrate analytic insights back into the operational systems.  The process should clearly define roles and responsibilities between business users, the BI/DW team, and data scientists to operationalize big data:

  • Collaborate with the business stakeholders to capture new business requirements
  • Acquire, prepare, and enrich the data; acquire new structured and unstructured sources of data from internal and external sources
  • Continuously update and refine analytic models; embrace an experimentation approach to ensure on-going model relevance
  • Publish analytic insights back into applications and operational and management systems
  • Measure decision and business effectiveness in order to continuously fine-tune analytic models, business processes, and applications

 

Landmark #6: Value Creation City


Big data holds the potential to transform or rewire your value creation processes to create competitive differentiation.  Organizations need a big data strategy that links their aspirations to the organization’s key business initiatives.  Envisioning workshops and analytic labs identify where and how big data can power the organization’s value creation processes.  There is almost no part of the organization that can’t improve its value creation capabilities with big data, including:

  • Procurement to identify which suppliers are most cost-effective in delivering high-quality products on-time
  • Product Development to identify product usage insights to speed product development and improve new product launches
  • Manufacturing to flag machinery and process variances that might be indicators of quality problems
  • Distribution to quantify optimal inventory levels and supply chain activities
  • Marketing to identify which marketing campaigns are the most effective in driving engagement and sales
  • Operations to optimize prices for “perishable” goods such as groceries, airline seats, and fashion merchandise
  • Sales to optimize account targeting, resource allocation, and revenue forecasting
  • Human Resources to identify the characteristics and behaviors of the most successful and effective employees

 

The Big Data Journey Storymap

The big data storymap provides an engaging visual for helping organizations understand some of the key components of a successful big data strategy.  I hope that you will enjoy the storymap as much as I enjoyed the opportunity to work with Mark Lawson and Glenn Steinhandler to pull it together!

Filter Blog

By date:
By tag: