September 2018: Data Virtualization

Presented by Robert Eve & Leo Duncan

TIBCO Software & Nordisk Systems

Modern development cycles are aggressively short.  Agile enterprises can no longer afford to wait weeks for a new object to be delivered to a data store.   Data integration must be immediate in order to maximize that data’s value to the organization.
Extraction, Transformation, & Load (ETL) has been the backbone of data integration for as long as most of us can remember.  We take something from platform X, massage it, and put it on platform Y.  Then the users have to find that data, and manually combine it with data from other sources.  This is a complex, time-consuming, and cumbersome process.

We’ve been doing ETL for so long, that it has literally become synonymous with the term “nightly cycle” at many shops.  Missing that nightly cycle can often result in Day +2 analytics, or even worse.  We’ve become so comfortable with ETL that we’ve forgotten to ask the basic question:  “Is there a better way to integrate data”?  The answer is absolutely “Yes”!

Come join us to see how you can free yourself from the historical constraints of ETL.  Maximize the benefits from your existing technology investments; while also providing a bridge to cloud-based platforms.  Learn how data virtualization can enable you to leverage your data without the need for unnecessary movement.  See for yourself how data virtualization can reduce multiple-week integration projects to just a few hours.

In this session you will learn:

  • What is data virtualization and how does it work
  • Twelve tangible ways data virtualization can help you overcome analytics data challenges
  • Cases studies in, and a proven strategy for, data virtualization implementation success

And as a bonus, session attendees will each receive a free copy of Data Virtualization: Going Beyond Traditional Data Integration to Achieve Business Agility.
Continue reading “September 2018: Data Virtualization”

June 2018: Data Management for the Internet of Things

Presented by Michael Scofield, M.B.A.

Assistant Professor, Loma Linda University

The “internet of things” is dependent upon the communication between various devices—such communication containing data. When data moves, it has architecture, and it is that architecture of “data in motion” (albeit small records within a transaction) which must be astutely designed.

The quality of any business or industrial process outcomes depend upon three major foundations:

  1. Quality and reliability of hardware (and physical network) supporting it.
  2. Quality of design of the process and decision rules. This includes anticipating all contingencies which would influence a decision made independent of human judgment and involvement.
  3. Quality of the data at capture, and quality of definition and clarity of data conveyed between devices.

Continue reading “June 2018: Data Management for the Internet of Things”