Need help with your GRC Data Lake?

The GRC industry is awash with individuals, like many of our Members, who are looking for help in planning, designing, implementing and managing the deployment of a new era of GRC Data Lake. As Big Data Quarterly magazine reports in their Spring 2016 Best Practice Series (By Joe McKendrick), data lakes have become a mainstream strategy for many enterprises over the past couple of years, with promises of greater flexibility in the way data is handled and made available to decision makers. A recent survey by Unisphere Research, a division of Information Today, Inc., found that 20% of data managers and professionals are currently deploying data lakes, and 45% are learning about and researching them. A majority, 56%, have a positive impression of the concept in that it may provide some value to their businesses. At least 38% indicate their companies are committed to data lake strategies ("Data Lake Adoption and Maturity Survey Findings Report," Unisphere Research, October 2015).

In most cases, data lakes are defined as data environments that capture and store raw data. A data lake comprises data in its original format, to be transferred and transformed at a later date as applications and end users demand. The thinking behind the concept is that the analytics or questions to be applied against the data may have not yet been identified, and by holding the data in a relatively accessible environment, it is open for innovation. However, as with any major enterprise data initiative, the concept has to be sold to the enterprise. That's where our GRCme Brain Trust members come into play. We can help your team get running quickly through our GRCme University educational curricula and GRC Tenders and e-commerce helps bring you into contact with the movers and shakers of our GRC industry. We are focused on helping our Members understand how their organizations can leverage a new era data architecture that offers:

1. 100% in-database analytics

2. How to save time and money with an intelligent modeling layer

3. Create a modern web architecture that has a closed-loop control architecture at it's core to measure enterprise performance and values against industry peers.

4. Plan for GRC industry-driven crowdsourcing and benchmarking

5. How consultancies and end-uers can all benefit from near real-time data feeds of GRC best practices and KPI's and Key Risk Indicators (KRI's) which can return "peer averages" and "best-in-class" performance data.

6. Addressing GRC specific application needs for audit trends, anti-fraud, cyber security, crisis response, defense, insider threat, intelligence, legal intelligence, law enforcement, your own custom applications, etc.

Data lakes absorb data from a variety of sources and store it all in one place, with all the necessary requirements for integration and security. Data lakes are a response to the eternal problem of data silos, attempting to bypass these various, fragmented invironments to finally maintain data all in one place. The data lake also reduces the requirement for immediately processing or integrating the wide variety of data formats that comprise big data.

We will be covering a number of best practices from Joe McKendrick and GRC Sphere to help you get off to a quick start. Also, we are now in the planning stages for a Master Class that addresses the Plan, Design, Implement and Manage stages of creating a world-class and new era GRC data lake with teeth. Contact us at [email protected] if you would like to join our GRCme Brain Trust to help our Members with GRC data lake starts.

Category: GRC Data Lake

Comments

Post new comment

The content of this field is kept private and will not be shown publicly.