SQLServerCentral - www.sqlservercentral.com

A community of more than 1,600,000 database professionals and growing

Featured Contents

The Voice of the DBA

A Large GDPR Victim

During the last two years, Redgate has been preparing for the GDPR to take effect in the European Union. As a company based in the UK, we recognized that there were both challenges and opportunities for our business. We needed to ensure we were compliant with the regulations, which would likely require us to change processes and educate our employees. At the same time, our customers would face similar challenges and there was an opportunity to help them achieve their own compliance with software tools.

The GDPR enforcement began last May, though fairly slowly with few fines and decisions being handed down. Across the EU, it seems to have been a quiet period with few companies told they were non compliant. Most organizations likely think all their preparation has been worth the effort and likely believe that they are prepared for any complaints from customers or investigations from regulatory authorities.That confidence may have been shaken in the last week as Google was assessed a fine over $50 million for violations. In particular, the EU regulators in France found that Google had not obtained the consent needed for using certain data in personalizing ads. They also decreed that Google had not clearly presented information about how users data would be handled and stored, as well as creating a difficult process to opt out.

This fine isn't much for the tech giant, but it's just the start and will likely force Google to change the way they handle data. It may also have implications for other tech companies of all sizes. Google is appealing the decision, and this will be an interesting case to follow for data professionals since we may need to ensure that we can comply with the final ruling. Many of us view the data in our organizations as belonging to our employer, with free reign in how we handle, process, and store it. That may change quickly if the ruling is upheld.

Much of the decisions about how companies will deal with data is made by others, but data professionals often need to ensure that we do comply with whatever rules our organizations decide to use. This means a number of practices that we must consider. At a high level, we need to know what data is affected by the GDPR, or any other privacy regulation. This requires that organizations have a data catalog that allows them to track which data is sensitive and must be handled carefully. Few organizations have a comprehensive data catalog already, so this will be an area in which to focus resources during 2019.

Once we are aware of where our sensitive data is stored, we must take precautions to protect this data throughout our organization. Most companies have implemented security in their production environments, but their data handling practices in test and development areas are often not the same. The GDPR calls for anonymization, randomized data, encryption, and other protections, which data professionals will need to implement in a consistent manner throughout their IT infrastructure.

Finally, accidents and malicious attacks will take place. This means that every organization really needs a process to detect data loss and a plan for disclosing the issues to customers. Auditing of activity, forensic analysis, and communication plans need to be developed, practiced, and distributed to the employees that may be involved in security incidents.

There may be other preparations needed, and the larger the company, the more work that will be required. Tools are critical to ensuring this process can be completed in a timely manner, both to save time in implementing processes and also to show regulators that actions are underway to better protect data. Fine levels aren't mandated, and the more effort put into achieving compliance, the less likely that regulators will assess a fine equivalent to 4% of your annual revenue.

There will be plenty of other GDPR fines in the future, and it is worth following this case with Google to see how stringently the regulations will be enforced. The world of data handling practices is changing and all organizations need to get used to better disclosure of practices, tooling for customers, and protection of the data assets they hold.

Steve Jones from SQLServerCentral.com

Join the debate, and respond to today's editorial on the forums


The Voice of the DBA Podcast

Listen to the MP3 Audio ( 5.5MB) podcast or subscribe to the feed at iTunes and Libsyn. feed

The Voice of the DBA podcast features music by Everyday Jones. No relation, but I stumbled on to them and really like the music.

ADVERTISEMENT
SQL Monitor

How SQL Server monitoring benefits your whole organization

SQL Server monitoring doesn’t just benefit your DBAs. In this new guide from Redgate, we take you through the different ways a robust monitoring solution has a positive impact across your organization, from your development teams to IT management, and from finance to your C-suite. Download your free copy now

SQL Prompt

Write, format, analyze, and refactor SQL fast with SQL Prompt

Writing SQL is 50% faster with SQL Prompt. Your SQL code can be formatted just the way you like it, you can create and share snippets with your team, and with code analysis you get suggestions to improve your SQL as you type. Download your free trial

Featured Contents

 

SSRS Scale Out with Standard Edition Containers and Instances

Paul Stanton from SQLServerCentral.com

Learn how to use database cloning to scale out your reporting services workload. More »


 

Overview of Azure SQL Database Performance Monitoring

Additional Articles from Database Journal

Transitioning to the Platform-as-a-Service model typically implies relinquishing certain degree of control over your computing environment. One of the primary concerns related to this transition is diminished level of transparency providing insights into performance of cloud-resident workloads. Fortunately, with Azure SQL Database, you have a wide range of options that address this concern, allowing you to identify and remediate overwhelming majority of performance-related issues. More »


 

Database Development in Visual Studio using SQL Change Automation: Getting Started

SQL Change Automation (SCA) is a tool designed to help you automate and optimize the way you develop, build, test, and release SQL Server databases, including integration with source control. In this article, the first in a short series, Steve show's you how a team can use SQL Change Automation in Visual Studio to do development work on an existing database. More »


 

From the SQLServerCentral Blogs - Deploying SQL Server to Kubernetes using Helm

Andrew Pruski from SQLServerCentral Blogs

In previous posts I’ve run through how to deploy sql server to Kubernetes using yaml files. That’s a great way... More »


 

From the SQLServerCentral Blogs - Controlling the firewall for an Azure SQL DB via T-SQL

Kenneth Fisher from SQLServerCentral Blogs

The other day I took a Microsoft Learn course about securing Azure SQL DB. It was really enjoyable and I... More »

Question of the Day

Today's Question (by Steve Jones):

I want to change the default data directory on my SQL Server on Linux installation from /var/opt/data to /prod/data. What command should I run?

Think you know the answer? Click here, and find out if you are right.


We keep track of your score to give you bragging rights against your peers.
This question is worth 1 point in this category: SQL Server on Linux.

We'd love to give you credit for your own question and answer.
To submit a QOTD, simply log in to the Contribution Center.

ADVERTISEMENT

T-SQL Querying (Developer Reference)

Squeeze maximum performance and efficiency from every T-SQL query you write or tune. Four leading experts take an in-depth look at T-SQL’s internal architecture and offer advanced practical techniques for optimizing response time and resource usage. Get your copy from Amazon today.

Yesterday's Question of the Day

Yesterday's Question (by Steve Jones):

The Buffer Pool Extension file size can be how many times the setting of max server memory in SQL Server 2017?

Answer: 32

Explanation:

The Bufer Pool Extnetion can be set to a value up to 32 times the size of max server memory.

Ref: Buffer Pool Extension - click here


» Discuss this question and answer on the forums

Database Pros Who Need Your Help

Here's a few of the new posts today on the forums. To see more, visit the forums.

SQL Server 2017 : SQL Server 2017 - Development

2017 SSIS PackageFormatVersion? - Would someone please look in a 2017 SSIS package and post the PackageFormatVersion value? I expect that it is 9...

What is Alternative of numbered_procedures? - Hi every body I use numbered_procedures because of versioning purpose. (Notice: versioning means version of live and up procedures and Doesn't...


SQL Server 2016 : SQL Server 2016 - Administration

Basic Availability Group vs. Database Mirroring - any experiences with large scale deployments? - Hello The company I work for hosts hundreds of databases and uses database mirroring as a high availability solution.  Microsoft's suggested...

TDE and absolutely ridiculous amounts of corruption? - I've searched around quite a bit and can't find anyone else with this problem, but ever since we implemented TDE...


SQL Server 2016 : SQL Server 2016 - Development and T-SQL

Dynamically outputting a set of columns based on another Tables records - I have a table (called #OutputTable) which i would like to be able to dynamically select certain columns from For...


SQL Server 2014 : Administration - SQL Server 2014

Compatibility - I have changed the compatibility of database accidentally and then reverted it back. Would that cause any problem? Please advise? Thanks


SQL Server 2014 : Development - SQL Server 2014

Combining Two Queries into Single Query - The view  gives the below output. CREATE VIEW V1 AS SELECT SELECT DISTINCT    PT.PRODUCT AS PRODUCT, PT.TEXT_CODE AS TEXT_CODE,    PHT.PHRASE...

SQL Query Help - PIVOT or CROSS APPLY? - Hi, I have a dataset that looks like this: CREATE TABLE .(      (255) NULL,      (255) NULL,      (255) NULL,      NULL,

Performance Issue with Simple Query /Big Tables - This query (attached estimated query plan) returns Exception of type 'System.OutOfMemoryException' was thrown.  Are queried tables just too huge or can query performance...

Recursive CTE for Supervisor Hierarchy WITH Effective Date of Reporting Relationship - Updated with DDL and Sample Data - Okay, I'm editing my request per Lynn's suggestion. Below, in the comments, I'm posting the DDL to generate the sample...


SQL Server 2012 : SQL 2012 - General

how to track DB read-write operations for 30 days without using Profiler - Hi, Before database decommission, there is task to trace that whether database is involving in any read-write operations or not for...


SQL Server 2012 : SQL Server 2012 - T-SQL

Query Question - I have a table that contains multiple fields: Diagnosis1 Diagnosis2 Diagnosis3 Diagnosis4 Diagnosis5 Diagnosis6 Diagnosis7 Diagnosis8 Diagnosis9 each field only contain codes, the descriptions are in another table. What I...

Question on grouping - SELECT AUC.AUCTION_NAME_LONG,

2008 -> 2012 Performance Issues - Ok very strange one this that is outside of my skill level unfortunately. We've got a fairly large database (35GB)...


SQL Server 2008 : SQL Server 2008 - General

storing Image in Database (SQL Server ) vs File System - Dear All In My scenario we are handled more then 1 TB size images.Last couple of  years we are storing all...

ssis package succesffuly executes when we run from visual studio, but when we run from sql server agent it still is sucessfully running but doesnt pull any data - I have 3 packages that run by a Proxy account who has all the permissions to access the files in...


Reporting Services : Reporting Services

How can I add the server name to the subject line of a SSIS subscription email? - How can I add the server name to the subject line of a SSIS subscription email? In Microsoft SQL Server Integration...


Data Warehousing : Strategies and Ideas

Polling Datawarehouse Design - Hello I really need your help. I am currently working on a BI project for a polling institute. We have surveys that we...

Build dimension as a VIEW - Hello, I have build a dimension as a VIEW. Definition of the view is below: CREATE VIEW dbo.PowerBI_DimClient(ClientId,Client) WITH SCHEMABINDING AS WITH CTE AS (     SELECT    DISTINCT...


Database Design : Design Ideas and Questions

Entity Framework & RowVersion - Hi all, One of our developers has put a rowversion field with the datatype timestamp into all of the tables of...

This email has been sent to newsletter@newslettercollector.com. To be removed from this list, please click here.
If you have any problems leaving the list, please contact the webmaster@sqlservercentral.com.
This newsletter was sent to you because you signed up at SQLServerCentral.com.
Feel free to forward this to any colleagues that you think might be interested.
If you have received this email from a colleague, you can register to receive it here.
This transmission is ©2018 Redgate Software Ltd, Newnham House, Cambridge Business Park, Cambridge, CB4 0WZ, United Kingdom. All rights reserved.
Contact: webmaster@sqlservercentral.com