One of the interesting things on this trip was that I was very unwired. In fact, I didn't even take a laptop, which is a rarity for me. Often I'll take one to jot notes, sketch an editorial, or just have around to check on things. This time, I didn't bother, and neither did my wife. One kid did, but only because he was enrolled in a couple classes (CS and Math) this summer and had assignments to do.
The only device I had was my phone, which didn't have service for much of the trip. Both during long drives (I was glad Google Maps downloads directions), at many of our campsites, and in the national parks, there was little service. In fact, when I did have service, I was surprised. The camping locations did have wi-fi, but I only used it to upload pictures of the trip and didn't even bother checking social media as I posted. I mostly uploaded pictures to ensure I had a backup.
I did check email once early on to be sure that I hadn't forgotten anything from work early, and forwarded on a couple items to people, but when I arrived home last week, I had hundreds of messages from Redgate and probably close to a thousand from SQLServerCentral. It was quite a chore handling and deleting a number of items, and took half of my Monday back in the office.
What was interesting to me is that I didn't really miss the social media much, certainly didn't miss the news, and nothing that important came up. I didn't worry about work, knowing that something would go wrong (such as a few buggy questions) and that someone would handle the situation (thanks to Grant for fixing things) or delay the item until I returned. I found numerous meetings canceled or rescheduled without me and quite a few hastily dashed emails or queries that were later retracted or amended. In fact, going through emails from newest to oldest allowed me to delete older, out of date messages without reading them.
I rarely take more than a few days off at a time, usually just a long weekend here and there, but this proved (again) something I've suspected for a long time. Much of the pressure we feel with deadlines at work is arbitrarily imposed. Whether we do it to ourselves or someone else picks a datetime, there isn't any rationale for the choice. Usually the goal is to complete work as fast as possible, but if there are delays, if something comes up, with life, other tasks, unforeseen failures (hardware/software) get in the way, the business will survive.
I learned a long time ago that despite my best efforts, things will go wrong and there will be delays. I've learned to expect them, accept them and try to not add any extra stress on myself when things are delayed. Go home at a reasonable hour most of the time, live the rest of your life, while doing the best professional job you can. There will always be more work, and the business will survive if most of your work ends up being delayed slightly.
The Voice of the DBA podcast features music by Everyday Jones. No relation, but I stumbled on to them and really like the music.
ADVERTISEMENT
Benchmark your Database DevOps maturity level
Get a better understanding of how advanced your current processes are, receive recommendations for improvements, and see how your maturity level compares with that of your peers. Complete the Database DevOps Maturity Assessment
What’s the top challenge faced by SQL Server professionals in 2018?
Learn how 626 SQL Server professionals monitor their estates in our new report on the State of SQL Server Monitoring. Discover the challenges currently facing the industry, and what is coming next. Download your free copy of the report
Being a database administrator is much more than knowing how to install SQL Server and set up a database. One of the most important responsibilities is being proactive by monitoring the instances in their care. But, what should be monitored? Here are the top five things to monitor when you are a SQL Server DBA More »
By encouraging collaboration and teamwork, removing the barriers between development and operations, and introducing automation, DevOps speeds up software delivery and enables features to move from the fingers of developers into the hands of customers faster More »
SELECT 'Player' = s.playername , 'CurrentValue' = s.points , 'Behind' = LAST_VALUE(s.points) OVER (ORDER BY points DESC) - s.points FROM dbo.Scorers AS s ORDER BY s.points;
What results do I get for the Behind column?
Think you know the answer? Click here, and find out if you are right.
We keep track of your score to give you bragging rights against your peers. This question is worth 1 point in this category: LAST_VALUE.
We'd love to give you credit for your own question and answer. To submit a QOTD, simply log in to the Contribution Center.
This book shows how to deliver eye-catching Business Intelligence with Microsoft Power BI Desktop. You can now take data from virtually any source and use it to produce stunning dashboards and compelling reports that will seize your audience’s attention. Slice and dice the data with remarkable ease then add metrics and KPIs to project the insights that create your competitive advantage.
Yesterday's Question of the Day
Yesterday's Question (by Steve Jones):
I want to detect in my python source file whether this file is being run as a script or being imported into a REPL or other module. What variable allows me to detect if I've run the file as a script?
Answer: __name__
Explanation:
The __name__ variable will either contain the name of the module or "__main__", depending on context. If the file is run as a script, then the value is __main__.
This stored procedure converts a table or select query to a HTML table format, with some customization options.
I have taken as a base, a script Carlos Robles (dbamastery.com) provided me for a static table, so i modified it to accept any table and apply different or no styles, also you can output or not the column names to the table.
NOTES:
This SP works with dynamic queries, also data is not validated, so it is vulnerable to SQL injection attacks, so always validate your queries first.
Null values are not converted on this initial release, so before using it, remove null values from your data.
Some special datatypes like geography, timestamp, xml, image are not supported, if you try to use them, an error will raise, remove these columns before using it.
This tool is not designed to handle huge amounts of data, so, for massive information you can split them in various executions.
PARAMETERS:
@stTable: input table or SELECT query, a schema.object or SELECT query format
@RawTableStyle: OUTPUT variable, to use in another process or programatically
@includeColumnName: 0=does not include column names | 1=include column names (DEFAULT)
Separate DB, log and Temp drives ? - These days, is there actually any point ? Using VM's and SANS, is there any real point in separating data, logs...
Execute same SELECT with a different condition - I have this code. SELECT CT.claim_type_name AS ClaimType, COUNT(CM.claim_type) AS Cases, ISNULL(SUM(CC.original_amount),0) AS SumInsuredTotal INTO #ChosenYear FROM ..dbo.t_claim_type CT LEFT OUTER JOIN ..dbo.p_claim_main CM ON CM.claim_type = C
Partioning Huge Table (650GB about) - Hi all, in my data warehouse (SQL Server 2016 Std) I need to maintain 36 (!!) versions of a table in order...
This newsletter was sent to you because you signed up at SQLServerCentral.com. Feel free to forward this to any colleagues that you think might be interested. If you have received this email from a colleague, you can register to receive it here.