Joshua Lynn started programming databases when he was 14 years old. Professionally, Joshua has been in the IT industry since 1991 specializing in database design, development and performance enhancement. Though his career he has lead IT and development teams in projects ranging from COLO fit-outs to full n-tier application development on the various incarnations of the Microsoft Platform. His experience with SQL server goes back to version 6.5 where he thought triggers were real neat. Currently he is working on SQL server 2K8 R2 projects and still believes CTEs have enhanced the quality of his professional life. Although the Tablix feature of SSRS is pretty neat too. Surprisingly Joshua’s educational background is in Mechanical Engineering and has no formal training in what he does for a living but does maintain an unusual passion for SQL and high performance database query algorithms. In his spare time over the last 14 years he mentors high school students in building and programming robots for the FIRST Robotics Challenge an international completion with over 1500 teams and 35K students. Additionally Mr. Lynn is an advocate for Deadlock victims’ rights.
A DBA’s Prayer: “May the deities of databases grant us the serenity to accept the things we cannot change and the courage to change the things we can and the wisdom to know the difference”
This presentation is about putting some wisdom behind how we deal with changing data. How is changing data detected? Did the data really change or does it just look like it may have changed? How do we know the old data we’re comparing to didn’t change itself? What does the change mean and why the !@#$% doesn’t the application layer deal with this so I can get back to “real” database work?
Dealing with changing data is a big part of managing data. There are as many techniques as there are reasons for detecting change. The rules for dealing with changing data are not as well defined as the normal forms for persisting data. This presentation looks to analyze some use cases behind detecting and managing changing data and provide some consistent solutions using a logical framework, which is adaptable for purpose and technology.
So the next time you’re handed a project where the requirements say only load or extract the changes, you’ll have a new perspective to complete an analysis and formulate as solution that you know covers the permutations of change, the pitfalls of weak data, and has the ability to adapt.